I have enjoyed participating in the National Dialogue about Health IT. One of the challenges put forward to my suggestion that decision makers should insist on FOSS in Health IT, was the following comment:
in terms of privacy, there’s nothing inherent in FOSS that makes it superior to all proprietary products.
I have discussed this issue before, mostly when discussing HealthVault, but my comments have been spread out over several articles.
There is an inherent benefit to privacy, confidentiality and security for FOSS health IT systems.
There is another idea on the National Dialogue site that I thought was useful. It separates the concepts of privacy and confidentiality. Most people blur the concepts of privacy, security and confidentiality and talk about them in the same mouthful. For now I will consider that “privacy” is the ability to control who gets to see your data. Although my points apply to confidentiality and security as well.
FOSS Health IT are inherently better ways to respect privacy because they support “trust-but-verify”, while proprietary systems just support trust.
The only way to know what a program is doing is to read the most human-readable version of that program, which is typically called sourcecode. There are countless examples of programs doing things other than what they appear to be doing. Viruses, Spyware, Monitoring features and Bugs are classic examples of this.
When a proprietary Health IT program says it respects your privacy, there is no way to know for a user to know if this is true directly, he must trust the proprietary vendor. The fact that most proprietary vendors are honest is irrelevant. The trouble with dishonest people is that you cannot tell the difference between them and honest people. We cannot know which proprietary Health IT vendors are respecting privacy and which are not. Also, the same large organizations who you might normally “trust” have in fact a very poor history of abusing privacy; Microsoft being the best example.
So does HealthVault respect privacy? Probably. But there is no way to be sure without reading the code.
Does Dossia respect privacy? Probably. But we can check by auditing the sourcecode of Indivo, because Dossia is based the FOSS Indivo project. Suppose that you believe that Indivo does not do a sufficient job of respecting privacy, or you find a back door (unlikely). You can fork the code, remove or change the offending portions of Indivo, and then run your own Indivo server with the privacy features that you want.
FOSS supports both trust-but-verify and trust-but-fork which is the only way to absolutely certain that privacy is maintained.
Therefore FOSS does have a fundamental advantage over proprietary software with regards to privacy concerns.
-FT
Fred
How can you be sure that Dossia is running the same code that you actually inspect? Unless you personally compiled the code then you still have to trust the sysadmins at Dossia right?
Checking the code that someone *say’s* they are running is no proof of anything.
Etienne,
You are absolutely correct. This is why “Trust but fork” is so important. Running your own server is the only way to be absolutely sure. Stallman, for instance, thinks it is crazy to trust a computer in the cloud.
People running parallel instances of FOSS PHRs help the whole community in another way. It allows the frontend functionality to be compared between instances. So if the frontend of Fred’s Indivo based PHR does not match Dossia’s, then we know that there may have been backend changes as well.
There are ways for companies to ensure that running sourcecode is the same as published sourcecode. You can use code-signing cryptography for this. You can even get community provided certificates for this purpose from CACert.org
There is no way to prevent someone from publishing a sourcecode hash, that was created by an unmodified version of the software, but pretending that it came from modified running software. However, those kinds of games begin to approach the breach of more traditional anti-fraud laws.
-FT