After reading the many articles listed to analyze through the lens of chapter 7 in Code 2.0, I chose "And You Thought a Prescription was Private" by Milt Freudenheim. in this article, the problem is that a woman trying to conceive used various fertilization drugs and in vitro fertilization and is still receiving advertisements and other materials for those kinds of drugs. She thought her prescription information was private, but in reality, it wasn't. Most of her personal information (name, address, even social) were listed on the prescription and the pill bottle (just like any prescription). Third parties were able to get that information either directly from the pharmacy (buy/sell) or through hacking the system.
Since this issue affected anyone who bought prescription drugs, something had to be done to regulate who has access to personal health information. Under the new stimulus law enacted by President Obama earlier this year, selling personal health information is illegal except in the case of tracking trends (which isn't personal, just data) of illness like the flu.
Two things are in conflict here: protecting personal (health) information and personalized advertising (as Lessig describes in chapter 11 of Code 2.0: using such data "simply to make the market work more smoothly" p. 219). ) On the one hand, people want to be sure their information is private--only for them. But on the other hand, vendors and companies want to have some information to target specific audiences and promote sales.
In this case, law is the constraint on prescription information (health information). Perhaps a better solution would be to combine legal constraints with architectural constraints. While the pharmacies accused of selling personal health information (Walgreens, CVS) claim that "names of patients are removed or encrypted before data is sold, typically to drug manufacturers" (Freudenheim), obviously that encryption wasn't enough. Perhaps code should be made to further encrypt personal information--granted, i don't know the actual possibilities of that protecting information any better than before. Other than these two constraints, I can't see other ways of protecting information. That is, even if the market raised the price of information, it would still be sold. And, what kind of social norm would regulate the sale of personal information? Are people going to stop going to the drugstore? Likely no. So the best solution would be to add technology regulations to the legal regulations.
I suppose the conflict between using both architectural and legal constraints is that they are both already in use to some degree. With the recent legislation, both law and code protect prescription information. The problem is that perhaps more legislation and/or more code is necessary to really protect people's privacy the way they desire. Lessig would argue that there should be some way to give out data to promote market activity, while also protecting privacy (p. 219). This seems impossible, however, because how will marketers discriminate advertising without having the information that will allow them to do so? Perhaps the way to have "advertising ... go to those people for whom it would be useful information" (Lessig p. 219) is to make information accessible but only addresses (not names). So flyers could come to the house but not necessarily addressed specifically to you. And if you moved, then, those flyers wouldn't follow you. Most people would say this is still infringement on their privacy, but "people value privacy differently" (Lessig 228).
This was actually a difficult article to analyze. I would love to hear any comments or ideas you have on the subject.
Good overall analysis. I appreciate the regular reference back to Lessig. I am also encouraged that you did not see this as a simple privacy vs market argument. You did a good job of identifying when and how there are some protections on privacy, but more that the companies are not just trying to make money, but using available data to serve the clients better. It becomes a much more nuanced "ambiguity" when it is not good vs evil, but more along the lines you describe it, how much information do I want to give up to have slightly better targeted marketing. And how much control should I have on that information and who is watching out for that balance form both sides.
ReplyDelete