|
Digital Privacy
A Brave New World?
By Drew Lewis and Kelly L. Frey, II All it takes is a visit to a downtown restaurant or a trip to an airport lobby to observe how engrained in our daily lives Treos®, Blackberrys®, iPhones®, PDAs, laptop computers, and other digital devices have become. We have, in many ways, become cyborgs in a new social order centered upon digital convenience.1 Digital devices have become repositories for our most private and personal information. We keep our bank records on our laptops and pay bills digitally through WiFi links at the local coffee shop. We create a “commercial self” as a result of the personal shopping preferences that reside on our computers and the cookies2 we accept from internet merchants who are eager to use our prior browsing and buying habits as triggers for new sales opportunities.3 We store our family photographs on our PDAs rather than carrying pictures of our loved ones in our wallets. We blog, we email, and invariably we hit the “save” button every time we convert a thought, a feeling, or a vision to digital form. Our digital devices have become extensions of ourselves, of our intangible personalities; they are the principal interface between ourselves and the world around us.
But in this brave new world, can we expect the same protection for our “digital self” as we expect for our “physical self?” The short answer from new state legislation and recent enforcement actions by federal administrative bodies such as the Federal Trade Commission (FTC) seems to be yes. However, our federal legislators, courts, and law enforcement officials seem to be equivocating in supplying answers that attempt to properly balance our rights of privacy and our need for governmental protection in the new digital world.
State Legislation and Federal Administrative Activities
Recent efforts in state legislation with respect to digital privacy have primarily centered on protection of personal information exchanged in consumer transactions. Tennessee, along with a number of other states, has enacted legislation designed to combat identity theft and protect consumers’ private information.4
The FTC is taking the lead from an administrative perspective on digital privacy issues.5 The FTC indicates that it has multiple responsibilities with respect to educating consumers, enforcing companies’ privacy guarantees, and helping to protect children on the internet from specific internet threats.6 Even though these recent state and federal actions have dealt primarily with protecting our commercial digital activities, they have fostered an increasing public expectation of both privacy and control over our private and personal information in digital form in all contexts. However, this increasing expectation of digital privacy seems to have had little effect upon law enforcement methods or judicial interpretations based upon somewhat archaic analogies when confronted with new digital dilemmas. The Fallacy of “Computer as a Footlocker” Our privacy rights, and the attendant Fourth Amendment provisions with respect to search and seizure in our personal space, have traditionally been based upon a reasonable expectation of privacy.7 So, for example, when a person leaves his/her home and locks the door, that person would have explicitly manifested an expectation of privacy such that no unauthorized access to the premises will/should occur.8 It seems that society is comfortable with recognizing this expectation of privacy as legitimate and to a large degree nonproblematic. If law enforcement sought to enter the home, the locked door manifests the necessary expectation of privacy such that a warrant or consent (or an exception to a warrant/consent) would be needed to enter the secured personal space.
However, an increasing number of federal courts have held that securing our “digital space” (such that a right of privacy and assurances against warrant-less search and seizure exists for our “digital selves”) may require a lot more effort than merely “locking the door.” The 6th Circuit recently decided in United States v. Morgan,9 that just as with homes, a person can authorize the search of another person’s computer so long as there is apparent authority to consent. In this case the defendant’s wife, who suspected her husband was viewing child pornography, had installed a covert “spyware” application on the computer in order to confirm her suspicions.10 After notifying law enforcement, the defendant’s wife consented to a search of the personal computer. Ultimately, the court concluded, the wife’s consent was valid since the appearance of the wife’s authority with respect to the computer was quite reasonable; the computer was located in a common area of the home and both the defendant and his wife had access to the computer. Furthermore, neither the husband nor the wife had their own password to access the system.
The 4th Circuit also addressed the issue of common authority over personal computers and computer files in United States v. Buckner.11 In Buckner, the defendant’s wife consented to a search of a computer that was purportedly involved in wire and mail fraud crimes. The husband in Buckner asserted that he had “password protected” the computer such that it could not be turned on without access to his unique password (i.e., “locking the box” such that he could have a reasonable privacy expectation with respect to the digital contents of the computer). In Buckner, however, the court did not address the “password” issue, since the computer was already on and running when the search was authorized by the wife. Thus, the condition of a “locked door” was not dispositive in Buckner, apparently because there was a “willing (if not authorized) invitation” by the wife (who, however, had no legal interest, or privacy expectation, in the personal computer records that were the object of the search).12 An even more strained analysis came out of the 10th Circuit in United States v. Andrus,13 where the court recently affirmed the conviction of a man who police discovered possessed child pornography on his computer after his 91 year old father consented to a search of his room and computer.14 The court concluded that even though the elderly father clearly did not have actual authority to permit a search of his son’s computer, the fact that the father owned the home and paid the internet service provider bill was sufficient indicia to conclude he had apparent authority to authorize the search of the house and the computer. The 10th Circuit first considered which analogy was appropriate in conceptualizing the search of electronic storage devices within the house. The court noted that the expectation of privacy in a personal computer has often been likened to that of a briefcase or footlocker and that a computer protected by a password is conceptualized as a “locked footlocker inside the bedroom.” The 10th Circuit determined, “it seems natural that computers should fall into the same category as suitcases, footlockers, or other personal items that command a high degree of privacy.”15 However, despite this “higher degree of privacy” the 10th Circuit came to the conclusion that the presence of a computer password did not create the same protection as a lock on a physical container.16 Even though the personal computer in the Andrus case was password protected (a “locked container”), law enforcement used a software application specifically designed to “allow[s] user profiles and password protection to be bypassed.”17 The police then argued that they did not know that the Andrus computer was password-protected (because they didn’t worry about such things – they could simply ignore any password protection and bypass the “locked door” using their great new software application). The Andrus court apparently accepted such logic in finding that “…since the password would not have been obvious to the officers at the time they obtained consent and commenced the search…” the search was legal.18 The result is an odd piece of jurisprudence in which the individual’s expectation of privacy (and guarantee against illegal search and seizure) is dependant not so much on the individual’s reasonable efforts to protect his/her personal space as it is dependant upon the government’s technological ability to avoid recognition of such efforts.19 What all of these courts fail to recognize is that the expectation of privacy with respect to digital files stored on computers, the expectation of privacy with respect to our “digital selves,” is qualitatively different from the expectation of privacy in a suitcase or footlocker (whether locked or unlocked). Computers and PDAs are not “boxes” or “containers” with information physically stored inside. These devices have become digital extensions of our physical selves. They contain a great deal more information about us and our activities than could ever be physically stored within any “personal space.” And they have an annoying tendency to capture everything about our digital selves – intended or not – over the lifetime of the device.20 Computers are not footlockers or suitcases that require physical locks to assert privacy rights against invasion. Computers (and similar digital devices) are repositories of our most private and personal information. Computers have become our memories – and the courts’ failures to recognize this fundamental societal change places us all at risk.
The 9th Circuit has recognized this qualitative difference between (i) information stored on digital devices and (ii) physical objects enclosed within a container. Last year in one of his dissents Judge Kleinfeld remarked, “for most people, their computers are their most private spaces.”21 His comments presage a world where the expectation of privacy is defined by the contents of the computer, not the fact that computer coincidentally resembles a physical container. Courts must make a conscious policy choice when facing issues of digital evidence-gathering techniques; sophisticated electronic devices which store most, if not all, of the average person’s private information should no longer be defined in terms of overly simplistic, physical analogies. The courts must develop with the times and use more sophisticated paradigms for analyzing these types of cases.22 E-Mail Isn't Really Mail Equally alarming are recent federal activities related to our digital communications. In the past, communication with others was conducted primarily though mail couriers and then electronically, via telephone. Items mailed to an individual could be accessed by the government through a search only after the government had shown probable cause, a showing of a reasonable ground or belief of guilt,23 along with contemporaneous notice (such as “knock and announce”).24 Telephone communications were subject to an even higher level of protection from government intrusion.25
Today, technology has allowed us to merge the convenience of instantaneous telephone communication with the practicality of mail communication by allowing us to send emails, images, and other personal communications in multiple digital formats. In fact, many of us use email as our primary form of communication in both our professional and personal lives. The logical assumption for most of us is that the information we exchange through digital media would be, at a minimum, subject to the same protections as information exchanged through physical media (requiring reasonable grounds and notice for searches and seizures to occur). Unfortunately, neither reasonable grounds nor notice are required for search and seizure in many such digital communications.26 The reason that digital communications have not been subject to the same protections we would expect is because Congress has chosen to treat digital communications differently than mail or telephone communications. Today, we often receive information through an email and store it in our inbox to access at a later time. This “open and save” technique has become prevalent in modern communications and has largely replaced the earlier technique of hand writing notes to serve as memorandums of information communicated in telephone calls or archiving physical correspondence in a filing cabinet or desk drawer. Although both techniques accomplish the same goal and are logically very similar, the protection of the two techniques from government intrusion is vastly different. While a digital communication is in transit, it is subject to the same heightened level of protection that a telephone call would have.27 However, any time the digital communication is stored, even if by a third party carrier, the legal protection for that digital communication deteriorates.
The Electronic Communications Privacy Act (ECPA) was enacted in 1986 (before the Internet as we now know it existed) and set two levels of protection for digital information in the possession of third parties, such as Internet Service Providers (ISPs). The first level, for communications stored for less than 180 days is subject to Fourth Amendment protection in regard to cause (the government must show probable cause) but not notice. A warrant for the search of a user’s archived email stored on an ISP’s servers can reach into all of the digital contents of the user’s inbox without the user even knowing that his/her email has been accessed! This results in an email user effectively waiving his/her opportunity to challenge the warrant and his/her ability to protect his/her privacy interest in the email’s contents. More alarming is the second level of protection under the ECPA, given to digital communications stored for over 180 days. The ECPA allows for the government to gain access to digital information stored for over 180 days pursuant to a subpoena,28 which requires the government to only show relevance of the information contained in the email to a government investigation in order to get access. Effectively, any email stored in an inbox for more than 180 days can be subpoenaed without the user ever receiving notice, forcing the user to waive his opportunity to contest the subpoena.
Although ISPs typically have service contracts with each user that define the privacy parameters with respect to emails remitted through the ISP (and the FTC is fairly vigilant in ensuring that the protections granted in such agreements are enforced), these ISP service contracts are not sufficient to protect the privacy expectations of internet users. First, most of the ISP service agreements are in “clickwrap” form, such that their privacy provisions are not actually read by the users. Second, and most notably, almost all ISPs’ privacy agreements allow the ISP to disclose information upon demand of the government.29 What this means for the email user is that neither his/her service/privacy agreement with the ISP nor privacy rights under the ECPA provide for the same level of procedural protection of the personal information contained in his/her digital inbox as the personal information contained in his/her physical space (i.e. a letter in an envelope, correspondence in a closed desk-drawer or file-folder, etc.).
This asymmetry between the level of protection given to information stored digitally and information stored physically is incongruent with precedents of search and seizure law. Traditionally, privacy protection is afforded to an individual upon a reasonable expectation of privacy and a manifestation of that expectation. Given the current state of digital communication, there is little or no practical difference in the privacy that one would reasonably expect as between (i) a letter stored for three months in an office desk or in transit with a carrier and (ii) an email that one stores in a folder in his email account or that is stored by his/her ISP for the user’s future access. However, in the realm of digital communication, the protection of individual privacy and the adherence to precedents of privacy law seems to have been abandoned for increased administrative efficiency in government investigations. Conclusion
In every generation the law is faced with an essential redefinition of society. At the turn of the twentieth century the government sought to balance the power of monopolies against the protection of the consumer. Mid-century, the conflict concerned human rights – first the rights of expression and association and then the rights of equality across all races and classes. Our most recent generational conundrum has been the actual definition and protection of self, including dominion over the digital personas and communications we create as the digital tools we use become extensions of our identities and ourselves.
Congress and the judiciary’s first attempts at adjusting to our changing means of communication in the modern era have proved to be insufficient. Their attempts to apply doctrines established for information stored physically to information that is stored digitally have yielded results that are counterintuitive to both the layman and the privacy law scholar. This misapplication of legal doctrine has created great asymmetry between the Constitutional protection we expect for our digital privacy and the protection that we are actually given. The conclusions that we, as lawyers, will fashion through the courts, and we, as voters, will determine through the legislative process, in this new digital era will determine whether the resultant society is one in which we and our children will be adequately protected from both digital crime and unwarranted digital intrusion by the government. It is time for us all to “think outside the box” and accommodate to a new world, a changing world, that recognizes the rights inherent to our digital selves – rights inherent to the cyborgs we have become.30 -------------------------------------------------------- Drew Lewis is a law student at the University of Mississippi School of Law and a summer associate with the Nashville office of Baker Donelson Bearman Caldwell & Berkowitz, PC. He can be reached at walewis@olemiss.edu. Kelly L. Frey, II is a law student at Emory Law School and a summer clerk with the Atlanta office of Baker Donelson. He can be reached at kfrey@law.emory.edu. The authors want to acknowledge the assistance of Kelly L. Frey, Sr., and Jonathan J. Cole (shareholders with the Nashville office of Baker Donelson) in preparation and revision of this article. Mr. Frey, Sr. represents large corporations and vendors to such companies in information technology and corporate procurement transactions and may be reached at 615-726-5682 or at kfrey@bakerdonelson.com. Mr. Cole focuses on resolving business and consumer disputes and defending companies from state and federal regulatory enforcement and may be reached at 615-726-7335 or at jcole@bakerdonelson.com. ------------------------------ (Footnotes) 1 “Cyborg - A person whose physiological functioning is aided by or dependent upon a mechanical or electronic device.” Random House Unabridged Dictionary (2006)
2 The gentle euphemism makes most people ignore how powerful these small pieces of code have become for criminals as well as legitimate corporate marketing.
3 Creating a completely new industry referred to as “market analytics” to mine information about us across multiple merchants and the increasing number of commercial databases that capture information every time we use a credit card, purchase online, or even search for potential items to purchase.
4See, e.g., Identity Theft Victims’ Rights Act of 2004, Tenn. Code Ann. § 39-14-150. The Tennessee statute makes identity theft a class D felony and identity theft trafficking a class C felony. Tenn. Code Ann. § 39-14-150(i). See also http://www.ncsl.org/programs/lis/privacy/idt-statutes.htm for current status of state identity theft laws and http://www.ncsl.org/programs/lis/privacy/eprivacylaws.htm for current status of state computer privacy laws. 5 See http://www.ftc.gov/privacy/. 6 In addition to enforcement actions under various industry specific legislation such as the Gramm-Leach-Bliley Act, 15 U.S.C., Subchapter I, §§ 6801-6809, the FTC action has acted to force Internet Service Providers (ISPs) to comply with the explicit privacy policies agreed upon with their customers in the ISP service agreements. See, James X. Dempsy, “Digital Search & Seizure: Updating Privacy Protections to Keep Pace with Technology,” Privacy Law Institute (Eight Annual): Pathways to Compliance in a Global Regulatory Maze, p. 523. 7 Katz v. United States, 389 U.S. 347 (1967). See also the excellent discussion of privacy/privacy expectations as it relates to emails in the 6th Circuit case of Warshak v. U.S., No. 06-4092, 2007 ILRWeb (P&F) 2025, ___ F3d ___, 2007 WL 1730094 (June 18, 2007). 8 Even though warrant-less surveillance of the exterior of the home appears to be acceptable – another dilemma for the courts as new technologies evolve that reduce closed curtains and solid walls to mere interference that can be filtered out by law enforcement officials now armed with “superman-like, X-ray vision” devices. See Kyllo v. United States, 533 U.S. 27 (2001). 9United States v. Morgan, 435 F.3d 660 (6th Cir. 2006). The defendant was convicted for possession of child pornography discovered on his personal computer after his wife consented to a search of the computer. 10 A spyware program that captures whatever appears on the screen of the computer every ten seconds without the user’s knowledge. 11United States v. Buckner, 573 F.3d 551 (4th Cir. 2007). Defendant was suspected of mail fraud and his wife consented to a search of his personal computer. 12 In this case, the wife of the individual who placed digital files on the computer clearly told the police that she did not use the computer – such that the digital records on the computer were not hers and the “privacy expectation” for the files legally lay with the husband. Nonetheless, the fact that the wife was the sole lessee of the computer (the “box” within which the files resided) may have ultimately proved a pivotal factor for the Court (even if the wife had no interest, legal or otherwise, in the digital information that was stored on the computer). 13United States v. Andrus, 483 F.3d 711 (10th Cir. 2007). 14 The court seemingly ignored two key facts as to apparent authority to consent. The first is the fact that even though law enforcement officials thought the elderly man had consented to the search of his son’s computer, they still stopped their investigation once the son returned home and sought the son’s consent, directly. The second, and more controversial fact, is that there was some discrepancy as to whether the elderly man flatly told police he did not have access to the computer and thus could not have apparent authority. The police deny that he made any such statement though he claims he did; the court indicated there is a presumption to favor the story of the police when there is a conflict in testimony (a credibility element, not a privacy element). 15Andrus, 483 F.3d at 718 (quotations omitted). It is worth noting that the court failed to provide the analogy for a password protected file located on the hard drive of a password protected computer. 16Id. at 720 n.6. 17Id. at 719 n.5. The specific software application used, EnCase. For more information on EnCase and a new product, Neutrino, designed to do the same things as EnCase but on cellular devices, see http://www.guidancesoftware.com. 18Id. at 720 n.6; see also id. at 723 (McKay, J., dissenting) (commenting that the alleged difficulty the majority claims is presented by passwords on computers is not worsened, rather, it is avoided “altogether, simultaneously and dangerously sidestepping the Fourth Amendment in the process”). 19 The natural consequence of which would be to place the burden upon the individual to prove that the government does not have actual knowledge of the individual’s efforts to secure his/her privacy in order to claim the privacy rights, a very Orwellian state of affairs where law enforcement merely needs to “look the other way” or “turn a blind eye” to any reasonable restrictions that individuals may place on their right of digital privacy in order to circumvent that right. 20 See the interesting recent Georgia Court of Appeals case of Barton v. State, No. A07A0486, 2007 WL 1775565, ___ S.E.2d ___ (Ga. Ct. App., June 21, 2007) discussing the presence of digital information in “cache” on a personal computer (i.e. files that are stored by the computer technology being used, not by the user him/herself). 21United States v. Gourde, 440 F.3d 1065, 1077 (9th Cir. 2006) (en banc) (Kleinfeld, J., dissenting). 22 Earlier this year, the Department of Justice released a special report entitled “Investigations Involving the Internet and Computer Networks” (available at http://www.ncjrs.gov/pdffiles1/nij/210798.pdf.) as a guide to law enforcement in federal investigations. In the report, the DOJ dedicates only one chapter to “Legal Issues,” and less than three pages of the 140 page report deals with Fourth Amendment issues, specifically. The DOJ report makes the blanket assertion that “[t]raditional Fourth Amendment principles, like those governing closed containers, apply to digital evidence.” Note 2 at 75 (emphasis added). It is certainly possible that the term “closed containers” was randomly chosen to illustrate what was meant by traditional; however, it is an interesting choice of words, since use of that term most clearly illustrates the problems that may arise if law enforcement seeks to force the “square peg” of digital evidence into the “round hole” of traditional Fourth Amendment precedents. 23Maryland v. Pringle, 540 U.S. 366 (2003) 24Wilson v. Arkansas, 514 U.S. 927 (1995) 25 18 U.S.C. §§2510-2522 (“Wiretap Act”); see also Berger v. New York, 388 U.S. 41 (1967) (discussing the uniquely intrusive aspects of wiretapping and bugging and the requirement for heightened procedural protections from such covert activities). 26 For a detailed analysis of how search and seizure law in regard to digital communication varies from other communication see James X. Dempsy, Digital Search & Seizure: Updating Privacy Protections to Keep Pace with Technology, Privacy Law Institute (Eight Annual): Pathways to Compliance in a Global Regulatory Maze. 27 18 U.S.C § 2518 (requires probable cause for the government to intercept emails in transit). 28 18 U.S.C. § 2703. This open email is considered to be held in the inbox only for storage purposes and is, thus, afforded this lower level of protection. 29 Dempsey, supra note 26, at 523. 30 “[C]lauses guaranteeing to the individual protection against specific abuses of power, must have a similar capacity of adaptation to a changing world.” Olmstead v. United States, 277 U.S. 438, 472 (1928) (Brandeis, J., dissenting).
|