Smart Cities and the EXPECTATION OF PRIVACY
September 11, 2001 a date embedded in history across the globe and never to be forgotten. President George Bush Jr. provided a mandate to all levels of government, “never again” . A mandate that ran through all security agencies of the United States, arguably more so every country of the world. Characterized as a period of ‘surveillance surge’ , since, surveillance has found no outline, border directly embedding itself within and overthrowing human values . The combination of public fear, lobbying efforts of the industry, and linkages between political and economic interests, have catapulted the industry to centre stage in the fight against terrorism . Known for some time, emphasized by Introna and Wood, technology is “by its very design includes certain interests and excludes others” . Allowing the powers that be to violate citizens’ rights discreetly to serve their own self-interests over the values and rights of their citizens. Using technology, for political gain, through power and discipline . Values are trodden upon through all who are enrolled in the regime or take part in an action or function of society .
While Argument remains that because of the sphere of a public place, an individual’s journeys should be protected , With transparency there should be no expectation of privacy in public spaces as in Hunter v. Southam s.8 of the Charter of Rights and Freedoms protects "people not places"., we need to be able to conceive of privacy as something that is not lost, or strongly diminished, simply because information is shared with others in some contexts for some purposes, and As in R v Mills, Justices Iacobucci and McLachlin, Privacy is not an all or nothing right.
Smart Cities
With an understanding of the data trust, there needs to be an understanding of the smart city ecosystem. The ecosystem premised on data collected from public spaces like streets and parks; data from publicly accessible private spaces like stores and building lobbies or courtyards; and data from private spaces not controlled by those who occupy them, such as office thermostats. The principal is that it is anchored to geography, unlike data collected through websites and mobile phones. There is no argument, the ecosystem is complex, services and the smart technologies provide several outcomes. These outcomes are quality of life, government efficiency, health and wellness, economic development, sustainability, public safety and mobility. The services from the smart city infrastructure through policies, processes, public partnerships and programs through continued innovation are a driving mechanism, each of which deliver the data that regulation is premised on protecting.
The ability of the city and its service providers to work hand in hand with governments to strategically develop and deploy privacy in the smart city is a core competency and a critical success factor in building a smarter and more responsive city. Over time, as the service matures, and as user’s expectations and needs change, regulation must be continuously examined and adjusted to maintain or exceed current laws, such as the expectation of privacy test developed in Hunter v. Southam . the first Supreme Court of Canada case to interpret s. 8 of the Charter, Justice Dickson wrote that s.8 protects “people not places” . Austin portrays the decision as “the court liberated the guarantee against unreason- able search or seizure from its common law roots - indeed shackles - of property .
Smart cities are an emergent subject of great interest to the planet's diverse urbanist communities. The technology must be embraced; innovations offer far greater benefits than that of an individual’s privacy and security. It is inevitable governments will garnish the opportunity of smart cities to provide the security and efficiency necessary for growing municipalities. The efficiency of space, both of security and general well-being of the community as a whole becomes the premise of why smart cities are inevitable.
Privacy and Security
Prior to diving into the expectation of privacy test, a general analysis on the intertwining of security and privacy should be briefly discussed. Helen Nissenbaum elaborates on the privacy and security combination through the principles that divulge into the nature of the information collected and how it may or may not affect the owner of the data. Her principles further look into the premise of who in fact is the owner of the data ultimately concluding the consumer shall always have control of their data. Relevant to Schneier is her principle of the sanctity of places . Surveillance is premised on the concept that there are no boundaries for the habitation and use of technology for governments development. Niseenbaum disagrees, and underscores how fundamental are the consumers rights to be “shielded from the gaze of others” . Nissenbaum concepts of the contextual integrity ultimately recognized we live in a digital realm. Information flows in context of politics, convention, and cultural expectation . And protecting the sanctity of the information proves vital and may not be “transgressed” . Through the flow of data, what may or may not be revealed is reliant on the “appropriateness” of the data, how likely it would have been easily revealed through one’s course of information flow . The means by which it has been freely given with complete knowledge of use of the data and knowledgeable consent given, this falls under “distribution” . And the intervention of justice through regulation, or the community as a whole, when there has been a violation of the sanctity of the consumers data .
Expectation of Privacy
Bruce Schneier speaks of this new surveillance society and criticizes the “expectation of privacy” test that is used to interpret the fourth amendment of the United States. He notes that this test is dangerous, because “the whole ‘expectations’ test is circular, what the government does affects what the government can do” . Values now can be brushed aside for the benefit of security. The American constitution guard’s citizens against unreasonable searches and seizures. This right is interpreted by looking at the individual’s subjective expectation of privacy, as well as objective societal expectations. The ostensible tradeoffs of privacy and security are frequently misconceived. Not for nothing does the Fourth Amendment begin with the words, “The right of the people to be secure….” Deprivations of privacy and security tend to walk in lockstep among poor and marginalized populations under surveillance. As we have seen, policing can instill terror; surveillance can ruin lives. When we speak of privacy and security, we would do well to ask: privacy for whom, and security against whom?
Schneier’s questions and criticisms are equally valuable to Canadians, as the American test for the fourth amendment is similar to that being used for section 8 of the Canadian Charter of Rights and Freedoms, the right to be secure against unreasonable search and seizure. The Supreme Court of Canada interpreted this right in the criminal case R v Tessling . Before one can even ask whether the government has violated someone’s privacy, we must determine whether the subject matter of the government’s action was private. This is where the reasonable expectation of privacy test comes in, to draw the line between the private and public sphere. First, the individual must subjectively expect that the subject matter in question is private, and secondly, that belief must be objectively reasonable. In Tessling, the court applied the test to police who had used infrared imaging technology to detect houses with an unusual amount of heat emanating from them. Using this technology, along with other evidence, they were able to establish probable grounds that illegal narcotics were being grown on a specific property. When the property owners stated that this infrared data should be excluded from the evidence as private, the court ultimately concluded that they did not have the reasonable expectation of privacy in their property’s thermal information.
Sidestepping the difficult debate about searches and seizures, this case shows how the test is designed to be flexible to changing circumstances. The entire common law system is based on the notion that society changes, both culturally and technologically, and that judges have a role in ensuring that old laws stay relevant to new contexts. To that degree, a reasonable expectation of privacy test serves its role. Schneier criticizes this approach, however, by pointing out that “today’s technology makes it easier than ever to violate privacy… but it doesn’t necessarily follow that we have to violate privacy.” Schneier’s criticisms are taken in good faith, but it is difficult to imagine how to define privacy without a technological context. Young people are growing up in the age of online social networking, and for better or for worse their boundaries of privacy are different from the generation before them. It is difficult to imagine a society that will protect information that the same society does not expect to maintain as private.
Schneier identifies a much more problematic aspect of the test in examining how the government itself can erode the expectation of privacy, and change a search from unreasonable to reasonable. True, a government would not be able to override a societal consensus about privacy with an arbitrary or sudden shift in policy. But as the example of warrantless wiretapping illustrates, the ability for the U.S. government to justify the program makes the application of the “expectation” test unclear:
In Katz v. United States , the Court ruled that the police could not eavesdrop on a phone call without a warrant: Katz expected his phone conversations to be private and this expectation resulted from a reasonable balance between personal privacy and societal security. Given NSA’s large-scale warrantless eavesdropping, and the previous administration’s continual insistence that it was necessary to keep America safe from terrorism, is it still reasonable to expect that our phone conversations are private?
Certainly, with the U.S. government boasting their wiretapping program as a tool to fight terrorism, it is not clear that Americans can reasonably continue to expect that their phone conversations are private. Moreover, by invoking the rhetoric of security, it is not even clear that Americans expect that their phone conversations should stay private. It may offer some relief that a poll of Americans indicates that most Americans believe the government should obtain a warrant in order to wiretap a phone conversation . But Schneier notes that the court’s test “isn’t based on anything like polling data; it is more of a normative idea of what level of privacy people should be allowed to expect, given the competing importance of personal privacy on one hand and the government’s interest in public safety on the other.”
Schneier’s criticism of the current legal scheme is well-founded. It is difficult to determine if the test should shift towards permanent ground or public opinion. But since the test is supposed to protect citizens from unreasonable government searches, it becomes important to find a way to protect the test itself from the government.
Privacy rights are rights that attach to every individual. Article 8 of the European Convention on Human Rights s (now embodied in the Charter of Fundamental Rights of the European Union, Article 7) protects them by asserting that “everyone has the right to respect for his private and family life, his home and correspondence.” To establish the lawfulness of a measure in relation to Article 8 the Court will impose a two step test. First, the Court will establish if a breach exists. Once the existence of the breach is established, the lawfulness of such a breach will be tested against the requirements of Article 8.2. This sets out that the breach must be provided for by law either through legislation or case law, and relates to one of the areas listed in paragraph 2 such as national security or economic well-being of the country. Finally, it requires that the breach must be necessary in a democratic society which is decided through balancing the value of protecting privacy and the interest that breaching this protection would serve. Therefore the ‘reasonable expectation of privacy’ is utilised in Europe as the ‘balancing test of deciding when a privacy violation is necessary in a democratic society depends on the gravity of the privacy violation which in turn relies to a certain extent on the way or amount of privacy that people experience in that particular context.’ Despite this, we know that our privacy rights (like all our other rights), are not absolute and are by necessity limited. What is problematic then, is achieving an appropriate balance in defining the scope and limits of this very unique right.
A recent ruling by the European Court of Human Rights (ECHR) provides an example of the difficulty of discerning the boundaries of this right. The case concerned a hospital taking a photograph of a newborn baby as part of a commercial service. The court ruled that simply the taking of the photo without the parent’s consent was a violation of the baby’s right to privacy even though the photograph was never published. In his ruling the judge made the following remarks:
the concept of a private life is a broad one encompassing the right to identity– he stressed that a person’s image revealed his/her unique characteristics & constituted one of the chief attributes of his/her personality
the effective protection of the right to control one’s image presupposed in the present case obtaining the consent of the person concerned when the picture was being taken and not when it came to possible publication. The action of taking the photo breached the child’s right to a private life as guaranteed by Article 8 of the European Convention on Human Rights, and that the Greek court failed to uphold that right.
At first glance, this ruling may seem arbitrary especially because the photograph was never published, but what is underlying this decision is an implicit contextual balancing.
Transparency and Balancing
R v. Mills, co-written with Justice McLachlin, delivers another avenue to privacy in public spaces “Privacy is not an all or nothing right” . Austin provides the “statement, and its elaboration by Justices Iacobucci and McLachlin, creates the jurisprudential foundation for over coming one of the most important conceptual hurdles with respect to informational privacy in the context of information sharing” . “We need to be able to conceive of privacy as something that is not lost, or strongly diminished, simply because information is shared with others in some contexts for some purposes . Mills affirms that privacy is not lost simply because information is shared and therefore secures the possibility of robust constitutional protection of personal information in the hands of third parties .Emphasis that privacy is not lost rather my argument is that it is enhanced when third parties are regulated and controlled to handle the data for the permitted use and ensure it is divulged no further than for the security of the community and the technology required in the community.
Hunter emphasizes this balancing act, where “the outcome of the balancing exercise might be different depending on the different level of protection be warranted” . Austin provides the example, the Courts have accepted departures from the requirement of prior authorization o standard of reasonable and probable grounds at borders and in administrative and regulatory contexts . This balancing act should be extended for the surveillance of communities and the interoperability of technology to enhance the lives of individuals. Balancing along with data management is fundamental to these “smart communities”.
Austin emphasizes that all together the fundamental goal is to ensure that the management of the data is to deliver on the “reasonable expectation that private information will remain confidential to the persons to whom and restricted to the purposes for which it was divulged. Where private information is disclosed to individuals outside of those to whom, or for purposes other than for which, it was originally divulged, the person to whom the information pertains may still hold a reasonable expectation of privacy in this information.
Through these smart communities Austin emphasizes that the privacy interest relies heavily on the “expectation of privacy” test . The argument, when in public and using the benefits of the public space what expectation may one have? Austin sums it up after a few paragraphs that in public is there is no expectation of privacy in its own right as one would have in their home. She provides the example of one’s diary should be kept secret from prying eyes. There is however an expectation to ensure the information gathered is not used for nefarious reasons, is disclosed to the individuals, and remains transparent to the user. I like to call it “hiding behind bushes”, I do not want the providers of the technology and surveillance hiding to avoid to critique and criticism of the public. Rather provide transparency, through signage and/or phone pings to inform everyone what technology is being used in this smart setting and how it being used. The argument should always be about having all join together on the interoperability and not masking the face of progress. Austin also provides the constitutional right of s.8 is “about protecting individuals from being surprised by state (providers) intrusions”.
Conclusion
Pundits have been writing privacy’s obituary for years. In 2014, Thomas Friedman wrote in the New York Times that “privacy is over.” Facebook’s Mark Zuckerberg said that “the age of privacy is over” in 2010. And Sun Microsystem’s former CEO Scott McNealy declared, back in 1999, that we “have zero privacy anyway. Get over it.” We have been told privacy is dying for so long that the average person on the street can be excused for thinking it died years ago, alone, gasping for breath . If we think of privacy as secrecy, then yes! Rather an emphasis should be on transparency of the data used within the smart community. Ensuring the community as a whole is aware of the information gathered, it’s purpose, maintaining control and the ability to opt out. All in all, innovation, technology and the hearts and minds of people should be encouraged. The old way of thinking is, if something is good for people it is bad for business and vice versa. This is absolutely archaic! The new way of thought should be about positive social outcomes. The equation is adopting that there is no expectation of privacy in public spaces along with transparency equal the placing people and businesses at the forefront leading to an enhanced and efficient communities as a whole.
References
(WIPO), W. I. (n.d.). What is Intellectual Property? Retrieved from https://www.wipo.int/about-ip/en/
Adler-Bell, B. G. (2017, December 21). The Disparate Impact of Surveillance. The Century Foundation. Retrieved from https://tcf.org/content/report/disparate-impact-surveillance/
Adler-Bell, B. G. (December 21, 2017). The Disparate Impact of Surveillance. The Century Foundation. Retrieved from https://tcf.org/content/report/disparate-impact-surveillance/
Asma Khatoon, P. V. (2019). Blockchain in Energy Efficiency: Potential Applications and Benefits. Energies. Matering Blockchain, 12(17).
Austin, L. M. (Spring, 2007). Education, Administration, and Justice: Essays in Honour of Frank Iacobucci. The University of Toronto Law Journal, 499-523.
Bannerman, S. (2020). Privacy and smart cities: A Canadian survey. Canadian Journal of Urban Research, 17-38.
Bashir, I. (2017). Mastering Blockchain. Packt Publishing.
Brussels, E. C. (2020 , February 19). White Paper On Artificial Intelligence - A European approach to excellence and trust. 65 Final.
Cavoukian, A. ( 2009, August). Privacy by Design The 7 Foundational Principles. Retrieved from https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf
Daniel Bergstrom, B. S. (2019, April 4). BLOCKCHAINS AND ONLINE TRUST CHARTING THE FUTURE OF INNOVATION. Blockchains and Online Trust, 3.
Fan Zhang, S. G. (August 2015). Federated Learning Meets Blockchain: State Channel based Distributed Data Sharing Trust Supervision Mechanism. Journal of Latex Class Files, 14.
FISA. (n.d.). VOTERS VIGOROUSLY OPPOSE WARRANTLESS WIRETAPS, BLANKET WARRANTS AND TELECOM AMNESTY. Retrieved from https://www.aclu.org/other/national-fisa-poll-mellman-group-voters-vigorously-oppose-warrantless-wiretaps-blanket?redirect=cpredirect/32189
Gasser, U. (2015). Interoperability in the Digital Ecosystem. Research Publication No. 2015-13 July 6, 2015, 30. Retrieved from https://dash.harvard.edu/bitstream/handle/1/28552584/SSRN-id2639210.pdf
Geer, D. E. (2015, July/August). The Right to Be Unobserved. IEEE Computer and Reliability Societies.
Geer, D. E. (n.d.). Complex Security, Secure Complexity. Retrieved from http://geer.tinho.net/bolin.geer_3LR_4.pdf
Gratton, E. (2013, October 3). If Personal Information is Privacy's Gatekeeper, then Risk of Harm is the Key: A Proposed Method for Determining What Counts as Personal Information. Albany Law Journal of Science and Technology, 24(1).
Guy Zyskind, O. N. (n.d.). Decentralizing Privacy: Using Blockchain to Protect Personal Data. iapp.
Hoffman, E. G.-C. (2014). Privacy, Trusts and Cross-Border Transfers of Personal Information: The Quebec Perspective in the. Dalhousie Law Journal, 255-300.
Hunter v. Southam, 2 (S.C.R. 1984).
J.L. Zittrain and J.G. Palfrey, J. (2008). Internet Filtering: The Politics and Mechanisms of Control. Chapter 2 of Access Denied: The Practice and Policy of Global Internet Filtering, 29-56. Retrieved from https://pdfs.semanticscholar.org/4834/a0957b5d13f2d86e95ef415bb27b3f4cef87.pdf
Jayachandran, P. (2017). The difference between public and private blockchain. Blockchain: Background and Policy Issues. Congressional Research Service.
Katz v. United States, 35 (United States Supreme Court 1967).
Kulhari, S. (n.d.). Building-Blocks of a Data Protection Revolution. Nomos Verlagsgesellschaft mbH. Retrieved from https://www.jstor.org/stable/j.ctv941qz6.7#metadata_info_tab_contents
Landau, D. C. (2011). Untangling Attribution. Harvard National Security Journal. Retrieved from https://www.nap.edu/read/12997/chapter/4
Leenes, R. a.-J. (2005). Code‟: Privacy‟s Death or Saviour? International Review of Law Computers & Technology, 329-340.
Limpert, P. B. (2019). Background to Election and COVID Misinformation On Social Media. Law 6163.
Limpert, P. B. (2019). Position and Activities of Western Adversaries in Cyberspace. Law 6163.
Limpert, P. B. (2019). Theories About Misinformation. Law 6163.
Limpert, P. B. (2019). Various Issues in Government Sponsored Surveillance. Law 6163.
Lin, H. (2016). Attribution of Malicious Cyber Incidents: From Soup to Nuts. Journal of International Affairs(The Cyber Issue Winter 2016). Retrieved from https://jia.sipa.columbia.edu/attribution-malicious-cyber-incidents
Nissenbaum, H. (2004). Privacy as Contextual Integrity. Washington Law Review, 101-139. Retrieved from https://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf
O'Hara, K. (2019). Data Trusts: Ethics, Architecture and Governance for Trustworthy Data Stewardship. Web Science Institute White Papers. Retrieved from http://dx.doi.org/10.5258/SOTON/WSI-WP001
Onuoha, M. ((2017, January 30). What It Takes To Truly Delete Data. FiveThirtyEight.
Out-Law. (2004). Naomi Campbell wins privacy appeal in House of Lords. Pinsent Masons. Retrieved from https://www.pinsentmasons.com/out-law/news/naomi-campbell-wins-privacy-appeal-in-house-of-lords
Out-Law. (2008). Rowling privacy win confirms proper test for privacy, says expert. Pinsent Masons. Retrieved from https://www.pinsentmasons.com/out-law/news/rowling-privacy-win-confirms-proper-test-for-privacy-says-expert
Out-Law. (n.d.). Is filming someone in the street a breach of privacy? Pinsent Masons. Retrieved from https://www.pinsentmasons.com/out-law/news/is-filming-someone-in-the-street-a-breach-of-privacy
Personal Information and Protection of Electronic Documents Act (S.C. 2000, c. 5). Retrieved from https://laws-lois.justice.gc.ca/ENG/ACTS/P-8.6/index.html
Privacy Act (R.S.C., 1985, c. P-21). Retrieved from https://laws-lois.justice.gc.ca/ENG/ACTS/P-21/index.html
R. v. Mills (3 S.C.R. 668 1999).
R. v. Tessling, SCC 67 (CanLII) 3 SCR 432 (Supreme Court of Canada 2004). Retrieved from https://www.canlii.org/en/ca/scc/doc/2004/2004scc67/2004scc67.html
Rand, G. P. (2021). The Psychology of Fake News. Trends in Cognitive Sciences(Volume 25, Issue 5), 388- 402. Retrieved from https://reader.elsevier.com/reader/sd/pii/S1364661321000516?token=2FD305907771D35A8886E8CE 73C23EA2DC20CEE70C9D1125018D9A50B5510E523B744C2C25A5475E932C6501C976B54C&originReg ion=us-east-1&originCreation=20210929144338
Rich, J. K. (2018). Truth Decay: An initial exploration of the diminishing role of facts and analysis in American Public Life. Rand Corporation, 21-38; 152-153.
Schneier, B. (2009). It's Time to Drop the 'Expectation of Privacy' Test. Wired. Retrieved from https://www.wired.com/2009/03/its-time-to-drop-the-expectation-of-privacy-test/
Schneier, B. (2016, March 4). Schneier on Security.
Solove, D. J. (2007). “I’ve Got Nothing to Hide” and Other Misunderstandings of Privacy. San Diego Law Review,, p. 745. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=998565
Thomasen, K. (2020). Robots, Regulation, and the Changing Nature of Public Space. Ottawa Law Review. Retrieved from https://commons.allard.ubc.ca/fac_pubs/633/
Troncoso, S. G. (n.d.). Computers, Privacy & Data Protection, 25. Engineering Privacy by Design Reloaded. Retrieved from https://www.esat.kuleuven.be/cosic/publications/article-1542.pdf
Waldman, A. E. (23 March 2018). Privacy as Trust Information Privacy for an Information Age. Cambridge University Press.
Wood, L. I. (2004). Picturing algorithmic surveillance: the politics of facial recognition systems. Surveillance and Society, 178-198. Retrieved from https://pdfs.semanticscholar.org/81df/e7091cda0e6e17f3344ff8a863c04041b3c5.pdf
Wood, L. I. (2004). Picturing algorithmic surveillance: the politics of facial recognition systems. Surveillance and Society 177, 178-198. Retrieved from https://pdfs.semanticscholar.org/81df/e7091cda0e6e17f3344ff8a863c04041b3c5.pdf
Yuhong Li, K. O. (2020, April March). A Blockchain-Assisted Intelligent Transportation System Promoting Data Services with Privacy Protection. Sensors, 20, 2-22.
Troncoso, S. G. (n.d.). Engineering Privacy by Design Reloaded. Computers, Privacy & Data Protection, 25. Retrieved from https://www.esat.kuleuven.be/cosic/publications/article-1542.pdf
Geer, D. E. (2015, July/August). The Right to Be Unobserved. IEEE Computer and Reliability Societies, p. 1.