Privacy is understood as the right of a person to have his personal data properly secured. Moreover, it is related with the ability of a person to control, edit, manage and delete information about them and to decide how and to what extent such information is communicated to others [[i]]. Any data that could uniquely identify a person or, which is not supposed to be known to any person other than its owner and/or her immediate family, without her consent is called Private Data [[ii]].
Cloud services make it easier for Public Authorities to take advantage of opportunities to share information. For example, sharing personal information with another public Authority or Agency may be achieved by simply creating user accounts with the appropriate permissions within a SaaS solution rather than having to implement a system-to-system interface to exchange information. Although cloud services have the potential to lower the technical barriers to information sharing Public Authorities must ensure that they appropriately manage access to personal information and comply with the requirements of the European and National Privacy Legislation.
The main threats to privacy in a cloud computing environment are:
- Lack of User Control
- Lack of Training and Expertise
- Unauthorized Secondary Usage and Loss of Trust
- Complexity of Regulatory Compliance
- Transborder Data Flow
- Legal Uncertainty
In 2014, the International Organization for Standardization (ISO) adopted ISO/IEC 27018:2014, an addendum to ISO/IEC 27001, the first international code of practice for cloud privacy. Based on EU data-protection laws, it gives specific guidance to cloud service providers (CSPs) acting as processors of personally identifiable information (PII) on assessing risks and implementing state-of-the-art controls for protecting PII [[iii]].
The new standard sets out best practices for public cloud service providers. It establishes security guidelines to protect personal data and provides a privacy compliance framework that addresses the fundamental obligations of a data processor under EU data protection laws. Any organisation that processes PII through a cloud computing service under a contractual arrangement can be certified under ISO 27018 – this means all types and sizes of organisations, including public and private companies, government entities and not-for-profit organisations, are eligible. To qualify for certification under ISO 27018, the applicant provider must agree to be audited by an accredited certification body and must also submit to periodic third-party reviews.
Although the agreement covers a lot of privacy issues, the lack of physical control by cloud users over data storage, and the absence of standardised and mature techniques for monitoring how data is accessed, processed and used inside the cloud, it is harder to verify a cloud’s compliance with such privacy policies.
In addition to the evaluation of cloud provider, Public Authorities should also assess their Smart City services to identify issues that may lead to infringing users’ privacy. This applies mainly to applications that keep personal information or handle payments. In the first case the application must comply local laws about storing personal data, including any rules about the location of data centres, such as the EU Directive on data Protection [[iv]] while in the second with any rules about safe payments, such as the Payment Card Industry’s Data Security Standard (PCI DSS) [[v]].
However, there are many Smart City infrastructure management applications, such as applications related to public transport, street lighting or road traffic management that do not fall into any of the above categories, and for these, data privacy is not such an issue.
Agencies planning to place personal information on a cloud service should perform a Privacy Impact Assessment (PIA) to verify that privacy requirements are adequately addressed.
The STORM CLOUDS approach
The STORM CLOUDS Smart City services have been evaluated regarding privacy issues. The involved Public Authorities in collaboration with the applications’ developers perform a Privacy Impact Assessment (PIA) to ensure that they identify any privacy risks associated with the use of the services together with the controls required to manage them effectively.
The privacy impact assessment questionnaire, which was used for each application, contained the following 14 questions [[vi]]:
|Q1. Will the project involve the collection of new information about individuals?
Q2. Will the project change the way personal data, particular important to individuals, is being handled? Examples include racial and ethnic origin, political opinions, religious beliefs, trade union membership, health conditions, sexual life, offenses and court proceedings.
Q3. Further important examples apply in particular circumstances. The addresses and phone numbers of a small proportion of the population need to be suppressed, at least at particular times in their lives, because such ‘persons at risk’ may suffer physical harm if they are found.
Q4. Will the project compel individuals to provide information about them?
Q5. Will the project perform any data processing at personal data on a large number of individuals? Examples include applications seeking to locate people or to build or enhance profiles of them.
Q6. Will information about individuals be disclosed to organisations or people who have not previously had routine access to the information, or other third parties that are not subject to comparable privacy regulation?
Q7. Are you using information about individuals for a purpose it is not currently used for, or in a way it is not currently used?
Q8. Will the project involve new or significantly changed consolidation, inter-linking, cross-referencing or matching of personal data from multiple sources?
Q9. Will the project significantly contribute to public safety? Application dealing with critical infrastructure and the physical safety of the population, usually have a substantial impact on privacy.
Q10. Does the project involve using new technology which might be perceived as being privacy intrusive? For example, does the project use biometrics, facial recognition, radio frequency identification (RFID) tags, locator technologies (including mobile phone location, applications of global positioning systems (GPS) and intelligent transportation systems), profiling, data mining, and logging of electronic traffic?
Q11. Does the project involve new identifiers, re-use of existing identifiers, or intrusive identification, identity authentication or identity management processes? For example, does the project use digital signatures, presentation of identity documents as part of the registration scheme, or intrusive identifier such as biometrics?
Q12. Will the project result in you making decisions or taking action against individuals in ways which can have a significant impact on them?
Q13. Is the information about individuals of a kind particularly likely to raise privacy concerns or expectations? For example, health records, criminal records or other information that people would consider to be particularly private.
Q14. Will the project require you to contact individuals in ways which they may find intrusive?
In order to drive consistent privacy practices during the development of new Smart City Applications, the Public Authorities should define a privacy framework, which will define standard privacy features and practices. Because security is critical to privacy, the alignment of complementary privacy and security processes helps minimise vulnerabilities in software code, guard against data breaches, and helps to ensure that developers factor privacy considerations into Smart City Services.
[i] Conducting privacy impact assessments code of practice, 2014, UK Information Commissioner’s Office
[ii] Privacy Issues and Measurement in Cloud Computing: A Review. International Journal of Advanced Research in Computer Science, Volume 4, No. 4, March-April 2013.
[iv] European Commission, Protection of personal data, viewed June 2, 2016 <https://goo.gl/xwemTY>
[v] PCI Security Standards Council, viewed June 2, 2016, <https://www.pcisecuritystandards.org>
[vi] STORM CLOUDS Project, 2015, Deliverable 4.3: Privacy and security measures