Children and Other Vulnerable Data Subjects
There is a market of IoT devices, which clearly targets children, parents, or child- carers: from smart cameras and smart toys to wearables, tablets, and smartphones. From a very young age, children are exposed to a vast range of Internet-connected devices, which can, for example, keep them entertained or help them with learning [44]. Authors speak about the “datafication” and “quantification” of a child’s life, which may start even before they are born [44, p. 3] and can lead to their continuous “dataveillance” [45, p. 285]. Stories about malicious actors gaining access to baby monitors and Internet-connected dolls have made headlines the last years [45, p. 286] and have led to the ban of such products in some countries over concerns for children’s safety [21. p. 5]. A smart watch for kids, for instance, offering localization features may not directly pose risks for children’s safety but it could be hijacked and used for tracking or contacting the kids by malevolent or other third party actors [21, p. 5].
At the international human rights framework, children6 are entitled to the protection of their private life, on the top of Article 12 UDHR and Article 12 ICCPR, also explicitly on Article 16 of the United Nations Convention on the Rights of the Child (UNCRC). Those rights apply equally both on offline and on an online, digital environment [45. p. 287]. At CoE level, the children’s rights to privacy and data protection are considered part of the instruments we discussed earlier (Article 8 ECHR and Modernized Convention 108+). The same goes for the EU level (Articles 7 and 8 EUCFR). It is important to highlight that children’s rights are further protected in Article 24 EUCFR, where children are entitled to protection and care which ensures their well-being. Moreover, all actions that concern them should take the child's best interests into primary account [45, pp. 287-288].
Even though the Data Protection Directive did not include specific provisions for the protection of children, already in 2009 the A29WP assessed the issues in relation to children’s data protection in its Opinion [46]. The Party reinstated its positions later on in relation to apps on smart devices and the IoT. Under the GDPR, special protection provisions were included to reduce online risk for children, in particular for those under 16 years old. In GDPR, children are considered vulnerable data subjects [47] and their protection is assessed through requirements for consent in Article 8, specific transparency obligations for the data controllers in Article 12 and risk management with particular focus on children in Recital 75. Recital 38 GDPR, read under the light of Recitals 58 and 65, provides that “children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data" [47].
Recital 38 further states that:
[s]uch specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.
However, this is to be understood as the specific protection is not only related to marketing or profiling but refers to “a wider 'collection of personal data with regard to children' ” [48, p. 24]. According to Article 8( 1) GDPR. where information society service providers7 rely on consent for the provision of such services directly to a child, the processing of the child’s personal data is only lawful if the child is at least 16 years old. Specifically, for minors under 16 years old, the GDPR requires the legal guardian's consent when information society service providers process children’s personal data, aiming to address the unique needs of children as data subjects [49], even though the effectiveness of such scheme has been disputed [50]. The consent must be given or authorized by the holder of parental responsibility, which can be more than one persons [48, p. 24]. The age of digital consent can be lowered individually in Member States, but it should not be less than 13 years old.
In the US, the Children’s Online Privacy Protection Act (COPPA) was put in place almost two decades before GDPR and also offers specific provisions for the processing of children’s data, in particular for minors under the age of 13 [45, p. 288]. Studies provide extensive comparative analyses of both legal instruments; in principle, some differences can be found in definitions and the application scope [49].
Complying with the principle of transparency proves to be rather challenging not only toward adult users, but foremost minor users [51, p. 17]. Providing meaningful information about the data processing in plain language and in a way that can be easily understood by children requires effort, which could include both textual and visual information, less links to other webpages, and more concise texts [51, p. 17]. Empirical studies have demonstrated that data protection policies are often opaque, inadequate, and nonchild friendly [49]. Excessive collection of personal data from children and subsequent disclosure to third parties has also been observed [49]. This might entail that other nonconventional ways for providing information shall be adopted, including cartoons, pictograms, and animations, depending on the age and maturity of the child [52, p. 12].
Thus, all actors involved in the IoT ecosystem should “go the extra mile” to make sure that children’s rights are sufficiently protected. For IoT products directly targeting children, suggestions have been made for the inclusion of data protection policies on the packages both for children and parents, including audio and visual notifications that inform children and parents of the data processing activities [45, p. 299]. The implementation of the highest security standards and reconsideration of the need to collect data from children and further, of who should have access to the collected data, as well as reshaping data storage solutions, should be of primary concern of actors involved in IoT targeting children, parents and educators [45, p. 299]. Industry code of conducts, tackling the particular issues in a self-regulatory manner, are also encouraged [45, p. 300].
Specifically, in relation to automated decision making and profiling, such provision cannot concern children [44, p. 5]. Nevertheless, A29WP suggests that even though no absolute prohibition exists, yet entities involved in children’s personal data processing should refrain from children's profiling for marketing purposes.
Apart from children, vulnerability can be attributed to other data subjects as well, including adults. The key element of vulnerability appears to be a power imbalance between the data subject and the data controller. Thus, vulnerable data subjects can include children, as seen earlier, employees toward their employer, and other individuals belonging to population groups which require special protection, for instance, "mentally ill persons, asylum seekers, or the elderly, patients, etc." [47] Malgieri and Niklas argue that based on a layered analysis of vulnerability in GDPR, “everyone is potentially vulnerable, but at different levels and in different contexts." In the IoT context, this would entail that IoT actors have to go two or more extra miles, to take into consideration the particularities of each application. A way to identify and implement specific safeguards for vulnerable data subjects could be the conduct of data protection impact assessments and the full implementation of the principle of data protection by design [47]. If the system is not designed to protect the most vulnerable or there are remaining risks after the risk management process, those risks should be made known in a transparent manner.
Lastly, during the corona outbreak, EU Member States considered standalone devices and wearables specifically for children and other vulnerable groups which would constitute an alternative to smartphones, deploying directly the nationally adopted contact tracing apps [53, p. 20]. The possibility of using domotics and other home-based solutions was also explored [53, p. 20].
To sum up, IoT actors will have to observe the relevant provisions of the forthcoming ePrivacy Regulation, as the latter will have a significant impact also in relation to children’s and other vulnerable individuals’ protection [45, p. 290].
Inferences
Another issue, which is very complex in the IoT ecosystem, is that of inferred data. Personal data initially collected may not as such reveal sensitive aspects of someone’s private life. Nonetheless, those data tracked over a period of time, combined with other data and examined through advanced data analysis tools may lead to inferences, which could constitute personal data, even belonging to a special category, despite the fact that the initial collected data did not [54].
Article 29 WP in its Opinion from 2014 concerning IoT brings the example of applications which track data subjects' movements in order to extract the number of daily steps and display information about an individual’s physical and mental condition [9, pp. 7-8]. Even though the user of the smart device may have been “comfortable ” sharing the original information, they may not be comfortable with sharing the secondary (derived or inferred) information [9, pp. 7-8]. The concerned stakeholder should take into account the sensitivity of inferred data and ensure that all the purposes of the processing concerning the raw, the extracted and the displayed data are known to the data subject, in line with the principles of transparency and purpose limitation [55].