Data Protection and Privacy Challenges raised by the IoT Technologies, Applications, and Ecosystem

The inherent characteristics of IoT devices and services create a number of opportunities. Those same characteristics though lead to several challenges and risks [17]. The most important of those challenges will be discussed below.

1.4.1 Insufficient Security Due to Heterogeneity

First of all, IoT devices are vulnerable to attacks, which could have as a result not only disruption of the continuity and availability of a service but could potentially also cause material damage and physical harm. IoT devices may further enable unauthorized access to personal information, facilitate attacks on other systems, and even bring about general safety risks [56, p. 11]. One of the reasons for this vulnerability is that information security features are not embedded in the products by design, but they are configured later once the desired functionality has been achieved. Moreover, a number of sensors do not allow for the establishment of encrypted links and there is no automated updates service available. Another reason is the lack of harmonization, due to the development of IoT devices and services by disparate communities, and lack of standardization, due to the use of proprietary standards and standards applicable in different sectors. Therefore, the security risk analysis, assessment, and mitigation are more difficult than in the case of coordinated ecosystems [1].

Security vulnerabilities and flaws have been reported by a number of consumer rights organizations in EU Member States concerning several interconnected products available in the market, which raise questions about the compliance of those products with data protection requirements in relation to security [10. p. 30].

1.4.2 Knowledge Asymmetries, Human Errors and Unregulated Access

App developers and device manufacturers are often unaware of the data protection requirements^], even though this seems to slowly change given that companies may be obliged to offer trainings to their employees and more and more public debates take place on the matter. Given the large processing which can lead to extensive monitoring [57], the intrusive use of IoT devices cannot be excluded as well as any unlawful surveillance.

Without proper training, additional risks may occur from human errors, inside threats and internally or externally caused personal data breaches [9, p. 29].

1.4.3 Insufficient Transparency

The IoT landscape is characterised by the complexity of the relations among the several involved entities which are a lot more in number than in a traditional context. The vast range of actors can include hardware manufacturers, device manufacturers, device vendors, operating system and other software vendors, telecommunications and network providers, third-party app developers and vendors, Cybersecurity experts, end-users (individuals), including subscribers and owners, other third users, and people who are incidentally captured by the device or the service. Such a constellation of actors creates questions as to when a stakeholder acts as a data controller or as a data processor.

The distinction is of high importance, since GDPR. even though it has introduced obligations for the data processors too, considers the data controller the principal entity for taking care of the main responsibilities in relation to data processing operations, including partnering with data processors who are GDPR-compliant. For instance, it is the data controller who has to provide the data subject with information about the processing of the personal data and it is the data controller that the data subject will contact to revoke his/her consent, even if data processors are more or substantially involved in the processing of the data.

In addition, tracing data flows and functions may be proven rather difficult, causing uncertainty as to who the owner of the generated data is and who the recipient of those data is. The individual appears to have little control over the dissemination and flows of data, which could lead to excessive self-exposure. Moreover, in practice, there is limited possibility to use sendees anonymously [58].

1.4.4 Forced Consent Mechanisms, “Consent Fatigue” and Other Legal Grounds

Another concern is that users may also be confronted with indirectly forced consent mechanisms in the context of IoT, which could lead to low quality or invalid consent. According to GDPR.

consent of the data subject means any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.

The European Data Protection Supervisor suggests that consent is the legal basis that should:

be principally relied on in the IoT. In addition to the usual requirements (specific, informed, freely given and freely revocable), end users should be enabled to provide (or withdraw) granular consent: for all data collected by a specific thing; for specific data collected by anything; and for a specific data processing. However, in practice it is difficult to obtain informed consent, because it is difficult to provide sufficient notice in the IoT .

[4]

The European Data Protection Supervisor even suggests that:

no one shall be denied any functionality of an IoT device (whether use of a device is remunerated or not) on grounds that he or she has not given his or her consent [...] for processing of any data that is not necessary for the functionality requested .

[4]

The European Data Protection Board specifically states that consent withdrawal must be as easy as giving consent, but this does not entail that the exact same procedure should be applied [48, p. 22]. In other words, when consent is obtained via electronic means, for example through a sw'ipe or a mouse-click, the withdrawal of consent should be equally easy and simple [48, p. 22]. Specifically, when consent is obtained through the interface of an IoT device, the data subject must be able to withdraw their consent via the same electronic interface and should not be required to use another interface solely for withdrawing their consent as a different way would entail undue effort [48, p. 22]. Moreover, the withdrawal should not entail detriment for the data subject and must be possible free of charge or without resulting in a lowered service quality [48, p. 22].

It is also important to note that IoT devices constitute “terminal devices” under EU law and thus any storage of information or access to information stored on an IoT device requires the end user's consent in line with the ePrivacy Directive [9, p. 14]. Legitimate interest is unlikely to constitute a legal ground for processing given the seriousness of the intrusion.

1.4.5 Exercising Data Subjects’ Rights

Another challenge constitutes the heterogeneity of the data protection policies of the interconnected objects, w'hich may become even more complex depending on the context w'here the objects are used and the different applicable legal frameworks, even though some data protection and privacy considerations will be similar. Accessing one’s data is also a challenge in the IoT context, in particular since the right of individuals extends not only to the displayed data or the requested data (e.g., registration data), but also to the raw data processed in the background. To that end, can be added the inferences and the processing of data for secondary purposes, including the detection of behavior patterns and profiling w'hich could amount to surveillance.

1.4.6 Incidental Collection of Personal Data, Including Processing of Nonusers’ Data

The IoT environment is a multiuser context, where devices may have more than one users and data of nonusers may be collected in a way that can be perceived as nearly covert. For instance, when a sensor collects data of persons regularly visiting a building or when a device is used by different members of a family.

The quality of the consent can be also impacted by difficulties in providing information to individuals who are not the end-users of a device, for instance to individuals whose data get incidentally collected.

1.4.7 Mixed Datasets

In real-life application of the IoT, datasets wfill be mixed, meaning that they will be likely composed of both personal and nonpersonal data. Nonpersonal data are data that are not “personal” as defined in the General Data Protection Regulation. In other words, data that did not originally relate to an identified or identifiable natural person; for example, data relating to the w'eather collected by building sensors. And data that were originally personal, but they were later made anonymous, which means that the data can no longer be attributed to a particular individual.

The assessment of whether the data are properly anonymized and cannot be reidentified—not even with additional data—must be done on a case-by-case basis, by taking a look at all means reasonably likely to be used by a controller or by another person in order to identify the individual. It is important to remind the reader that anonymized data are nonpersonal data whereas pseudonymized data are personal data [59, p. 5]. Pseudonymization for example would be the case, if personal data are replaced by unique attributes in a dataset and the actual personal data are kept separately from the assigned unique attributes in a secured database.

The assessment demands regular reviews, given the technological progress that can lead to re-identification of personal data. The Commission in its Guidance gives as an example the quality control reports on production lines which can relate to specific employees [59, pp. 6-7]. In some cases, even the data relating to legal entities could be considered personal data, for instance, if the name of the legal entity corresponds to a living natural person [59, pp. 6-7].

If a dataset is composed of both personal and nonpersonal data; in other words, it is a mixed dataset, then the following would apply [59, p. 9]:

  • • the Free Flow of Non-Personal Data Regulation would apply to the nonpersonal data part of the dataset;
  • • the General Data Protection Regulation would apply to the personal data part of the dataset;
  • • “if the non-personal data part and the personal data parts are ‘inextricably linked’, the data protection rights and obligations stemming from the General Data Protection Regulation [would] fully apply to the whole mixed dataset, also when personal data represent only a small part of the dataset.” [59, p. 9] None of the two regulations define the concept of “inextricably linked.” In practice, this can refer to a situation where separating the personal from the nonpersonal data would be impossible, not technically feasible or economically inefficient [59, p. 10].
  • 1.4.8 Profiling and Discrimination

Wachter identifies three ways of profiling that could lead to discrimination in IoT

systems;

a. data collection that leads to inferences about the person (e.g., Internet browsing behavior);

b. profiling at large through linking IoT datasets (sometimes called 'sensor fusion ’); and

c. profiling that occurs when data are shared with third parties that combine data with other datasets (e.g., employers, insurers)” [20, p. 10].

IoT devices can only work if they collect data and make inferences, but this can be very intrusive into a person’s private life [20. p. 11]. Even data which in a first look seem to be neutral, such as a postcode, when connected to other datasets, can lead to discrimination, based on ethnicity or gender [20, p. 12].

In the IoT ecosystem, a digital identity can be considered:

a type of profile, made up of all information describing the user that is accessible to a decision-maker, based on observations or prior knowledge (e.g., age, location), or inferences about the user (e.g., behaviours, preferences, predicted future actions).

[20. p. 7]

The way an individual perceives themselves might be different from the way external entities perceive them, especially given that those entities may only have in their disposal segments of an individual’s profile [20. p. 7]. Moreover, the individuals may not be fully aware of those external identities, may lack control or may be not able to assess the validity of those inferences about them [20, p. 7]. Those segmented profiles constitute virtual identities which are rather contextual and may again interfere with a person’s right to data protection as well as a person’s right to non-discrimination.

 
Source
< Prev   CONTENTS   Source   Next >