ICO News

ICO Tech Horizons Report – what does it mean for the use of … – Lexology


Overview

The Information Commissioner’s Office (ICO) published its first ‘Tech Horizons Report’ (the Report) on 15 December 2022. In this article, we provide a summary of the Report and our thoughts on its implications for the insurance industry.

The Report is the first of a new series to be released annually in line with the ICO’s commitment to setting out views on emerging technologies, which was made as part of its ICO25 strategic plan. This first Report explores technological developments and their associated data privacy risks, whilst also providing some recommendations as to how organisations and businesses should address these risks. It makes for an interesting read for both developers and users of these technologies.

The Report identifies four challenges which the ICO perceives to be common across all emerging technologies:

  • Technologies are collecting personal data via methods that are not transparent to individuals and that they do not have “meaningful control” over.
  • Data ecosystems can be too complex for individuals to understand the processing being undertaken and to enable them to exercise their data subject rights.
  • Technologies may collect more data than required for their primary purpose.
  • Several technologies are collecting special category data which require additional protective measures.

The Report focuses on four applied technologies that are expected to have “novel and significant implications for privacy in the next two to five years”. Interestingly, these technologies were selected by the ICO from an initial list of 11 by using a foresight methodology to assess the “probability, scale, and associated harms and benefits in relation to privacy law”. The four technologies focused upon in the Report are:

  • Next Generation Internet of Things (IoT);
  • Immersive Technology; and

In addition, the ICO will publish a deep dive report in Spring 2023 on a fifth subject; neurotechnology. The ICO has identified this as the “most significant” area which merits its own report focusing on processing subconscious neurodata and applying the UK GDPR to brain-computer interfaces.

Consumer Healthtech and IoT

Although the ICO has chosen to separate Consumer Healthtech and IoT into two distinct areas, they are of course very much connected by their ability to track and collect data on individuals, their behaviour and their environment. The Consumer Healthtech section of the Report focuses on wearable devices (including smartwatches, smart fabrics and digital applications) that collect physical and mental health data of the wearer (such as heart rate and sleep quality), whereas the IoT section concentrates on technologies that equip sensors within the home and office environments (such as smart video cameras, TVs and smart home ventilation systems).

Both of these types of technologies are of particular relevance to the insurance industry, as they have been utilised for the benefit of firms and customers in recent years. Such devices have been used for risk management; to improve the efficiency of the insurance claims process, for example through real time notifications in the event of an accident; and to increase the accuracy of risk assessments. Insurers are also able to utilise this data to offer services directly relevant to a customer’s particular needs.

For example, in motor insurance, telematics has been used for a number of years to collect data on driver behaviour and speed in order to calculate premiums. More recently, we have seen a rise in the use of ‘smart’ monitors as part of home insurance offerings to measure moisture levels and identify potential leaks early on.

Risks

These technologies often involve the collection of vast datasets. In some instances, it may not be clear that the information is “personal data”, particularly if used in commercial environments. However, firms should be aware that such information will fall within the definition of “personal data” if:

  • observed or inferred data is linked to a person; or
  • the information collected otherwise allows for the identification of a person.

The ICO is particularly concerned by the potential for these technologies to collect large volumes of special category data. Whilst health data is the most obvious category, certain technologies will collect biometric data (for example, voice recognition). The Report emphasises that this data is subject to Article 9 of UK GDPR which requires a separate condition for processing. Firms should keep in mind that the scope of the “insurance” processing condition, set out in Schedule 1 Data Protection Act 2018, does not include biometric data and therefore an alternative condition should be identified.

Firms should therefore interrogate these technologies in order to understand when personal data and special category personal data is being collected so as to ensure that all relevant requirements are complied with. The ICO recommends the use of a Data Protection Impact Assessment (DPIA) as a helpful tool; we would expect initiatives using these technologies to meet the DPIA threshold.

The Report also references Article 5(1)(d) of the UK GDPR which requires personal data to be accurate. This is particularly relevant to data collected by the insurance industry when it is used to make key decisions regarding risk assessments and premium pricing. The ICO notes concerns over the accuracy of wearables and these concerns could also be applied to IoT devices when collecting personal data for a similar purpose. The Report therefore recommends that organisations should check to ensure that processing via such technology is accurate, fair and monitored for risks of systemic bias. Eliminating any systemic bias from such devices additionally coincides with the ESG aims of the insurance industry, which seeks to ensure that insurance products are suitable for a diverse range of customers. The ICO suggests that systemic bias can arise a result of insufficient ‘demographic diversity’ in product test data.

In addition, considering the range of information that can be collected through these various devices, the ICO has highlighted the need for organisations to be transparent with customers about what is being collected and for what purpose. When utilising these devices, the ICO advises a need to draft privacy notices which provide ‘clear, intelligible information on how and why’ an individual’s data is being processed and to communicate this to customers. The ICO notes that this is a particular challenge in the absence of a user interface but encourages organisations to ‘continue to explore approaches to transparency and data minimisation’.

Finally, within the environment of increased and more sophisticated cyber-attacks, the ICO also notes that IoT devices provide additional routes for cyber-criminals and encourages organisations to ‘prioritise efforts’ by effecting the requirements of the upcoming Product Security and Telecommunications Infrastructure Bill, as well as the European Telecommunications Standards Institute’s IoT security standard.

Immersive Technologies

Another of the four technologies identified by the ICO in its Report is “Immersive Technology”. This term is used by the ICO to capture the use of both Augmented Reality (AR) and Virtual Reality (VR). In simple terms, AR is used to enhance reality with images, text and other information, whereas VR allows users to become immersed in a 3D computer generated reality. Users can experience VR mainly through the use of sophisticated headset devices whereas AR is currently utilised by devices such as phones and tablets but also wearables such as ‘smart glasses’.

The Report explains that some of the main uses of AR and VR can be in respect of entertainment (such as its already prevalent use in gaming), but also notes that it can be used in the wellness field, particularly when it is combined with the Consumer Healthtech devices discussed previously. For example via virtual yoga or fitness classes.

The ICO also notes the use of Immersive Technology in the workplace. This is something that is already being utilised in the healthcare industry in respect of education and training as AR and VR provide a more realistic alternative to books and videos for medical students and healthcare professionals in order to simulate real-life work situations. For example, immersive technology can be used to replicate a surgery environment before a medical student is exposed to a patient in real-life.

Risks

Whilst not yet common place, the use of Immersive Technology in the insurance industry is growing, for example, where it can be used as a risk management tool for complex property insurance.

The main risks concerning the use of immersive technology revolve around the high volume of information that devices are able to collect (and often are required to collect) in order to create virtual environments. In order for a user to be fully immersed, these devices employ sensors which can capture user characteristics, location information, the user’s voice, eye and facial movement and much more. As one of the key themes identified by the ICO across each of these emerging technologies, this can raise concerns about the collection of special category data, which as outlined previously, require additional safeguards under the UK GDPR. It is important to also ensure that, considering the large amount of data being collected, that data is only retained for as long as necessary to achieve the purpose it was collected, under the principle of data minimisation.

The ICO also considers transparency as a key challenge of immersive technology. Users of immersive technology devices need to be appropriately educated and aware of how the device collects personal data. The Report casts doubt on whether a standard 2D privacy policy is beneficial in an immersive 3D environment. In addition, delivery of privacy information to certain groups will be difficult because of the ability of devices to collect data on individuals who are not using them.

The ICO recommends exploring “technical and policy solutions” in respect of these challenges, but does not provide any specific solutions at this stage.

Decentralised Finance

The ICO examines the development of Decentralised Finance (DeFi) and the privacy challenges this brings. A particular area of focus is the use of Blockchain, and other distributed ledger technologies (DLTs), to provide access to financial services. These technologies are increasingly growing in popularity in the insurance sector. DeFi systems can be used for a variety of transactions and are often governed by Decentralised Autonomous Organisations (DAOs); entities with no central authority consisting of ‘token-holders’.

The Report highlights the transfer of data via ‘smart contracts’ on DLTs as an issue. Fundamentally, a smart contract is a set of coded rules or instructions which can be executed in response to a pre-defined and agreed input. These ‘smart contracts’ can include information such as the IP addresses of the contracting parties, their financial information as well as encrypted messages.

Risks

The very nature of a decentralised network causes difficulty in identifying the responsible parties in respect of personal data obligations. The ICO highlights that the unstructured form of a DAO can make it very difficult to determine the controller, processor or joint controller of the personal data processed in the relevant transactions. Not only does this provide a challenge for the purposes of liability and regulation under data protection law, but also leads to a lack of clarity for data subjects as to where/whom to direct any requests to exercise their fundamental personal information rights.

Furthermore, on the topic of data subjects’ rights, the immutable nature of Blockchain conflicts with the rights to have data rectified and deleted under Articles 16 and 17 of the UK GDPR. This also presents obvious challenges to compliance with the storage limitation principle.

The Report additionally considers that whilst the privacy of DeFi transactions is one of its key benefits, personal data is often pseudonymised rather than completely anonymised. Pseudonymisation merely replaces or removes information in a dataset that identifies an individual, which leaves open the possibility of re-identification of a data subject. The ICO recommends that organisations ensure that they are transparent with data subjects so that they are aware what personal data is being processed during transactions, and are able to make informed choices. This is especially relevant when considering the risk of novel cyber-attacks on assets stored on DeFi platforms, with the Report noting that there have been several well-publicised breaches.

In an article published in November 2022, we noted many of the same data protection compliance concerns that come with the use of Blockchain, but also suggested that this technology could actually help facilitate compliance with the UK GDPR by encouraging transparency, assisting with data portability and promoting compliance with data subjects’ rights. The ICO similarly concludes that organisations are developing solutions to address concerns, such as the use of Privacy Enhancing Technologies (PETs) which keep personal data ‘off-chain’. The ICO however does not provide any specific solutions to the problems it identifies in the Report, but instead encourages organisations to continue to work to resolve these.

What’s next?

The ICO’s deep dive on neurotechnology is set to be published in the first half of 2023. Following this, we expect the ICO to publish the next report in its Tech Horizons series at the end of the year. It will be fascinating to see how the ICO’s recommendations develop with further research and understanding of technological advancements and the data protection concerns they pose. We hope that the second report in this series is able to provide more expansive solutions to organisations.

In the meantime the ICO will be doing the following in preparation for regulating these emerging technologies:

  • Working with the public regarding the risks and benefits;
  • Inviting organisations to participate in its Regulatory Sandbox;
  • Developing guidance for organisations, initially on data protection and IoT; and
  • Proactively monitoring market developments.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.