Data Protection Authorities in Europe have begun to turn their focus to consumer devices that collect and process neural data, as neurotechnologies continue to rapidly advance. Consumer products are already on the market that use headbands or helmets to detect information from the brain and use it to infer information about a person, such as whether they are interested in something, paying attention, or drowsy. These devices are also used in consumer wellness products, such as items to help people meditate or relax. Technologies are also being developed to enable a person to control a device, such as a computer, with their minds. Neurotechnologies like these can be used in the medical, entertainment, marketing, wellness, safety, employee monitoring, and education industries, and many others.
The Spanish supervisory authority (AEPD) and the European Data Protection Supervisor (EDPS) recently released a joint report titled “TechDispatch on Neurodata” detailing neurotechnologies and the data protection challenges associated with processing neural data (the “Report”).
The Report follows recent increased attention to neural data in South America, Europe, and the United States. In 2021, Chile amended its constitution to protect neurorights. Last year, the UK Information Commissioner’s Office (ICO) published “ICO Tech Futures: Neurotechnology,” a report that delved into the field of neurotechnology and its implications for data protection and privacy. This year in the United States, Colorado became the first state to expressly cover neural data in its comprehensive privacy law, and similar laws have been under consideration in California and Minnesota. (Read our recent MoFo Minute about neural privacy).
The Report highlights the approach European regulators might take to address the unique challenges posed by neural data under EU data protection law. These challenges stem from the sensitive nature of neural data, its potential to reveal unexpected intimate personal details, and the complex technical processes involved in its collection and analysis, all of which are being amplified by the power of artificial intelligence to infer information from neural data. The Report touches on many of the same issues that the ICO’s report addressed in more depth one year earlier. Below we highlight the key takeaways from the Report.
Key Insights
- Neural data is, at least in some cases, personal data. In some use cases, users of neurotechnologies identify themselves, so that data collected from them is associated with them. Even where that is not the case, the Report references research studies and other authorities for support that neural data can be used to uniquely identify an individual somewhat like a fingerprint. The Report assumes for its analysis that neural data is personal data.
- Neural data is, at least in some cases, a “special category of data.” The Report notes that neural data often constitutes biometric data that can be used to uniquely identify a natural person, or data concerning health, both of which are “special categories of data” under the EU GDPR. The Report adds that neural data may provide “deep insights into people’s brain activity and reveal the most intimate personal thoughts and feelings, including those that do not translate into actions and consequently cannot be measured or inferred by data collected through other technologies.” Neural data that qualifies as a special category of data is subject to heighted requirements under the GDPR.
- Proportionality and data minimization are especially important due to the intrusiveness of neural data. Because the brain is active 24/7 and is involved in many types of functioning, and because there is a massive amount of data that can be collected from the brain, and then additional data inferred from that data, neurotechnologies are “very intrusive, if not the most intrusive processing, encroaching upon the very mental privacy and possibly mental integrity of the person concerned.” Because of this, the Report advises that businesses considering processing neural data should analyze if the purpose sought justifies the sensitive and invasive nature of the processing.
- The Report notes that the European Data Protection Supervisor considers that “brain fingerprinting,” which is the detection of the existence of specific information from the brain, should only occur for healthcare purposes, and should be accompanied by all data protection conditions and safeguards.
- Neural data may be subject to accuracy limitations. The principal of accuracy under the GDPR requires that personal data must be kept up to date and reasonable steps must be taken to ensure that personal data that is inaccurate is rectified or deleted. However, because the brain changes over time (“brain plasticity”), and because of the possibility of errors when collecting neural data or false inferences made from neural data, the accuracy of neural data and the inferences drawn from it may be impacted. The EDPS and AEPD recommend that companies keep in mind the intrinsic accuracy limitations of neural data processing to prevent misuse.
- Transparency may be difficult to achieve due to the nature of neural data and how it is collected. Under the GDPR, data controllers are required to disclose how neural data is processed and the potential implications to data subjects. Due to the intrinsic and involuntary nature of neural data, neural data may reveal more about a person than data subjects expect. Even after reading a privacy notice, they may not fully understand the potential implications of neural data processing. The Report warns that companies should consider this challenge, especially when it comes to minors, for example, in the education and entertainment sectors.
- Neural data technologies must be assessed for fairness. The principle of fairness under the GDPR requires that companies handle personal data in ways that people would reasonably expect and not process personal data in ways that would have unjustified adverse effects on individuals. The Report notes the risk of discrimination in the development and use of neurotechnologies that could lead to biased or unfair outcomes. This risk can be exacerbated when AI is involved in the processing of neural data. Neurotechnologies must be carefully assessed and managed to reduce the risk of discrimination, especially in the education and healthcare sectors.
The processing of neural data presents both opportunities and significant data protection challenges that require careful consideration. Businesses in the neurotechnology sector should closely monitor these regulatory updates and conduct thorough assessments when considering development of neurotechnologies or processing of neural data to ensure compliance with data protection laws.