New Products Collect Data From Your Brain. Where Does It Go?
17:14 minutes
There are products on the market that monitor your brain waves through caps or headbands: Some aim to improve mental health, sleep, or focus, while others can plunge users into virtual reality for gaming.
What happens to the neural data that neurotechnology companies collect from these devices? Consumers may be accustomed to their personal data from apps and social media being sold to third parties. However, the potential sale of brain data to a third party raises additional privacy concerns.
There are no federal laws governing the data collected by these wearable devices. But Colorado recently became the first state in the country to pass legislation protecting neural data in consumer products.
Guest host Arielle Duhaime-Ross talks with Jared Genser, general counsel and co-founder of The Neurorights Foundation about the current landscape of neuro privacy.
Jared Genser is General Counsel and Co-Founder of The Neurorights Foundation in Washington, DC.
ARIELLE DUHAIME-ROSS: This is Science Friday. I’m Arielle Duhaime-Ross. Maybe you’re getting really into meditation or good sleep, and you want to make sure you’re getting the most out of your new practice. So you google around a bit, and you find that you can purchase a headband or a cap that monitors your brainwaves for this exact purpose. Interesting, right?
But you’re pretty tech savvy, so you also think to yourself, wait. Where will all my brainwave data get stored, and how will it be used? Good question. Turns out that there are no federal laws governing the data collected by these wearable devices, but recently Colorado was the first state in the country to pass legislation protecting neural data in consumer products.
Joining me now to talk more about the current landscape of neuroprivacy is my next guest, Jared Genser, general counsel and cofounder of The Neurorights Foundation. He’s based in Washington, DC. Jared, welcome to Science Friday.
JARED GENSER: Thanks for having me.
ARIELLE DUHAIME-ROSS: Can you give me an overview of the types of consumer products currently on the market that collect neural data?
JARED GENSER: Sure. There are about 30 different products you can buy online today. They really are in three different categories. The first is in wellness, the kind of projects you were describing. The second is in entertainment. For example, you can actually fly a helicopter drone using your thoughts. And the third kind of device are really for do-it-yourselfers who want to dig into doing brain scans and develop software for processing data that comes from those devices.
ARIELLE DUHAIME-ROSS: Wow. OK, so there’s a wide range, and your organization, The Neurorights Foundation, recently released a report which assessed the privacy of these consumer neurotechnology products already on the market. What did you find?
JARED GENSER: Well, unfortunately what we found was– and this was not really all that surprising to us because this is really an emerging market right now. But what we found is that, in fact, there’s an enormous gap between the requirements of global privacy standards and the kinds of privacy rights that users are given in using these products today. And this was incredibly worrying to us, for the very questions that you were raising in your introduction, about users having every good reason to wonder, what is going to be happening with this data?
It’s important to emphasize that these consumer products are using medical-grade brain-scanning devices, and each of them are downloading from gigabytes to terabytes of data from each of their users individually to provide the services that they ultimately provide.
ARIELLE DUHAIME-ROSS: Well, I can’t help but notice that you just used the term medical grade. So does HIPAA, the privacy law protecting health data, cover these consumer products?
JARED GENSER: So, interestingly, it doesn’t, and this is, of course, why we’re especially concerned. Implantable brain-computer interfaces, neurotechnology devices that have to go, for example, through neurosurgery inside the skull, are heavily regulated, not surprisingly, as medical products, and all the data that come from them are protected by HIPAA as health data. But when the very same kinds of devices– let’s say like an EEG scanner– is used in a consumer context, even though it provides medical-grade data, it doesn’t have to be licensed by the FDA, and the data that it gathers is not protected under HIPAA because it is not health data gathered in a medical context.
ARIELLE DUHAIME-ROSS: OK, so if these were licensed as medical devices, companies would be required to protect your information, right?
JARED GENSER: Absolutely.
ARIELLE DUHAIME-ROSS: So what happens if I use one of these products and then a third party buys my neural data? What can these companies learn from it? How worried should I be?
JARED GENSER: Well, I think we’re really, really concerned about this because the brain itself generates all of our mental and cognitive activity. Neural circuits in our brains create our thoughts, our emotions, our memories. They guide decision making. They also help form our personalities, identities, and senses of self. And that means that neural data, which is information concerning the activity of our central or peripheral nervous systems that can be gathered by these devices, can reveal deeply intimate information including, for example even today, potential brain diseases that you may currently have but may not even be aware of. But ultimately these devices are going to be able to decode mental states, emotions, health, and–
ARIELLE DUHAIME-ROSS: Brainwaves.
JARED GENSER: –information about our neural processing.
ARIELLE DUHAIME-ROSS: OK, so it sounds like you’re talking about both current applications and also future applications, and I just want to clarify here. Could a company figure out what I’m thinking when the data was collected?
JARED GENSER: So the answer is there are researchers who could do that right now today with the kind of devices that are available in the consumer market. There’s a study that was published last year in Australia about researchers who, using an EEG scanner– which is what most of these devices currently are on the consumer side– combined with generative AI being able to decode thought-to-text translation with about 40% to 60% accuracy with a wearable device.
Now that, of course, is not an immediate concern. Probably it will take a few more years until that accuracy rate can get to 90% plus. But with implantable devices already today, there are experiments that have been done, including one that was on the front page of the New York Times last year, about a woman who had thought-to-text translation at 80 words a minute with a very high level of accuracy with an implantable device. So that is not yet possible with the wearable devices, but undoubtedly with a year or two or more of work and effort, ultimately your thoughts that you had while you were wearing these devices may well be able to be decoded after the fact once these devices that are wearable are brought to a market.
ARIELLE DUHAIME-ROSS: So I’ve actually worn very briefly a cap of this nature, and I was able to write with my mind on a computer screen, which is great for accessibility reasons, right? But I had to be thinking about writing in that moment. So I hate to harp on this, but I do want to clarify. Can currently this neural data be used to figure out what I’m thinking if I’m not thinking about writing something on a screen?
JARED GENSER: So, I mean, the answer is yes but only by these researchers in Australia, as far as we know. The researchers in Australia and the way that they’re undertaking their research do not require the cooperation of the subject in order to decode the person’s thoughts to text if they are thinking in their own mind about thoughts in the form of words.
ARIELLE DUHAIME-ROSS: So we’re talking about a rather scary scenario. Are there positives to this kind of technology being more readily available to consumers?
JARED GENSER: I mean, there are really off-the-charts positives, and I think that any concern about the risks of misuse and abuse definitely need to be weighed at the same time along with the extraordinary benefits. We don’t yet really even understand fully how the brain works. If you were to ask a neuroscientist, What is a thought and how does it happen in your mind? they don’t actually have an answer for that question today.
But one in three people in the world will have some neurological disease over the course of their lifetime, and these devices and the extraordinary research that are being done with them have the potential to not only potentially diagnose and begin to cure brain diseases but ultimately to be able to reverse their effects. Right now, no brain diseases are actually curable. All we can do is to slow their progression at the very best. But the expectation is in the next 5, 10, 20 years that we’re going to be able to start to cure specific brain diseases and even begin to reverse their effects, and the impact on humanity will really be profound.
ARIELLE DUHAIME-ROSS: And so with this great potential outcome, this is also something that we need to start thinking, OK, what are the rules around this? So a bill was recently passed in Colorado that was aimed at protecting neural privacy. Your organization lobbied for it. What protections were included?
JARED GENSER: So there are 15 states in the United States that have consumer privacy laws, and all of them really unintentionally exclude protections for neural data, and that’s because they protect data such as biometric data or genetic data or other kinds of biological data, but neural data is actually electrical. It doesn’t actually meet the definition as a matter of medicine of being biological, and so they’re all excluded.
And so what we did in Colorado is we were able to extend the protections that are provided today to biometric and genetic data, for example, and now extend those protections for neural data as well. That’s a big step forward, but as you noted earlier, there is no US law, no federal law that protects neural data in any way. There are only two categories of consumer data that are protected at a federal level. One is health data covered through HIPAA and health privacy laws, and the second is financial data about your bank accounts and financial information.
And, of course, it’s our view that if we can protect, for example, How much money do you have in your bank account? from disclosure, then surely disclosing medical-grade neural data should be something that should be prevented from being able to happen by consumer product companies.
ARIELLE DUHAIME-ROSS: It came to my attention while I was researching this that a policy network named TechNet that represents Apple, Meta, and OpenAI lobbied against this bill. What stipulations did they manage to get eliminated from the bill?
JARED GENSER: Well the only change that they that they made– which, to me, was not a material one– was to require that for information to be protected, it had to be information that would be personally identifiable back to the user. And the reason why I’m not especially concerned about that change is because when the full brain scan is taken, it has to be connected back to the user in real time in order for the device to be able to properly decode the information and provide it back to the user themselves. If it wasn’t personally identifiable, they wouldn’t be able to do that.
But I actually think that these industry lobby groups are probably not speaking, as much as they might say that they are, for their members because I can’t imagine companies like Meta, Apple, and Snap, that are all developing consumer neurotechnology products right now as we speak, that they would ever make an argument to the public at large that their view is that somehow the gigabytes/terabytes of neural data collected by their products shouldn’t be subject to any protections of any kind, which if you oppose the bill in Colorado is the position you would be taking.
Already today, all these companies have to protect much more rudimentary data from a user, including their credit-card information, for example. So the idea that they would ever argue that they shouldn’t have to protect neural data would strike me as a position they’re never going to actually take.
ARIELLE DUHAIME-ROSS: All right, so it’s your view that, while these companies might be currently working on a product like this, what happened with TechNet probably isn’t the goal here. They’re probably not intending to sell any neural data that they might collect in the future.
JARED GENSER: Well, I think they might want to monetize it and to sell it but that they would only do that with, obviously, informed consent of the users themselves. One illustration– it was reported several months ago that Apple is developing and, in fact, filed a patent application for the next generation of AirPods that would include EEG scanners built into each of the AirPods, and that could be very, very beneficial to tie to their Apple Watch for health-related purposes, but it could, frankly, do other things as well as the technology advances, including, for example, doing thought-to-text dictation on your iPhone.
So when you apply for a patent, you don’t have to explain exactly how you’re going to use a specific device, so it’s not clear yet what Apple intends to do. But I would imagine that if Apple is going to be advancing that kind of a product to market that that kind of data could be used for many, many other purposes, and they’d love to use it that way, but that they would have to disclose to users up front exactly what purposes that might be used and, if they were going to transfer that data to third parties, what that would look like and to what third parties would they transfer and for what purpose and so forth.
ARIELLE DUHAIME-ROSS: All right, thought to text with AirPods, that certainly sounds interesting. Do you think this bill in Colorado went far enough to protect consumer data?
JARED GENSER: I think it went as far as it could go in terms of the way that these kinds of privacy laws exist in 15 states. I think, ultimately, we’re moving bills forward as well in California and Minnesota, but there really needs to be a protection at the federal level for this kind of data, most especially because, again, we’re talking about medical-grade data, and you’re talking about gigabytes to terabytes of data for each user that are using these devices.
And while today what can be learned from this data is more limited, undoubtedly that’s going to increase substantially over time. Today we can only decode usually about 1% to 2% at most of data that’s in a scan, and ultimately you’re going to be able to decode people’s dreams and even people’s subconscious thoughts. These are things that are not going to be in the immediate term but in the 5-, 10-, 15-year time frame.
ARIELLE DUHAIME-ROSS: So getting ahead of it seems important. What about on a global scale? Is this something that we could create global rules for?
JARED GENSER: We are working at the United Nations and with other regional organizations around the world on these issues. The European data protection authority is looking at the application of neurotechnology within the context of their own privacy framework. And so we absolutely do need to update global standards.
We’ve been advocating for what we were describing as five neurorights as protections that we believe that users of these devices should have around the world in order to ensure that they are properly protected. And we’re specifically talking about the need for a right of mental privacy, a right of mental agency, a right of mental identity, a right to nondiscrimination in the development and application of these devices, and ultimately also a right to fair access to mental augmentation.
There’s already a study that was published last year at Boston University, for example, studying senior citizens and their memory, and they found that with externally applied electrical pulses to different parts of the skull that you could improve short-term memory by 30% in remembering a list of 20 specific items over time. And one could imagine, for example, how transformative it would be if you have a device on the market for consumers that would enable the improvement of short- or long-term memory and what a big impact that would have in terms of both advancing humanity but also potentially creating even larger divides between haves and have-nots in our societies.
ARIELLE DUHAIME-ROSS: Right. So with the rapid pace of this technology, we often see the legislature lag behind and struggle to catch up. In the case of neurorights broadly, do you think you and others in the field are getting ahead of this issue?
JARED GENSER: Well, I think we’re definitely making progress. What we’re really saying is that we don’t necessarily need new human rights, but really we need to bring human rights into the 21st century. For example, there’s a right to privacy under international law, but the way that it’s currently interpreted is a right to be free from arbitrary interference with your privacy, and it may not, for example, protect against the passive monitoring or decoding of your brain data. And so we really would need to include within that existing right of privacy a right of mental privacy.
And I do think that we are moving things forward in terms of an understanding of the urgent need to address these issues. I think it’s notable that, for example, in Colorado, the bill to extend their state privacy law to cover neural data was adopted on a vote of 61 to 1 in the Colorado House and unanimously in the Colorado Senate. This kind of unanimity demonstrates that most ordinary people, when asked if they thought it was reasonable that consumer-product companies could download and do whatever they want with your medical-grade neural data, that virtually nobody would think that was a good idea and that this is something that needs to be addressed.
So I think we are making progress. There’s still a very, very long way to go. Colorado is actually the only jurisdiction in the world right now that actually explicitly and unequivocally defines and protects neural data as a category of user data that needs protection.
ARIELLE DUHAIME-ROSS: Wow All right. Yeah, I mean, I think it’s telling that, regardless of your political affiliation, that it would be very clear and easy to grasp why this matters.
I’ve got to ask, though. You’ve worked as an international human-rights lawyer for over two decades, and as part of that work, you helped free a lot of high-profile political prisoners. What got you interested in working on neurorights, and does the work feel connected to you?
JARED GENSER: Well, it feels very connected to me. Three years ago or so, I had a conversation with an extraordinary neurobiologist and neurotech expert at Columbia University, a gentleman named Dr. Rafael Yuste. And in an hour-long conversation with him about the work that he was doing, I realized that my own views as to the human-rights challenges we might face in the coming decades were really quite myopic and missed this enormous area of both exciting as well as worrying development.
And he and I decided to then co-found The Neurorights Foundation together and to engage in this work. He doesn’t have the expertise on human rights that I do, and I don’t have the medical expertise and the scientific expertise that he has. But I think together talking about these issues in both of those ways at the same time is really, really exceptionally important for people to understand not only the science as a starting point but also both the extraordinary developments that are coming and the risks of misuse and abuse that would need to be addressed.
ARIELLE DUHAIME-ROSS: Thank you so much for taking the time to speak with me.
JARED GENSER: My pleasure. Thanks for having me.
ARIELLE DUHAIME-ROSS: Jared Genser is a general counsel and cofounder of The Neurorights Foundation. He’s based in Washington, DC.
Copyright © 2024 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Shoshannah Buxbaum is a producer for Science Friday. She’s particularly drawn to stories about health, psychology, and the environment. She’s a proud New Jersey native and will happily share her opinions on why the state is deserving of a little more love.
Arielle Duhaime-Ross is freelance science journalist, artist, podcast, and TV host based in Portland, OR.