Note: BPC has partnered with the XR Association to set up an XR Initiative to study the policy implications of immersive technology. On June 23rd, the BPC initiative hosted a private convening with academics, industry representatives, civil society groups, and other stakeholders to better understand XR privacy and security issues. The following blog highlights research that was done leading up to the convening and the convening itself.
The debates around privacy and security will likely broaden as immersive technologies become more pervasive in civil society. Virtual reality headsets can track people’s eye movement, while augmented reality glasses can record bystanders in various locations. The Bipartisan Policy Center brought together a diverse range of stakeholders representing industry, civil society, and other relevant organizations for a convening on XR privacy and security to preemptively discuss many of the policy debates likely to occur in the coming years. To encourage candor, the convening followed Chatham House rules, so this post will not identify individual participants.
The initial phase of this convening focused on two hypothetical scenarios involving XR applications. The goal was to imagine potential privacy and security challenges that would arise while looking for various remedies and safeguards to prevent harm.
We presented the first scenario about a VR app that helps identify and manage stress for health care and workplace purposes. The application relied on capturing certain biometric information, such as eye movement.
We first asked participants to identify the privacy and security challenges involved in this scenario. Many participants identified the collection, storage, and use of biometric data to be a primary concern. Once stolen, biometric information is virtually impossible to change. Your fingerprint cannot be reset the same way a password can be. This makes data breaches a major concern. Personal information may also be inferred through biometric data, such as health characteristics, age, and gender. This raises the prospect of discrimination against a protected class in areas such as hiring.
Compliance with existing healthcare regulations, such as the Health Insurance Portability and Accountability Act (HIPAA), was another concern raised; participants were uncertain when HIPAA laws would apply. Power asymmetries between workers and companies also raised concerns among participants. For instance, what rights do employees have to opt-out of such a service, could employers access any data collected about employees, and what rights did employees have to review and delete any data collected about them?
Participants suggested several methods for addressing these various challenges, including safeguarding data using encryption, anonymization, autodeletion, or adding user access and control of the data and audit trails. Policy-focused suggestions included enforcing compliance with existing privacy laws, restricting the use and sale of data, and informing users about risks using similar devices.
The second scenario presented was about AR glasses that could help users navigate a city while recording their surroundings to improve the glasses’ navigation capability. In this scenario, participants expressed concerns primarily focused on bystander privacy, especially in sensitive locations. For instance, someone walking by a health clinic and recording people entering and exiting the building. From location data to facial recognition, bystanders often unintentionally give up their data as they walk through a public space. Multiple participants discussed what constitutes meaningful consent in this scenario and more broadly. Again, participants questioned how the data is collected and stored on such a device, especially for children and un-consenting adults.
Participants also raised concerns about user privacy (such as a user’s location), the recording of private property (for example, homes or small business storefronts), and data collection, storage, and usage. A participant asked if private properties would have the ability to shield their information if they chose to reject such precise mapping technology. Another participant noted that an assortment of privacy standards on recording or filming sensitive information makes it challenging to know when and where a line may have been crossed.
Participants shared several solutions to the various challenges discussed above, such as de-identifying technology like facial blurring, lighting to signal AR glasses were recording, reducing the precision of the location, opt-out options, geo-based “do not scan” options, meaningful consent, laws addressing sensitive recordings and data protection, and data deletion procedures.
A roundtable discussion to find shared concerns and policy priorities followed the scenario exercises. First, we focused on the idea of meaningful consent related to XR in the scenarios we previously introduced. Participants discussed the limitations of consent, including the usually lengthy user agreements that are signed only once and potentially before fully understanding the consequences. One participant suggested an alternative method would be a “privacy-by-default” scenario, requiring companies to protect user privacy first and foremost. The use of an indicator light to signal a pair of AR glasses is recording was brought up to address bystander privacy. However, a participant argued that using a light on AR glasses to signal it is recording would not be enough since not everyone would see it or understand what it meant and argued for greater bystander protections.
Another topic of discussion was data privacy. XR technology can collect a vast amount of data, risking user privacy and security. Participants discussed why biometric data, other personally identifiable information (PII), and location data are very sensitive and discussed the merits of greater levels of protection for such data.
Security breaches are a major risk when large amounts of sensitive data and information are retained using these devices. One participant suggested encrypting the data so users are protected from such breaches; however, another participant said encrypting data might lower the ability of the first party to use the data. Another participant raised the prospect of heavy fines to discourage security and privacy breaches. Some participants shared their frustrations about the lack of coherency of data privacy and security laws. Some laws and company policies enforce notices in the case of security breaches; some make clear their intent to use and delete information, whereas others have little or no enforcement or protections.
Concluding from this roundtable discussion, we tried to identify areas of agreement from the participants. Though there were many common themes from the event – privacy, consent, data protection, and data sharing – recommendations to address these issues diverged. A consensus could not be reached on whether to build off existing privacy laws, since a participant argued there was disagreement among the group on if existing laws were flawed or not. This prompted discussions on the appropriateness of reviewing existing laws.
A consensus was reached among the participants that a diversity of perspectives is needed to implement and deploy tools in XR. A diverse and inclusive XR industry enables more people to input ideas, enhance the technology, and ensure safety for a larger population of users. A participant highlighted how risk factors for one set of users would not be the same for another set, then elaborated how someone using a virtual avatar to privately explore different gender identities may care a lot more about their avatar’s privacy than the average user.
Technology regularly changes and brings new privacy and security concerns. The emergence of XR is no different and requires a forward-looking approach to help inform policymakers. From our convening, we identified a consensus about the need for a greater diversity of people and perspectives when designing, deploying, and making policy for XR tools and technology. However, consensus was harder to find on other issues as discussions about XR privacy and security are still in their infancy, with much more needed to be learned about the challenges and various solutions’ feasibility and implications. BPC hopes to help accelerate this process and encourages taking an evidence-based and bipartisan approach to identifying and addressing XR privacy and security concerns. Further research and debate are critical to ensure policies reflect the complexity of the topic.
Support Research Like This
With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.Donate Now