Registered users can unlock up to five pieces of premium content each month.
FaceApp's Privacy Issues Highlight AI and AR/VR Issues |
NEWS |
The Artificial Intelligence (AI) selfie app FaceApp, which allows users to edit their selfies and see themselves younger, older, or with different hair styles and facial features, was launched in 2017 by a Russian startup called Wireless Lab. Recently, the app captured users’ attention and became the top free app in Apple’s App Store. However, a number of privacy concerns arose regarding users’ data and the company’s privacy policy, which mentions that the app can access a user’s location, IP address, and log file information for targeted advertisements. The app also stores selected and edited photos in the cloud rather than locally on the device. Another concern that came up, but ultimately faded away, was that the app leverages users’ photos to train facial algorithms, which is expected; FaceApp is not a unique example that needs to address privacy issues, but it is a good chance both for users and businesses to pay more attention and become better informed about the parties that handle their data and how that data is acquired, stored, and used. Outside of FaceApp, the growth of Augmented Reality (AR) consumer apps that leverage users’ locations or likenesses to enable virtual try on have also raised questions about users’ data protection.
What Are Some of the Privacy and Security Topics Specific to AR and VR? |
IMPACT |
Augmented Reality (AR) and Virtual Reality (VR) devices and apps collect significant biometric data, such as facial features and expressions, retina or iris scans, voiceprints, and hand movements in order to overlay virtual content and enable more immersive and interactive experiences. Moreover, AR/VR devices may record conversations via microphones or capture information about where users are looking and their surrounding environments to overlay virtual objects and provide location-based information. In addition to its location, an AR mobile app may collect more data about a mobile device and its operating system, the type of mobile Internet browser it uses, and other information about the way that users use the application itself. Consequently, AR and VR apps have the potential to capture more personal sensitive/confidential information than standard applications, to enable product optimization and enhance user experience and and algorithm training. Data can also be shared with third party companies, especially for targeted advertising. This raises numerous questions about the companies that receive and manage data, the time period that they store data, and the third parties that may share data.
These additional biometric and visual data types offer valuable insight into user behavior, but also present another opportunity for attack and exploitation. For instance, in the case of FaceApp and many other AR applications, captured images are sent to servers to remove the processing barrier from the device and optimize the outcome with a cloud dataset. However, this opens up the possibility to observe network traffic between the device and the provider’s AR server and access information about the user from "raw" images that are transferred (for example, information about facial features or from the surrounding environment). In addition, attackers may trick users by placing AR markers/QR codes that lead in malicious content, usually in public places.
How to define the ownership of trademarks in VR environments and in the physical world, or virtual and real-world Intellectual Property (IP) rights, is another crucial legal issue that needs to be addressed. Businesses need to be proactive and protect their trademarks and copyrights in virtual and augmented reality environments. At the same time, developers need to be wary of content ownership—both of their own content and that of third parties for items like virtual logos/trademarks, likenesses, and locations—and how it is used.
Robust Security Systems and Privacy Protocols Can Ensure User Data Safety in AR and VR |
RECOMMENDATIONS |
The upcoming massive adoption of AR and VR solutions in both consumer and enterprise areas is increasing the need for more robust and modern security systems to handle new avenues of attack and novel data capture methods. This requires clear and detailed privacy, data, and trademark protection regulations. In accordance with ABI Research’s latest Augmented and Mixed Reality Devices and Enterprise Verticals (MD-ARMR-102) Market Data, the revenues for internal IT spending will reach almost US$6 billion and revenues from system integration will hit almost US$8 billion by 2023, showcasing spending for in-house and third party security.
By adopting software security systems and security protocols, companies can minimize the risk of an AR attack and protect users’ data. For instance, AR browsers could inform users about the origins of AR content before displaying it (such as its developer’s name or information about the AR channel to ensure that is reliable). In addition, an enhanced user interface in AR browsers is valuable as it informs users about suspicious content in advance. In regard to outsourced images in AR servers, security protocols are essential for securing image matching without accessing irrelevant information in the images that transferred from the device to the AR server in order to prevent the leakage of sensitive information.
In the coming years, the majority of AR/VR devices will include eye tracking systems and more advanced input options (like gesture control and AI voice assistants) that will gather more valuable data and therefore increase the possibilities of cyber-attack and data misuse. Companies need to adopt and frequently update robust security systems that protect users’ data and minimize their risk of attack. Privacy and data policies should also clarify the authorized third parties that access data, how data is acquired and how will it be used, and which steps users can take to manage data collection.
All in all, as AR and VR solutions scale, so, too, do security and privacy concerns. More tailored policies and regulations need to be applied, especially in enterprises that enable AR/VR apps to handle confidential data and integrate with other infrastructure, like the Internet of Things (IoT). Apart from AR/VR providers, users need to take responsibility and pay extra attention to the terms and conditions that they accept. Undoubtedly, privacy regulations need to be less complex, more clear, and ultimately more secure to ensure AR/VR solutions are reliable and trustworthy enough to enable large-scale adoption.