The company formerly known as Facebook has introduced a new corporate identity, Meta, at its annual conference, signaling a pivot towards the development of a virtual realm known as the ‘metaverse.’ This move represents a commitment to expanding beyond social media into a virtual space designed for interaction, commerce, and entertainment. While the Facebook app retains its name, the rebranding reflects a broader strategic direction for the company.
Dr. Marcus Carter from the Socio-Tech Futures Lab at the University of Sydney, alongside Dr. Ben Egliston from QUT, has been analyzing Facebook’s ventures into Virtual Reality (VR) and Augmented Reality (AR) technologies. Following Facebook’s acquisition of Oculus in 2014 for $2 billion, Meta has been channeling substantial resources into VR and AR, with reported annual expenditures reaching around $10 billion for its research and development division in these areas.
The ‘Ethical Implications of Emerging Mixed Reality’ project by Dr. Carter and Dr. Egliston focuses on the ethical considerations inherent in the proliferation of VR and AR. They express concerns that the data collection capabilities of these technologies are significant, highlighting the risk of data being used in ways that could harm individuals or society, similar to the issues raised around Artificial Intelligence.
The researchers assert that the intricate data derived from VR—regarded as a behavioral biometric due to its uniqueness to individuals—could lead to privacy violations if misused for identification purposes. With Meta’s business model heavily reliant on targeted advertising, which accounts for a substantial portion of its revenue, the tracking capabilities offered by VR are particularly relevant.
Despite Meta’s announcement of a $50 million investment in ethically shaping the metaverse, the researchers remain skeptical, suggesting that such efforts might not sufficiently address the structural impacts of new technologies on society. They argue that the company’s history, including the handling of the Cambridge Analytica incident, demonstrates a pattern of ethical concerns, particularly in data usage and the company’s influence.
The researchers also reference the involvement of digital rights organizations in consultations, such as the instance where Access Now participated in a privacy-focused event for the RayBan Stories AR smartglasses but saw their primary recommendation overlooked. They note that many of the entities consulted by Meta are recipients of funding from the company and the Chan Zuckerberg Initiative, which could influence the objectivity of the feedback process.
As Meta forges ahead in creating its version of the metaverse, these developments prompt calls for closer scrutiny of privacy practices and potential regulatory actions to ensure user data is managed responsibly.