Meta is reportedly staying quiet on whether it is collecting video and image data from its artificial intelligence (AI) wearable device Ray-Ban Meta smart glasses to train its large language models (LLMs). The company announced a new real-time video feature for the device using which users can ask the AI to answer queries and ask for suggestions based on their surroundings. However, there is no clarity on what happens to this data once the AI responds to the query.
The feature in question is the real-time video capability that allows Meta AI to “look” at the users’ surroundings and process that visual information to answer any query a user may have. For instance, a user can ask it to identify a famous landmark, show it the closet and ask for wardrobe suggestions, or even ask for recipes based on the ingredients in the refrigerator.
However, each of these functionalities requires the Ray-Ban Meta smart glasses to take passive videos and images of the surroundings to understand the context. In normal circumstances, once the response has been generated and the user has ended the conversation, the data should be left in private servers if not instantly deleted. This is because a lot of the data might be private information about the user’s home, and other belongings.
But Meta is reportedly not stating this. On being asked whether the company is storing this data and training native AI models on this, a Meta spokesperson told TechCrunch that the company is not publicly discussing the matter. Another spokesperson reportedly highlighted that this information is not being shared externally and added that “we’re not saying either way.”
The company’s refusal to clearly state what happens with the user data is concerning given the private, and potentially sensitive nature of the data the smart glasses can capture. While Meta has already confirmed training its AI models on public user data of its US-based users on Facebook and Instagram, the data from the Ray-Ban Meta smart glasses are not public.
Gadgets 360 has reached out to Meta for a comment. We will update the story once we receive a statement from the company.