Meta starts AI training with personal data - those who want to object must act quickly
https://datenschutz-hamburg.de/news/meta-starts-ai-training-with-personal-data
15 April 2025Until now, Meta's AI technology has not been trained with the Facebook and Instagram data of European users. This is now set to change: From the end of May 2025, Meta wants to use the data of all adult European Facebook and Instagram users for training of its own AI applications. The data will then be used to train the company's AI-based services such as the Meta-AI chatbot on WhatsApp or language models such as Llama.
Start of training planned for 2024 postponed for the time being
Meta had already announced last year that it would use all public posts and photos from its European Facebook and Instagram users for AI training. However, Meta then postponed the project after the Irish Data Protection Authority (IDPC), which is the lead supervisory authority for Meta in the EU, approached the company with data protection issues, particularly regarding the legal basis and transparency. The procedure that is now in place makes it easier to object to Meta's use of personal data for AI training.
If users want to object, then now
Registered users of Facebook and Instagram must now decide whether the company may train AI models with their personal data (posts, photos, etc.). This applies not only to the future, but also to all data from the past.
If you have no concerns about your publicly accessible posts, comments and photos in your own account being used for AI training, you don't have to do anything.
Anyone who wants to prevent their own data from being used by Meta for AI training purposes must take action now. All European users of legal age will be notified accordingly by Meta and informed of their options. You can object in the respective apps or for Facebook at https://www.facebook.com/help/contact/712876720715583 and for Instagram at https://help.instagram.com/contact/767264225370182 . For the objection to take full effect, it must be lodged before the end of May 2025. Although it is possible to object at any time after this date, it will no longer be possible to undo that your data has already been used for AI training. Training data is irrevocably incorporated into AI models and its influence can no longer be removed from the model according to the current state of the art.
Thomas Fuchs, Hamburg Commissioner for Data Protection and Freedom of Information: „I can well understand that users are concerned when all their images and texts shared on social networks are now incorporated into AI models. The only way to protect yourself here is to object in good time. If so, then now.“