Social networking site LinkedIn has been accused of harvesting unwitting Australian users’ data to train its artificial intelligence ‘models’ without letting them know.
Creative media expert Dr James Birt blasted the tech giant for making users opt out of a policy they did not even know existed – after automatically setting their accounts to agree to have their profiles’ data pillaged.
‘This showcases the dark side of big tech,’ the Bond University associate professor told news.com.au.
‘While users can opt out, the setting is enabled by default, which raises concerns about informed consent.
‘The absence of a proactive opt-in is a typical example of how big tech leverages user apathy or a lack of awareness to further its AI initiatives.’
The setting, known as ‘Data for Generative AI Improvement’, has been automatically switched on for users outside the EU, EEA, UK or Switzerland, giving permission for LinkedIn and unnamed ‘affiliates’ to ‘use your personal data and content you create… for training content creation AI models’.
Leaving it on allows the app to share users’ activities with its unnamed ‘affiliates’ for the purpose of ‘training content creation AI models’.
LinkedIn has been called out for automatically turning on a preference in the app which allows users’ data to be pillaged for use in its AI systems
‘Data for Generative AI Improvement’ needs to be manually switched off by anyone using the app who does not wish for their activity to be recorded by LinkedIn
The includes anything that someone posts or even the contents of a users’ profile.
Turning off the setting will stop LinkedIn from harvesting data going forward but will not erase what it has already taken while the setting was active.
LinkedIn is not the only mainstream app to ‘scrape’ users’ data for its own benefit.
Meta, the company behind Facebook and Instagram, confirmed earlier this month it had also stored data on Australian adult users’ photos and posts since 2007.
Meta made the admission when its privacy policy director, Melinda Claybaugh, appeared before an inquiry and said the company harvested data when pressed by senators.
Meta, which owns Facebook and Instagram, confirmed in early September it had stored data on Australian adult users’ photos and posts since 2007. Meta made the admission when its privacy policy director, Melinda Claybaugh, appeared before an inquiry
Regulations surrounding this kind of behaviour has tightened in Europe where companies need permission to store users’ data, but in other areas like Australia this courtesy is not required.
Dr Birt said the decision to automatically opt users in for this kind of practice exemplified the ethical concerns around personal data storage.
LinkedIn states that it uses generative AI for ‘a variety of purposes’ including in its writing assistant which helps users draft messages.
Microsoft owns the platform and these AI models are on LinkedIn but not Microsoft’s Azure OpenAI, which created ChatGPT.
LinkedIn spokesperson Greg Snapper clarified the app was ‘not sending data back to OpenAI for them to train their models’.
Leaving the setting on allows the app to share users’ activities with its unnamed ‘affiliates’ for the purpose of ‘training content creation AI models’
Daily Mail Australia has contacted LinkedIn for comment.
In order to turn off the feature in the app, users have tap their profile, go to settings, then data privacy and finally access ‘data for Generative AI Improvements’.
From there the setting can be turned off.
When users click ‘learn more’ at the final stage, the app explains its AI usage.
‘This setting applies to training and finetuning generative AI models that are used to generate content (e.g. suggested posts or messages) and does not apply to LinkedIn’s or its affiliates’ development of AI models used for other purposes, such as models used to personalise your LinkedIn experience or models used for security, trust, or anti-abuse purposes,’ it states.
In LinkedIn’s generative AI FAQs, the app claims it will ‘seek to minimise personal data in the data sets used to train the models’.