Australians who attend sports games could unknowingly be trading their biometric data for entry to the match as advances in artificial intelligence incentivise risky data collection.
The latest report from the Productivity Commission says the popularity of artificial intelligence (AI) and generative AI technologies will pose significant issues.
“AI is heightening data-related risks including through increased capacity to misuse (or even weaponise) data,” the report says.
“Data that is not risky today may become so tomorrow.”
AI technologies are generally trained on immense datasets, which has increased the value of data and changed the nature of data collection.
Developments such as facial recognition have increased the potential for intrusive personal information collection that can put a person’s privacy and rights at risk.
For example, the report says crowds at sports grounds that use facial recognition technology could have their faces scanned and the data sold.
If the building does not have an area free from facial recognition devices signposted as such, visitors might not have an opportunity to give their meaningful and informed consent.
The discussion of consent and AI has reignited in recent days after an image of a Victorian MP published by Nine News was digitally altered to depict her with enlarged breasts and a more revealing outfit.
The broadcaster apologised for the image and said the changes were a result of Adobe Photoshop’s “generative expand” tool, which uses AI to fill in areas as it essentially un-crops photos.
The report also touches on these issues of bias and warns that technology could discriminate against certain communities.
An over-representation of Aboriginal and Torres Strait Islander people in administrative data could result in biased AI outputs, for example.
Improved data on driving behaviour could lead to better drivers paying lower premiums for car insurance and vice versa.
There are also concerns about generative AI programs using art in their training data.
Not only does this pose issues for artists’ copyright, but it also poses issues for authenticity.
For example, if First Nations art is used in AI training data sets, it allows generative AI programs to produce “culturally unsafe outputs”, the report said.
While Productivity Commissioner Stephen King says Australia could benefit from AI technology, governments must improve protections to unlock its full potential.
Clear rules around text and data mining for AI training models would help protect creative industries and improve data accessibility, as would a national data strategy.
“Judicious policy interventions and a practical approach to regulation would put the Australian economy in the best position to ride that wave,” King said.
This article was first published by AAP.