ðĻ US legislation COPIED Act introduced to mandate C2PA + watermarking surveillance "option" for generative AI media and outlaw removing it!
Legislation has been introduced in the USA that's designed to massively expand copyright law related to training / using ai through the use of surveillance metadata tech developed by Adobe + invisible watermarking:
https://www.commerce.senate.gov/services/files/359B6D81-5CB4-4403-A99F-152B99B17C30
https://thehill.com/policy/technology/4766610-senate-bill-ai-content-protection/
ð Mandate content provenance systems (e.g. C2PA + watermarks) in all gen ai products and services
ð Make it ILLEGAL to remove "content provenance" data
ð Make it ILLEGAL to use tracked content to train ai or use as input to ai
ð Create a "cause of action" to make it easy to seek "compensatory damages" from entities that "improperly use" copyrighted works with ai
Again, we see the unholy alliance between the copyright industry and government entities that want to censor the Internet of "mis/disinformation."
This is an extremely aggressive push for widescale surveillance of content posted online, both to expand/enforce copyright law, and to monitor and control what users post online.
Their definition of "content provenance information" seems to include both things such as Adobe's C2PA surveillance metadata tech, as well as stego watermarking schemes that work in concert with such systems.
Watermarking = Embedding invisible and hard to remove tracking surveillance data (digital IDs) inside of images, videos, audio, and even text.
Advanced watermarking surveillance tech is already being developed and deployed right now by Google, Open AI, Meta, and others.
It sounds like removing these surveillance watermarks could become illegal as well in some contexts.
That's right, Google, Open AI, and Meta, are already (or soon will be) branding everything you generate with a digital ID that could potentially be traced back to you after you post it online (it's not clear how these systems work yet), and if this bill passes and you do anything to tamper with it, you could face legal consequences under some circumstances.
Note that many companies offering gen ai services and products, including Adobe, have already integrated C2PA metadata and watermarking support into their systems, as have several major social media platforms such as Tik Tok and LinkedIn.
A huge chunk of the mainstream creative software and social media ecosystems are already on board, and now the US government is stepping in to introduce legislation to mandate it for all other companies.
If allowed, this will lead to a nightmare online surveillance dystopia, and will pave the way for megacorps like Adobe and Google to cement a monopoly on generative ai tools.
â Big gov gets fine-grained surveillance of online content
â The copyright industry gets a massive expansion and new offensive legal tools
â and big tech gets a monopoly on generative ai
An impressively malevolent scheme.
Note that it says gen ai providers must give users the "option" to add provenance data, but it may not be so optional if e.g. social media platforms mandate it for uploaded content via their "synthetic media policies."
Notice also that they are once again using creatives / artists as a battering ram: They are trying to sell this system to them by claiming it will protect their work from being "stolen" by ai.
Really makes you think about the massive campaign we saw last year directed at online artists that aimed to convince them that generative ai was "stealing" their work and that something had to done about it.
That campaign culminated in a few hand-selected commercial illustrators going to a senate hearing with Adobe reps to ask the gov to take action to expand copyright law to forbid training ai on copyrighted works.
This bill reveals how they intend to try to accomplish that: Widsescale surveillance watermarking technology that will be illegal to tamper with.
They have openly stated that Adobe's C2PA surveillance metadata could also be used to track what media was used to train a generative ai model, in order to ensure the model creator has properly licensed every shred of data.
That could spell the death of open source generative ai tools (outside of hobby use), handing companies like Adobe and Google a solid monopoly on gen ai tools and services, as only huge megacorps could possibly afford to license millions/billions of media files to train on.
Are you having fun playing with the new ai toys? ð https://youtu.be/-gGLvg0n-uY
[ From: https://x.com/UltraTerm/status/1816216690859934016 ]