Outrage in Australia Over AI and Author Rights
In a surprising turn of events, the Australian literary community has erupted in outrage. Black Inc, the publisher behind the widely respected Quarterly Essay and works by renowned authors, recently sought consent from its authors to use their texts for training AI models. This plan, which involves sharing potential revenue with the authors, has sparked a heated debate.
For context, I’m not just an observer in this scenario. As someone who has published four books with Black Inc, with another on the way, and a sixth on the horizon, I have a vested interest in their decision. Having spent 40 years as an AI researcher, I also have firsthand experience with training AI models using written data.
The Dilemma of Small Publishers
I agreed to Black Inc’s proposal, but I can’t help but think the communication around it could have been clearer. Questions linger: Who exactly is Black Inc partnering with? What is the ultimate goal? Why the rush to get author consent? Despite these concerns, my sympathy lies with Black Inc.
Small publishers like Black Inc play a crucial role in nurturing Australian literature and preserving our cultural heritage. They often operate in a challenging environment, struggling against larger publishing giants like Penguin Random House. For them, many books are losses, and their survival often hinges on occasional bestsellers. Without small publishers, many deserving authors, including myself and others like Richard Flanagan and David Marr, might never be heard.
A Deep Sense of Outrage
Yet, my support for Black Inc does not shield me from my own outrage. My ire is directed primarily at big tech companies like OpenAI, Google, and Meta. These corporations have trained AI models, including ChatGPT and Gemini, using my copyrighted works without permission or compensation, effectively stealing from authors and publishers alike.
In early 2023, I alerted Black Inc to the unauthorized use of our works. When they expressed skepticism about how I knew, I pointed out that ChatGPT was able to summarize a chapter from my book. It was clear that this lack of transparency is a significant issue.
Big tech argues that training AI models on copyrighted material is ‘fair use’. I vehemently disagree. Last year at the Sydney Writers’ Festival, I labeled this practice the "greatest heist in human history." The reality is that these companies are profitably siphoning from a vast reservoir of human culture.
The Need for Change
What adds to my frustration is that these tech giants didn’t pay for the copies of my work they used, likely sourcing them from illegal compilations. This situation mirrors the early days of music piracy when streaming services emerged from the ruins left by platforms like Napster, which bankrupted countless musicians.
In publishing, we find ourselves in a similar predicament. We need a model that ensures that creators receive fair compensation for their work. This is crucial, especially for small publishers, to secure a seat at the negotiation table with colossal tech firms. Therefore, I signed Black Inc’s contract—with a heavy heart, I admit—seeing it as the lesser of two evils.
Global Implications
Equally alarming are recent changes proposed by the British government regarding copyright laws. These changes could allow AI developers to utilize any material they have lawful access to for training, placing the burden on creators to opt out if they do not wish their works to be used.
The argument that AI training on books is comparable to human reading is deeply flawed. AI processes far more information than any single person could in a lifetime. The implications of this go beyond literature, potentially encompassing all digital knowledge—including science and cultural heritage.
Imagine a world where tech giants amass and control all this knowledge, shaping our behaviors and decisions behind a veil of unseen manipulation. This is a reality we must guard against.
Conclusion: A Call to Action
As this issue unfolds, it’s evident that the digital framework surrounding our creative works is under threat. We’re in a pivotal moment—one that demands awareness and action from both creators and consumers alike.
The AI Buzz Hub team is excited to see where these breakthroughs take us. Want to stay in the loop on all things AI? Subscribe to our newsletter or share this article with your fellow enthusiasts.