UNSW petition challenges 'Generative AI for Artists' elective: what creatives should know
More than 7,000 people have signed a petition urging the University of New South Wales (UNSW) to cancel its new elective, DART2252: Generative AI for Artists. Final-year Arts student Robin Chessell started the petition in September, arguing the subject undermines intellectual property, increases emissions, and fuels "AI slop" that drowns out original work.
The course is a rebrand of Emerging Media Technologies Studio, first offered in 2021. UNSW says the subject pushes students to engage critically with generative AI, including issues of misuse, plagiarism, and environmental impact. Critics argue students are paying roughly $1,000 for material they believe is freely available online.
Why students are pushing back
Chessell says mass-produced AI content floods social feeds and buries the work of artists who rely on those platforms for income. The petition also calls out the energy demands of AI systems and the ethical risk of training on copyrighted work without consent.
"AI slop," as described in the petition, is more than an aesthetic complaint. It's a distribution problem: low-effort content crowds channels where attention is already scarce.
How the course team responded
Associate Professor Oliver Bown, the course convenor, acknowledged the criticism and updated the subject. Students must now complete an ethical and environmental impact assessment of any AI tools they use. Those who object to using AI are not required to use it and can discuss alternative tools.
UNSW maintains the elective is consistent with its ethics and sustainability standards and says the class critically explores AI rather than promoting it. The university also noted the current enrollment is under 20 students.
The sustainability question
UNSW has stated the course is consistent with its Environmental Sustainability Plan. You can read the university's sustainability commitments here: UNSW Sustainability.
More broadly, AI's energy use is under scrutiny across the sector. For context on data centres and AI electricity demand, see the International Energy Agency's overview: IEA: Data centres and networks.
What this means for working creatives
Whether you're for or against AI in studio practice, this moment is about control: over your process, your rights, and your signal in a noisy feed. Here's how to protect your work and your income while staying informed.
Practical steps for your practice
- Publish your AI policy: disclose where you do or don't use AI, label outputs clearly, and set boundaries for client work. Make your stance part of your brand.
- Protect IP in briefs and contracts: specify training data restrictions, style copying limits, and licensing terms. Add a "no unauthorized AI training" clause where relevant.
- Differentiate with process: document sketches, drafts, and behind-the-scenes work. Process proof builds trust clients can feel and algorithms can't fake.
- Own your channels: reduce reliance on social feeds prone to "AI slop." Prioritize newsletters, communities, and site-based portfolios with strong case studies.
- Audit tool impact: prefer providers that publish energy and model documentation, choose smaller models when possible, and batch heavy tasks to reduce compute waste.
- Learn with guardrails: if you explore AI, do it with clear ethics, consent-aware datasets, and explicit client communication. Curated learning hubs can help sort signal from noise. Browse courses by job
The core tension
Chessell appreciates the course team's willingness to adjust, but argues the subject should focus on critique without requiring AI use. Bown believes the educational upside outweighs the downside if ethics are built in. That disagreement reflects the broader fight in creative work: preserve craft and rights while dealing with tools already affecting markets, algorithms, and client expectations.
Add your voice
If you want to support or review the petition, you can find it here: Stop DART2252: Generative AI for Artists at UNSW.
Your membership also unlocks: