Keep Expressing Yourself and Stay Human, Urges AI Expert Brian Christian
Artificial intelligence will suggest your words, your tone, even your opinions-if you let it. Brian Christian, author and AI ethics expert, visited campus on October 6 as part of the College's Hastings Initiative for AI and Humanity, which prepares students to lead as AI changes how we live and work. His message was blunt: keep your voice. Don't outsource it to autocomplete.
Why your words matter
Christian shared a simple moment that says a lot. He typed "ill." His phone kept switching it to "I'll." He considered changing the word to "sick," but that would be letting the tool edit his identity. "We must insist on saying what we mean and on sounding like ourselves," he said.
He pointed to a core insight from computer science: making typical things easier often makes atypical things harder. Your uncommon phrasing, your personal rhythm-that's exactly what gets sanded down by default settings.
"Your idiosyncrasy, your ability to think, to speak, to act for yourself is the source of your power," he told the audience. Picasso had to invent a new way to paint the human face. At least his canvas wasn't nudging him back to center. Ours is.
The alignment problem, off the page and into life
Christian's latest book, The Alignment Problem, looks at a core question: How do we make sure AI systems serve human needs and values? The stakes are concrete, not abstract.
He cited cases where health algorithms gave worse estimates for minorities because the data underrepresented them. That's not a theory issue-it's a care issue. For background on this kind of bias in health systems, see this Science study on racial bias in algorithms here. For an overview of AI alignment as a field, start here.
He also warned how predictive tools can quietly become generative. A model built to forecast house prices became so trusted that people started using it to set prices. The same way predictive text can end up shaping how we write.
AI demands more than code
"AI at this point demands a radical interdisciplinarity," Christian told faculty. That means computer science, yes, but also philosophy for ethics, political science for policy and supply chains, environmental studies for data center costs to the planet, and English for linguistic impact.
This fits a liberal arts approach. "We also talked about when these tools are useful for students and when they are not," said Associate Professor of Neuroscience and Psychology Erika Nyhus. The shared view: great for processing data; not a replacement for genuine expertise. You still need to study.
Practical takeaways for writers and creators
- Finish your sentence before you look at the suggestion. Don't let the next word pick you.
- Use the right word, even if it's unusual. Don't swap precision for convenience.
- Let AI crunch data or summarize. Don't ask it to grant you expertise you haven't built.
- Keep a short style sheet of phrases, tone rules, and examples. Guide your tools with it.
- Watch how tools change your habits. If you don't like the shift, turn features off or change apps.
For students and career builders
Students asked how to prepare for an AI-heavy job market. Christian's view: the field moves too fast to predict exact tools. Follow what genuinely interests you. Build one project at a time. You don't need a 10-year plan.
He also reminded students of their leverage. "This is a live situation, and it could go either way." What you decide is "cool" can reshape products-and companies. If a product launches with an always-on camera, you decide if that's acceptable. Culture is a vote.
Student reactions
Alma Dudas '27: "I found his take on the alignment problem really interesting, particularly how small gaps in data or poorly defined goals can lead to completely unintended outcomes. He also drew on conversations with researchers across disciplines to make those complex ideas feel real."
Joe Gaetano '27: "The Alignment Problem completely changed the way that I think about AI and the direction it is heading. Mr. Christian was incredibly insightful in discussing the broader challenges and ideas surrounding AI and its alignment with human values."
Hold your ground
"Your choices, both collectively and individually, matter." That was the through line. Keep expressing yourself. Keep thinking for yourself. Use tools, but don't let them use you.
Next up for the Hastings Initiative: a Generative AI Hackathon on October 14, 9 a.m.-5 p.m.
Want structured training?
If you're a writer or creator building practical AI skills, these resources can help:
- AI courses by job - pick learning paths that fit your role.
- AI tools for copywriting - see what's useful without losing your voice.
Published October 10, 2025
Enjoy Ad-Free Experience
Your membership also unlocks: