Ghostwriting Group Publishes AI Use Guidelines for the Industry
Gotham Ghostwriters released a set of AI guidelines Wednesday aimed at establishing baseline standards for how ghostwriters and their clients should approach generative AI tools. The document covers potential uses, specific risks, and a framework for negotiating AI's role in collaborative writing projects.
The guidelines list three categories of AI application: administrative tasks, research and analysis, and generative uses such as drafting initial text or graphics that writers will revise later. A separate section enumerates five concrete risks: potential copyright ineligibility for AI-generated content, inclusion of confidential material in training datasets, plagiarism in generated text, inaccurate transcription, and factual errors.
Dan Gerstein, CEO of Gotham Ghostwriters, said the guidelines aim to "help ghostwriters and their clients navigate this disruptive change together." The working group included industry leaders from the Association of Ghostwriters and Splash Literary, along with several working writers and editors.
Survey Shows Split Among Writers on AI Adoption
The guidelines build on a November 2025 study by Gotham Ghostwriters that surveyed fiction authors and writing professionals. Sixty-one percent of respondents reported using AI tools. Those with more advanced AI experience expressed less concern about the technology's risks.
Gerstein noted a pattern in the data: "The more writers use AI, the more optimistic they are about its potential to elevate the profession instead of decimate it." He acknowledged that skepticism persists, framing the industry challenge as convincing reluctant writers to accept AI as inevitable.
The guidelines position transparency as central. They're designed to help ghostwriters and clients discuss AI use openly rather than impose a single approach across the profession.
For more on how AI tools are changing the writing profession, see AI for Writers and Generative AI and LLM.
Your membership also unlocks: