AI Puts a $340 Price on My Montana Book

An AI firm offered $3

Categorized in: AI News Writers
Published on: Oct 12, 2025
AI Puts a $340 Price on My Montana Book

AI Wants to Inhale Your Book. Is $340 Worth It?

An AI company offered $340 to license a 12-year-old essay collection about Montana for "one-time use" in training. That price feels precise, almost casual. But "one-time use" in model training isn't a wet wipe. Once your text is ingested, it influences outputs for the life of the model and any derivatives.

Writers are being asked to trade decades of craft for a single check. The real question isn't "Is $340 fair?" It's "What are they buying, how long does it last, and what does it replace?"

What AI Actually Does With Your Book

Large language models don't learn your big ideas. They break your sentences into tokens and predict the next token again and again. To them, your work is data. To you, it's voice, structure, judgment, and years of field-tested thinking.

That gap matters. If a model absorbs your core patterns and style, it can produce "good enough" versions that compete with you-especially in commodity markets like SEO content, summaries, and generic explainers.

The $340 Dilemma: A Simple Valuation Frame

  • Baseline value: Annual royalties x years remaining you care about. If you earn $1.19 per copy and sell slowly, $340 might equal a few years of sales.
  • Cannibalization risk: Will model outputs reduce future sales, gigs, or speaking fees? Estimate a range. This is often the biggest number.
  • Option value: Future uses you haven't imagined (translations, audio, derivative works) lose scarcity once your text is in training corpora.
  • Precedent risk: Your yes sets a price for your catalog and for other writers on your publisher's list.
  • Reality check: Some models have already trained on pirated datasets like Books3/LibGen. Licensing can be a way to claw back value-but the terms must reflect that.

Before You Say Yes: Non-Negotiables to Request

  • Purpose scope: Pretraining vs fine-tuning vs embeddings. No "any and all future uses."
  • Model scope: Specify model names, versions, and parameter caps. No sublicensing without your approval.
  • Time limit: Fixed term (e.g., 24-36 months) with no automatic renewal.
  • Deletion rights: Documented removal upon expiry or breach, including from checkpoints and derivatives.
  • Attribution + disclosure: Public statement that models were trained on licensed works, with your title acknowledged in a registry.
  • Data governance: Dataset manifest listing file hashes and locations; annual audit rights.
  • Payment: Upfront fee + usage-based royalty (per token or revenue share). Most-favored-nation clause so you get improved terms offered to others.
  • Security: No redistribution of your files; watermarking or fingerprinting allowed for enforcement.
  • Indemnity: You're held harmless for downstream model outputs.
  • No training on private or student work product from your courses, workshops, or client projects without separate consent.

How to Counter a Lowball Offer

  • Anchor higher: "Given replacement risk and ongoing value, my rate is $3,500 upfront + usage royalty for a 24-month term across Model X/Y only."
  • Tier by scope: Pretraining costs more than fine-tuning; larger models cost more than smaller ones.
  • Bundle smartly: Offer excerpts or older editions at a lower rate; keep flagship works exclusive.
  • Ask for transparency: "Which dataset, storage location, and deletion process will you use? Please provide a manifest."
  • Create walk-away conditions: No dataset manifest, no term limit, no deletion = no deal.

Practical Safeguards You Can Deploy Now

  • Rights inventory: List what you can license, what you'll never license, and what you'll excerpt only.
  • Controlled samples: Publish summaries or first chapters instead of entire PDFs. Keep full texts behind controlled access.
  • Monetize scarcity: Offer commentary, workshops, and annotated editions that models cannot replicate.
  • Direct relationship: Add a clear CTA in every edition to your newsletter and community. Your audience is the moat.
  • Collective action: Coordinate with your agent, publisher, or guild. Group licenses can increase price and improve terms.

Ethics and Enforcement

Many writers discovered their books in unauthorized datasets. Some companies have faced legal pressure and settlements. Others may skate by unless writers enforce rights together.

If you consider licensing, do it with eyes open and paperwork tight. Price reflects control, not just word count.

A Grounded Perspective

There's no grand plan inside a model-only next-token prediction. Nature works like that too: step by step, no tidy endings. Your edge is curation, taste, and lived context. Guard those, and charge for access to them.

Further Reading

Level Up Your Defense and Leverage

Copy-Paste Clause Starter

  • Scope: License limited to fine-tuning Model [Name, Version], up to [Params] for [24] months. No pretraining. No sublicensing.
  • Data: Licensee must provide dataset manifest with file hashes; no redistribution; storage in [jurisdiction].
  • Payment: $[X] upfront + [Y%] of gross revenue attributable to the fine-tuned model; MFN applies.
  • Deletion: Upon expiry or breach, licensee deletes content from all datasets, checkpoints, and derivatives; provides written certification.
  • Attribution: Public disclosure: "Trained on licensed works from [Publisher/Author Title]."
  • Indemnity: Licensee indemnifies Licensor against claims arising from model outputs.

Say yes only if the terms pay for loss of control, not just access. If the offer treats your life's work like a disposable wipe, it's not a deal-it's a warning shot.


Tired of ads interrupting your AI News updates? Become a Member
Enjoy Ad-Free Experience
Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)