New York's Personalized Pricing Law: What Legal Teams Need to Know
New York just set the marker for AI-driven pricing. The state is the first to require a conspicuous disclosure when retailers set prices online using algorithms and a customer's personal data.
The rule is simple on its face: if a retailer uses personalized pricing, it must display the exact notice, "THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA." The law survived an early federal court challenge and is already drawing fire from both industry and consumer advocates.
What the law targets
The focus is on personalized or "surveillance" pricing: using AI and a shopper's data to individualize the price they see. Think higher prices for a customer who usually buys premium jeans, or elevated hotel rates shown to someone who already booked flights.
The intent is to prevent quiet exploitation of personal data to extract more value from certain users. Disclosure is the lever the state chose, not a blanket ban.
Who is likely in scope
Online retailers and platforms that individualize prices based on a user's personal data should assume coverage. That includes e-commerce, travel, ticketing, subscription services, and marketplaces experimenting with algorithmic pricing.
Dynamic pricing that reacts to market conditions may be treated differently if it doesn't rely on personal data about the individual. The gray line is where segment-based pricing (e.g., device type, location, referral source) becomes "personal." Expect challenges here.
What you must display
If your system uses personal data to set an individual's price, the disclosure must state exactly: "THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA."
Placement and prominence will matter. Prepare to show it where the price appears (not buried in a policy), and ensure it persists across app and web flows.
Immediate action plan for counsel
- Map use cases: Identify every pricing flow that uses user-level data or proxies tied to an individual.
- Trigger the disclosure: Implement a deterministic rule that shows the required notice whenever personalized pricing logic fires.
- Document "personal data": Align an internal definition and data inventory so teams know what turns the disclosure on.
- Review consent and notices: Make sure privacy disclosures match actual data use for pricing and any profiling.
- Logging and evidence: Record inputs, model/version, decision factors, and timestamps for each personalized price.
- Human review gates: Require approval for new features or experiments that personalize price using personal data.
- Vendor contracts: Add warranties, audit rights, and compliance clauses for any third-party pricing engines.
- UX standards: Set minimum font, placement, and recurrence for the notice; prohibit dark patterns around the price.
- Fairness checks: Test for disparate impact and protected-class proxies (e.g., ZIP, device, browsing history).
- Incident playbook: Define how to pause personalization and notify stakeholders if disclosures fail.
Key risks and how to blunt them
Deception and unfairness claims: A disclosure that's hard to see or missing at key touchpoints is low-hanging fruit for enforcers and class actions. Treat the notice like a safety belt-on every page that shows a personalized price.
Antitrust exposure: Algorithmic pricing can create tacit coordination risks, especially through shared vendors or data. Maintain independent pricing policies, monitor outcomes, and avoid sharing competitively sensitive signals.
Bias and proxies: Personal data signals can mirror protected traits. Strip high-risk features, monitor outcomes, and keep a written rationale for what the model uses and why.
Open questions to watch
- Scope of "personal data": Are device type, IP-based geolocation, or referral source within scope? Build for the broader interpretation.
- Where the notice must appear: On product pages, cart, checkout, receipts, or all of the above? Aim for each place the price is displayed.
- Exemptions and edge cases: Loyalty discounts, coupons, A/B tests, and surge pricing without user-level data may be treated differently. Your logs should prove how the price was set.
How this fits with wider enforcement
Expect scrutiny under general consumer protection laws even beyond New York. The FTC's guidance on AI and algorithms signals that opaque automated decisions drawing on personal data are a priority area.
Multistate compliance will be easier if you treat New York's rule as the baseline. Build feature flags for state-specific disclosures, but default on transparency where feasible.
30-day rollout checklist
- Week 1: Complete a pricing data audit; freeze launches that use personal data without a compliant notice.
- Week 2: Ship a standardized disclosure component; add logging and alerting for missing notices.
- Week 3: Update privacy policy, help center, and internal FAQs; train product, growth, and legal ops.
- Week 4: Run fairness and antitrust reviews; finalize a regulator-ready memo explaining your approach.
Why this is the next battleground
Personalized pricing sits at the intersection of privacy, consumer protection, and competition. New York's move-and the law's early survival in court-invites copycat bills and tougher variants.
Legal teams that standardize disclosures, strengthen controls, and keep clean evidence will be ready for the next wave. Treat algorithmic pricing like any other high-risk automated decision: transparent, auditable, and defensible.
Resources
- FTC Business Blog: Using AI and Algorithms
- Complete AI Training: Courses by Job (for legal and compliance teams)
Your membership also unlocks: