Overview
New York is one of a growing number of states that are introducing and passing legislation aimed at regulating or prohibiting algorithmic pricing practices, reflecting increased concern over the transparency of artificial intelligence (AI)-driven pricing tools that automate pricing decisions based on personal data. Since November 10, 2025, New York’s Algorithmic Pricing Disclosure Act (the Act) has required entities that set the price of a specific good or service using personalized algorithmic pricing to disclose their use of personal data to set pricing. While the Act is the first law specifically addressing algorithmic pricing in New York, it is part of a broader and accelerating trend as other states adopt similar laws.
In Depth
The Act requires any entity that sets the price of a specific good or service using personalized algorithmic pricing to add a clear and conspicuous disclaimer stating: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” The disclaimer must be provided whenever an entity directly or indirectly advertises, promotes, labels, or publishes a statement, display, image, offer, or announcement of the pricing to a consumer in New York. Entities that fail to comply may face civil penalties of up to $1,000 for each violation. The law is not industry-specific and could apply to healthcare companies, digital health platforms, and other entities to the extent they use personalized pricing tools. In practice, these algorithmic and dynamic pricing tools are frequently powered by AI or machine learning models that automate pricing decisions based on data inputs such as location and shopping history.
The Act targets pricing practices that involve both automation and personalization. To specify the scope of these practices, the Act defines the following key concepts:
- Dynamic pricing is “pricing that fluctuates dependent on conditions.” The Act references “dynamic pricing” in its definition of “personalized algorithmic pricing,” which means dynamic pricing set by an algorithm that uses personal data. Dynamic pricing that does not take into account personal data (for example, pricing that fluctuates based on aggregate data, general market conditions, or time of day) is not governed by the Act.
- Personal data is “any data that identifies or could reasonably be linked, directly or indirectly, with a specific consumer or device.”
Although the Act is relatively new, recent regulatory attention suggests that New York authorities are already closely scrutinizing how companies deploy algorithmic pricing tools. New York Attorney General (AG) Letitia James has already begun sending letters to companies regarding their use of algorithmic pricing.
Analysis
The Act does not regulate all forms of dynamic pricing but rather targets dynamic pricing that is personalized through the use of personal data and algorithmic decision‑making. Some states have taken a different approach by embedding algorithmic pricing restrictions within broader antitrust frameworks or by targeting specific industries.
State trends
- California Assembly Bill (AB) 325 took effect on January 1, 2026, and expanded the Cartwright Act, the state’s primary antitrust law. The law targets anticompetitive pricing practices by prohibiting the use or distribution of “common pricing algorithms” in anticompetitive agreements that restrain trade or commerce, including methodologies or technologies that rely on competitor data to recommend, align, stabilize, set, or otherwise influence prices or commercial terms.
- Connecticut passed House Bill (HB) 8002, which took effect January 1, 2026. Similar to California’s AB 325, HB 8002 houses its prohibition on algorithmic pricing tools in the state’s antitrust laws. Unlike New York’s broadly applicable disclosure requirements, HB 8002 focuses specifically on algorithmic pricing practices in the rental housing market.
- Pennsylvania is among the most recent states to introduce legislation targeting dynamic pricing. If enacted, Senate Bill (SB) 1205 would prohibit “unfair methods of competition and unfair or deceptive acts or practices in the conduct of any trade of commerce,” including promoting or engaging in dynamic pricing.
- Tennessee has introduced a bill similar to Pennsylvania’s SB 1205. SB 1807 would prohibit an entity from setting the price of goods and services using personalized algorithmic pricing. The bill would classify violations as “an unfair or deceptive act or practice affecting trade or commerce” in violation of the Tennessee Consumer Protection Act of 1977. If passed, it would take effect on July 1, 2026.
While the scope and mechanics of these laws vary, each law reflects an interest by state governments to protect citizens’ information and reduce unfair trade practices by increasing scrutiny of pricing decisions that rely on data‑driven or automated systems. This trend is particularly relevant for healthcare companies, where AI‑enabled pricing tools have become embedded in several core functions, including:
- Insurance premium pricing. Insurance companies have been using AI‑driven predictive models to set premiums and account more accurately for consumer behavior and market changes. These models analyze large data sets, including claims history, diagnostic codes, utilization patterns, and population‑level risk factors to forecast costs and optimize pricing. While the models are typically applied at an aggregate level, increased reliance on data‑driven modeling raises questions about how premium price setting could be affected by a shift from utilizing anonymized data to more individualized personal or behavioral data.
- Provider‑side dynamic pricing. Hospitals have started using AI tools as a metric to gauge competitors’ rates, identify high‑margin procedures, and assist with resource management. Dynamic pricing has been implemented through cost adjustments for products and services based on supply‑and‑demand factors, which, when informed by automated tools or third‑party pricing software, may implicate state laws regulating algorithmic pricing practices.
- Direct‑to‑consumer health services. Direct‑to‑consumer health services such as telehealth platforms, digital health and wellness subscriptions, and concierge medicine services increasingly rely on AI‑enabled pricing models. These tools may be used to tailor subscription pricing, offer individualized discounts, or adjust fees based on user engagement, demand, or personal characteristics.
Key takeaway
As more states propose and adopt legislation prohibiting the use of algorithmic pricing, organizations using such tools should monitor developments on both the state and federal levels. Increasing legislation could soon change the landscape of how entities may legally incorporate AI and algorithmic pricing tools into their business practices. Healthcare companies that use algorithmic pricing tools should review applicable requirements and consider conducting periodic reviews of pricing methodologies to help ensure consumer data or competitor information is not being used in a prohibited pricing practice.
If you have questions or would like to discuss any issues related to New York’s disclosure requirement, contact your regular McDermott Will & Schulte lawyer or one of the authors.
Ashley Anumba, a law clerk in the New York office, also contributed to this client alert.