In an era defined by data and digital transformation, so-called “algorithmic pricing” has emerged as a natural extension of longstanding retail practices. Far from being a novel or dangerous innovation, algorithmic pricing is simply the use of tools suited to today’s fast-moving, competitive marketplace to bring greater efficiency, responsiveness and accuracy to how prices are set.
NRF is pushing back on unnecessary and aggressive state laws that limit behavior retailers have always used to give consumers competitive prices. Learn more.
At its core, algorithmic pricing is a data-driven strategy that uses automated algorithms to adjust prices dynamically based on factors like consumer demand, competitor pricing, market trends and customer behavior.
These algorithms, often powered by artificial intelligence and machine learning, allow businesses to recalibrate prices in real time, maximizing competitiveness and optimizing supply and demand.
This isn’t new; it’s just faster. For decades, retailers have relied on manual competitor checks, seasonal adjustments and intuitive promotions to adjust prices. Algorithmic pricing does the same thing, just using modern tools.
Consumers benefit directly from this evolution. Pricing becomes more efficient, more competitive and more responsive to market realities. Common outcomes of responsible algorithmic pricing include lower prices during off-peak periods, tailored discounts and better inventory management.
Such pricing is not only common; today it is expected by consumers. Online retailers have long adjusted prices in response to demand shifts, customer segments and local conditions. These changes create real-time flexibility that supports consumer access and market fairness.
The Federal Trade Commission and the Department of Justice have both made clear that algorithmic pricing, when implemented independently and without coordination, is fully consistent with antitrust law and can be pro-competitive.
And yet, despite these clear benefits, some state governments have recently sought to paint algorithmic pricing as presumptively harmful. Disclosure mandates, such as those in a law recently enacted by the state of New York, impose sweeping obligations on businesses to notify customers every time a price is influenced by personal data.
While transparency is important, this presumes that algorithmic pricing is inherently problematic or unfair. That premise is not supported by the facts.
This NRF working group gathers retail leaders to facilitate policy and stakeholder engagement on AI issues and the development of practices and guidelines for the use of AI within retail.
New York’s new disclosure law, for example, requires businesses to notify consumers whenever a price is determined by an algorithm using personal data. The notice must state: “This price was set by an algorithm using your personal data.” While the law aims to promote transparency, it does so by framing personalized pricing as something suspect — regardless of whether any harm or discrimination is present.
The statute’s vague definitions and broad scope may capture even routine loyalty programs and targeted discounts. That will undermine consumer trust in merchants that are doing nothing wrong; it also exposes businesses to compliance risks. And unfortunately, other states — including California and Minnesota — are now considering similarly overbroad proposals that could fragment the regulatory landscape and discourage responsible pricing innovation nationwide.
Requiring the disclosure of algorithmic activity may seem innocuous, but when the required disclosures are overly broad, the result will have a chilling effect on legitimate retail practices. Retailers might shy away from offering personalized discounts, tailored loyalty offers or efficient pricing updates simply to avoid the risk of triggering notice obligations that are confusing to consumers and mischaracterize benign pricing decisions as nefarious.
These vague and overbroad laws also create an uneven playing field: Smaller businesses that rely on third-party pricing software may face disproportionate burdens, while larger competitors with in-house tools are better able to navigate or absorb the compliance costs.
The result is a regulatory landscape that penalizes innovation and efficiency, particularly in the digital economy. By placing unnecessary burdens on dynamic pricing models, these mandates may lock in less efficient practices and reduce pricing flexibility, ultimately to the detriment of consumers.
More frequent price updates, for example, allow businesses to quickly respond to inflation, shifts in consumer demand and real-time inventory levels. Slowing or discouraging those updates in the name of disclosure could lead to less responsive pricing and reduced promotional opportunities.
Retailers are not seeking to avoid scrutiny. Those that are implementing algorithmic pricing models are paying careful attention to fairness and preserving customer trust. Industry best practices include using “white box” algorithms that are explainable, not using protected personal data and implementing oversight mechanisms to detect unintended bias.
NRF's Retail Law Resource Center offers retail in-house counsel invaluable legal insights and unique opportunities to build meaningful peer-to-peer relationships.
The push for disclosure, while well-meaning, will result in a regulatory landscape that penalizes innovation and efficiency, particularly in the digital economy. It presumes guilt in a system that, when properly used, delivers tangible benefits to consumers.
In an increasingly digital retail economy, we should be fostering innovation that allows retailers to compete and respond to market forces — not discouraging it through burdensome, unnecessary regulations that suggest wrongdoing where there is none.
Consumers want fair prices. Retailers want to provide them. Algorithmic pricing helps make that possible. It’s time to stop treating it like a threat and start treating it like the tool for progress that it truly is.