Stopping the manipulation machines
By Greg Bensinger, New York Times
last updated: May 20,2021
Some things are difficult by design.
Consider Amazon. The company perfected the one-click checkout. But canceling a $119 Prime subscription is a labyrinthine process that requires multiple screens and clicks.اضافة اعلان
Or Ticketmaster. Online customers are bombarded with options for ticket insurance, subscription services for razors and other items and, when users navigate through those, they can expect to receive a battery of text messages from the company with no clear option to stop them.
These are examples of “dark patterns,” the techniques that companies use online to get consumers to sign up for things, keep subscriptions they might otherwise cancel or turn over more personal data. They come in countless variations: giant blinking sign-up buttons, hidden unsubscribe links, red X’s that actually open new pages, countdown timers and pre-checked options for marketing spam. Think of them as the digital equivalent of trying to cancel a gym membership.
There are plans in both the House and Senate to tackle dark patterns. And there’s movement at the state level, too. California strengthened its data privacy laws to include certain dark patterns and, in Washington state, lawmakers included similar language in a failed privacy bill of its own.
The phrase was coined over a decade ago by a British user experience designer — who maintains an online “hall of shame” — and since then dark patterns have become only more effective and pernicious. Because of the scale of the internet, if even a small percentage of these ploys work, many thousands or even millions of people may be affected.
Donald Trump’s 2020 campaign, for instance, used a website with prechecked boxes that committed donors to give far more money than they had intended, a recent Times investigation found. That cost some consumers thousands of dollars that the campaign later repaid.
Enforcement against dark patterns has been uneven, and generally left to the Federal Trade Commission under its rules prohibiting “unfair or deceptive acts.” But those unfair and deceptive acts can be hard to identify, or even to notice — which is, of course, precisely as practitioners intend. Without a clear baseline of federal enforcement, they have flourished. Yet stronger rules defining the extent of the problem and addressing the more egregious tricks could help to curtail the practice.
Parting consumers from their money is as old as retail itself. But with the benefit of real-time user data and the ability to quickly change online interfaces, dark patterns can be far more effective — and diabolical — than offline tricks. Protections that offline consumers enjoy, like cooling-off periods after buying a car, typically don’t apply to online transactions.
This week, the FTC held a workshop on dark patterns and their impact on consumers, particularly children. Last year, the FTC fined the parent company of the children’s educational program ABCmouse $10 million over what it said were tactics to keep customers paying as much as $60 annually for the service by obscuring language about automatic renewals and forcing users through six or more screens to cancel.
At the FTC event, panelists were shown an example of a children’s video game that threatened to turn a virtual pet over to the SPCA unless users made a $10 payment. Panelists, which included university researchers and regulators, detailed as well how dark patterns are particularly effective when used against minority groups, the poor, the less educated and the elderly, echoing offline schemes.
“While there’s nothing inherently wrong with companies making money, there is something wrong with those companies intentionally manipulating users to extract their data,” Representative Lisa Blunt Rochester (Democrat, Delaware) said at the FTC event. She said she planned to introduce dark pattern legislation later this year.
A 2019 Senate bill would have banned tactics that “intentionally impair user autonomy, decision-making, or choice” and would have established a group to advise the FTC on dark pattern enforcement. It failed to advance, but will be reintroduced during this session of Congress. Without such uniform federal legislation, a patchwork of state rules could lead to varying levels of enforcement and definitions of dark patterns — potentially creating more confusion for consumers and opportunity for unscrupulous businesses.
At the state level, California has addressed some of the more unscrupulous dark patterns. It prevents companies from using design tricks to dupe Californians out of exercising their right to prohibit their data from being sold, for instance. The state’s privacy laws, due to be updated in 2023, will include further consumer protections.
That’s a start. But there are many other common practices that must still be addressed for all consumers, like obscuring or burying unsubscribe buttons, fake sales countdown clocks, forcing users to file multiple requests to end a service, inoperable links, intentionally confusing choices and miniature or obscured fonts.
In a recent experiment testing some of the most commonly used tactics, like multiple opt-out screens and double-negatives, a University of Chicago Law School professor, Lior Strahilevitz, and a law clerk, Jamie Luguri, found that using dark patterns is extremely effective at compelling consumers to pay for services they didn’t necessarily want. Participants in the study subjected to digital cajoling were nearly four times more likely than a control group to keep a paid data protection service they had been automatically signed up for.
More than 1 in 10 e-commerce sites rely on dark patterns, according to another study, which also found that many online customer testimonials (“I wouldn’t buy any other brand!”) and tickers counting recent purchases (“7,235 customers bought this service in the past week”) were phony, randomly generated by software programs.
“Everyone is frustrated with dark patterns,” Strahilevitz said. “Companies are taking a calculated risk that they won’t get caught doing deceptive things because there is no consistent enforcement mechanism for this.”
Strategies that took decades to streamline offline — at car lots, casinos, grocery stores and even on ballot initiatives — can now be perfected and refined essentially overnight thanks to the high-tech tools and real-time analytics available online. The largest companies also hire behavioral psychologists and game theorists to help hone their techniques.
“The internet shouldn’t be the Wild West anymore — there’s just too much traffic,” Lauren Willis, a Loyola Law School professor, said at the FTC event. “We need stop signs and street signs to enable consumers to shop easily, accurately.”
Companies can’t be expected to reform themselves; they use dark patterns because they work. And while no laws will be able to anticipate or prevent every type of dark pattern, lawmakers can begin to chip away at the imbalance between consumers and corporations by cracking down on these clearly deceptive practices.
Read more opinion
Consider Amazon. The company perfected the one-click checkout. But canceling a $119 Prime subscription is a labyrinthine process that requires multiple screens and clicks.
Or Ticketmaster. Online customers are bombarded with options for ticket insurance, subscription services for razors and other items and, when users navigate through those, they can expect to receive a battery of text messages from the company with no clear option to stop them.
These are examples of “dark patterns,” the techniques that companies use online to get consumers to sign up for things, keep subscriptions they might otherwise cancel or turn over more personal data. They come in countless variations: giant blinking sign-up buttons, hidden unsubscribe links, red X’s that actually open new pages, countdown timers and pre-checked options for marketing spam. Think of them as the digital equivalent of trying to cancel a gym membership.
There are plans in both the House and Senate to tackle dark patterns. And there’s movement at the state level, too. California strengthened its data privacy laws to include certain dark patterns and, in Washington state, lawmakers included similar language in a failed privacy bill of its own.
The phrase was coined over a decade ago by a British user experience designer — who maintains an online “hall of shame” — and since then dark patterns have become only more effective and pernicious. Because of the scale of the internet, if even a small percentage of these ploys work, many thousands or even millions of people may be affected.
Donald Trump’s 2020 campaign, for instance, used a website with prechecked boxes that committed donors to give far more money than they had intended, a recent Times investigation found. That cost some consumers thousands of dollars that the campaign later repaid.
Enforcement against dark patterns has been uneven, and generally left to the Federal Trade Commission under its rules prohibiting “unfair or deceptive acts.” But those unfair and deceptive acts can be hard to identify, or even to notice — which is, of course, precisely as practitioners intend. Without a clear baseline of federal enforcement, they have flourished. Yet stronger rules defining the extent of the problem and addressing the more egregious tricks could help to curtail the practice.
Parting consumers from their money is as old as retail itself. But with the benefit of real-time user data and the ability to quickly change online interfaces, dark patterns can be far more effective — and diabolical — than offline tricks. Protections that offline consumers enjoy, like cooling-off periods after buying a car, typically don’t apply to online transactions.
This week, the FTC held a workshop on dark patterns and their impact on consumers, particularly children. Last year, the FTC fined the parent company of the children’s educational program ABCmouse $10 million over what it said were tactics to keep customers paying as much as $60 annually for the service by obscuring language about automatic renewals and forcing users through six or more screens to cancel.
At the FTC event, panelists were shown an example of a children’s video game that threatened to turn a virtual pet over to the SPCA unless users made a $10 payment. Panelists, which included university researchers and regulators, detailed as well how dark patterns are particularly effective when used against minority groups, the poor, the less educated and the elderly, echoing offline schemes.
“While there’s nothing inherently wrong with companies making money, there is something wrong with those companies intentionally manipulating users to extract their data,” Representative Lisa Blunt Rochester (Democrat, Delaware) said at the FTC event. She said she planned to introduce dark pattern legislation later this year.
A 2019 Senate bill would have banned tactics that “intentionally impair user autonomy, decision-making, or choice” and would have established a group to advise the FTC on dark pattern enforcement. It failed to advance, but will be reintroduced during this session of Congress. Without such uniform federal legislation, a patchwork of state rules could lead to varying levels of enforcement and definitions of dark patterns — potentially creating more confusion for consumers and opportunity for unscrupulous businesses.
At the state level, California has addressed some of the more unscrupulous dark patterns. It prevents companies from using design tricks to dupe Californians out of exercising their right to prohibit their data from being sold, for instance. The state’s privacy laws, due to be updated in 2023, will include further consumer protections.
That’s a start. But there are many other common practices that must still be addressed for all consumers, like obscuring or burying unsubscribe buttons, fake sales countdown clocks, forcing users to file multiple requests to end a service, inoperable links, intentionally confusing choices and miniature or obscured fonts.
In a recent experiment testing some of the most commonly used tactics, like multiple opt-out screens and double-negatives, a University of Chicago Law School professor, Lior Strahilevitz, and a law clerk, Jamie Luguri, found that using dark patterns is extremely effective at compelling consumers to pay for services they didn’t necessarily want. Participants in the study subjected to digital cajoling were nearly four times more likely than a control group to keep a paid data protection service they had been automatically signed up for.
More than 1 in 10 e-commerce sites rely on dark patterns, according to another study, which also found that many online customer testimonials (“I wouldn’t buy any other brand!”) and tickers counting recent purchases (“7,235 customers bought this service in the past week”) were phony, randomly generated by software programs.
“Everyone is frustrated with dark patterns,” Strahilevitz said. “Companies are taking a calculated risk that they won’t get caught doing deceptive things because there is no consistent enforcement mechanism for this.”
Strategies that took decades to streamline offline — at car lots, casinos, grocery stores and even on ballot initiatives — can now be perfected and refined essentially overnight thanks to the high-tech tools and real-time analytics available online. The largest companies also hire behavioral psychologists and game theorists to help hone their techniques.
“The internet shouldn’t be the Wild West anymore — there’s just too much traffic,” Lauren Willis, a Loyola Law School professor, said at the FTC event. “We need stop signs and street signs to enable consumers to shop easily, accurately.”
Companies can’t be expected to reform themselves; they use dark patterns because they work. And while no laws will be able to anticipate or prevent every type of dark pattern, lawmakers can begin to chip away at the imbalance between consumers and corporations by cracking down on these clearly deceptive practices.
Read more opinion