- The Federal Trade Commission is calling out companies' use of so-called dark patterns that trick consumers into subscribing to services they don’t want or giving away data they don’t understand they’re giving away.
- The agency says companies have gotten away with the use of deception for a long time but the law is catching up to them.
- “These practices are squarely on the FTC’s radar,” the agency says.
In a report it released a few weeks ago, the FTC divides dark patterns into four broad groups: 1) inducing false beliefs, 2) hiding material information, 3) imposing unauthorized charges and 4) subverting privacy choices.
What they all share are design practices that are intended to mislead, like making people think they’re reading a news story when it’s an advertisement or waiting until a customer has already filled out a purchase form to disclose a fee.
In a typical case, the opt-in button is big and simple while the opt-out button is accessible only after making the user navigate to another page and search for it in paragraphs of dense legal text.
“Repeatedly presenting the choices as ‘Yes’ or ‘Not Now’ instead of ‘Yes’ or ‘No’” is a common pattern, the agency says.
Although Congress has yet to pass comprehensive online consumer protection legislation, the agency has a number of statutory tools at its disposal to go after companies, the FTC says.
This includes the Restore Online Shoppers Confidence Act, which targets the use of negative-option subscription charges, and several laws that have been around for decades that can be applied in an online context. These laws include the Telemarketing Sales Rule, which targets misleading pricing practices, and the Truth in Lending Act, which targets deceptive credit charges.
The Children's Online Privacy Protection Rule, enacted in 1998 to regulate companies’ collection of personal information for children up to 13 years old, is another major tool the agency uses.
Probably the most comprehensive effort yet to combat dark patterns isn’t at the federal level but is incorporated into the California Consumer Privacy Act (CCPA), enacted in 2018 and amended in 2020. In a direct shot at how companies use design to manipulate people, the law specifically links some of the most important restrictions to how website and app functions look.
“It’s a really interesting concept they’re driving at here,” David Stauss, an attorney with Husch Blackwell, says. “They’ve essentially bundled the [enforcement] concepts with design aspects.”
To get permission to collect data, for example, “the link needs to appear in a similar manner as other links that the business uses on its home page,” says Shelby Dolen, also an attorney with Husch Blackwell. “They must use the same font size and color as any of the other links.”
Federal data privacy and security legislation making its way through Congress doesn’t go nearly as far as the California law in mandating design elements, but it does include requirements that opt-in and opt-out choices be presented clearly and in equal measures to prevent steering consumers in one direction over another.
That legislation, which passed a key House committee a few months ago with almost no opposition, is unlikely to pass this year but analysts see it as a rare example of major legislation that can overcome partisan gridlock.
Separate from that legislation, the FTC report shows, the agency has the regulatory tools to go after companies.
“Businesses [are] on notice that the FTC will continue scrutinizing these practices,” the agency says.