Dark patterns, the tricks websites use to make you say yes, explained

Dark patterns, the tricks websites use to make you say yes, explained

2021-04-01 15:20:00
{widget1}

If you're an Instagram user, you may have recently seen a popup asking if you want the service to "use your app and website activity". to & # 39; provide & # 39; a better ad experience & # 39 ;. At the bottom there are two boxes: in a slightly darker shade of black than the pop-up background, you can choose to 'Make Ads Less Personal'. A bright blue box urges users to personalize "ads".

This is an example of a dark pattern: a design that manipulates or strongly influences users to make certain choices. Instagram uses terms such as "activity" and "personalized" instead of "tracking" and "targeting," so the user may not realize what they're actually authorizing the app for. Most people don't want Instagram and its parent company, Facebook, to know everything they do and wherever they go. But a "better experience" sounds like a good thing, so Instagram makes the option users want to choose more prominent and attractive than the one they hope to avoid.

A & # 39; better ad experience & # 39; is subjective.

There is now a growing movement to ban dark patterns, and that could very well lead to consumer protection laws and measures as technology policies and initiatives of the Biden administration take shape. California is currently tackling dark patterns in its evolving privacy laws, and Washington State's latest privacy law includes a provision on dark patterns.

"When you look at the way dark patterns are used in digital engagement, they are generally (over the Internet) significantly exacerbated and made less visible to consumers," Rebecca Kelly Slaughter, acting chairman of the Federal Trade Commission (FTC), recode told. "Understanding its impact is very important to us in developing our strategy for the digital economy."

Dark patterns have caused internet users to give up their data, money and time for years. But if some lawyers and regulators have their way, they may not be able to do it for much longer.

Dark patterns, briefly explained

While you may not have heard the term dark patterns before, you have certainly seen countless examples of them – and experienced their effects:

  • The trial streaming service you signed up for will only be automatically billed when the trial period has expired
  • The interstitial ad for the app that you don't know how to get out of because the & # 39; X & # 39; in the top right corner is too small and dim to see …
  • … or the & # 39; X & # 39; is so small that you accidentally click on the ad itself and are redirected to the ad's website
  • The drugstore account you need to create to get a vaccination appointment, but can't easily cancel
  • The marketing email that commands you to respond within the next five minutes or otherwise, and includes a fake countdown timer
  • The large pop-up prompting you to sign up for a website's newsletter with a big red 'Sign Me Up' button, while the opt-out button is much smaller and passive – aggressive means everyone who clicks is a bad person who don't care about saving money or staying informed

But there are also effects that may not be as clear-cut. Dark patterns are used by websites to trick users into consenting to be tracked, or to have their data used in ways they didn't expect or want. Or sites will claim to provide ways for users to opt out of tracking (usually because they are required by law), but use misleading language or make it particularly difficult to actually do it.

Ultimate Guitar's Pro Access sale is a few hours away from the end of the last few months, if not years.
Ultimate guitar

For example: pop-ups with permission for cookies. Websites will tell you that their sites use cookies and then ask you to "accept" them, usually by clicking a large, prominent, brightly colored icon. But if you want to refuse the cookies, you will have to search for and click through to a settings menu and disable them manually. Most people don't have the time or the inclination to do this for every single website they visit, if they even understand what's being asked in the first place. Companies whose revenues depend heavily on user data don't want to make it easy for those users to refuse to provide it.

If you do not want Forever 21 to place cookies on your browser, you must click the 'opt-out' and manually disable each category.
Forever 21

Harry Brignull coined the term "dark patterns" in 2010 and has been watching them ever since on its website (he also wrote about them for The Verge in 2013). Dark patterns existed in the physical world long before the Internet emerged: '90s kids will remember the great deal from mail-order music club Columbia House to buy 12 CDs for just one cent (plus shipping), which they then automatically signed up to a CD-a-month club that was almost impossible to cancel. But the internet has made dark patterns so much more ubiquitous and powerful. Websites can fine-tune their methods using the very specific feedback their visitors provide, optimizing their manipulation on a scale that the physical world in its wildest dreams could never reach.

"I think the Internet has made it easier to industrialize the way we persuade and, in turn, manipulate and mislead each other," Brignull told Recode.

Some of the more obvious and scam dark patterns – sneaking extra items into shopping baskets or broaching hidden costs – have made illegal in places, and the FTC has started looking for some of the most egregious offenders. But the law isn't that crazy when it comes to privacy, data, and consent.

Some websites use guilt or shame tactics to convince you to hand over your personal information.
Florsheim

It is difficult to know what an action-oriented deceptive act or practice is when there is no privacy law at all. And it's hard for consumers to know what they're inadvertently giving away or how it could be used against them when it's all happening behind the scenes.

"It's a bit like invisible health effects from inhaling vapors or getting a dose of radiation: you may not realize it at that point, but it has a hidden impact on you," Brignull said. "With privacy, it's pretty hard to think and understand what the long-term implications are for you. You're constantly leaking information about yourself to data brokers and you don't really know how they're using it to market."

That's why Brignull and a growing number of attorneys, regulators and lawmakers believe that legislation is needed to stop these dark patterns so that consumers can use the internet without being constantly manipulated into spending money and signing up for services they don't need. or providing their details.

& # 39; Regulation works, & # 39; said Brignull. "It can really turn the Internet into a place to be a good place to be, rather than a complete Wild West environment. And we need it."

How laws and regulations can stop the worst dark patterns

If you live in California, you already have it. One of Attorney General Xavier Becerra & # 39; s last acts Before leaving office to head the Department of Health and Human Services, he had to add regulations around dark patterns to the state's Consumer Privacy Act (CCPA). This banned dark patterns that were intended to make it difficult for consumers to exercise some of the rights provided by law, such as opting out from selling their data. Prohibited dark patterns include forcing users to click through multiple screens, scrolling through lengthy privacy policies, urging them not to opt out, or using confusing language.

Washington State's third attempt to pass a privacy law, currently working its way through the legislature, says dark patterns may not be used to obtain user consent to sell or share their data – a provision reiterated in the recently passed California Privacy Rights Act (CPRA), an extension of its CCPA.

Federal lawmakers are also paying attention to dark patterns. During a recent House Energy and Commerce Committee hearing on social media and disinformation, Rep. Lisa Blunt Rochester (D-DE) asked Big Tech CEOs Mark Zuckerberg, Sundar Pichai and Jack Dorsey if they would oppose against legislation banning dark patterns that trick users into giving away their data. This data, she said, was often used in algorithms targeting people who are particularly prone to misinformation.

“Our children… seniors, veterans, people of color, even our democracy is at stake here,” said Blunt Rochester. & # 39; We have to do something. And we will assure you that we will act. "

At the end of last year, the congressman introduced the DETOUR (Deceptive Experience To Online Users Reduction) Act, the house version of the bill of the same name that Sens. Deb Fischer (R-NE) and Mark Warner (D-VA) introduced in 2019.

"I introduced the DETOUR Act to address the common tactics tech companies use, which are used to collect as much personal information as possible," Blunt Rochester told Recode. "They are deliberately deceptive user interfaces that trick people into transferring their data."

The bills only targeted online services with more than 100 million monthly active users – Twitter, Facebook and YouTube, for example – and prohibits them from designing user interfaces that manipulate users to consent to provide their personal information. The platforms would also not be able to rotate experiments with design change on users without their consent.

Blunt Rochester and Warner told Recode they plan to reenter the DETOUR Act this session.

"I am committed to working with my colleagues in Congress to ban the use of these deliberately manipulative practices designed to extract users' personal information," said Blunt Rochester.

Senator Fischer did not respond to a request for comment, but she rolled over the DETOUR law into the SAFE DATA law, the Republican version of the Senate Trade Commission. federal privacy law that they may reintroduce this session.

Finally, the FTC, which would likely be responsible for regulating legislation on dark patterns, is also scrutinizing the practice.

"This is behavior that we take seriously," said Slaughter, of the FTC.

The FTC is planning hold a workshop on the topic in late April, where it will discuss how dark patterns manipulate consumers, which groups may be particularly vulnerable or harmed by this manipulation, what rules there are to stop them, and whether additional rules are needed and what they should do to become.

"I view this issue as much more of a data misuse than just data privacy," Slaughter said. “The first step in collecting your data may not be the immediate damage. But how then is that data aggregated, used and transferred to manipulate your purchases, target ads, create this surveillance economy that does a lot of downstream harm to users in a way that is less visible to the user or the public? "

The authority of the FTC here stems from its mandate to enforce deceptive or unfair trading practices. The agency has been looking for offenders who use dark patterns where possible. Cheating people to sign up for and pay for subscriptions or services, and intentionally make it difficult to cancel, is a clear and doable exampleMaking people think they are buying something for a fixed price without revealing the extra costs is another

One of the few federal privacy laws we have – the Children's Online Privacy Protection Act – gives the FTC authority over many privacy violations against children under 13, and many dark patterns are incorporated into that law. But no such law exists for adults, so confusingly worded privacy policies and opt-outs that lead to data misuse may need legislation that explicitly prohibits them before the FTC is authorized to act.

Nor will that legislation be easy to write. The line between deliberate deception and legally inducing a user to make a choice that will substantially benefit a business can be blurred.

“Part of the challenge in regulating dark patterns is the gray areas: the cases where users of a technology are constrained to the point that they cannot exercise full autonomy, but they may not be fully manipulated, or may be forced but with a light touch, ”Jennifer King, privacy and data policy officer at the Stanford University Institute for Human-Centered Artificial Intelligence, told Recode.

In lieu of a federal privacy law, Slaughter says she hopes to use it Section 18 of the FTC Act to exercise the regulatory power of the commission.

"The FTC should have a clearer, more straightforward administrative procedure law to handle this sort of thing," Slaughter said. But in the meantime, I'm really excited to use all the tools we have, including our Section 18 authority, to tackle it. Is it easy? No. Is it fast? No. Is it worth it? Yes Because if we just wait for Congress to do something, we can wait a long time. ”

Open source is powered by Omidyar Network. All open source content is editorially independent and produced by our journalists.


{widget2}

Leave a Reply

Your email address will not be published. Required fields are marked *