Friday, November 22, 2024

Is Musk’s X Using Dark Patterns To Trick Users? EU Says “Yes”

Must read

The European Union has accused X, formerly known as Twitter, of using deceptive “dark patterns” to mislead users. According to the EU’s preliminary findings, these practices violate the bloc’s new social media regulations under the Digital Services Act (DSA).

What Are Dark Patterns?

Dark patterns are user interface (UI) designs crafted to trick users into taking actions they might not otherwise take. Coined by Harry Brignull, founder of Deceptive Patterns , the term describes UI elements that manipulate users into decisions that benefit the service provider, often at the user’s expense. These patterns exploit cognitive biases and lack of user attention to guide users down a path that they may later regret.

Examples of dark patterns include:

  • Fake Scarcity: Users are pressured to act quickly by falsely indicating that what they are looking at is almost gone.
  • Disguised Ads: Tricking users into clicking on ads that look like navigation elements.
  • Roach Motel: Making it easy to get into a situation (e.g., signing up for a service) but hard to get out (e.g., canceling a subscription).

EU’s Allegations Against X

The European Commission’s investigation under the DSA revealed that X’s blue checkmarks, available for a monthly fee, are a form of dark pattern. For most of Twitter’s history, these checkmarks were a mark of verification for celebrities, politicians, and other influential accounts. It was an indication that the identify of the person had been verified.

Other social networks like LinkedIn continue to use a “verified” badge to indicated they have authenticated the member’s identity.

In contrast, under Elon Musk’s ownership, any X user can obtain a blue checkmark by paying $8 per month. This shift has led to confusion and deception, as users can no longer rely on the blue checkmark as a sign of authenticity and trustworthiness.

A quick scan of trending topics at X showed plenty of blue checkmarks – some recognizable names, some unfamiliar ones that might or might not be real, and a many anonymous accounts. At this point, it seems, the blue checkmark means only that somebody paid eight dollars to X – the person, an organization, a bot farm creator… there’s no way to tell.

According to the EU, this practice “negatively affects users’ ability to make free and informed decisions about the authenticity of the accounts and the content they interact with.”

The blue checks are just one aspect of the broader accusations, which also include blocking data access to researchers and shortcomings in X’s ad database​​.

Understanding the EU Allegations about X

The EU’s findings suggest that the current system of blue checkmarks on X can mislead users into believing certain accounts are more trustworthy or authentic than they actually are. This practice undermines the original purpose of verification, which was to provide a reliable indicator of an account’s legitimacy. By allowing anyone to purchase a blue checkmark, X is effectively reducing the clarity and reliability of this marker for genuine verification.

The allegations also extend to X’s compliance with transparency rules for advertising. The DSA requires platforms to maintain a searchable and reliable database of all digital advertisements, including details about who paid for them and their intended audience. The EU’s investigation found that X’s ad database is not up to these standards, hindering researchers’ ability to study emerging risks from online ads​​.

The Broader Issue of Dark Patterns

Dark patterns are not unique to X. Many platforms use similar tactics to nudge users into making decisions that benefit the company. These can range from making it difficult to cancel subscriptions to automatically enrolling users in services without their explicit consent.

All of the major tech companies have been reported multiple times at Deceptive Patterns. Even Trump campaigners and fundraisers make the list.

The most common dark patterns involve making a choice favorable to the user hard to find or making an action favorable to the brand look like the default or only option. Years ago, I inadvertently shared my contacts with LinkedIn by clicking a “Continue” button after adding a connection. Later, they suggested I should connect with my deceased mother on the platform.

Regulatory Response and User Awareness

The EU’s action against X is part of a broader effort to regulate digital platforms and protect users from manipulative practices. The Digital Services Act represents a significant step in that direction, setting clear guidelines for transparency and accountability.

As users, it’s important to be aware of dark patterns and understand how they can influence our decisions.

The EU is scrutinizing X for its alleged use of dark patterns, but this issue is common across many digital platforms. Understanding and recognizing dark patterns is crucial for consumers to make informed decisions and for businesses to maintain trust and transparency. By not only recognizing but these tactics, but calling out the offenders, we can make better choices and hold platforms accountable.

Latest article