Unveiling Deceptive Insight scheme: A new dark pattern in town.

Yashasvi Nagda
Bootcamp
Published in
6 min readJan 12, 2024

--

Exposing the exploitation of curiosity and the illusion of personalization.

Today, we already have a well-defined list of Dark Patterns that exploit various negative human emotions such as fear, anxiety, frustration, guilt, shame, peer pressure, impatience, isolation, insecurity, and regret. While many dark patterns tend to capitalize on negative emotions, there are manipulative tactics that can leverage positive emotions.

Recently, I identified a pattern that preys on the positive emotion of ‘Curiosity,’ which I’m calling the “Deceptive Insight Scheme.” This pattern has been operating in plain sight, employing tactics to generate interest by disclosing enticing but limited information to the user. For instance, platforms like http://Academia.edu and LinkedIn notify users about being cited or viewed by others but require payment for detailed insights, creating a sense of curiosity and prompting users to pay for actual information. When you see that you have been cited by someone in a perceivably and potentially valued publication or that multiple hiring managers are viewing your profile, it serves as social proof — an indication that others find value or importance in what you’re doing. Pattern’s strategy is to engage users and instil a sense of curiosity by partially revealing some information, and ultimately leading them to pay for getting actual insights.

Picture showing a mail from a research website that shows them that they have been cited by someone and a call to action button to view citations. Second picture shows the website asking user to to pay certain amount to view the citations

Ex: Here Academia.edu sends out a mail/ notification mentioning that user has been cited / mentioned by someone. To view who mentioned or where you have been cited, it asks you to pay to get premium benefits.

Screenshot of linkedin showing standard analytics on user profile page
Screenshot of linkedin showing numbers of profile viewers and partial information of probable viewers

Similarly on LinkedIn you get a notification saying that certain number of people belonging to certain reputable organisations have viewed your profile etc. To view who viewed your profile, the only way is to subscribe to their paid premium plans. (A spoiler, I tested and it doesn’t show the number of viewers they claim or reveal any additional insight than what is already available)

This pattern isn’t confined to formal products or spaces; dating applications like Tinder and Bumble also employ a similar strategy. Features like ‘Likes You’ and ‘Beeline’ show a certain number of people interested in the user but withhold details, encouraging users to subscribe to premium plans for full access.

It’s a strategy that plays on the desire for connection and curiosity about potential matches, using it as an incentive to convert users into paying customers. These instances can be categorized as a form of “Limited Access Deception.” By providing information about supposed mentions or profile views without revealing the specifics and then asking for payment to access those details, the user is manipulated into a sense of curiosity or urgency.

What makes these patterns problematic is the facade of authenticity of the numbers shown.

Data or say numbers play a substantial role in this pattern. Ex: 13 papers mention you, 48 hiring managers viewed your profile, 7801 people already liked your profile etc. Numerical information is often perceived as more objective and scientific.

The use of numbers can create an illusion of precision and accuracy, leading individuals to view the information as more credible.

Dark patterns aim to exploit psychological tendencies to influence user behaviour, often by creating a false sense of urgency, scarcity, or exclusivity. In this case, the user may feel compelled to pay to uncover information, even though there’s uncertainty about the authenticity of the data they’re paying for. It is unclear whether the information is genuine or fabricated until the user makes the payment, creating a deceptive user experience.

Identifying the pattern

Screenshot of dating application showing partial information of user who may have already liked the user
Note the commonality in the presentation of different products

Following are the identifiable traits or elements of this pattern:

  1. Initiates engagement, potentially through a personalised mail or a notification. Note that compared to other dark patterns which are relatively subtle, these patterns are more explicit. These patterns initiate user engagement rather than waiting for user to use the product and fall in trap.
  2. Shows numbers to build the facade of credibility.
  3. Shows believable but partial information of user names, roles, pictures (blurred)
  4. Pay to view action (Most crucial)

Does it qualify as a Dark pattern?

I’m absolutely a believer that nothing is black or white. The darkness of these dark patterns depend on multitude of factors as I have argued before. What some may consider as dark, others may think of it as a good sales tactic. Some of these aforementioned patterns may have been introduced to provide more delightful experience, it is all subject to discussion.

The identified pattern, referred to as “Deceptive Insight scheme,” exhibits several traits that align with the characteristics commonly associated with dark patterns. Here are key points that justify categorizing it as a dark pattern:

  1. Deceptive information: This pattern presents information about potential matches or profile views, creating an illusion of personalization and user popularity. However, the pattern intentionally withholds specific details, leading users to form inaccurate perceptions.
  2. Curiosity exploitation: The pattern preys on users’ natural curiosity about who is interested in them or viewing their profile. By creating a sense of mystery and then requiring payment to unveil the details, it exploits users’ emotional responses to curiosity.
  3. Withholding information: Deliberately withholds crucial information about the cited individuals or profile viewers unless the user pays for access. This lack of transparency goes against ethical design principles and can be considered manipulative.
  4. Monetary setback: The primary motivation behind the pattern is to drive users to pay for additional details. This aligns with the dark pattern characteristic of prioritizing financial gains over the user’s best interests.
  5. Potential for inauthenticity: Users who pay to access the information are left uncertain about the authenticity of the details provided. There’s a risk that the data, such as the number of profile views or mentions, may be exaggerated or fabricated.
  6. Emotional manipulation: This pattern relies on positive emotions, such as the desire for social validation and connection, to manipulate user behavior. The pattern leverages users’ emotional vulnerabilities to encourage payment.
  7. Lack of user empowerment: The pattern often involves design decisions that limit user agency or force them into actions they might not have taken with full information. The pattern restricts access to meaningful information, limiting users’ ability to make informed decisions without paying.

It’s important to note that while positive emotions can be ethically harnessed to improve user experience, the line is crossed when these emotions are exploited or manipulated in a way that leads to deceptive practices or coerces users into unintended actions. Ethical design must prioritize the well-being and autonomy of users.

Open for discussion

We can always argue whether it is a dark pattern or a smart business tactic. The foreseeable next step is to find a healthy balance so we don’t kill smart business practices and at the same time also don’t manipulate or deceive the users. This is important to discuss because, yes, businesses in the current times are building and surviving on data — everything is data, data is everything. They use resources and efforts to collect the meaningful data to provide personalised or sometimes hyper personalised experience. So my opinion is, nothing wrong in using it to sustain the business.

I understand the greyness of the matter. But being a UX-er, I believe it is my duty to advocate for the users. So I’ll leave it to the other side to debate for the businesses. Probably in near future we can find a happy medium which works for both, users and business.

--

--

Talks about UX , Persuasive tech, Deceptive design • Experience Designer at Bosch • NIT-B Graduate •