Social Media No Longer Compete to Connect: They Battle to Know More About You

For years, social media presented itself as the great infrastructure of digital conversation: a place to stay connected, discover information, and engage in communities. In 2026, that promise coexists with another much more tangible reality: a large part of the model is built to measure, profile, and optimize what we do—and what we are likely to do—for commercial purposes.

The ongoing trial in the United States regarding the addictive design of certain platforms and their impact on adolescent mental health comes at a time when the discussion no longer is just about “what content circulates,” but how behavior is engineered to keep the wheel spinning: notifications, infinite scroll, recommendations, and an incentive system that rewards quick reactions over reflection.

When “service” equals data collection

The key to understanding this phenomenon is simple: modern advertising is no longer just about showing ads, but about getting the right message to the right person at the right time. And to do that, information is essential.

This is where data extraction comes in. A comparative index published by IT Asset Management Group illustrates well the kind of data collection becoming normalized on mobile devices. In their ranking of “most invasive” apps, Instagram and Facebook are tied for first place: 32 types of data collected, 25 linked to the user, and 7 linked and tracked, with a score of 61.47/100.

This isn’t an abstract debate about “privacy” in general: these are specific categories that help understand the scope of the instrumentation integrated into our digital daily lives.

It’s not just how much data is collected, but how it’s used

The table also shows an important nuance: there is a difference between “linked to the user” data and “linked and tracked” data. In the same ranking, Threads appears with 32 data types collected and 32 linked to the user, but 0 linked and tracked, with a score of 54.53/100. The same applies to Meta Business Suite and Messenger (identical values and scores).

By contrast, other apps on the list combine less “direct linking” but more “tracking”: Grab shows 27 data types, 8 linked, and 15 tracked and linked (55.57/100). Toward the lower end of the top 10, Nordstrom Rack gathers 22 types, with 4 linked and 18 tracked and linked (53.62/100); Nordstrom, 22 types, with 5 linked and 17 tracked and linked (52.54/100); and Pinterest, 29 types, with 22 linked and 6 tracked and linked (50.06/100).

The conclusion isn’t that “all are the same,” but that the ecosystem has become extraordinarily sophisticated on two fronts: identifying (matching data to a person) and tracking (cross-referencing signals to infer habits, interests, and propensity to act).

The attention economy: what’s rewarded multiplies

Once the product becomes measurable, the next step is optimization. And that’s where the issue that worries families, educators, and regulators today arises: if the system makes money the longer you stay within it, the algorithm tends to prioritize what keeps you engaged.

This doesn’t necessarily mean “bad content,” but rather highly engaging content: controversy, emotion, outrage, fear, social comparison, and short stimuli that invite repetition. In teenagers — a stage particularly sensitive to social acceptance and self-image — this dynamic can amplify insecurities and reinforce unhealthy habits.

That’s why the debate over addictive design isn’t superficial. It’s not just about “using your phone less”: it’s a discussion about product architecture and what happens when this architecture is deployed at scale among hundreds of millions of people.

Can social media be “fixed”?

Here, the debate splits into two paths, both of which are on the table:

  1. Reforms and limits to the current model. Greater transparency, restrictions on tracking, stricter controls on minors, external audits, limits on profile-based personalization, and mandates to explain how each user’s content is decided.
  2. Changing incentives. If the business relies on profiling and retention, superficial changes are insufficient. Alternatives include models where profitability isn’t based on surveillance: subscriptions, paid versions without tracking, true interoperability, or smaller networks with different governance.

No solution is immediate, but one thing is clear: if incentives don’t change, design will tend to revert to the same pattern.

What can a user do today, without becoming an expert

For a general tech media outlet, it’s useful to distill this into practical decisions:

  • Review permissions and limit what isn’t essential (location, contacts, app-to-app tracking).
  • Reduce personalization (history, activity outside the platform, “interest-based” ads).
  • Simplify usage: one app for messaging, another for entertainment, avoiding a single platform absorbing everything.
  • For minors, prioritize controlling screen time and content type (not just “block,” but guide and understand their consumption).
  • Adopt a simple rule: if a service is “free” and lives off hyper-targeted ads, pay with your data and attention.

The blind spot of 2026: normalizing the abnormal

Social media won’t disappear tomorrow. But the way we evaluate them may change: less as “neutral public squares” and more as what they already are in practice: mass-market products with a business model based on data.

The trial in the U.S. is a sign of the times, but the real issue won’t be solved solely in the courts. The fundamental question is whether society continues paying the costs—privacy, attention, mental health—or starts demanding something different: services that connect without surveilling and entertain without exploiting.

Source: Redes Sociales

Scroll to Top