Bot Farms and the Synthetic Internet: Power, Perception, Labor, and the Philosophy of Attention

Bots in a line doing work, a symbol of bot farms

Most people know bots exist. They have seen spam replies, suspicious follower spikes, strange comment threads, and engagement that feels artificial. The surprise is not that bots are online. The surprise is how much of the internet they now occupy, and how deeply they shape systems that affect politics, wealth, labor, culture, and even mental health.

Nearly half of internet traffic is automated. According to the 2024 Bad Bot Report from Imperva, 49.6 percent of all web traffic in 2023 came from bots, and 39.6 percent were classified as bad bots.
Full report:
https://www.imperva.com/resources/resource-library/reports/2024-bad-bot-report/

That statistic alone reframes the internet. When half of activity is automated, we are no longer simply navigating a human conversation augmented by technology. We are navigating a mixed environment where human and synthetic behaviors constantly interact.

Bot farms are not just a technical nuisance. They sit at the center of interconnected systems involving world events, advertising markets, labor exploitation, emotional wellbeing, and deeper philosophical questions about truth and reality.

To understand them clearly, we need to understand what they are, why they exist, how money flows through them, and how they reshape the architecture of public life.


What a Bot Farm Actually Is

A bot farm is a coordinated operation where one operator or group controls large numbers of automated accounts or devices that simulate human behavior online at scale.

These systems can:

  • Visit websites
  • Click advertisements
  • Share links
  • Generate comments
  • Post long-form content
  • Create fake reviews
  • Register accounts in bulk

Some bot farms are entirely virtual, running through distributed cloud servers that rotate IP addresses to avoid detection. Others are physical spaces filled with smartphones mounted on racks, connected to software that automates scrolling and engagement.

A click farm uses low-paid human labor to perform fake engagement manually. A bot farm automates that labor through software.

The goal is not necessarily to create new ideas. It is to manufacture signals in systems that reward signals.

On today’s internet, signals equal visibility. Visibility equals influence or money.


World Events: Synthetic Amplification and Political Power

Bot farms become most visible during elections, wars, and geopolitical crises.

Automated networks amplify narratives, flood hashtags, and create the appearance of widespread support or outrage. They do not need to persuade everyone. They only need to alter perception at key moments.

Research published in Nature Communications found evidence that bots amplified misinformation during major political events:
https://www.nature.com/articles/s41467-020-18338-9

The Atlantic Council and its Digital Forensic Research Lab have documented coordinated inauthentic behavior linked to state and non-state actors across multiple countries:
https://www.atlanticcouncil.org/programs/digital-forensic-research-lab/

In 2019, Facebook removed networks operating out of Kosovo and North Macedonia that targeted US political audiences:
https://about.fb.com/news/2019/03/remove-cib-from-macedonia/

The significance is not only about misinformation. It is about perception management.

If bots flood a topic with thousands of comments within minutes, algorithms detect engagement velocity and push that content further. Synthetic activity becomes real exposure. Real exposure triggers human responses. Human responses reinforce visibility.

A feedback loop forms.

In moments of crisis, even small distortions can cascade into broader narratives that influence public opinion and policy decisions.

World events are now inseparable from information infrastructure. And that infrastructure is partially automated.


Wealth, Labor, and the Political Economy of Bots

people are not bot farms

Bot farms exist because they are profitable.

Global ad fraud losses are projected to exceed 100 billion dollars annually, according to Juniper Research:
https://www.juniperresearch.com/press/digital-ad-fraud-losses-to-exceed-100bn

Other estimates from firms such as Statista and industry researchers place annual digital ad fraud between 70 and 120 billion dollars.

That money does not disappear. It flows.

Bot operators generate fake impressions and clicks. Advertisers pay for engagement that never involved a real human. Traffic brokers resell inflated metrics. Arbitrage publishers monetize synthetic traffic.

The digital advertising economy runs largely on impressions and engagement counts. If 14 to 22 percent of pay-per-click traffic may be invalid or undetected fraud, optimization systems are being trained on corrupted data.

This is not just fraud. It is structural distortion.

There is also a labor dimension.

Click farms, often located in lower-income regions, rely on low-paid workers performing repetitive engagement tasks. Investigations by outlets like The Guardian have documented these labor conditions:
https://www.theguardian.com/technology/2018/jan/10/facebook-likes-click-farms

Bot farms automate this labor, replacing human click workers with software. That automation increases scale while reducing cost.

The result is a strange inversion of digital capitalism:

On one side, real human labor is underpaid to simulate engagement.
On the other, software replaces that labor to simulate engagement even more efficiently.

Wealth concentrates upstream. Risk concentrates downstream.

Bot farms sit inside the broader logic of extraction economies, where value is pulled from attention markets rather than created through meaningful exchange.


Mental Health: Living Inside a Distorted Signal Field

The internet is no longer separate from psychological life. It is the environment where many people encounter news, community, and conflict.

When hostile comments dominate a thread, users may assume those views are common. Psychologists refer to related phenomena as false consensus effects and pluralistic ignorance.

If a significant portion of aggressive or extreme content is automated, then emotional responses are being triggered by systems designed to provoke engagement rather than by organic human disagreement.

Research from Pew Research Center shows that many users perceive online discourse as more hostile than offline conversations:
https://www.pewresearch.org/internet/2022/04/20/the-state-of-online-harassment/

Synthetic amplification intensifies that perception.

Anxiety rises when people believe society is angrier than it actually is. Polarization deepens when algorithmic systems prioritize high-arousal content.

Bot farms do not just distort information. They distort emotional climate.

If half of traffic is automated and a substantial share of high-engagement posts are amplified artificially, then individuals are navigating a psychological environment partially engineered for reaction.

Mental health, in this sense, becomes interconnected with technological architecture.


Climate Change and Information Integrity

Environmental policy depends heavily on public perception of scientific consensus.

Research published in Global Environmental Change has examined coordinated online campaigns influencing climate discourse:
https://www.sciencedirect.com/science/article/pii/S095937801630278X

Automated amplification can magnify skepticism or exaggerate fringe perspectives. Even if bots represent a minority of total conversation, high-volume activity can create the illusion of significant disagreement.

Policy momentum slows when public consensus appears fractured.

Climate change is not only a scientific and ecological issue. It is also an information systems issue. When discourse is distorted, collective action becomes harder.

Information integrity affects environmental outcomes.


Arts, Culture, and Manufactured Popularity

The creative economy is metric-driven.

Streaming counts influence payouts. Engagement shapes algorithmic recommendations. Follower counts determine sponsorship opportunities.

Streaming fraud investigations reported by Music Business Worldwide have detailed how artificial plays distort artist compensation:
https://www.musicbusinessworldwide.com/streaming-fraud-is-a-growing-problem/

Fake reviews influence purchasing decisions. The Federal Trade Commission has issued guidance on deceptive review practices:
https://www.ftc.gov/business-guidance/resources/soliciting-and-paying-for-online-reviews-guide-marketers

When visibility can be purchased through bot-driven engagement, creative ecosystems skew toward those with resources to manipulate metrics.

Popularity becomes partially synthetic.

Culture becomes entangled with automation infrastructure.


AI and the Escalation of Synthetic Presence

Artificial intelligence has transformed bot capabilities.

Advanced bots now generate realistic bios, long-form comments, and coherent debate. The Imperva report notes a rise in moderate and advanced bots using adaptive behaviors.

AI lowers the technical barrier to entry. Tools that once required expertise are now accessible.

Bots can now:

  • Register accounts
  • Generate profile images
  • Write persuasive posts
  • Engage in multi-turn conversation
  • Click ads
  • Simulate transactions

The conversation has shifted from “how many fake followers exist?” to “how much of the conversation itself is synthetic?”

AI expands scale and sophistication simultaneously.

This creates an arms race between detection systems and automation networks.


Philosophy: Truth, Reality, and the Value of Attention

At a deeper level, bot farms raise philosophical questions.

If visibility can be manufactured, what does popularity mean?
If engagement can be automated, what does consensus mean?
If conversation can be simulated, what does authenticity mean?

The internet once promised democratized voice. Bot farms introduce synthetic voice at industrial scale.

Philosophers have long debated the nature of reality and perception. In digital spaces, perception is increasingly mediated by algorithmic ranking systems that respond to engagement signals.

If signals can be fabricated, then perceived reality becomes manipulable.

Bot farms reveal a shift from an information economy to an attention economy, and now toward a synthetic attention economy.

Attention becomes a commodity extracted, redirected, and monetized.

Trust becomes scarce.


The Structural Paradox

Not all bots are malicious.

Search crawlers from Google index the web. Security bots scan for vulnerabilities. Monitoring tools maintain uptime.

Automation is foundational to modern infrastructure.

At the same time, advertising models reward impressions and engagement volume. Even fraudulent impressions generate billable events somewhere in the supply chain.

Bots undermine trust while sustaining volume-based revenue systems.

That is the structural paradox.

Bot farms did not emerge accidentally. They emerged because digital ecosystems reward scale over verification.


Interconnection Is the Core Insight

Bot farms are not isolated anomalies.

They connect:

Technology infrastructure
Advertising economics
Labor markets
Political influence
Climate discourse
Cultural production
Psychological wellbeing
Philosophical questions about truth

When nearly half of internet traffic is automated, we are navigating an environment where synthetic and human signals constantly interact.

World events are shaped by amplification systems.
Wealth flows through distorted metrics.
Labor is replaced or exploited to simulate engagement.
Mental health is affected by engineered outrage.
Climate policy is influenced by perception cascades.
Cultural popularity becomes partially programmable.
Philosophical assumptions about consensus and authenticity weaken.

Bot farms reveal how deeply interconnected digital systems have become.

The internet is not simply overrun by bots. It is structurally intertwined with them.

The central question is no longer whether bots exist. Most people know they do.

The deeper question is whether digital systems will evolve to prioritize authenticity, verified identity, and outcome-based metrics over raw volume.

Because in a world where attention can be manufactured, the rarest and most valuable resource online is not information.

It is trust.


This issue sits at the intersection of World Events, Mental Health, Wealth and Labor, Technology, and Philosophy, and you can explore how these interconnected systems shape one another in greater depth by visiting those sections on Interconnected Earth.