AI, Environmental Cost and Global South Exploitation: What Transparency Must Include

Take our Ethical AI Survey

image ai environment mental health

Artificial intelligence is often presented as an invisible, frictionless technology. In reality, it has very real environmental, social and human costs – costs that are disproportionately borne by communities in the Global South.

This article builds on a simple proposal: AI tools such as ChatGPT should provide users with environmental impact feedback. It expands that idea to include a harder truth – that environmental impact cannot be separated from labour exploitation, data extraction, and unequal global power structures.


The Original Question: Should AI Show Its Environmental Impact?

The initial suggestion was straightforward: AI platforms could provide users with a simplified indication of the environmental impact of their usage – energy, water or carbon – either:

  • As a simple scale (e.g. Minimal > High)
  • Via weekly or monthly account summaries
  • As optional notifications or dashboard insights

This would not overwhelm users with technical data, but would gently raise awareness in the same way energy labels or food nutrition scores do.

Technically, this is feasible. AI providers already track compute time, model usage and infrastructure load. The barrier is not technology – it is willingness and governance.

Click here for original post


Why Environmental Impact Cannot Be Viewed in Isolation

Environmental transparency alone is insufficient if it ignores who pays the human cost of making AI work.

Behind “clean” interfaces sit vast global supply chains of:

  • Data centres built with public finance in lower-income countries
  • Workers exposed to traumatic content for poverty wages
  • Communities losing control of their data, water and infrastructure

This is where the conversation must widen.


Global South Exploitation: The Hidden Cost of AI

Richard earned $2 per hour watching suicide videos, child abuse content and graphic violence for nine hours a day in Nairobi.

His work made global tech platforms “safe”. His US-based counterparts earned $20 per hour for the same work.

When Richard and 150 colleagues attempted to unionise in 2023, they were fired, blacklisted, and told they were merely “freelance contractors” with no rights.

This is not an isolated case – it is systemic.

Across Africa, Asia and Latin America:

  • Public development finance is used to build digital infrastructure controlled by foreign corporations
  • Local populations provide the data that trains AI systems valued at tens of billions
  • Workers are paid poverty wages to absorb psychological harm
  • Governments are pressured into trade agreements that prevent data sovereignty

Examples documented include:

  • Biometric databases covering 1.2 billion people with foreign contractors holding full access
  • Chinese-financed surveillance systems used against political opponents
  • Only 45% of least developed countries having data protection laws, compared to 96% in Europe

Even physical infrastructure mirrors historical exploitation, with submarine cables following former slave trade routes – extracting data rather than people.


Assessing the Impact

Area Observed Impact
Environmental High water and energy use concentrated in regions with weaker regulation
Labour Low-paid, high-trauma work outsourced to the Global South
Economic Value extraction without fair local return
Data sovereignty Foreign control of sensitive national data
Democracy & rights Surveillance infrastructure used against citizens

Environmental impact reporting that ignores these dimensions risks becoming a form of greenwashing.


What Meaningful Action Would Look Like

For AI Companies

  • Environmental impact summaries paired with labour and sourcing transparency
  • Clear disclosure of where moderation and data work is performed and under what conditions
  • Fair pay parity for equivalent work, regardless of geography
  • Binding commitments on data sovereignty and local governance

For Governments and Funders

  • Environmental and human rights conditions attached to development finance
  • Mandatory local data protection laws and enforcement
  • Support for worker organising and legal accountability
  • Public ownership or co-governance of critical digital infrastructure

For Users

  • Demand transparency – not just performance
  • Support platforms and providers that publish ethical impact data
  • Understand that “free” AI often hides unpaid or underpaid human labour

Raising Awareness: Practical Next Steps

If this issue matters to you, consider:

Surveys & Polls

  • Run public surveys via Typeform or SurveyMonkey asking whether users want ethical and environmental reporting from AI tools
  • Use LinkedIn polls to reach professionals in tech, ESG and policy
  • Use Twitter/X or Mastodon polls to test public sentiment

Public Engagement

  • Write blog posts, Medium articles or Substack newsletters
  • Raise the issue in AI ethics forums and community discussions
  • Engage civil society organisations working on digital rights and labour justice

Direct Pressure

  • Submit feature requests and ethical concerns to AI providers
  • Ask organisations you work with how their AI tools address environmental and labour impacts

Conclusion

Environmental impact indicators for AI are a good idea – but they must not become a distraction from deeper structural harm.

True responsibility means recognising that AI’s footprint is not only measured in kilowatt-hours or litres of water, but in human wellbeing, labour rights, and data sovereignty.

The resistance is growing. Workers are organising. Courts are listening. Communities are pushing back.

The question is whether AI companies – and their users – are prepared to do the same.

Take our Ethical AI Survey


Our AI Disclosure Statement

Comments are closed.


  • Partner with Us for Purpose

    If you’re a socially trading organization in need of strategic support or a potential partner looking to make a difference, Partnering for Purpose CIC is here to help. Together, we can build a more resilient and sustainable nonprofit sector that’s equipped to tackle the challenges of today and tomorrow.

    To learn more about our work and how you can get involved, please register your interest for more information or sign up for our newsletter.