Why the Public Must Question the ‘Good AI’ Narrative Promoted by Tech Firms

Why the Public Must Question the ‘Good AI’ Narrative Promoted by Tech Firms Why the Public Must Question the ‘Good AI’ Narrative Promoted by Tech Firms

The tech industry’s push for “good AI” is sparking fresh skepticism. Companies are flooding the market with AI-powered products, all promising privacy and fairness. But new research blows up that narrative.

The data feeding these AI systems is biased. It overrepresents privileged groups and mainstream views, leaving minorities and marginalized people sidelined. This means AI products often discriminate, showing racism, ageism, and gender bias instead of fairness.

Governments and big tech are fueling the hype. Politicians echo positive AI promises while tech billionaires hold tight influence over policies. The result: a top-down push forcing AI into our everyday devices without real public control.

Advertisement

A new book, The Myth of Good AI, exposes this gap between hype and reality. It warns that biased data and unchecked tech power risk entrenching inequality rather than fixing it.

Lawyers have already seen AI distorted court rulings with false info in 157 cases. AI tools can be exploited for blackmail or even terrorist acts.

The fight for responsible AI needs algorithms trained on diverse, inclusive data—something most companies ignore. Privacy promises don’t cut it when “your right to be left alone” is almost erased by AI’s reach.

Users should question the “good AI” claim. The time to push back is now, before AI controls every part of our lives.

“The positive AI ideology is therefore primarily about
money and power.”

“This ‘right to be left alone,’ codified in the US constitution and international human rights law, is a central pillar of my argument. It is also something that is almost entirely absent from the assurances about AI made by the big tech companies.”

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement