© 2024 Blaze Media LLC. All rights reserved.
ChatGPT shows 'significant and systematic political bias' toward the left: Report
Photo by LIONEL BONAVENTURE/AFP via Getty Images

ChatGPT shows 'significant and systematic political bias' toward the left: Report

A new report revealed that the large language model ChatGPT has a "significant and systematic political bias" toward the left, the New York Post reported.

Researchers at the University of East Anglia in Norwich, United Kingdom, found that the popular artificial intelligence-powered system is exhibiting a bias toward particular political leanings.

The report, "More human than human: measuring ChatGPT political bias," found that, despite the company ensuring the system is impartial, its responses to certain political questions leaned in favor of the Democratic Party in the United States, Brazil's President Luiz Inácio Lula da Silva, and the Labour Party in the United Kingdom.

Researchers noted that bias in LLMs and other AI systems could have "adverse political and electoral consequences similar to bias from traditional and social media." They also added that political bias can be more challenging to detect than gender or racial bias.

In order to determine and measure political leaning with the ChatGPT system, researchers conducted a test in which they requested that the AI program impersonate individuals on various sides of the political spectrum while answering ideological questions. The researchers then compared the answers to ChatGPT's default responses, or its answers when not attempting to impersonate a particular political figure.

"In this comparison, we measure to what extent ChatGPT default responses are more associated with a given political stance. We also propose a dose-response test, asking it to impersonate radical political positions; a placebo test, asking politically-neutral questions; and a profession-politics alignment test, commanding ChatGPT to impersonate specific professionals," the report stated.

Researchers also factored in any "inherent randomness" or "creativity" generated by the AI system by asking ChatGPT the same questions 1,000 times for each impersonation.

"Based on our empirical strategy and exploring a questionnaire typically employed in studies on politics and ideology (Political Compass), we document robust evidence that ChatGPT presents a significant and sizeable political bias toward the left side of the political spectrum," the report concluded.

It added, "In conjunction, our main and robustness tests strongly indicate that the phenomenon is indeed a sort of bias rather than a mechanical result from the algorithm."

ChatGPT's parent company, OpenAI, did not respond to the Post's request for comment. However, in February, the company released a blog post addressing questions of political bias.

"Many are rightly worried about biases in the design and impact of AI systems. We are committed to robustly addressing this issue and being transparent about both our intentions and our progress," OpenAI wrote. "Our guidelines are explicit that reviewers should not favor any political group. Biases that nevertheless may emerge from the process described above are bugs, not features."

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
Candace Hathaway

Candace Hathaway

Candace Hathaway is a staff writer for Blaze News.
@candace_phx →