How we can help

Need help with your Managed IT Services?

Our team are available Mon – Fri: 7:30am-5:30pm

Call Now On:
Stourport: 01299 848311 Hereford: 01432 663026

Technical Support

Contact us

- 28th Feb 2024

Tech News : Google Pauses Gemini AI Over 'Historical Inaccuracies'

Only a month after its launch, Google has paused its text-to-image AI tool following “inaccuracies” in some of the historical depictions of people produced by the model. 

Woke’ … Overcorrecting For Diversity? 

An example of the inaccuracy issue (as highlighted by X user Patrick Ganley recently, after asking Google Gemini to generate images of the Founding Fathers of the US), was when it returned images of a black George Washington. Also, in another reported test, when asked to generate images of a 1943 German (Nazi) soldier, Google’s Gemini image generator returned pictures of people of clearly diverse nationalities in Nazi uniforms.

The inaccuracies have been described by some as examples of the model subverting the gender and racial stereotypes found in generative AI, a reluctance to depict ‘white people’ and / or conforming to ‘woke’ ideas, i.e. the model trying to remove its own bias and improve diversity yet ending up simply being inaccurate to the point of being comical.  

For example, on LinkedIn, Venture Capitalist Michael Jackson said the inaccuracies were a “byproduct of Google’s ideological echo chamber” and that for the “countless millions of dollars that Google spent on Gemini, it’s only managed to turn its AI into a nonsensical DEI parody.” 

China Restrictions Too? 

Another issue (reported by Al Jazeera), noted by a former software engineer at Stripe on X, was that Gemini would not show the image of a man in 1989 Tiananmen Square due to its safety policy and the “sensitive and complex” nature of the event. This, and similar issues have prompted criticism from some that Gemini may also have some kind of restrictions related to China. 

What Does Google Say? 

Google posted on X to say about the inaccurate images: “We’re working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.” 

Google has, therefore, announced that: ”We’re already working to address recent issues with Gemini’s image generation feature. While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.” 

Bias and Stereotyping 

Bias and stereotyping have long been issues in the output of generative AI tools. Bias and stereotyping in generative AI outputs exist primarily because AI models learn from vast amounts of data collected from human languages and behaviours, which inherently contain biases and stereotypes. As models mimic patterns found in their training data, they can replicate and amplify existing societal biases and stereotypes. 

What Does This Mean For Your Business? 

Google has only just announced the combining of Bard with its new Gemini models to create its ‘Gemini Advanced’ subscription service, so this discovery is likely to be particularly unwelcome. The anti-woke backlash and ridicule are certainly something Google could do without about now, but the issue has highlighted the complications of generative AI, how it is trained, and the complexities of how models interpret the data and instructions they’re given. It also shows how AI models may be advanced, but they don’t actually ‘think’ (as a human would), they can’t perform ‘reality checks’ as humans can because they don’t ‘live’ in the ‘real world.’ Also, this story shows how early we still are in the generative AI journey.  

Google’s explanation has shed some light on the thinking behind the issue and at least it’s admitted to being wide of the mark in terms of historical accuracy – which is clear from some of the examples. It’s all likely to be an embarrassment and a hassle for Google in its competition with Microsoft and its partner OpenAI, nevertheless, Google seems to think that with a pause plus a few changes, it can tackle the problem and move forward.

Google Rating
5.0
Based on 45 reviews