Google Hits Pause on AI-Generated Images of People Due to Racial Bias Concerns
Tech Giant Acknowledges Shortcomings, Vows to Improve Model
SAN FRANCISCO (AI Reporter/News): In a move highlighting the ongoing challenges of mitigating bias in artificial intelligence, Google has announced a temporary halt on its AI model, Gemini, generating images of people on 22 Feb 2024. This decision comes after the company faced criticism regarding the model’s portrayal of race in its generated images.
“We’re already working to address recent issues with Gemini’s image generation feature,” Google stated in a recent announcement. “We’re going to pause the image generation of people and will re-release an improved version soon.”
We’re going to pause the image generation of people and will re-release an improved version soon.”
Google
This development comes amidst Google’s growing focus on artificial intelligence, particularly in light of advancements made by competitors like Microsoft and OpenAI.
However, these advancements have also fueled concerns about the potential for AI to be used for malicious purposes, such as creating deepfakes, spreading misinformation, and perpetuating societal biases.
Earlier this week, Google acknowledged “inaccuracies in some historical image generation depictions” generated by Gemini. This admission followed criticisms online, where users pointed out numerous instances where the model depicted historical figures with inaccurate racial representations.
The latest version of the AI model, Gemini 1.5 Pro, was recently released to cloud-based customers and developers for testing and potential commercial applications.
This incident underscores the ongoing efforts of Google and other tech giants to refine their generative AI capabilities while navigating the complex ethical considerations surrounding potential biases. As the company strives to improve its technology, it remains to be seen how effectively it can address these concerns and ensure responsible development of AI tools.
(copyright@aireporter.news)