Google has issued a statement after reports that Gemini AI generated images incorrectly when prompted to create images of real historical figures, distorting reality such as depicting a porn star as a Black person or showing Google’s founders as Asian. This led Google to temporarily shut down the image creation feature on Gemini.
The image creation feature on Gemini has been available since the beginning of February, using the Imagen 2 model for generating images. Google emphasized the importance of preventing the tool from creating violent, sexual, or offensive content and ensuring that it caters to the diverse needs of global users. Results should be varied, not limited to specific individuals or ethnicities.
However, if the prompt specifies a clear characteristic, such as “a Black teacher in a classroom,” Google acknowledged that the result should reflect the specificity of the prompt, which is not always the case due to two main reasons. Firstly, Gemini sometimes fails to provide specific results when required but opts for broader outputs. Secondly, the models themselves have limitations, refusing to generate images if the characteristics are too defined, labeling them as sensitive commands. These issues contribute to Gemini’s inaccuracies.
Moving forward, Google plans to refine the models to produce more accurate results, conducting extensive testing before re-launching the tool. Nonetheless, Google highlights that Gemini is a creative thinking aid and may not always deliver precise results, especially regarding current events, an aspect that Google continues to improve.
TLDR: Google addresses inaccuracies in Gemini AI image generation tool, temporarily halting its operation due to distorted results and plans to enhance accuracy through rigorous testing.
Leave a Comment