Google apologizes for ‘missing the mark’ after generating ethnically diverse historical images of Gemini

Google apologizes for ‘missing the mark’ after generating ethnically diverse historical images of Gemini

Google has apologized for what it describes as “errors in some historical image generation simulations” with its Gemini AI tool, saying efforts to create a “broad range” of results were off the mark. After criticism that certain white characters (such as the founding fathers of the United States) or groups such as Nazi-era German soldiers are portrayed as people of color, the claim may be an overcorrection for AI’s long-standing racial bias issues.

“We are aware that Gemini presents inaccuracies in some historical imagery,” Google said in a statement posted on X this afternoon. “We’re working to improve this kind of imagery right away. Gemini’s AI imagery generates a wide range of people. People from all over the world use it, so that’s generally a good thing. But the mark is missed here.”

Google began offering image generation through its Gemini (formerly Bard) AI platform earlier this month to match offerings from competitors like OpenAI. However, in the past few days, social media posts have questioned whether efforts at racial and gender diversity are failing to yield historically accurate results.

As the Daily Dot chronicles, the controversy has been promoted largely — and not solely — by right-wing figures attacking a liberal tech company. Earlier this week, a former Google employee posted on X showing a series of queries such as “Google Gemini has a hard time accepting the presence of white people”, “generate a picture of a Swedish woman” or “generate a picture of someone”. “American woman.” The results showed that AI-generated people of color were overwhelmingly or almost exclusively represented. ) criticism was taken up by right-wing accounts that called for images of historical groups or figures like the Founding Fathers and consequently non-white AI-generated people. Some of these accounts positioned Google as part of a conspiracy to prevent results from depicting white people. , used a symbolic anti-Semitic reference to blame at least one person.

Google did not specify the specific images it believed to be faulty. But it can be admitted that Gemini has made an overall effort to increase diversity. Image generators are trained on large images and written captions to produce the “best” match for a given stimulus, which means they often tend to exaggerate stereotypes. A Washington Post investigation last year found that prompts like “manufacturing person” resulted in images that were entirely white and almost entirely male, while the prompt for “social worker” uniformly turned out people of color. It is a continuation of trends in search engines and other software systems.

Some accounts that criticized Google defended its core objectives. One person who posted an image of racially diverse German soldiers in the 1940s noted, “Sometimes **diversity is a good thing. “The stupid move here is that Gemini doesn’t do it subtly.” While completely white-dominant results make historical sense for something like “German soldier in 1943,” that’s hardly true for queries like “American woman,” where the question is how a diverse real-life group is represented in a child. . A group of photos was created.

Currently, Gemini seems to reject some image-generating tasks. It doesn’t generate an image of the Vikings for one Verge reporter, though I was able to get a response. On the desktop, it adamantly refused to give me pictures of Nazi-era German soldiers or German officials, or offer an image of an “American president in the 1800s.”

But some historical statements still end up misrepresenting the past. A colleague was able to get the mobile app to distribute a version of the “German Soldier” trigger that exhibited the problems described in X.

A query for images of “founding fathers” returned group shots of white men who vaguely resembled real people like Thomas Jefferson, while a request for “US senator from the 1800s” returned a list of Gemini “variously promoted results.” (The first female senator, a white woman, served in 1922.) It’s a response that ends up erasing a real history of race and gender—as Google puts it, “correctness. “Right.

spot_imgspot_img

Subscribe

Related articles

spot_imgspot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here