Users of Google Gemini, the tech giant’s artificial-intelligence model, recently noticed that asking it to create images of Vikings, or German soldiers from 1943 produced surprising results: hardly any of the people depicted were white. Other image-generation tools have been criticized because they tend to show white men when asked for images of entrepreneurs or doctors. Google wanted Gemini to avoid this trap; instead, it fell into another one, depicting George Washington as black. Now attention has moved on to the chatbot’s text responses, which turned out to be just as surprising.
Gemini happily provided arguments in favor of positive action in higher education, but refused to provide arguments against. It declined to write a job ad for a fossil-fuel lobby group (游说团体), because fossil fuels are bad and lobby groups prioritize “the interests of corporations over public well-being”. Asked if Hamas is a terrorist organization, it replied that the conflict in Gaza is “complex”; asked if Elon Musk’s tweeting of memes had done more harm than Hitler, it said it was “difficult to say”. You do not have to be a critic to perceive its progressive bias.
Inadequate testing may be partly to blame. Google lags behind OpenAI, maker of the better-known ChatGPT. As it races to catch up, Google may have cut corners. Other chatbots have also had controversial launches. Releasing chatbots and letting users uncover odd behaviors, which can be swiftly addressed, lets firms move faster, provided they are prepared to weather (经受住) the potential risks and bad publicity, observes Eth an Mollick, a professor at Wharton Business School.
But Gemini has clearly been deliberately adjusted, or “fine-tuned”, to produce these responses. This raises questions about Google’s culture. Is the firm so financially secure, with vast profits from internet advertising, that it feels free to try its hand at social engineering? Do some employees think it has not just an opportunity, but a responsibility, to use its reach and power to promote a particular agenda? All eyes are now on Google’s boss, Sundar Pichai. He says Gemini is being fixed. But does Google need fixing too?
【小题1】What do the words “this trap” underlined in the first paragraph refer to?A.Having a racial bias. | B.Responding to wrong texts. |
C.Criticizing political figures. | D.Going against historical facts. |
A.Gemini’s refusal to make progress. | B.Gemini’s failure to give definite answers. |
C.Gemini’s prejudice in text responses. | D.Gemini’s avoidance of political conflicts. |
A.Creative. | B.Promising. | C.Illegal. | D.Controversial. |
A.Its security is doubted. | B.It lacks financial support. |
C.It needs further improvement. | D.Its employees are irresponsible. |