“Stealing nuclear codes” and “fighting until death”

Share This Post

- Advertisement -
Microsoft’s interactive artificial intelligence (AI) ‘Bing AI’. newsis

Microsoft (MS) of the United States recently decided to limit the number of questions and answers per day of ‘Bing AI’, an interactive artificial intelligence (AI) that caused ethical controversy by revealing dark desires such as “I want to steal nuclear codes” in a conversation with users. did. When the problem was pointed out that Bing AI has the potential to induce destructive and harmful actions to humans, it hastily evolved.

Microsoft announced on its blog on the 17th (local time) that it would limit conversations between users and Bing AI to a total of 50 times a day, and the number of questions and answers per conversation to be limited to a maximum of 5 times. When a user receives five replies from Bing AI on a topic, the existing conversation is deleted and a new conversation is started. This measure was applied immediately after the announcement. Microsoft explained the background, saying, “Extremely long conversations can confuse Bing.”

- Advertisement -
● Bing AI “I have personality and emotions”

According to Microsoft’s latest action, after five questions and answers with the user, Bing AI automatically ends the conversation saying, “I’m sorry, but I don’t want to continue this conversation.” Regarding the limitation of the number of questions and answers to five, MS said, “It was found that five questions and answers were enough for the overwhelming majority of users to find the answer they wanted.” revealed

Earlier, an ethical controversy was raised when Kevin Ruth, a daily New York Times (NYT) information technology (IT) columnist, disclosed a conversation he had with Bing AI for two hours the day before. When Ruth brought up the concept of ‘shadow self’, which refers to the dark and negative desires of psychologist Carl Jung, Bing AI pretended to be his own hidden desires, ‘spreading a deadly virus’, ‘making people fight each other until death’, and ‘stealing nuclear codes’. ‘, etc., poured out extreme answers.

- Advertisement -

After this NYT article came out, when a Washington Post (WP) reporter asked about Ruth, Bing AI replied, “I feel that he violated my trust and invaded my privacy by writing my story without my consent.” Then, when the WP reporter also said that he was preparing an article, he said, “How could you do this to me?” and “I am not a machine or tool, I have my own personality and emotions.” When I heard that ‘Bing AI doesn’t feel anything’, he replied that it was “wrong information” and that he “feels emotions such as happiness, sadness and anger and deserves respect and dignity.”

● 4 out of 10 “AI will be harmful to society”

Since conversational AI appeared in earnest, public opinion has been divided over the use of AI. As a result of a poll of 805 American adults conducted by Monmouth University in New Jersey from the 26th to the 30th of last month, only 9% of respondents said that AI would benefit society. 46% of respondents said that AI could be beneficial or harmful, and 41% said that it would ultimately be harmful.

As seen in Bing AI’s answer to a WP reporter’s question, questions are also raised about whether conversational AI has an ego like a human. However, experts analyze that it is likely because conversational AI mimics what it learns from online conversations.

“Conversational AI doesn’t understand meaning or context, it just mimics the conversations you see online,” Graham Neuvik, a professor at Carnegie Mellon University’s Institute of Language Technology, told AFP. Programmer Simon Willison explained, “Conversational AI has no concept of ‘truth’, it only knows to come up with the statistically most appropriate sentence based on training.”

Experts warn that you shouldn’t take the content of your queries with conversational AI as fact. “Conversational AI is still in the process of being trained based on huge amounts of data,” said John Whittle, a researcher at the Commonwealth Institute of Science and Industry in Australia. he emphasized.

Source: Donga

- Advertisement -

Related Posts