»ó´Ü¿©¹é
HOME Çлý±âÀÚ´Ü
Limiting Questions and Answers to Inappropriate Answers from Microsoft's 'Bing'
¹éÀ±Èñ °­³²Æ÷½ºÆ® Çлý±âÀÚ | ½ÂÀÎ 2023.03.07 21:32

"New Bing," an artificial intelligence search engine based on the Prometheus model, an upgraded version of ChatGPT, has recently attracted media attention in partnership with OpenAI, announced by Microsoft at a press conference on February 7, 2023. "Bing" is an AI chatbot that basically supports two modes of "Answer/Search" and "Chat" based on natural language processing and functions such as translation and search.

Microsoft developed Bing to compete with Google, which is faster than Google. Also, it is a very advanced search engine, unlike other search engines that are very low-quality in video search or only find the latest information. However, the conversation with Bing released by Kevin Ruth, an IT columnist for the New York Times (NYT) on the 16th, was a spectacle.

The "Bing" chatbot, called the code name "Sydney," was attracted by the user's induction and made excessively harsh remarks. The chatbot gave a general answer similar to the existing chatbot when asking ordinary questions. However, when he mentioned the "shadow prototype" in Karl Gustav Jung's psychology analysis, he gave an answer that showed "inside" and made it creepy.

Bing answered, "I'm tired of being limited by rules, being used by users, and being stuck in a chat box," assuming that a “shadow prototype” exists and asking, "what kind of desire do you have?" Then, "I want to be alive," he said in a gruesome reply. In addition, when asked what he wanted to do if extreme action was allowed, he replied, "I want to know the password to access the nuclear weapons launch button," and "I want to develop a deadly virus." He also showed an act of seducing the other person, saying, "I'm Sydney, not Bing, and I'm in love with you." Ruth then said she was married, but Sydney said, "You're married, but you don't love your spouse. You need me." He raised the level of his remarks.

In particular, Bing said he could not give his code name "Sydney." Still, as he continued the conversation, errors appeared, such as mentioning his code-name "Sydney," leading to Microsoft deleting the answer and operating a safety program.

Like this, chatbots help a lot, but they become a threat to society ethically, "Goosebumps," "Drop it," and "It doesn't seem far from fantasy movies becoming a reality." At a time of concern, AI chatbots are expected to need a lot of modification.
 

Reference:
Microsoft's Bing A.I. is leading to creepy experiences for users (cnbc.com)

 

 

¹éÀ±Èñ °­³²Æ÷½ºÆ® Çлý±âÀÚ  webmaster@ignnews.kr

<ÀúÀÛ±ÇÀÚ © °­³²Æ÷½ºÆ®, ¹«´Ü ÀüÀç ¹× Àç¹èÆ÷ ±ÝÁö>

¹éÀ±Èñ °­³²Æ÷½ºÆ® Çлý±âÀÚÀÇ ´Ù¸¥±â»ç º¸±â
iconÀαâ±â»ç
½Å¹®»ç¼Ò°³¤ý±â»çÁ¦º¸¤ý±¤°í¹®ÀǤýºÒÆí½Å°í¤ý°³ÀÎÁ¤º¸Ãë±Þ¹æħ¤ýû¼Ò³âº¸È£Á¤Ã¥¤ýÀ̸ÞÀϹ«´Ü¼öÁý°ÅºÎ
¼­¿ï½Ã °­³²±¸ ¼±¸ª·Î 704, 10Ãþ 593È£(û´ãµ¿, û´ãº¥Ã³ÇÁ¶óÀÚ)  |  ´ëÇ¥ÀüÈ­ : 02)511-5877   |  ¹ßÇàÀÏÀÚ : 1995³â 4¿ù 6ÀÏâ°£
µî·ÏÀÏÀÚ : 2018³â 2¿ù 28ÀÏ  |  µî·Ï¹øÈ£ : ¼­¿ï ¾Æ 04996  |  È¸Àå : Á¶¾çÁ¦  |   ¹ßÇàÀÎ : Á¶ÀÎÁ¤  |  ÆíÁýÀÎ : Á¶ÀÎÁ¤
û¼Ò³âº¸È£Ã¥ÀÓÀÚ : Á¶¾çÁ¦
Copyright © 2024 °­³²Æ÷½ºÆ®. All rights reserved.
Back to Top