»ó´Ü¿©¹é
HOME Çлý±âÀÚ´Ü
Why You Should Not Fear SuperIntelligence
ÃÖÀ±¿µ °­³²Æ÷½ºÆ® Çлý±âÀÚ | ½ÂÀÎ 2020.01.14 22:24

According to Stephen Hawking, "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks"

The idea of evolved technology posing a serious risk to human survival has swiftly grown in popularity with changing circumstances. Talk about Alphago, NANO1, Snore Circle, and numerous more technologies… it seems more absurd that someone would not be familiar with Artificial Intelligence (A.I.).

A.I. is constantly exchanging information amongst its neural networks to make itself as lifelike as possible: cloning human intelligence. So far, it does not seem like much. However, imagine A.I. undergoing "recursive self-improvement" a process in which it reprograms and boosts itself without the aid of humans. Groundbreaking. The A.I. would eventually process data and accumulate skills at speeds unthinkable to humans, paving the way to superintelligence or intelligence explosion, which would enable it to outstrip humans in any performance.

What's more, the bots will not be bothered by human limitations like emotions or sleep. This is where doomsayers fret about facing danger. The bots, marching towards their goals, are unstoppable. Eliezer Yudowski, American AI researcher, claims that "the AI does not hate you nor loves you, but you are made out of atoms which it can use for something else." For instance, say that the bots set their goal of limiting carbon emissions. The best solution would be to eliminate its main source. humans. For bots, we are like ants on the kitchen floor. we do not hate the ants, but their inconvenience prompts us to spray insecticide.

Some may ask to “unplug” the intelligence, but unfortunately, superintelligence would be stored in a database unrestricted to a single unpluggable device. However, there is no reason to worry about the potential danger of A.I. Our bots are still very far from the stage of self-improvement, and scientists have plenty of time to secure Friendly Artificial Intelligence (FAI), which could “box” A.I. up (saving us from physical harm) and even instill humane morales (MR, MP proposal) in bots.

It is more likely that our future A.I. bots will serve to aid, not destroy human intelligence. After all, we still have enough time to construct an alternate terminator-free future.

 

AIÀÇ ÃÊÁö´ÉÈ­¸¦ °ÆÁ¤ÇÏÁö ¾Ê¾Æµµ µÈ´Ù

 

“AI °³¹ßÀÇ ¼º°øÀº Àηù ¿ª»ç¿¡¼­ °¡Àå Áß¿äÇÑ À̺¥Æ®ÀÌ´Ù. ºÒÇàÈ÷µµ, ¿ÂÀüÇÑ ÀΰøÁö´ÉÀÇ °³¹ßÀº ÀηùÀÇ Á¾¸»À» ºÒ·¯¿Ã ¼ö ÀÖ´Ù.” ½ºÆ¼ºì ȣŷ ¹Ú»çÀÇ 2014³â ÀÎÅͺäÀÌ´Ù.

±â¼úÀÇ Áøº¸°¡ ÀηùÀÇ »ýÁ¸¿¡ À§ÇùÀÌ µÇ°í ÀÖ´Ù´Â »ý°¢ÀÌ ±Þº¯Çϴ ȯ°æ ¼Ó¿¡¼­ ºü¸£°Ô È®»êµÇ°í ÀÖ´Ù. ¾ËÆÄ°í, ³ª³ë1, Snore Circle µîÀ¸·Î ÀÎÇؼ­ ¿ì¸®´Â AI¿¡ Àͼ÷ÇØÁ® °¡°í ÀÖ´Ù.

AI´Â ³×Æ®¿öÅ©¸¦ ÅëÇØ ²÷ÀÓ¾øÀÌ Á¤º¸¸¦ ÁÖ°í ¹ÞÀ¸¸ç »ý¸íü¿Í °°ÀÌ ¹ßÀüÇϸç Àΰ£ÀÇ Áö´É¿¡ ±ÙÁ¢ÇÏ°í ÀÖ´Ù. ÀÌ ¶§±îÁö´Â AI°¡ ±×´ÙÁö ¶È¶ÈÇØ º¸ÀÌÁö ¾Ê¾ÒÀ» ¼ö ÀÖ´Ù. ÇÏÁö¸¸ Àΰ£ÀÇ µµ¿ò ¾øÀÌ ½º½º·Î ÇнÀÇÏ°í ¹ßÀüÇÏ´Â AI¸¦ »ó»óÇغ¸¶ó. AI´Â ¾ðÁ¨°¡´Â Àΰ£ÀÌ »ó»óÇÒ ¼ö ¾ø´Â ¼Óµµ·Î µ¥ÀÌÅ͸¦ ÇнÀÇÏ°í ±â¼úÀ» ÀÍÈ÷¸ç Superintelligence¸¦ °®°Ô µÉ °ÍÀÌ¸ç ¸¹Àº ºÐ¾ß¿¡¼­ Àΰ£À» ´É°¡ÇÏ°Ô µÉ °ÍÀÌ´Ù. °Ô´Ù°¡ AI´Â °¨Á¤°ú Àá µî Àΰ£ÀÌ °¡Áö°í ÀÖ´Â ÇÑ°èÁ¡µµ °¡Áö°í ÀÖÁö ¾Ê´Ù.

ÀÌ·± ÀÌÀ¯µé·Î ºñ°ü·ÐÀÚµéÀº AI°¡ ÃÊ·¡ÇÒ À§Çè¿¡ ´ëÇؼ­ °ÆÁ¤À» ÇÏ°í ÀÖ´Ù. ¹Ì±¹ÀÇ AI ¿¬±¸¿øÀÎ ¿¤¸®Àú À¯µµÇÁ½ºÅ°´Â “AI´Â ´ç½ÅÀ» ¹Ì¿öÇÏÁöµµ »ç¶ûÇÏÁöµµ ¾Ê´Â´Ù. ÇÏÁö¸¸ ´ç½ÅÀº AI°¡ ´Ù¸¥ ¸ñÀûÀ¸·Î »ç¿ëÇÒ ¼ö ÀÖ´Â ¿øÀÚµé·Î ÀÌ·ç¾îÁ® ÀÖ´Ù”°í Çß´Ù. ¿¹¸¦ µé¾î, ·Îº¿ÀÌ Åº¼Ò¹èÃâ·®À» Á¦ÇÑÇÏ·Á´Â ¸ñÇ¥¸¦ ¼¼¿ü´Ù°í °¡Á¤Çغ¸ÀÚ. ÃÖ°íÀÇ ÇØ°á¹ýÀº °¡Àå ź¼Ò¸¦ ¸¹ÀÌ ¹èÃâÇÏ´Â ¿øÀÎ, ¹Ù·Î Àΰ£À» Á¦°ÅÇÏ´Â °ÍÀÏ °ÍÀÌ´Ù. ·Îº¿¿¡°Ô´Â Àΰ£ÀÌ ºÎ¾ý ¹Ù´ÚÀÇ °³¹Ì¿Í ´Ù¸¦ ¹Ù°¡ ¾ø´Ù. ¿ì¸®°¡ °³¹Ì¸¦ ¹Ì¿öÇÏÁö ¾Ê¾Æµµ °³¹ÌµéÀÌ ºÒÆíÇÏ´Ù´Â ÀÌÀ¯·Î »ìÃæÁ¦¸¦ »Ñ¸®Áö ¾Ê´Â°¡.

¾î¶² »ç¶÷µéÀº AIÀÇ Àü¿øÀ» ²ô¸é µÇÁö ¾Ê³Ä°í ¹¯°ÚÁö¸¸, ºÒÇàÈ÷µµ Superintelligence´Â Çѵΰ³ÀÇ Àü¿ø¿¡ ÀÇÁ¸ÇÏ´Â µ¥ÀÌÅͺ£À̽º¿¡ ÀúÀåµÇ¾îÁ® ÀÖ´Â °ÍÀÌ ¾Æ´Ï´Ù. ÇÏÁö¸¸ ¾ÆÁ÷Àº AIÀÇ ÀáÀçÀûÀÎ À§Çè¿¡ ´ëÇؼ­ °ÆÁ¤ÇÒ ÇÊ¿ä´Â ¾ø´Ù. ÇöÀçÀÇ ·Îº¿Àº ½º½º·Î ¹ßÀüÇÏ´Â ´Ü°è¿Í´Â ¾ÆÁ÷ ¸Ö¸® Àֱ⠶§¹®ÀÌ´Ù. °úÇÐÀÚµéÀº Friendly Artificial Intelligence (FAI)¸¦ °³¹ßÇÒ ÃæºÐÇÑ ½Ã°£ÀÌ ÀÖ´Ù°í ÇÑ´Ù. ±×µéÀº AIµéÀÌ Àΰ£ÀÇ ½Åü¸¦ ÇØÇÏÁö ¾Ê°Ô Á¦ÇÑÇÒ ¼öµµ ÀÖ°í Àΰ£ÀÇ µµ´ö¼ºÀ» AI¿¡°Ô ÁÖÀÔÇÒ ¼öµµ ÀÖ´Ù.

 

ÃÖÀ±¿µ °­³²Æ÷½ºÆ® Çлý±âÀÚ  webmaster@ignnews.kr

<ÀúÀÛ±ÇÀÚ © °­³²Æ÷½ºÆ®, ¹«´Ü ÀüÀç ¹× Àç¹èÆ÷ ±ÝÁö>

ÃÖÀ±¿µ °­³²Æ÷½ºÆ® Çлý±âÀÚÀÇ ´Ù¸¥±â»ç º¸±â
iconÀαâ±â»ç
½Å¹®»ç¼Ò°³¤ý±â»çÁ¦º¸¤ý±¤°í¹®ÀǤýºÒÆí½Å°í¤ý°³ÀÎÁ¤º¸Ãë±Þ¹æħ¤ýû¼Ò³âº¸È£Á¤Ã¥¤ýÀ̸ÞÀϹ«´Ü¼öÁý°ÅºÎ
¼­¿ï½Ã °­³²±¸ ¼±¸ª·Î 704, 10Ãþ 593È£(û´ãµ¿, û´ãº¥Ã³ÇÁ¶óÀÚ)  |  ´ëÇ¥ÀüÈ­ : 02)511-5877   |  ¹ßÇàÀÏÀÚ : 1995³â 4¿ù 6ÀÏâ°£
µî·ÏÀÏÀÚ : 2018³â 2¿ù 28ÀÏ  |  µî·Ï¹øÈ£ : ¼­¿ï ¾Æ 04996  |  È¸Àå : Á¶¾çÁ¦  |   ¹ßÇàÀÎ : Á¶ÀÎÁ¤  |  ÆíÁýÀÎ : Á¶ÀÎÁ¤
û¼Ò³âº¸È£Ã¥ÀÓÀÚ : Á¶¾çÁ¦
Copyright © 2024 °­³²Æ÷½ºÆ®. All rights reserved.
Back to Top