Google executive downplays AI’s possible threat to mankind | Inquirer Technology

Google executive downplays AI’s possible threat to mankind

/ 07:32 PM September 21, 2017

A concept art of artificial intelligence. INQUIRER.net Stock photo

Tesla founder Elon Musk has long warned the public about the rise of the artificial intelligence (AI) in the near future.

The self-learning technology has often been linked to multiple doomsday scenarios where the machines could one day overthrow mankind.

ADVERTISEMENT

However, Google’s AI chief John Giannandrea has disputed such claims and advised the public not to be worried of a possible “AI apocalypse.”

FEATURED STORIES

There’s a huge amount of unwarranted hype around AI right now,” he said in a Bloomberg report. “This leap into, ‘Somebody is going to produce a superhuman intelligence and then there’s going to be all these ethical issues’ is unwarranted and borderline irresponsible.”

Among those AI problematic scenarios include the reality of the machine developing a consciousness much like humans—which is often portrayed in technology wasteland- themed films.

Giannandrea added that most AI designs are “dumber than you think,” with a narrow intelligence that are only capable of very specific tasks.

“They’re not nearly as general purpose as a 4-year-old child,” he said.

Still, the development of technology continues to grow at a rapid pace, with seemingly unthinkable scenarios becoming a reality with each passing day.

Will AI one day pose a threat to humans? Only time will tell.  Khristian Ibarrola /ra

ADVERTISEMENT

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TOPICS: Artificial Intelligence, Google
TAGS: Artificial Intelligence, Google

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.