I'm a silly fool...sure don't know much. Do try to consider what the greatest minds are saying, esp. when they aren't pitching for their company/interests.
So here are some big minds and AI experts...and what they say.
“Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization.”
-Stephen Hawking (near death)
- - -
“The most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in ‘maybe possibly some remote chance,' but as in ‘that is the obvious thing that would happen.' … If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter.”
-Eliezer Yudkowsky, Co-founder and research fellow at the Machine Intelligence Research Institute
- - -
"First of all, I tremendously respect Eliezer Yudkowsky and his thinking. Second, I do share his view that there's a pretty large chance that we're not going to make it as humans. There won't be any humans on the planet in the not-too-distant future. And that makes me very sad. We just had a little baby and I keep asking myself how old is even going to get?"
-Max Tegmark, physicist, cosmologist and machine learning researcher. Professor at MIT
- - -
"Don’t Look Up … but AGI instead of comet”
-Elon Musk
- - -
“The alarm bell I’m ringing has to do with the existential threat of them taking control. I used to think it was a long way off, but I now think it's serious and fairly close.”
-Geoffrey Hinton, widely regarded as “GodFather of AI"