Frankenstein syndrome

Mohi Beyki
2 min readNov 15, 2020

Should we be scared of what we might create? This is a question that I had ever since I watched the original Terminator movie. Will we ever create a machine that decides we do not deserve to live on this planet anymore?

I think as it stands right now, we are not even close to that point. We have AIs capable of doing great things, but we are still so far from creating a true AI capable of self-conscience thought. Outside this possibility, I think the only true threat to our existent is nuclear bombs, and they are quite capable of destroying all human life on earth.

If we were close to creating true self-aware AI, I would be scared of what might happen. I think AIs would eventually surpass us as the ultimate creature on earth, and then, our existence depends on how good we have built and trained them. Humans do hurt “lesser” creatures. We have already pushed some animals to extinction. To me, that is a possible reality when we have advanced self-aware AIs. On the other hand, most humans try to preserve life and not hurt animals themselves or their habitat. They might also try to help us and devote themselves to our benefit because we are their creators. I think it is too soon to predict what might happen if we create true AI.

Should we be scared of atomic bombs that might wipe us out? Yes, they are scary and, in my opinion, not necessarily in good hands. Lots of dictators have access to them, and a disaster is possible. All I can do regarding this situation is to hope that our humanity would prevent such a disaster from happening.

--

--

Mohi Beyki
0 Followers

I’m a graduate student of Computer Science & Applications at Virginia Tech. Tech Enthusiast, Deep Learning Expert!