Sanjay:
I fully agree here, AGI can be very dangerous in wrong hands.But same is the case with any powerful tech. Controlling the knowledge is only a temporary measure. In fact, general wisdom says that limiting the knowledge to a chosen few can be more dangerous. Power corrupts easily. Its misuse
.
I can't speak for others, but my goal is to create AGI as a tool, not as
something sentient. I believe it is possible to do that, but that
possibility does not appeal to me. Building AGI as a passive tool is much
more important IMO.
My main interest is just the opposite of yours...
The