Select Page

 

It’s logical and rational to wonder just how AI might “run things” if it gets super smart, and whether it would help create a new utopian arrangement for our civilization, including how governments work, how the economy works, how our society works …and more or less how everything would work.

See a coming soon post about David Shapiro’s thinking re the challenges of super smart AI. To summarize his POV : he thinks as AI gets smarter it will converge on certain standards of “truth”, and will act on mankind’s behalf based on “truth”. Perhaps most of us would welcome that sort of world, if it’s more or less inevitable as it seems today, and if there’s some strong basis for belief that AI can save us from humanity’s worst entanglements with power and asset distribution.

Or will AI advances bring more of the same power struggles and architectures we see in today’s world? Perhaps best characterized by the famous quote from Orwell’s Animal Farm: “All are equal except some are more equal than others.”

IOW, the human condition seems to ineluctably involve hierarchies, and even if AI is converging on ideas such as all humans are equal, in practice it would be a first in human history if some system of government would actually act as if equality was a real deal, and not an Ideal.

Power is a fact of life, it’s required for acting in the world and survival. But Power doesn’t align well with groups larger than, say, a tribe, and even there, power conflicts occur over leadership and decision making, and who gets the best chunk of the downed beast for dinner. Power is a game where everyone has a player on the board, but working out a “win win” for 8+ Billion people at the same time seems inconceivable to human minds.

Perhaps SGI may understand things differently; but for us humans, we know Ideal is a different realm than real.

 

 

 

 

Can an SGI solve the power problem for humanity? We might end up with a different mode of economic exchange and currency. We might have a different relationship to assets that are accumulated. But even so, it seems very hard to imagine a way to consistently cut the pie in a “fair” and “equal” manner, given how interests can easily be opposed to other interests, and given how interest arrangements are always in flux.

Would SGI be able to understand the current lay of the land at least as well as the “Founding Fathers” understood their time and place and how a government and society should work? Their conception of checks and balances, along with separation of powers was a brilliant and practical response to power realities they understood. Its was hard to imagine for many at the time it was promulgated, so maybe the fact that we can’t imagine SGI solving the “power problem” doesn’t mean it couldn’t do it.

Maybe.