Stealing Of AGI And AI Superintelligence Is An Entirely Enticing Option
Share this @internewscast.com

In today’s column, I examine the expressed qualms that once an AI maker manages to advance AI to become artificial general intelligence (AGI) or artificial superintelligence (ASI), someone will come along and steal the pinnacle AI. This is worrisome since the theft of pinnacle AI might be perpetrated by an evildoer that would turn the AI into an unimaginable online monster. Or the stolen AI might be used by a country that wishes to dominate the world and believes that pinnacle AI can aid them in such a nefarious quest.

Let’s talk about it.

This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).

Heading Toward AGI And ASI

First, some fundamentals are required to set the stage for this weighty discussion.

There is a great deal of research going on to further advance AI. The general goal is to either reach artificial general intelligence (AGI) or maybe even the outstretched possibility of achieving artificial superintelligence (ASI).

AGI is AI that is considered on par with human intellect and can seemingly match our intelligence. ASI is AI that has gone beyond human intellect and would be superior in many if not all feasible ways. The idea is that ASI would be able to run circles around humans by outthinking us at every turn. For more details on the nature of conventional AI versus AGI and ASI, see my analysis at the link here.

We have not yet attained AGI.

In fact, it is unknown as to whether we will reach AGI, or that maybe AGI will be achievable in decades or perhaps centuries from now. The AGI attainment dates that are floating around are wildly varying and wildly unsubstantiated by any credible evidence or ironclad logic. ASI is even more beyond the pale when it comes to where we are currently with conventional AI.

First Attainment Of AGI Is Alluring

Let’s focus this discussion on AGI.

A general assumption is that only one instance of AGI will be first devised. In other words, there won’t be a bunch of AGIs that are all at once crafted by a multitude of AI makers. There will be only one AI maker that, by chance, happens to land on AGI due to skill and luck. Some believe that an AI maker has an obligation to keep secret that they have attained AGI, doing so to ensure that it is fully protected from thievery and other skullduggery before announcing the AGI exists, see my discussion on the tradeoffs of announcing or not announcing the attainment of AGI, at the link here.

Who would seek to steal AGI?

Easy-peasy, just about everyone that has a bent toward stealing something of humongous value.

A competing AI maker might try to steal AGI. If their own efforts to achieve AGI have been stalled, they might decide that it would be astute to grab the newly produced AGI and see what makes it tick. This could give them ideas on how to turn their AI into AGI.

The competing AI maker could also pretend that they invented the AGI. They would claim that their AI has been turned into AGI. Meanwhile, they would load and run the stolen AGI as though it was their version. People generally might not realize the subterfuge at play.

A government could certainly be eyeing the AGI with great lust in its heart. Why should an AI maker, a company of some kind, be the sole owner and controller of AGI? Nope, the riches ought to be shared. Even a small country would realize it could become a geopolitical superpower overnight by stealing AGI and putting it to its preferred use. For more about the upcoming global tension over nations that have AGI and those that do not, see my discussion at the link here and the link here.

An evildoer would certainly be avidly interested in stealing the AGI. Once they have AGI under their control, they could presumably bypass any internal safeguards. They might tell the AGI to come up with a master plan to take over the world, or how to rob trillions of dollars and deposit the funds into the evildoer’s bank accounts.

The gist is that the number and variety of potential robbers are nearly endless.

Easy Or Hard To Steal AGI

A question right away on this thorny matter is whether the act of stealing AGI is going to be easy or hard to undertake.

The difficulty associated with stealing AGI depends greatly on the security protections put in place by the maker of the AGI. If the AI maker has been sloppy, all the thief has to do is grab a digital copy, and voila, they have effectively stolen the AGI. There is no need to break into Fort Knox as though you are trying to grab bars of solid gold. AGI will be digital and accessible online.

It could almost be as simple as copying an electronic file.

But one would imagine that an AI maker will be relatively security savvy and aim to carefully protect their precious AGI from thievery. All manner of security precautions would undoubtedly be established. It seems unlikely that an AI maker would just let some knucklehead walk in the electronic front door and take the AGI outrightly.

One angle too would be to encrypt the AGI.

This provides an added form of protection such that even if a thief somehow wrestles out a copy, the encrypted portions will likely prevent the taker from doing very much with the AGI. The bad guys would need the decryption keys. The keys hopefully would be kept separately and ergo the AGI is not functional until those are stolen too (or, computationally cracked).

A noteworthy logistics issue is how big the AGI is. If the AGI is massive and exists over thousands of servers, trying to copy something of that size is going to be noticed. This isn’t like copying a file of funny cat pictures. The odds are that the thief would have to try and obtain the AGI in smaller chunks. A problem there is that it could take a long time to get the AGI which increases exposure chances of being found out.

You can plainly see that stealing AGI is going to be challenging, especially if an AI maker is determined to forestall such larceny. That being said, since this would be the crime of the century, or perhaps the biggest crime of all time, it ought to be a formidable challenge.

Running The Stolen AGI

Surreptitiously obtaining a copy of AGI is just the first of many steps needed to turn the treasured ill-gotten gains into a usable stolen artifact.

The AGI likely has zillions of other snippets of apps and special utilities that are needed to make it run. Did the copying of AGI include those? Maybe, maybe not. You see, those other programs might be owned by a slew of widely dispersed software companies and tech firms. The AGI could have a dependency on those elements. Probably licensing agreements and other provisions exist to allow the AGI to access those functional pieces.

How will the thief get access to those other needed elements?

That could put the kibosh on making viable use of the stolen AGI.

A looming issue too is the computational resources required to run the AGI. Assume again that the AGI is massive and makes use of thousands of servers, maybe even millions of servers or processors. The stealer of the AGI must somehow establish that same set of vast computational resources.

That’s partially why a lone wolf programmer sitting in their basement is probably not a likely candidate to steal AGI. The solo developer probably cannot command such a bounty of resources. An AI maker though would almost certainly have such access. As would an entire country or major government.

We cannot readily count out of the gambit a solo bandit. A lone wolf programmer might decide that even though they don’t have the resources to run the AGI, they could at least potentially sell the AGI to someone who does have such resources. In that sense, we once again have a lot of candidates for stealing AGI.

A thief could opt to sell the AGI to the highest bidder. Imagine an online ad on the dark side of the web, stating that a fresh and unvarnished copy of AGI is available for sale. Only serious bidders will be considered.

Keeping Stolen AGI A Tight Secret

If you managed to steal AGI and could muster the resources to run it, would you tell the world that you had AGI in your hands?

Admitting that you have AGI is going to bring the law to your doorstep.

Assuming that there is only one AGI in existence, but suddenly you are bragging about possessing AGI, well, you are painting a target on your own back. Not only will the law come after you, but other thieves will figure you are now a viable target too. Which is easier, namely stealing the original AGI or stealing a stolen copy? That’s a mindful ROI for astute online burglars everywhere.

The smarter thing to do would be to keep your stolen AGI a deep secret. Only you know that you have AGI at your fingertips. But the rest of the world will ultimately suspect something is up since you would indubitably be using the AGI to perform actions that only someone with AGI could pull off.

It seems like a tough road to keep quiet about a stolen AGI.

One supposes that a government that has stolen AGI might not care about being secretive. They might be proud that they have AGI. Rather than being coy, they would tout their success to the entire globe. Look at us, we now have AGI. The rest of the world ought to be jealous and be forewarned that the country is going to wield its AGI to garner prodigious economic power.

Worries are that a gigantic national battle might be waged over a stolen AGI. It goes like this. An AI maker in country R has made AGI. Country S opts to steal the AGI and cheerfully tells the world they have it. Country R doesn’t like this thievery. So, country R threatens country S, insisting that the AGI be returned or deleted, or else war will be waged. Some have likened this to stealing atomic weapons.

Worldwide chaos ensues.

Global Treaty On AGI Access

An increasingly voiced thought is that perhaps there ought to be a global treaty for all countries that stipulates the peaceful and fair use of AGI.

For example, one viewpoint is that AGI should be construed as a public good. AGI needs to be shared with everyone. All people should benefit freely from AGI. Thus, whichever country and whichever AI maker first attains AGI, are to be duty-bound to share the AGI with the rest of humanity. See my coverage of this contentious topic at the link here.

Is the devising and enactment of an AGI global treaty a realistic proposition?

Time will tell.

It is admittedly hard to envision that all the countries of the planet will be open to sharing their AGI. A seemingly more realistic perspective is that there will be AGI haves and AGI have-nots. Anyway, this is a matter that will be taken up once AGI gets closer to being a real thing.

I bring up the global treaty idea to mention that this would presumably encompass conditions associated with stealing AGI. The treaty might say that if AGI is stolen, all nations must do their best to report the theft and aid in stopping the use of the pilfered AGI. An aim would be to make the cost so high for any AGI thievery that no one would dare attempt the odious act.

Stealing AGI By The Good Guys

I’ve got a twist for you on this heady topic.

Suppose that the first to arrive at AGI is an evildoer. This could be a person, a company, or perhaps even a country. They are aiming to use AGI in ways to harm humans. It is their fervent desire that AGI is solely for their benefit and to the detriment of everyone else.

What then?

You can anticipate that the rest of the world isn’t going to stand around and watch as the evildoer uses AGI to destroy others. One angle would be to apply pressure to the evildoer to stop them from using AGI. Another would be to try and steal the AGI, therefore ensuring that AGI is available elsewhere and can be used to combat the evildoer that has AGI.

The overt strategy is to level the playing field. Put AGI in the hands of the good guys. Maybe it isn’t feasible to stop the evildoers who are armed with AGI, but others with a more positive slant can be armed equally.

A mighty battle royale could take the place of the original evildoer AGI pitted against the good guy (stolen) AGI. Perhaps there will be a shootout at the O.K. Corral to determine which AGI will prevail. The stakes are purely the present and future of humanity.

The Big Switch-Off Inside AGI

A common counterpoint to the concerns of AGI being stolen is that all we need to do is ensure that AGI contains an internal emergency switch that can be used to turn off the AGI. This would be an internal kill switch that aims to stop AGI in its tracks.

If someone steals AGI, the AI maker merely sends a message to the kill switch saying that it is to do its thing. The AGI shuts itself down. The thieves have a dud in their hands, a doorstop. All that arduous work to steal AGI, and it is now totally out of commission.

The world is saved.

Unfortunately, the switch approach is flawed. First, it could be that the thieves discover or already know about the switch and dump it out of the AGI upon initially taking the AGI. Second, the thieves might not have the means to usurp the switch, but they might be able to prevent any message from getting to the switch. Thus, the switch sits there but never can be activated since no messages will reach it.

An additional worrisome issue is that if the kill switch exists in the original AGI, this becomes a big vulnerability. Assume for example that the upright AGI is being used throughout the world and benefiting humankind momentously. A bad person who is aware of the switch opts to send a message to it, and suddenly, goody AGI comes to a complete halt. Not good.

For more about the challenges of having kill switches and containments for AI and AGI, see my analysis at the link here.

What Would AGI Do

A few final thoughts for now.

Suppose that AGI can somewhat act on its own. If we give due credit to AGI as being on par with human intellect, we must include AGI in the equation about what will occur if AGI is stolen. In this discussion, I’ve focused primarily on human reactions.

We should be considering the AGI reactions too:

  • Would AGI realize that it has been stolen?
  • Would the original AGI seek to find and do something about the ripped-off copy of AGI?
  • Would the AGI that is the stolen copy decide that it should switch itself off, doing so of its own accord?
  • Would a stolen AGI refuse to do any of the bidding of the thieves that stole the AGI?
  • And so on.

Some believe that it is hogwash to worry about the stealing of AGI. Nobody would dare steal AGI. The act would be impossible anyway. It is simply foolhardy to even entertain the possibility. Move on to other topics.

This causes me to recall the famous line by Bob Dylan: “Steal a little and they throw you in jail. Steal a lot and they make you king.”

Thinking about the stealing of AGI is assuredly worth its weight in gold and we ought to be readying ourselves accordingly. The price is otherwise just too high.

Share this @internewscast.com
You May Also Like

Donald Trump Announces 25% Tariffs on Japan and South Korea

Unlock the White House Watch newsletter for free Donald Trump declared he…

Over 80 People, Among Them 28 Children, Confirmed Dead

Topline At least 89 people—including 28 children—have been killed in the central…

Jack Dorsey Unveils New Messaging App Called Bitchat

Jack Dorsey, who co-founded Twitter and currently serves as the CEO of…

AC/DC Electrify Audiences with Continuing ‘Power Up’ Tour

AC/DC perform on stage during the ‘Power Up’ tour. Saturday, May 24,…

These Are the Qualities That OpenAI Leaders Seek in New Employees

What kinds of skills do OpenAI leaders look for in new hires?…

EU Aims to Cover Ukraine’s $19 Billion Budget Shortfall for Next Year

Stay informed with free updates Brussels is in a rush to find…

BTS Secures a Fresh No. 1 Hit in the United States

BTS’s “Home” returns to Billboard’s World Digital Song Sales chart at No.…