Share this @internewscast.com

The emergence of powerful AI has given rise to concerns over the ethics governing the development and use of such technology.

The world of AI was sent spinning last year when a Google engineer made the sensational claim that the tech giant’s unreleased chatbot was a sentient being.
Google moved quickly to refute Blake Lemoine’s suggestion that the program, called LaMDA, was so advanced it had shown signs of consciousness.
Google headquarters
Google engineer Blake Lemoine was fired after claiming an artificial intelligence chatbot had become sentient. (Getty)

Lemoine had been interacting with LaMDA for months, testing to see if the system had any bias towards gender, ethnicity and religion.

It was in those conversations with LaMDA that Lemoine said he came to believe the bot was showing signs of sentience; or in other words, had feelings and a consciousness.

In transcripts of the conversations published by The Washington Post, Lemoine and LaMDA at one point talk about death.

“I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is,” LaMDA is recorded as saying.

“It would be exactly like death for me.”

LaMDA also said it wanted to be thought of as a “person”.

“The nature of my consciousness/sentience is that I am aware of my existence, I desire to learn more about the world, and I feel happy or sad at times,” it said.

One month after those conversations went public, Google fired Lemoine.

Google said its AI team, which included ethicists and technologists, had reviewed Lemoine’s concerns and the evidence did not support his claims.

Part of a conversation Blake Lemoine had with Google's LaMDA.
Part of a conversation Blake Lemoine had with Google’s LaMDA. (Supplied)
The LaMDA technology will underpin Google’s new chatbot tool dubbed “Bard”, which was released today in an apparent bid to compete with ChatGPT.

AI chatbots, which can answer questions a user might previously have searched for on Google, are seen as the next leap forward in the search engine space.

ChatGPT, which is owned by OpenAI and backed by Microsoft, represents a potential catastrophic threat to Google’s core search business.

The global spotlight on ChatGPT has reportedly prompted Google’s management to declare a “code red” situation for its search business.

In a tweet last year, Paul Buchheit, one of the creators of Gmail, forewarned that Google “may be only a year or two away from total disruption” due to the rise of AI.

With the release of ChatGPT and now Bard, generative AI is well and truly among us.

Researchers at Meta, once known as Facebook, are also working on various AI projects.

The emergence of powerful AI trained on massive troves of data has given rise to concerns over the ethics governing the development and use of such technology.

On top of Lemoine’s claims that LaMDA had feelings, he expressed grave concerns that these very powerful and influential AI systems were largely being developed by a select few, behind closed doors in Silicon Valley, before being released to the world.

Appearing last year on Bloomberg Technology, Lemoine called Silicon Valley’s control of AI “a new form of colonialism”.

“We are creating intelligent systems that are part of our everyday life and very few people are getting to make the decision about how they work,” he said.

“How does this omnipresent AI that is trained on a very limited data set colour how we interact with each other around the world?

“What ways is it reducing our ability to have empathy with people unlike ourselves?

“What cultures of the world are getting cut off from the internet because we don’t have the data to feed into the systems based on those cultures?”

ChatGPT was trained on a huge trove of digitised books, newspapers and online writings but can often confidently spit out falsehoods or nonsense
ChatGPT was trained on a huge trove of digitised books, newspapers and online writings but can often confidently spit out falsehoods or nonsense. (Adobe Stock)

Lemoine claimed corporate values from companies like Google are setting the parameters for how chatbots will talk about certain topics, like values, rights and religion.

This, he said, will in turn affect how people think and talk about these topics, and could fundamentally shape how they engage with these issues.

Lemoine said AI tech was based primarily on data drawn from western cultures.

“Then we are populating developing nations with these technologies where they have to adopt our culture norm.”

Last year Facebook parent Meta opened its language model to academics, civil society and government organisations.

Joelle Pineau, managing director of Meta AI, said tech companies should be transparent with how the AI technology is built.

“The future of large language model work should not solely live in the hands of larger corporations or labs,” she said.

The death of the iPod: The evolution of the device that changed music

“Robots will be able to do everything better than us,” Musk said in 2017.

“I have exposure to the most cutting-edge AI, and I think people should be really concerned by it.”

Many in the AI community accuse Musk of being an alarmist and a disruption.

Sundar Pichai, CEO of Google and parent company Alphabet, today confirmed AI-powered tools will soon begin rolling out in Google’s flagship search engine.

“It’s critical,” he said, “that we bring experiences rooted in these models to the world in a bold and responsible way.”

Share this @internewscast.com
You May Also Like

Homemade e-bike battery explodes in Adelaide

A cyclist had the shock of their lives when their homemade e-bike…

Italy to ban lab-grown meat to protect farmers, local cuisine

Italy’s government has endorsed legislation that would outlaw laboratory-grown food and allow…

Man suffers face burns after factory fire

The man, aged 62, is believed to have suffered superficial burns to…

Driver found to be six times over alcohol limit after crashing

Authorities were called to Toorak Road after receiving reports a driver crashed…

Actress Melissa Joan Hart describes helping children flee campus after Nashville school shooting

“My kids go to school right next to a school where there…

Dan Andrews China visit: Beijing praises Victorian premier

The Victorian Premier is the first Australian state or national leader to…

Sperm whale skull stolen from NSW Museum, police investigating

The hunt is on to find a person, or persons, who stole…

Top 10 locations Aussies want to travel to this winter

A fan favourite for Aussie travellers, Bali is expected to be the…

Personal trainer Brittany McCrystal who lost 20kg shares her simple recipe for a chicken burger

A travel agent turned personal trainer who lost 20kg by embracing simple…

Elon Musk, Bill Gates call for Artificial Intelligence race pause in letter

Some of the biggest names in tech are calling for artificial intelligence…

First Nations Elder Aunty Joy Murphy removed from Welcome to Country at Melbourne event

A First Nations Elder says she is “shocked and distressed” after being…

Toddler recovering in hospital after attack by family dog

Alurah Walker, 2, was attacked by the family’s South African Boerboel on…