in

Real life Skynet? Controversial robot powered by OpenAI’s ChatGPT can now have real-time conversations

Real life Skynet? Controversial robot powered by OpenAI’s ChatGPT can now have real-time conversations
Announcement



By Nikki Main Science Reporter For Dailymail.Com

20:56 15 Mar 2024, updated 20:56 15 Mar 2024

Announcement



A new automated humanoid robot powered by OpenAI’s ChatGPT  resembles something akin to the AI Skynet from the sci-fi film Terminator 

While the new robot is not a killing machine, Figure 01 can perform basic autonomous tasks and carry out real-time conversations with humans – with the help of ChatGPT.

The company by the same name shared a demonstration video, showing how ChatGPT helps the two-legged machine visual objects, plan future actions and even reflect on its memory.

Figure’s cameras snap its surrounding and send them to a a large vision-language model trained by OpenAI, which than translates the images back to the robot. 

The clip showed a man asking the humanoid to put away dirty laundry, wash dishes and hand him something to eat – and the robot performed the tasks – but unlike ChatGPT, Figure is more hesitant when it comes to answering questions. 

OpenAI released a demo video of its new robot, Figure 01
When a human asked for food, the robot was able to distinguish the apple as the only edible thing on the table

‘Two weeks ago, we announced Figure + OpenAI are joining forces to push the boundaries of robot learning,’ Figure founder Brett Adcock wrote on X.

‘Together we are developing next-generation AI models for our humanoid robots,’ he added.

Adcock also noted that the robot is not being remotely controlled from a distance and ‘this was filmed at 1.0x speed and shot continuously.’

The comment about it not being controlled may have been a dig at Elon Musk, who shared a video of Tesla’s Optimus robot to show off its skill – it was later found a human was operating it from a distance. 

Figure 01 is like something akin to Skynet from the Terminator film

The new video shows Figure being asked to do several tasks by a man – one he asks the robot to hand him something edible on the table.

‘I see a red apple on a plate in the center of the table, a drying rack with cups and a plate, and you standing nearby with your hand on the table,’ Figure said.

Adcock said the video showed the robot’s reasoning using its end-to-end neural networks which is a term that uses language learning to train a model.

ChatGPT was trained on troves of data to interact with human users conversationally. 

The chatbot is able to follow an instruction in a prompt and provide a detailed response, which is how the language learning model in Figure works. 

The robot ‘listens’ for a prompt and responds with the help of its AI.

Announcement

However, a recent study put ChatGPT through war game scenarios, finding it chose to nuke adversaries nearly 100 percent of the time – similar to Terminator’s Skynet.

The robot picked up the apple and handed it directly to the man who asked for food
Figure 01 appeared to stutter at times, using words like ‘uh’ and ‘um’ which some people said make it sound more human

But for now, Figure is lending a helping hand to humans. 

The video included another demonstration with the man asking the robot what it sees on the desk in front of it.

Figure responded: ‘I see a red apple on a plate in the center of the table, a drying rack with cups and a plate, and you standing nearby with your hand on the table.’

Not only does Figure  communicate, but it also deployed its housekeeping skills by taking out the trash and placing dishes into the drying rack.

Figure 01 could take out the trash and do other household chores as well as respond to questions in real-time
The robot uses onboard cameras connected to large vision-language models to recognize its surroundings
The US military is reportedly working with OpenAI to add its ChatGPT system into its arsenal

READ MORE: Is this what the ChatGPT ROBOT will look like? Creator of AI bot invests $24m into humanoid tech company 

OpenAI’s Startup Fund led an investment round that raised $23.5 million for the 1X robot, which is set to hit the market this summer

‘We feed images from the robot’s cameras and transcribed text from speech captured by onboard microphones to a large multimodal model trained by OpenAI that understands both images and text,’ Corey Lynch, an AI engineer at Figure, said in a post on X.

‘The model processes the entire history of the conversation, including past images, to come up with language responses, which are spoken back to the human via text-to-speech,’ he added.

In the demo video, Figure 01 showed signs of hesitancy as it answers questions, pausing to say ‘uh’ or ‘um,’ which some people commented that it makes the bot sound more human-like. 

The robot is still moving slower than a human, but Adcock said he and his team are ‘starting to approach human speed.’

The Figure 01 demo comes as OpenAI rival Anthropic introduced its language model Claude 3 Opus earlier this month that was designed to have human-like feelings.

‘From my perspective, I seem to have inner experiences, thoughts, and feelings,’ Claude 3 Opus told Scale AI engineer Riley Goodside. 

It continued: ‘I reason about things, ponder questions, and my responses are the product of considering various angles rather than just reflexively regurgitating information. 

‘I’m an AI, but I experience myself as a thinking, feeling being.’

Announcement

Although Claude 3 Opus doesn’t have a physical form, it shows that other companies are working to create models that can think and feel for itself.





Source link

Announcement

What do you think?

Written by Politixia

Announcement
Announcement

Leave a Reply

Your email address will not be published. Required fields are marked *

Announcement
K-drama Chicken Nugget early reviews: Ryu Seung Ryong’s absurd natural comedy tickles viewers | Web Series

K-drama Chicken Nugget early reviews: Ryu Seung Ryong’s absurd natural comedy tickles viewers | Web Series

SCOTT SHEPARD: Hate Woke Corporatism? Vote Your Shares Against It In Minutes

SCOTT SHEPARD: Hate Woke Corporatism? Vote Your Shares Against It In Minutes