OpenAI is apparently partnering with a robotics company, Figure 01. The robot seems to do a remarkable job of integrating language with vision and body movements. They say it works using neural networks entirely. No teleoperation or hand-coded code. Nothing is scripted ahead of time.

"Can I have something to eat?"

"Sure thing."

"Great. Can you explain why you did what you just did while you pick up this trash?"

"On it! So I gave you the apple because it's the only edible item I could provide you with from the table."

"Great. So, based on the scene right now, where do you think the dishes in front of you go next?"

"The dishes on the table, like that plate and cup, are likely to go into the drying rack next."

"Great. Can you put them there?"

"Of course."

OpenAI's 'AGI Robot' Develops SHOCKING NEW ABILITIES | Sam Altman Gives Figure 01 Get a Brain - Wes Roth

#solidstatelife #ai #openai #figure01 #genai #llms #robotics

1