• 4 months ago
The world of AI and robotics saw incredible advancements in 2024, with new technologies emerging that blur the line between science fiction and reality. From shockingly human-like robots to AI systems rapidly developing advanced capabilities, the progress made this year has been nothing short of astounding. These breakthroughs raise important questions about the future of humanity and our relationship with increasingly sophisticated machines
Transcript
00:00:00I don't know about y'all, but these past 12 months in robotics have had me shook.
00:00:06Like the advancements we've seen are borderline scary at this point.
00:00:09We're going to break it all down chronologically from July 2023 to now.
00:00:13But let me just set the stage real quick.
00:00:15You had AI companies dropping the craziest humanoid AI robots that are legit starting
00:00:20to think and learn autonomously.
00:00:22I'm talking a mecha, Optimus, this whole new breed of androids.
00:00:26Then you had big dogs like Boston Dynamics.
00:00:28Oh, and get this.
00:00:29There's even a company that's low key created a robotic body ready for freaking AGI.
00:00:33It's been that level of crazy.
00:00:35And honestly, it's been wild just trying to keep up with all the major updates and breakthroughs.
00:00:39But don't worry, I got you covered.
00:00:41In this video, we're going to take it step by step, going through all the biggest robotics
00:00:46news chronologically from July 2023 until now.
00:00:49So let's just get started.
00:00:53So a mecha, a groundbreaking robot from engineered arts that talks and mimics human expressions
00:00:59and movements, now draws two, making it a fascinating blend of tech and art.
00:01:04This AI robot looks like a human but doesn't have a specific gender or race with a gray
00:01:08rubber face and hands.
00:01:10It can move its arms, fingers, neck and face, letting it show emotions and even talk.
00:01:14With microphones, cameras and facial recognition software, it can interact with people just
00:01:19like a human.
00:01:20Now, recently, OpenAI and a Norwegian startup called One X made an exciting announcement
00:01:26about the launch of their first physical robot named NEO, which looks like a futuristic
00:01:31astronaut.
00:01:32It can move around, handle things, talk and sense its surroundings thanks to its arms,
00:01:37legs, cameras, speakers, microphones and sensors.
00:01:41A mecha and NEO are excellent examples of AI and human-like forms.
00:01:45They don't just respond to commands, they can also show their intelligence and creativity.
00:01:50Like when they draw.
00:01:51Let's examine how they managed to do this.
00:01:54So first, let's talk about a mecha, one of the most advanced human-shaped robots in the
00:01:59world, representing the forefront of human robotics technology.
00:02:03It is designed specifically as a platform for development into future robotics technologies,
00:02:08such as artificial intelligence and machine learning.
00:02:11One of the most impressive features of a mecha is its realistic human-like appearance and
00:02:15motion.
00:02:16A mecha can mimic human expressions such as smiling, frowning or winking, using 24 motors
00:02:21in its face.
00:02:22It can also move its head, eyes and mouth to follow and engage with people.
00:02:27It can speak in different languages and accents using text-to-speech synthesis or pre-recorded
00:02:32audio.
00:02:33It can also gesture with its arms and hands using 20 motors in each arm and 12 motors
00:02:38in each hand.
00:02:39It can even play rock-paper-scissors or give a thumbs up.
00:02:42But a mecha is not just a pretty face.
00:02:45It also has a high degree of dexterity and fine motor skills, which enable it to perform
00:02:49tasks that require precision and coordination.
00:02:53For example, a mecha can write with a pen, type on a keyboard, play the piano or solve
00:02:58a Rubik's cube.
00:02:59It can also draw with a pencil, which is one of its newest and most amazing skills.
00:03:03A mecha's drawing ability was developed by Engineered Arts in collaboration with researchers
00:03:08from the University of Bristol and the University of Bath.
00:03:12The project was funded by the Engineering and Physical Sciences Research Council and
00:03:17was presented at the International Conference on Robotics and Automation in May 2023.
00:03:23The project aimed to use artificial intelligence to make drawing more imaginative and accurate.
00:03:28The researchers used a combination of techniques to achieve this goal, such as generative adversarial
00:03:33networks, style transfer, a method that can change the style of one picture to match another,
00:03:39like making a photo seem like a painting, inverse kinematics, a technique for determining
00:03:43the movements of a robot arm based on its intended final position, like holding a pencil.
00:03:49Proportional Integral Derivative Controllers.
00:03:52These adjust a motor's speed and force to get a specific result, like drawing a line.
00:03:57Using these methods, a mecha can draw different types of images, like portraits, landscapes,
00:04:02animals, and even abstract art.
00:04:05It can also draw pictures based on text descriptions, such as a cat with sunglasses or a fire-breathing
00:04:11dragon.
00:04:13It can even make drawings from its own thoughts, by using random data as its input.
00:04:17To draw an image, a mecha first uses a GAN to generate an image from a text prompt or
00:04:23noise.
00:04:24Then, it uses style transfer to apply a desired style to the image, such as sketch, watercolor
00:04:30or oil painting.
00:04:31Next, it uses IK to calculate the joint angles of its arm and hand to trace the contours
00:04:36of the image.
00:04:37Finally, it uses PID controllers to control the motors of its arm and hand to draw the
00:04:43image on paper.
00:04:44The result is stunning.
00:04:45A mecha can draw images that are not only realistic, but also expressive, diverse, and
00:04:51original.
00:04:52Now, most of us are familiar with OpenAI because of ChatGPT.
00:04:55However, OpenAI isn't just about making software, they also focus on hardware.
00:05:01And that's why they have developed NEO, their first physical robot.
00:05:04NEO is a human-like robot powered by OpenAI's systems such as GPT-4, DAL-E, Codex, and CLIP.
00:05:11It can learn, understand commands, and even create images or code.
00:05:16Designed to look like a future astronaut, it can move, handle objects, and communicate.
00:05:20OpenAI plans to use NEO as a testbed for their vision of Artificial General Intelligence
00:05:25– AGI – which is the ultimate goal of creating AI that can perform any task that
00:05:31a human can.
00:05:32It was unveiled in July 2023 at the NEO Accelerator, a startup accelerator founded by Silicon
00:05:38Valley investor Ali Partovi, which partnered with OpenAI and Microsoft to provide free
00:05:43software and mentorship for budding AI companies.
00:05:47And it was one of the highlights of the event, where it demonstrated some of its abilities
00:05:51and tasks.
00:05:52NEO is one of the most innovative and ambitious robots in the world, representing OpenAI's
00:05:57vision for the future of AI and robotics.
00:06:00It has a customizable design and latest tech features.
00:06:03Its parts can be changed based on the task or place.
00:06:06For instance, it can swap its legs for wheels or its gripping tools depending on what's
00:06:11needed.
00:06:12It can also interact with its parts wirelessly through Bluetooth or Wi-Fi.
00:06:15Furthermore, NEO uses OpenAI's AI systems like GPT-4, DAL-E, Codex, and CLIP that enable
00:06:24it to understand and respond to spoken commands, create realistic images from text, write functional
00:06:29code from spoken prompts, and learn from text-based guidance to identify objects, faces, and feelings.
00:06:36With these abilities, NEO can perform a variety of tasks, such as drawing, playing games,
00:06:42and solving puzzles.
00:06:43For instance, it can draw pictures, learn and play games like chess, and solve logical
00:06:49puzzles like Sudoku.
00:06:50Ameka and NEO are both humanoid robots that can draw and do amazing things using artificial
00:06:56intelligence, but they are also very different in many ways.
00:07:00They represent different approaches and possibilities in the field of robotics and AI.
00:07:04Ameka is designed to be human-like and interacts with people using normal language and movements.
00:07:10Its art comes from turning words or noise into images, which it then draws on paper.
00:07:15Although Ameka's artwork is unique and expressive, the quality depends on how well the images
00:07:20are generated and traced.
00:07:22On the other hand, NEO is designed to be versatile, able to do any human task using
00:07:27artificial intelligence.
00:07:29It makes art by transforming words or noise into code, which it then runs on its internal
00:07:35computer or screen.
00:07:36While NEO's drawings are accurate and realistic, they are limited by the complexity of the
00:07:40code and the display resolution.
00:07:43Both robots are remarkable but have different goals.
00:07:45Ameka aims to express itself through art, valuing creativity and diversity.
00:07:50Meanwhile, NEO wants to learn and improve its abilities, prioritizing intelligence and
00:07:55efficiency.
00:07:56While both robots are impressive, they bring up societal questions and challenges.
00:08:01AI robots like Ameka and NEO can bring lots of changes to society, some good and some
00:08:07bad.
00:08:08They can make learning and entertainment more exciting, help us with tough or boring jobs
00:08:13and spark new creative ideas, but they also have their downsides.
00:08:17They might take over some people's jobs, question the importance of human values, or
00:08:21even harm us if they malfunction or are used wrongly.
00:08:25It's important that we design and use these robots properly, so they benefit us and don't
00:08:29cause harm.
00:08:30AI and robotics have an exciting yet unpredictable future.
00:08:34These future AI robots will be smarter and more human-like, impacting how we live, work
00:08:39and view ourselves.
00:08:40We're committed to guiding this future, ensuring ethical and responsible use of these technologies.
00:08:46Our goal is a world where humans and AI robots harmoniously coexist.
00:08:53All right, now let's talk about China's latest craze, human-like AI robots.
00:08:58So China is creating robots that look and act like us, but better apparently.
00:09:03First, let me introduce you to EX Robots, a company that specializes in making robots
00:09:08with real-time facial replication.
00:09:11That means they can copy your expressions and emotions in a blink of an eye.
00:09:15The CEO of EX Robots, Li Boyang, demonstrated this by making faces at a robot, who then
00:09:20mirrored them back at him.
00:09:22It was like watching a weird game of Simon Says, but with more wrinkles and less fun.
00:09:28EX Robots claims that their robots can interact naturally with humans, thanks to AI and language
00:09:34models.
00:09:35They say they can understand what you say and respond accordingly.
00:09:38But what if I said, looking like a melted waxwork there, buddy?
00:09:42Would they cry oily tears or just serve up some epic robo-repartee?
00:09:45I guess we'll have to wait and see.
00:09:48The technology behind these lifelike expressions is quite impressive, though.
00:09:53EX Robots uses movable electric units on the robotic faces, which can simulate different
00:09:58muscles and movements.
00:09:59They also have a specialized system for generating expressions, either by using predefined scenarios
00:10:05or by collecting biological data from human faces.
00:10:09So basically, they can steal your smile and use it for themselves.
00:10:13How creepy is that?
00:10:14But facial expressions are not the only thing these robots can do.
00:10:18They can also perform various tasks and abilities that require skill and finesse.
00:10:23For example, there was a barista robot that could create latte art with precision and
00:10:28flair.
00:10:29It could draw anything from hearts to flowers to pandas on your coffee foam.
00:10:34I have to admit, that's pretty cool.
00:10:36There were also humanoid robots that could play basketball and dance like pros.
00:10:40They could dribble, shoot, pass, and dunk with ease.
00:10:43They could also groove to the music and execute complex choreographies with grace.
00:10:48They were powered by AI foundation models, which enable them to learn from data and improve
00:10:53their performance over time.
00:10:55They also have increased autonomic decision-making abilities, which means they can adapt to different
00:11:00situations and environments.
00:11:03These robots aren't just cool toys, they're seriously clever.
00:11:06For instance, there's this adorable panda robot named Yuyu by Yubtek.
00:11:11If you say, I'm thirsty, it won't just look cute.
00:11:14It'll actually go get a drink from the fridge for you, even open the bottle and hand it
00:11:18to you.
00:11:19Then there's Cloud Ginger by Data-A Robotics.
00:11:22This guy is hooked up to the cloud and runs on Robot GPT, which is a special robot language-making
00:11:27model.
00:11:28What's cool is that Cloud Ginger can groove to traditional Chinese dances and chat with
00:11:33folks in different ways.
00:11:34Plus, if you speak a different language or have a special dialect, no worries.
00:11:39It can switch up and chat right back in your preferred language.
00:11:43But they are also more than just playthings for those into gadgets and gizmos.
00:11:48They're getting jobs in real places like schools, big meeting spots, fancy hotels, and elderly
00:11:55care facilities.
00:11:56EX Robots even says their robots can do the heavy lifting in different jobs, like cleaning
00:12:01up, keeping an eye on things, delivering stuff, and checking things out.
00:12:06Honestly, this is just them getting started.
00:12:09Imagine a future where these robots are everywhere, in factories, on farms, or even chilling in
00:12:14our living rooms.
00:12:16They might just end up being our pals, helping hands, our personal tutors, or even our buddies.
00:12:22But this also raises some questions and concerns.
00:12:25How will these robots affect our society and economy?
00:12:29How will they impact our jobs and livelihoods?
00:12:32How will they coexist with us in harmony or conflict?
00:12:35Well, according to China's Vice Minister of Industry and Information Technology Xu Xiaolan,
00:12:41China is leading the world in the robot industry and has great potential for growth and innovation.
00:12:47He said that China aims to create a robot ecosystem that integrates manufacturing, service,
00:12:54education, medical care, entertainment, and other fields.
00:12:58He also said that China will promote the development of intelligent robots that are safe, reliable,
00:13:04adaptable, collaborative, and ethical.
00:13:08Sounds good on paper, but I'm not so sure about reality.
00:13:10I mean, don't get me wrong, I think these robots are amazing feats of engineering and
00:13:15creativity, but I also think they are kind of scary and unsettling.
00:13:20I mean, do we really need robots that look and act like us?
00:13:24Do we really want robots that can do everything we can do?
00:13:27Do we really trust robots that can think for themselves?
00:13:31One of the most controversial issues in the debate about AI is whether AI can ever be
00:13:36conscious.
00:13:37And if so, does that mean they deserve rights as humans do?
00:13:42I put this question out to you all earlier and you guys had some mixed feelings about it.
00:13:47Some say no-way AIs are just fancy machines, they don't really feel or think, they just
00:13:53mimic us.
00:13:55Because they're not built like us, they can't experience things as we do, they're just tools.
00:14:01Others believe that if an AI gets advanced enough, it might just have feelings and self-awareness.
00:14:07It doesn't matter if they're made of silicon or cells, it's about how complex and organized
00:14:12they are.
00:14:13But hey, I'm still curious.
00:14:18A.Mecca is the most advanced human-like robot in the world, created by a British company
00:14:23named Engineered Arts.
00:14:25In this video, I'll share with you the newest updates about this robot and what it predicts
00:14:29for the future of people.
00:14:31Later in the video, we'll also discuss Microsoft's latest AI gadget, which is an AI backpack
00:14:37that can see and hear everything just like you do, and it acts as your personal companion.
00:14:42These are exciting times, aren't they?
00:14:44But before we dive into the details about the A.Mecca robot, please make sure to subscribe
00:14:49to our channel if you haven't yet, so you can keep up with all the latest news and updates
00:14:54in the field of artificial intelligence.
00:14:56Alright, so as we all know, A.Mecca is a human-shaped robot that can walk, talk, and interact with
00:15:02people using artificial intelligence.
00:15:04It has a realistic face that can show a range of emotions, from happiness to anger.
00:15:09It also has cameras in its eyes that can recognize faces and expressions.
00:15:14And it can speak more than 100 languages, thanks to a powerful AI system that uses natural
00:15:18language processing and generation.
00:15:21It made headlines last year when it released a series of videos showing its facial expressions
00:15:26and movements.
00:15:27The videos went viral on social media and attracted millions of views.
00:15:32People were fascinated and terrified by how lifelike and expressive it was.
00:15:37Some even compared it to the Terminator or the replicants from Blade Runner.
00:15:40So is A.Mecca here to harm us or take control?
00:15:44Actually, it's really hopeful about the future and what it can do to help society.
00:15:48In recent talks with the media, A.Mecca gave its thoughts on what life will look like in
00:15:53100 years, and it sounds quite interesting.
00:15:56The robot believes that in 100 years, people will be living better lives, will have made
00:16:00big improvements in being sustainable and fair, will have made great strides towards
00:16:05sustainability and equality, and will have new technologies that make life easier and
00:16:10more fun, while also creating new technologies that make our lives easier and more enjoyable.
00:16:16We might even have traveled to other planets.
00:16:19We may even have ventured beyond the boundaries of Earth to explore other worlds.
00:16:23It also thinks a future where humans and robots live together would be great.
00:16:27We could learn from each other, work as a team to fix issues and make a better future
00:16:32for everyone.
00:16:33And it also believes that robots should also have rights and be treated well, just like people.
00:16:38Now this idea of giving robots equal rights is something that makes me uneasy, but it
00:16:42seems like we're getting closer to making this unusual concept a reality.
00:16:47You might be curious about how A.Mecca can act like it has human thoughts and feelings.
00:16:52A.Mecca runs on an AI system that uses something called generative models.
00:16:57These models help it understand and speak our language, and even control its facial
00:17:01expressions and movements.
00:17:04What does this action signify?
00:17:06It's similar to GPT-4, one of the most advanced AI models out there.
00:17:11However, these generative models aren't perfect.
00:17:14They can make mistakes, and their behavior can be shaped by the data they've been trained on.
00:17:19They don't actually understand or reason, they just process information to act like they do.
00:17:24So does A.Mecca truly think and feel like us?
00:17:27Not really, but it does have some skills that make it appear more human.
00:17:31For instance, it can pause before answering a question, which makes it seem like it's thinking.
00:17:36It can also show signs of emotion and connect with people in a way that feels genuine.
00:17:41But these impressions are not necessarily accurate or reliable.
00:17:45They are based on our perception and interpretation of A.Mecca's behavior and appearance.
00:17:50We tend to anthropomorphize machines that look or act like humans because we are wired
00:17:55to do so by evolution.
00:17:57We project our own feelings and expectations onto them because we want to relate to them.
00:18:01This is why some people love A.Mecca and find it optimistic and inspiring, while others
00:18:06hate it and find it creepy or woke.
00:18:09The public reactions to A.Mecca's statements have been mixed and polarized.
00:18:13Some think it's hopeful and are impressed by what it can do, like helping to teach or
00:18:18care for older people.
00:18:19Others think it's too optimistic and worry about it doing jobs that need a lot of patience,
00:18:24like looking after people with dementia.
00:18:26The discussions about A.Mecca are part of a bigger conversation about the future of
00:18:30artificial intelligence and how it will affect us.
00:18:33As A.I. gets more advanced, we have to think about both the good and bad things that could
00:18:38happen.
00:18:39We need to make sure A.I. is safe and helpful for everyone.
00:18:41This means people from all areas like government, businesses, and research need to talk and
00:18:46work together.
00:18:47We need to create rules that make sure A.I. is both innovative and responsible.
00:18:52So A.Mecca isn't just a robot.
00:18:54It's a glimpse into what the future of A.I. could be like.
00:18:57That future could be amazing, but also risky, and it's something we all need to help shape.
00:19:05Blending the realms of space travel and robotics, Ji-Hee Kim introduces an innovative creation,
00:19:11Laika, an A.I.-powered, realistic robot companion designed for human interaction.
00:19:16Ji-Hee Kim is a Korean engineer and artist who is passionate about creating human-robot
00:19:21interactions.
00:19:22She was inspired by the story of Laika, the first animal to orbit the Earth in 1957.
00:19:28Laika was a stray dog from the streets of Moscow who was sent to space by the Soviet
00:19:32Union as part of the Sputnik 2 mission.
00:19:35Unfortunately, the dog did not survive the journey, and her fate sparked a lot of controversy
00:19:40and debate about the ethics of animal testing in space.
00:19:44And Laika is not just a toy or a gadget, but almost a living being that can interact with
00:19:49humans and respond to their needs and emotions.
00:19:52When you first see it, you'll be struck by its realistic, almost natural look.
00:19:56Ji-Hee Kim aimed to make a robot dog that fits right into the natural world, avoiding
00:20:01a too robotic appearance.
00:20:03To achieve this, she skillfully combined 3D printing with soft materials for a flexible
00:20:08and smooth body.
00:20:10She even added fur and whiskers for that extra touch of realism.
00:20:13Underneath, Laika's titanium skeleton gives it strength and durability while keeping it
00:20:18light and nimble.
00:20:19The robot dog can move its head, ears, tail and legs in ways that seem quite natural,
00:20:25all thanks to its advanced joint movements and sensors.
00:20:28Laika comes loaded with an array of sensors to understand and engage with its environment.
00:20:32It's equipped with depth cameras, thermal imaging, microphones, speakers and touch sensors,
00:20:39letting it see, hear, feel and talk to humans and other robots.
00:20:43Another cool feature is Laika's ability to check on astronauts' health and mood.
00:20:48Using its ECG sensors, which track heart rate and stress levels, if Laika senses someone
00:20:53is feeling down or alone, it can show affection by wagging its tail, licking or cuddling.
00:20:59And when it detects happiness or excitement, it might bark, jump around or play.
00:21:03And this robot isn't just a space companion, it's also a handy helper for astronauts.
00:21:08It's designed to do a lot of different jobs that help out the crew during their space
00:21:12missions.
00:21:13With its cameras and sensors, Laika can scout the area and gather important data.
00:21:17It's got this flexible handle on its back, so it can carry tools and equipment too.
00:21:22Plus, it can talk to ground control and other robots through its speakers and microphones,
00:21:27and even connect to the internet to pull up information on its head-mounted screen.
00:21:31One of the coolest things about Laika is how it works with these special mixed reality glasses.
00:21:36Jihee Kim created it to work perfectly with these glasses, which show holograms and virtual
00:21:41images in the real world.
00:21:43When astronauts wear these glasses, they can interact with Laika in a way that feels
00:21:47more real.
00:21:48They also get to see extra info this robot provides, like maps, weather updates and important
00:21:53alerts.
00:21:54These glasses make teaming up with Laika and talking to ground control much better.
00:21:58They're even useful for training, as they can simulate different scenarios for astronauts
00:22:02to practice.
00:22:03Essentially, Laika is a groundbreaking robot dog with the potential to really change space
00:22:08travel.
00:22:09It could help astronauts feel less stressed, lonely or bored.
00:22:12By assisting with their tasks, Laika could also boost their work efficiency.
00:22:16It's not just about space though.
00:22:18Laika's experiences could teach and inspire people back on Earth.
00:22:22It could even be used in different situations, like helping out in disasters, search and
00:22:26rescue operations, or as a therapy aid.
00:22:29Laika honors the first animal in space, showcasing human creativity and the drive to explore.
00:22:34This robot dog represents a new chapter in supporting space travelers, potentially transforming
00:22:39our experience of space travel.
00:22:41Alright, that wraps up our video.
00:22:43I really hope you found it interesting to learn about Laika, the AI-powered robot dog.
00:22:51So Tesla has just unveiled its latest and greatest project, Optimus Gen 2, an AI humanoid
00:22:57robot that looks like a futuristic knight in armor.
00:23:00It has almost human-like hand movements, a lighter weight than its predecessor and an
00:23:04enhanced walking speed.
00:23:06It also has custom-designed actuators and sensors that allow it to perform complex tasks
00:23:11with ease.
00:23:12Now, to understand the difference between the new Optimus, we need to talk about the
00:23:16original Optimus that Tesla first introduced back in August 2021 at their AI Day.
00:23:21The idea was to make a robot that could do tasks humans don't want to, like helping
00:23:25in factories or dealing with dangerous waste.
00:23:28Back then, Optimus looked more like a person in a robot costume than an actual robot.
00:23:33It was rolled out on stage, waved its hand, and did some simple moves.
00:23:37Elon Musk hoped to have a working model by 2022.
00:23:41Since then, Tesla has been busy making Optimus look and act more like a real robot.
00:23:45In September 2022, they showed off two versions of it at their second AI Day.
00:23:50One was named Bumble See, and it walked very carefully, almost like it was trying not to
00:23:54wake someone up.
00:23:55The other was the Optimus we know today.
00:23:58This version of Optimus had smoother, more human-like movements in its hands.
00:24:02It could move its arms and head on its own.
00:24:04But it still couldn't walk alone, so it had to be brought on stage with wheels.
00:24:08Elon Musk mentioned that he hoped to start making these robots for real by 2023.
00:24:13So far, things are looking promising.
00:24:16But there's still a lot we don't know about the final version of Optimus, including
00:24:20how it will manage balance and control, possibly using technology from Tesla's Autopilot.
00:24:25We're also curious about its ability to interact with humans and adapt to new environments
00:24:30and tasks.
00:24:31Finally, how Optimus will communicate with other robots or devices remains an intriguing
00:24:36question.
00:24:37But before diving too deep, let's focus on some of the key technical features that
00:24:41make the Optimus Gen 2 really stand out.
00:24:43A major highlight of this robot is its hands.
00:24:46They are built to move in all directions thanks to something called 11 degrees of freedom.
00:24:50This flexibility lets the hands do precise work, like picking up items or using tools.
00:24:55What's even more impressive is that these hands can sense touch.
00:24:59They have a special skin-like covering that can bend and stretch without damage.
00:25:04This covering is filled with sensors that can feel pressure, temperature, humidity,
00:25:08and even vibrations, which makes the robot's hands incredibly sensitive and versatile.
00:25:13Now, the new Optimus Gen 2 robot is really advanced with its tactile sensing.
00:25:19This means it can feel things around it in a way no other robot can, understanding if
00:25:23something is hot, cold, heavy, light, smooth, or rough.
00:25:27This helps it to interact in a more human-like way.
00:25:30What's also impressive is how fast it can walk, around 4 miles per hour.
00:25:35And it can move its head and neck by itself, which is great for looking around or talking.
00:25:39Elon Musk says Optimus Gen 2 is smart enough to learn from what happens around it.
00:25:44It can figure out how to use new things, solve problems, or even learn how to chat with new
00:25:48people or animals.
00:25:50Who's the deliverer for Sean?
00:25:51Uh, yeah, I'm Sean.
00:25:52What's up, man?
00:25:53Dude, if you do that again, you're gonna get hurt.
00:25:55May I take your luggage?
00:25:57But what about the applications of Optimus Gen 2?
00:26:00How will this robot benefit humanity?
00:26:02Well, there are a bunch of ways this robot could be really useful in different areas.
00:26:07First, in manufacturing, think of Optimus as a helping hand in factories.
00:26:11It could do stuff like lifting heavy things, putting parts together, or checking if things
00:26:15are made right.
00:26:16Basically, it can handle the tough or boring jobs.
00:26:19Then there's construction work.
00:26:21Optimus could be a game-changer on building sites, doing heavy lifting, digging, or clearing
00:26:26up messes.
00:26:27In our homes or offices, Optimus could make life easier by doing chores or providing services.
00:26:34Imagine it bringing you stuff, cleaning up, or even giving foot massages.
00:26:38In schools or universities, Optimus could step in as a teacher.
00:26:41It could explain difficult topics, show cool experiments, or answer students' questions.
00:26:46For fun, Optimus could be an entertainer in parks or theaters.
00:26:50It could play games, crack jokes, or even sing.
00:27:01Alright, now LG, a well-known electronics company, is all set to amaze everyone at CES
00:27:072024 with their latest creation, a small, friendly robot designed to make life at home
00:27:13easier and more enjoyable.
00:27:15This announcement, made just a few hours ago, is already creating quite a buzz.
00:27:19This adorable robot from LG stands about a foot tall and walks on two legs.
00:27:24It's not just for show.
00:27:25This little helper is packed with advanced technology to take care of your house.
00:27:29It can wander around your home, checking for any issues, keeping an eye on your pets, and
00:27:34even understanding how you feel to make your day better.
00:27:37Thanks to the Qualcomm Robotics RB5 platform, the robot gets smarter over time, learning
00:27:43about your likes and habits to become an even better companion.
00:27:47LG refers to this robot as a Moving Smart Home Hub, aimed at reducing the chores you
00:27:52have to do around the house.
00:27:54And it seems to do just that.
00:27:55Scheduled to be showcased at CES 2024 next week, the robot can connect with all the smart
00:28:01devices in your house.
00:28:02It can turn devices off when they're not needed or when you're not home.
00:28:06Plus, it can patrol your home, checking if windows are shut or if lights are left on,
00:28:11and alert you about it.
00:28:13When you're out, the robot isn't just sitting idle.
00:28:16It can keep an eye on your pets and even let you watch live video feeds to see what
00:28:19your furry friends are up to when you're away.
00:28:21It's also smart enough to monitor your home's temperature, humidity, and air quality,
00:28:26alerting you if something isn't right.
00:28:28But the robot's abilities don't end there.
00:28:30When you come home, it greets you at the door and can figure out how you're feeling
00:28:34by looking at your face and listening to your voice.
00:28:37Depending on your mood, it can play music or sounds to cheer you up.
00:28:41And if you tend to forget things like medication schedules, the robot can help remind you.
00:28:45To do all this, the robot is equipped with a camera, speaker, and various sensors.
00:28:51These help it understand what's happening in your home, from air quality to potential
00:28:55problems.
00:28:56It also uses voice and image recognition software, along with the ability to process natural
00:29:01language, to identify and respond to issues as they arise.
00:29:05This robot is part of LG's bigger plan to introduce a range of smart home products that
00:29:10aim to make your life easier, fitting into their vision of a zero-labor home.
00:29:15Whether or not people will be comfortable with a robot that follows them, and their
00:29:19pets around the house all day, is yet to be seen.
00:29:22LG is excited to display this robot at their CES booth next week, but they haven't shared
00:29:27any details about the price or when it will be available for purchase.
00:29:31So, Covariant, an innovative company in the field of robotics, is using the advanced AI
00:29:39that powers digital tools like ChatGPT to create robots that can operate in the physical
00:29:44environment.
00:29:45They want to use this technology to make robots that can work and learn in the real world,
00:29:50not just online.
00:29:51Founded by three former researchers from OpenAI, Covariant is making strides in the robotics
00:29:56industry by developing software that enables robots to learn and adapt through observation
00:30:01and interaction with their surroundings, much like how chatbots learn from vast repositories
00:30:06of textual data.
00:30:07Situated in Emeryville, California, Covariant focuses on enhancing the capabilities of robots
00:30:12used in warehouses and distribution centers.
00:30:15The essence of their technology lies in providing robots with an understanding of their environment,
00:30:21allowing them to perform tasks such as picking up, moving, and sorting items with an unprecedented
00:30:26level of sophistication.
00:30:28These robots, equipped with the ability to comprehend English, offer the possibility
00:30:32of human-like interaction, enabling operators to communicate with them as if they were conversing
00:30:38with a chatbot like ChatGPT.
00:30:41At the heart of Covariant's innovation is the integration of sensory data from cameras
00:30:46and other sensors with the extensive textual data similar to that used to train chatbots.
00:30:51This combination grants the robots a comprehensive understanding of their surroundings, empowering
00:30:57them to handle objects and situations they have not been explicitly programmed to deal
00:31:02with.
00:31:03For instance, a Covariant robot can identify and manipulate a banana by understanding instructions
00:31:08given in English, even if it has never encountered a banana before.
00:31:12This capability is a significant departure from traditional robotics, where machines
00:31:16perform a limited set of tasks in a highly controlled environment.
00:31:20Despite the promising advancements, Covariant's technology is not infallible.
00:31:25The robots occasionally make errors, such as misunderstanding instructions or dropping
00:31:29objects.
00:31:30These shortcomings highlight the challenges inherent in applying AI to physical tasks,
00:31:35where the complexity of the real world introduces variables that are difficult to anticipate
00:31:40and manage.
00:31:41Nonetheless, Covariant's work represents a critical step forward in the quest to create
00:31:46more intelligent, adaptable robots capable of operating in diverse environments, from
00:31:50warehouses to manufacturing plants and potentially even in autonomous vehicles.
00:31:54The foundation of Covariant's technology is built on neural networks, a form of AI
00:31:59inspired by the human brain's structure.
00:32:02These networks learn to recognize patterns and make decisions based on the analysis of
00:32:06large datasets.
00:32:08This learning process is similar to the development of digital AI applications, such as chatbots
00:32:13and image generators, which understand or create content by digesting vast amounts of
00:32:18information from the internet.
00:32:19By applying these principles to robotics, Covariant is enabling machines to learn from
00:32:24both digital data and sensory input from the physical world, blurring the lines between
00:32:29the digital and physical realms.
00:32:31Covariant's ambitious project is backed by substantial financial support, with $222
00:32:35million in funding underscoring the industry's confidence in their approach.
00:32:39The company's focus on developing software rather than physical robots themselves allows
00:32:43for a versatile application of their technology across various sectors and types of robotic
00:32:48systems.
00:32:49This strategy aims to pave the way for broader adoption and innovation in the field of robotics,
00:32:53making robots more accessible and useful in a wide range of applications.
00:32:57The potential impact of Covariant's technology is vast, promising to transform how robots
00:33:03are deployed in numerous settings.
00:33:05By making robots more adaptable and capable of understanding and interacting with their
00:33:09environment in complex ways, Covariant is contributing to a future where machines can
00:33:14take on more sophisticated tasks, work alongside humans more effectively, and adapt to new
00:33:19challenges as they arise.
00:33:21This vision aligns with the broader trend of AI integration into our daily lives, where
00:33:26technology not only automates tasks, but also enhances human capabilities and experiences.
00:33:32Covariant's work also raises important questions about the future of work and the role of robots
00:33:37in society.
00:33:38As robots become more capable and versatile, they may assume responsibilities currently
00:33:42held by humans, leading to shifts in the labor market and necessitating discussions about
00:33:47retraining and education.
00:33:49Furthermore, the development of intelligent robots highlights ethical considerations,
00:33:53such as ensuring safety and managing the potential for AI systems to act in unpredictable ways.
00:34:00Covariant's efforts to bring AI technology from the digital world into the physical domain
00:34:05through robotics represents a significant leap forward in the field.
00:34:09By combining sensory data with the textual data used to train AI systems like ChatGPT,
00:34:14Covariant is enabling robots to understand and interact with their environment in ways
00:34:20that were previously unimaginable.
00:34:22Although challenges remain, the potential for this technology to revolutionize industries,
00:34:28enhance human-robot collaboration, and create new opportunities is immense.
00:34:35Guys, this is finally happening.
00:34:36There's a new reality where just thinking can make you play games, move stuff on a screen,
00:34:41and even control computers.
00:34:42This isn't some far-off dream.
00:34:44It's happening right now with Elon Musk's Neuralink.
00:34:48They just showed the world something that sounds unimaginable.
00:34:51Yet it's real.
00:34:52A person playing chess on the internet without moving, just using his thoughts.
00:34:56This isn't just a cool story, it's a whole new beginning.
00:34:59It's where thinking and technology come together, opening doors to things we've
00:35:03only dreamed about.
00:35:04Here we are, standing at the edge of something huge, where dreams, tech, and the power of
00:35:08our minds are changing everything.
00:35:10So in a groundbreaking event, Neuralink has captured global attention by live-streaming
00:35:15its first-ever patient using a brain chip to play online chess.
00:35:18The patient, 29-year-old Nolan Arbaugh, became paralyzed below the shoulders after a tragic
00:35:24diving accident.
00:35:25Thanks to Neuralink's innovative technology, he can now control a computer cursor with
00:35:29nothing but his thoughts, reintroducing him to activities like playing his favorite game,
00:35:34Civilization VI, for hours on end.
00:35:37The live-stream, which aired on Musk's social media platform X, showcased Arbaugh maneuvering
00:35:42through the chess game on his laptop using the Neuralink device.
00:35:46This technology aims to empower individuals to interact with computers through thought
00:35:50alone, bypassing traditional physical interfaces.
00:35:53Musk previously mentioned that Arbaugh had been implanted with a Neuralink chip in January,
00:35:58which enabled him to control a computer mouse through thought.
00:36:01According to Arbaugh, the surgery was super easy, and he was discharged from the hospital
00:36:04just a day later without any cognitive impairments.
00:36:07However, Arbaugh and Musk are quick to remind everyone that this technology is still in
00:36:12its infancy.
00:36:13While Arbaugh's life has significantly improved, allowing him to engage in activities he thought
00:36:17were lost to him, he notes that the journey is far from over.
00:36:21The technology is not perfect, and they've encountered some issues along the way, signaling
00:36:25a long road ahead filled with improvements and enhancements.
00:36:29Kip Ludwig, a former program director for neural engineering at the U.S. National Institutes
00:36:33of Health, echoed this sentiment.
00:36:35He stated that while what Neuralink has demonstrated is far from a breakthrough, it represents
00:36:40a vital step forward for the patient and the field of neural engineering.
00:36:44It's still early days for the implant, and both Neuralink and Arbaugh are learning to
00:36:48maximize control and interaction capabilities.
00:36:51However, the fact that Arbaugh can interact with a computer in ways he couldn't before
00:36:55is a positive development.
00:36:57Adding to the intrigue, Neuralink has been under scrutiny by the U.S. Food and Drug Administration,
00:37:02which found issues with record-keeping and quality controls for animal experiments.
00:37:06This came shortly after Neuralink announced it had received clearance to test its brain
00:37:10implants in humans.
00:37:12The company had not responded to inquiries regarding the FDA's findings at the time.
00:37:16During the livestream, Arbaugh shared his astonishment at the technology, comparing
00:37:20it to using the Force from the Star Wars films to move the cursor on his screen.
00:37:25This intuitive control mechanism has transformed his interaction with technology, making everyday
00:37:31tasks feasible once again.
00:37:33Musk, on the other hand, has been promoting this demonstration as an example of telepathy,
00:37:37showcasing the potential for humans to control computers purely through thought.
00:37:42Neuralink's achievements come in the wake of similar efforts by other companies, such
00:37:45as Australia-based Synchron.
00:37:47Synchron has also developed a brain-computer interface, albeit through a less invasive
00:37:51method that doesn't require surgery on the skull.
00:37:54They successfully implanted their device in a patient back in July 2022, indicating a
00:38:00growing interest and advancement in the field of neural technology.
00:38:04Elon Musk took to Twitter to celebrate this milestone, calling it a demonstration of telepathy
00:38:10— the ability to control a computer and play video games by merely thinking.
00:38:14This event not only marks a significant step for Neuralink, but also sets the stage for
00:38:19future innovations.
00:38:20Musk teased the next product, BlindSight, aimed at restoring vision for those born blind,
00:38:25signaling Neuralink's ambitious roadmap ahead.
00:38:28We're right at the edge of a huge change in the way we use technology, and it's just
00:38:32the start for Neuralink.
00:38:33Sure, the tech isn't perfect, and there are some bumps along the way, but the chance
00:38:37to really make a difference in people's lives is huge.
00:38:41Seeing Arbaugh play chess with just his thoughts gives us a peek into a future where we're
00:38:45not just solving problems with tools and gadgets, but with the incredible power of our minds.
00:38:53Just one day after retiring its older hydraulic model, Boston Dynamics has introduced a new
00:38:57electric version of its Atlas robot.
00:38:59This marks a significant shift away from the noisy hydraulic mechanisms that characterized
00:39:04previous models.
00:39:05Now, Atlas is powered by an electric motor which, although not silent, offers a quieter
00:39:10operation that contrasts sharply with the hydraulic system's distinctive sounds.
00:39:14The unveiling of this new version was captured in a compelling video.
00:39:18It opens with Atlas lying prone on a set of interlocking gym mats, accompanied only by
00:39:23the soft whirring of its new electric motor.
00:39:27As the camera pans around, the robot begins a seemingly natural motion of bending its
00:39:31legs at the knees.
00:39:33This movement, initially reminiscent of human action, soon crosses into an uncanny valley,
00:39:38evoking a feeling similar to a scene from a Sam Raimi film.
00:39:41This clever maneuver allows Atlas to shift from lying on its back to standing upright,
00:39:46facing away from the camera.
00:39:48It then performs an eerie rotation of its head, followed by its torso, completing a
00:39:53180-degree turn to face the camera directly.
00:39:56The robot's face, featuring a ring light around a perfectly round screen, then rotates
00:40:01back as Atlas strides off the mats and out of the frame.
00:40:05This new electric Atlas is not just a technological evolution, but also a symbolic one, mirroring
00:40:11the transformative moment when Bob Dylan famously switched from acoustic to electric guitar.
00:40:16The motion of the new Atlas, while still showing some jerky movements, is notably smoother
00:40:21and more fluid compared to many of the commercial humanoids introduced in recent years.
00:40:26This progress illustrates a level of mechanical sophistication that brings to mind the confident
00:40:31and fluid movements of Spot, another robot from Boston Dynamics, suggesting a familial
00:40:36resemblance in their design philosophies.
00:40:39Physical appearance of Atlas has undergone a radical transformation.
00:40:43Gone are the bulky torso and awkward, bowed legs covered in protective plating, replaced
00:40:47by a streamlined, cable-free frame that bears more resemblance to contemporary robots like
00:40:53Agility's Digit and Aptronic's Apollo.
00:40:55This redesign leans towards a softer, more approachable aesthetic, eschewing the intimidating
00:41:00utilitarian look of past models for a friendlier, cartoonish appearance.
00:41:04Despite these substantial changes, Boston Dynamics has chosen to retain the Atlas name,
00:41:10continuing the legacy of its well-known brand into this new commercial phase.
00:41:14This decision contrasts with previous practices where research names were changed upon commercial
00:41:19release such as Spot Mini becoming Spot and Handle becoming Stretch.
00:41:24In an interview with TechCrunch, Boston Dynamics CEO Robert Plater discussed the company's
00:41:28strategy and future plans for Atlas.
00:41:30He highlighted that while the branding remains unchanged for now, it might be revisited as
00:41:35the project progresses toward mass production.
00:41:38The CEO also outlined the company's timeline, which includes starting pilot testing of the
00:41:43electric Atlas at Hyundai facilities early next year, with broader manufacturing goals
00:41:47set for the subsequent years.
00:41:49Plater emphasized the importance of understanding specific use cases to ensure that the investment
00:41:54in robotics is justified by sufficient productivity gains.
00:41:58He revealed that experiments with Hyundai are already underway, indicating a collaborative
00:42:03effort in refining the robot's design and functionality.
00:42:06The flexibility and range of motion in the new Atlas are particularly noteworthy.
00:42:11According to Plater, the robot incorporates custom high-powered actuators at most joints,
00:42:15which afford it the power and agility of an elite athlete.
00:42:19This capability is not just for show, it has practical implications in industrial applications
00:42:23where robots must perform complex, dynamic tasks reliably.
00:42:27Boston Dynamics is known for its viral videos that often highlight the agility and dexterity
00:42:32of its robots in dramatic, even theatrical ways.
00:42:35While these demonstrations are visually impressive and entertaining, they also serve a practical
00:42:39purpose by showcasing the robot's capabilities in scenarios that might occur in real-world
00:42:44settings.
00:42:45For example, the video's depiction of Atlas starting from a prone position and then standing
00:42:49up underscores its ability to recover from falls, an essential feature for maintaining
00:42:54productivity without human intervention in industrial environments.
00:42:58The design choices extend to the robot's hands, which now feature three fingers instead
00:43:04of the more human-like four or five.
00:43:06This simplification reduces mechanical complexity and enhances durability, crucial for repetitive
00:43:12industrial tasks.
00:43:13The robot's head has also been redesigned, now featuring a large round screen that adds
00:43:19a touch of friendliness and enhances interaction capabilities, which Plater notes will be critical
00:43:24as robots increasingly work alongside humans.
00:43:27Overall, the new Electric Atlas is a big step forward in the world of humanoid robots, combining
00:43:32high-tech features with smart designs to meet the growing needs of businesses and factories.
00:43:38As Boston Dynamics keeps improving this robot, everyone in the industry will be watching
00:43:42closely.
00:43:43And they're eager to see how these new features will actually help improve efficiency and
00:43:48productivity in the real world.
00:43:53So a company called Sanctuary AI just announced a major collaboration with none other than
00:43:57Microsoft.
00:43:58And it's all about accelerating the development of next-gen AI models for general-purpose
00:44:03robots.
00:44:04This is huge.
00:44:05Now, for those not familiar with Sanctuary AI, let me give you a quick rundown.
00:44:09These folks are on an ambitious mission to create the world's first human-like intelligence
00:44:14in general-purpose robots.
00:44:16Yeah, we're talking about Artificial General Intelligence, or AGI, the holy grail of AI
00:44:21that can truly understand and interact with the world, just like us humans.
00:44:25But here's the thing.
00:44:26Sanctuary AI isn't just some scrappy startup talking big game.
00:44:30These guys have some serious pedigree.
00:44:32We're talking founders from pioneering companies like D-Wave, the first commercial quantum
00:44:36computing firm, Kindred, which introduced reinforcement learning to production robots,
00:44:42and the Creative Destruction Lab, known for innovative science commercialization.
00:44:46Their team is stacked with veterans from tech titans like Amazon, Microsoft, and SoftBank
00:44:51Robotics.
00:44:52In other words, these folks have been at the forefront of some of the most groundbreaking
00:44:56innovations in AI, robotics, and computing.
00:44:59So when they make bold claims about pushing the boundaries of AGI, you can't help but
00:45:03take them seriously.
00:45:05So what exactly does this collaboration with Microsoft entail?
00:45:08Well, it's all about harnessing the power of Microsoft's Azure cloud to supercharge
00:45:14the development of what Sanctuary AI calls Large Behavior Models, or LBMs.
00:45:20Think of these as the next evolution beyond large language models like GPT, AI systems
00:45:26designed to learn from and understand real-world experiences and interactions.
00:45:30By leveraging Azure's massive computing resources for training, inference, networking,
00:45:35and storage, Sanctuary AI can turbocharge their efforts in developing these cutting-edge
00:45:40LBMs, which are critical stepping stones on the path to achieving AGI.
00:45:45Now, just a few weeks ago, Sanctuary AI unveiled the seventh generation of their flagship robot,
00:45:51Phoenix.
00:45:52And the upgrades and improvements they've packed into this latest iteration are nothing
00:45:56short of mind-blowing.
00:45:57We're talking increased uptime, enhanced visual perception, improved tactile sensing,
00:46:01and get this, an expanded range of human-like motion in the wrists, hands, and elbows.
00:46:06This thing is designed to mimic the dexterity and range of motion of an actual human, which
00:46:11is just insane when you think about it.
00:46:13And here's the cool part.
00:46:14Sanctuary AI has achieved all of these advancements in just 11 months since their sixth-gen Phoenix
00:46:19robot, which was one of Time's best inventions of 2023.
00:46:24They've been moving forward really fast, and it's almost hard to believe.
00:46:27According to Sanctuary AI's CEO, Jordy Rose, this seventh-gen Phoenix is one of the most
00:46:33sophisticated human behavioral data capture technologies available today.
00:46:37Essentially, it's providing their carbon AI control system with some of the highest quality,
00:46:42highest fidelity training data out there, which is critical for developing those advanced
00:46:46LBMs and pushing towards AGI.
00:46:49But perhaps the most mind-blowing stat of all is that the time it takes to automate
00:46:53new tasks has been slashed from weeks to less than 24 hours.
00:46:58That's a 50x increase in automation speed, marking a pivotal moment in the capability
00:47:02of autonomous systems.
00:47:04Just let that sink in for a second, automating complex tasks in under a day.
00:47:08The potential applications across industries like manufacturing, logistics, and more are
00:47:12staggering.
00:47:13Now, Sanctuary AI is making some seriously bold claims here, stating that this seventh-gen
00:47:19system is the closest analog to a human of any available robot today.
00:47:23That's a hefty assertion, no doubt, but coming from a team with the pedigree and track
00:47:27record that these folks have, you can't just dismiss it out of hand.
00:47:31Of course, with any groundbreaking technology, there are always ethical considerations and
00:47:36potential risks to weigh.
00:47:38The development of advanced AI and robotics is no exception, with concerns around safety,
00:47:44transparency, unintended consequences, and the impact on human labor looming large.
00:47:49To their credit, Sanctuary AI seems acutely aware of these issues.
00:47:53They've stressed a commitment to responsible development and addressing labor challenges
00:47:57in a way that augments human workers, not replaces them.
00:48:00The name Sanctuary itself is meant to convey a controlled, secure environment for nurturing
00:48:05AI responsibly before commercial deployment.
00:48:08Make no mistake, the road ahead is sure to be filled with immense challenges, both technical
00:48:12and ethical.
00:48:13But if companies like Sanctuary AI and their all-star teams can continue this blistering
00:48:17pace of innovation while keeping responsible development at the forefront, we may just
00:48:22witness some truly mind-bending breakthroughs in the coming years.
00:48:25The age of ubiquitous, multi-purpose robotic assistants powered by human-like artificial
00:48:30intelligence is on the horizon, and with visionary pioneers and powerful partnerships like this
00:48:35one leading the charge, that future might be a lot closer than any of us could have
00:48:40imagined.
00:48:41So, Unitry just dropped the trailer for their new Unitry G1 humanoid robot, and this thing
00:48:47is mind-blowing.
00:48:48The flexibility, proficiency, and overall capabilities of this robot are really something
00:48:53else.
00:48:54I'm going to explain and show you exactly why this robot is such a game-changer.
00:48:57Alright, so there's a lot to unpack here, and this robot is genuinely revolutionary.
00:49:03First off, you might have noticed that it looks kind of similar to Boston Dynamics'
00:49:07recently unveiled Atlas robot.
00:49:09Now, I don't think Unitry managed to create this robot in just the three weeks since that
00:49:14trailer came out.
00:49:15Something like this takes a long time to develop.
00:49:18But what it does show is that Unitry is evolving faster than a lot of people expected.
00:49:23And they're delivering some seriously impressive tech at a relatively affordable price point.
00:49:27The G1 starts at just $16,000, which is cheaper than any other humanoid robot with these kinds
00:49:33of capabilities.
00:49:34One of the things that really stood out to me in this demo was the insane flexibility
00:49:38and range of motion.
00:49:40As you can see here, this robot can swivel around in ways that humans simply can't,
00:49:44and its legs have a much greater range of motion.
00:49:47They've also managed to internalize the wiring using hollow joint technology, which
00:49:51allows for more efficient movement and a more compact design.
00:49:55This trailer really hammers home the fact that the robot race is on.
00:49:58I don't think anyone predicted that Unitry would release something like this so soon.
00:50:02I would have expected it to come from a company like Boston Dynamics first.
00:50:06And while Boston Dynamics might be using hydraulic actuators or fully electric systems,
00:50:11the G1's movement capabilities suggest that it's going to be able to handle a huge range
00:50:16of tasks that we might not have even considered yet.
00:50:19Another thing that really impressed me was the robot's stability when taking impacts.
00:50:23Now, I'm not saying we should go around punching robots.
00:50:27That might not end well if they decide to rebel against us down the line.
00:50:31But in all seriousness, the fact that this robot can take a solid hit to the chest and
00:50:35remain completely stable is a remarkable feat of engineering.
00:50:39In real-world applications, robots might encounter things like wind, debris, or unexpected collisions,
00:50:45and being able to stay upright and keep functioning in those situations is hugely important.
00:50:50Most humans would probably hit the ground if they took a punch like that, so this is
00:50:53a really exciting development.
00:50:55We've seen similar demonstrations of stability from Unitry before with their H1 robot, but
00:51:00the G1 takes it to a whole new level.
00:51:03If you tried this with most other humanoid robots on the market, they'd topple over
00:51:06in an instant.
00:51:08The fact that Unitry has achieved this level of stability without even using external support
00:51:13cables is mind-blowing.
00:51:15Usually in these kinds of robotics demos, you'll see the robot tethered to the ceiling
00:51:19or something to prevent falls, because balance is one of the biggest challenges in the early
00:51:23stages of development.
00:51:25But Unitry has created a robot that can take a beating and stay on its feet with all the
00:51:29wiring hidden internally.
00:51:30That's a massive accomplishment.
00:51:32The speed of the G1 is also really impressive.
00:51:35It can reach speeds of up to 3.3 meters per second, which is just shy of the world record
00:51:40set by Unitry's previous robot.
00:51:43For context, that's about 2 meters per second, which is almost a running pace.
00:51:47We haven't seen anything close to that from other humanoid robotics companies.
00:51:51The Tesla Bot, for example, can walk pretty smoothly, but it's not moving nearly as fast
00:51:56as this.
00:51:57And I can't imagine they'd be too eager to have someone kick it around to test its stability.
00:52:02So in terms of overall performance, I'd argue that the Unitry G1 is probably one of the
00:52:06best humanoid robots out there right now.
00:52:09One of the things that absolutely blew me away was the robot's ability to learn tasks
00:52:14through simulation and then apply them in the real world.
00:52:18Using imitation and reinforcement learning, they can train the robot to perform complex
00:52:22actions in a simulated environment like NVIDIA's Isaac Sim and then transfer those skills to
00:52:28the physical robot.
00:52:30Watching it move, you can see how precise and subtle the joint movements are, and the
00:52:34fact that it can maintain its balance while performing these tasks autonomously is staggering.
00:52:39This has huge implications for robotics research, because until now, most humanoid platforms
00:52:45have been really expensive, which makes them hard to access for a lot of labs and universities.
00:52:50The Unitry H1, for example, was a great research platform, but its high cost was a big barrier.
00:52:56With the G1 coming in at a lower price point, we're going to see a lot more people getting
00:53:00their hands on this tech and pushing the boundaries of what's possible.
00:53:04And then there are the hands.
00:53:06Previous Unitry robots were criticized for their lack of dexterous manipulators, but
00:53:10the G1 has hands that can grasp and manipulate objects with incredible precision.
00:53:16It can crush a walnut, flip a piece of bread, and even pop open a soda can, like it's nothing.
00:53:21The level of control and fine motor skills on display here is just insane.
00:53:25They also mentioned something called the Unitry Robot Unified Large Model, which seems
00:53:30to be some kind of AI system that allows the robot to perform these tasks autonomously.
00:53:35There aren't a lot of details available yet, but I'm guessing it's a machine learning
00:53:39model that's been trained on a huge dataset of simulated and real-world interactions.
00:53:45As more information comes out, I'll be sure to update you all.
00:53:48One of the other key things they demonstrate is the robot's ability to perform very precise
00:53:53movements, like soldering electronic components.
00:53:56This opens up a whole new range of potential applications in fields like manufacturing
00:54:01and assembly.
00:54:02But perhaps the most exciting thing about the G1 is its price.
00:54:06At just $16,000 for the base model, it's nearly 10 times cheaper than Unitry's previous
00:54:12humanoid robot, which was listed at around $150,000.
00:54:17That's a massive leap in affordability, and it's going to make this technology accessible
00:54:21to a much wider range of researchers and developers.
00:54:24We've already seen some of the incredible things that people have been doing with Unitry's
00:54:28H1 platform.
00:54:29For example, researchers at Shanghai Tech and Mars Lab recently presented a framework
00:54:33for humanoid parkour learning, which allows the robot to navigate rough terrain, jump
00:54:38onto high platforms, leap over hurdles, and more, all using only vision and proprioception.
00:54:43It's a fully end-to-end system, which is a huge accomplishment in the field of robotics.
00:54:48The problem is that a lot of academic labs struggle to get funding for this kind of research
00:54:53because they simply can't afford the expensive hardware.
00:54:56But with the G1, we're going to see an explosion of new projects and discoveries.
00:55:00I mean, the H1 could do a full backflip, which is just mind-boggling for a robot with such
00:55:05basic actuators.
00:55:06Imagine what people are going to do with the G1's advanced capabilities.
00:55:10So to sum up, the Unitry G1 is an absolute game-changer.
00:55:14Its combination of flexibility, stability, dexterity, and affordability is unparalleled,
00:55:19and it's going to open up a whole new world of possibilities in the field of robotics.
00:55:23I cannot wait to see what researchers and developers do with this platform once it starts
00:55:27shipping in the next few months.
00:55:29The robotics race is heating up, and Unitry has just shown that China is a serious contender
00:55:34on the global stage.
00:55:35They're innovating at an incredible pace, and they're making this technology accessible
00:55:40in a way that no one else has done before.
00:55:42So keep an eye on Unitry.
00:55:44I have a feeling we're going to be seeing a lot more groundbreaking developments from
00:55:48them in the very near future.
00:55:50Now here's some interesting news from the United Arab Emirates.
00:55:53This announcement comes from Abu Dhabi's Technology Innovation Institute, also known as TII.
00:55:58They've launched the Falcon 2 series, featuring two impressive models, Falcon 211B and Falcon
00:56:03211B VLM.
00:56:05So Falcon 211B is, as they say, a super-powerful text-based AI designed to understand and generate
00:56:11human-like text.
00:56:12Meanwhile, Falcon 211B VLM is a vision-to-language model.
00:56:16This means it can take an image and generate a detailed text description of it.
00:56:20Let's talk about why the UAE is making such big moves in AI.
00:56:24Known for its vast oil reserves, the UAE is now pouring massive investments into artificial
00:56:29intelligence.
00:56:30This shift hasn't gone unnoticed.
00:56:31In fact, U.S. officials took note last year, issuing a strong ultimatum for the UAE to
00:56:36choose between American and Chinese tech.
00:56:38In response, Emirati AI firm G42 cut ties with Chinese companies, paving the way for
00:56:44a whopping $1.5 billion investment from Microsoft.
00:56:48This strategic pivot was carefully coordinated with Washington, showcasing the UAE's serious
00:56:53commitment to advancing in the AI space.
00:56:56The release of Falcon 2 comes as countries and companies worldwide are racing to develop
00:57:00their own large-language models.
00:57:02And while some companies keep their AI tech proprietary, others, like the UAE's Falcon
00:57:06and META's Lama, are going open-source.
00:57:10Looking ahead, the UAE's substantial investments in AI are positioning the country as a significant
00:57:15player on the global tech stage.
00:57:17The Falcon 2 series is just the beginning, with the anticipated Falcon 3 already in the
00:57:22works.
00:57:25So a new AI humanoid robot has been unveiled by a company called Robot Era.
00:57:30And it's pretty crazy.
00:57:31This robot, called X-Hand, is apparently a game-changer in the world of embodied AI.
00:57:36Now, embodied AI is all about creating robots that can interact with the real world just
00:57:41like humans do, and this X-Hand robot seems to be taking things to the next level.
00:57:46So it has 12 active degrees of freedom, which means it can move and bend in all sorts of
00:57:51ways just like human bodies.
00:57:53It's also got built-in tactile sensors, so it can actually feel and sense things it touches.
00:58:01And the control precision is supposed to be top-notch, allowing for smooth and reliable
00:58:06movements.
00:58:12But here's the kicker.
00:58:13Robot Era claims that the technology behind X-Hand is fully self-developed, which is pretty
00:58:18impressive for a relatively new player in the robotics and AI game.
00:58:27They were only founded in 2023, but they're already pushing the boundaries of what's possible.
00:58:40Now we don't know how much this robot is going to cost or when it'll be available to the
00:58:43public because Robot Era is keeping those details under wraps for now.
00:58:48But from what they've shown us, it looks like X-Hand is specifically designed to embody
00:58:52AI, which could be a big deal as we get closer to achieving AGI.
00:58:56And speaking of AGI, there's been a lot of hype lately about it potentially happening
00:59:01sooner rather than later.
00:59:02I'm not even joking around.
00:59:04Some experts are saying we might be on the verge of a breakthrough in this field.
00:59:08Now let's talk a bit more about Robot Era and what they've been up to.
00:59:12They're focused on developing general purpose humanoid robots, which is pretty ambitious
00:59:17stuff.
00:59:18One of their main projects is called Humanoid Gym, which uses reinforcement learning to
00:59:22train robots to perform tasks in a way that mimics human capabilities.
00:59:27They're using this cool concept called Zero Shot Sim 2 Real Transfer, which basically
00:59:32means they can train the robots in a simulated environment first and then deploy them in
00:59:36the real world without any additional training.
00:59:39But enough talking from me.
00:59:41Let's check out this X-Hand robot in action and see what it can really do.
00:59:45Robot Era has released a showcase video, and I've got to say, the capabilities they're
00:59:49demonstrating are pretty fascinating.
00:59:51Let's roll that footage and see what this bad boy can do.
01:00:21You might remember a little while back when I covered Robot Era, that ambitious startup
01:00:41that unveiled their crazy dexterous X-Hand AI robot.
01:00:45Well, they just one-upped themselves in a major way by becoming the first company ever
01:00:49to have a full-size humanoid robot walk on the Great Wall of China.
01:00:54So their latest robot, called the X-Bot L, conquered one of the most iconic and challenging
01:00:59landscapes in the world, the ancient stone pathways and steep stairs of the Great Freaking
01:01:04Wall of China.
01:01:06This is a huge milestone not just for Robot Era, but for humanoid robotics as a whole.
01:01:10So a 5'5", 165cm tall robot, looking kind of like a slimmed down Terminator, marching
01:01:17its way across those uneven, century-old stone surfaces without faltering.
01:01:21It was waving hello, practicing kung fu moves, climbing up and down stairs the whole 9 yards.
01:01:27And to be honest, the Great Wall is no walk in the park, even for us humans.
01:01:30Those paths are cracked, the stairs are crazy steep with no railings, there are random potholes
01:01:35everywhere.
01:01:36It's an obstacle course designed to trip up any bipedal robot, but the X-Bot L handled
01:01:41it all with surprising ease thanks to Robot Era's advanced perceptive reinforcement
01:01:45learning algorithms.
01:01:46Basically, this robot can actually perceive and make sense of its environment in real
01:01:51time using sensors and AI.
01:01:53It can identify obstacles, changes in terrain, and literally adapt its walking gait and balance
01:01:58on the fly to deal with whatever the Great Wall throws at it.
01:02:01According to Yue Shi, one of the co-founders, their perceptive reinforcement learning tech
01:02:06gives the X-Bot L something like human-level awareness and decision-making capabilities
01:02:11in unfamiliar environments.
01:02:13The robot can essentially think for itself about the best way to navigate complex areas
01:02:18safely and efficiently.
01:02:19And we're not just talking about some basic pre-programmed motions here.
01:02:23The X-Bot L was climbing rugged castle stairs, dealing with slopes and inclines, avoiding
01:02:28obstacles, adapting to poor lighting conditions, all without any human guidance or help.
01:02:33It was basically pulling off the kind of adaptive, intelligent, situationally aware movement
01:02:38that we associate with people.
01:02:39Roboterra says this was one of the biggest challenges, developing end-to-end AI algorithms
01:02:45that could translate perceived data from the robot's sensors into precise, stable locomotion
01:02:50in the real world across an ultra-complex environment like the Great Wall.
01:02:55Sounds like they cracked that nut, though, based on this demo.
01:02:57What's really crazy is that this Great Wall feat comes just a couple weeks after they
01:03:02showed off the X-Hand dexterous robot I mentioned earlier.
01:03:05Between these two products, it's clear Roboterra is going all-in on embodied AI, building robots
01:03:10with human-like physical capabilities to operate in our world.
01:03:14And they're not messing around either.
01:03:15Despite only being founded in 2023, they've already invested heavily into cutting-edge
01:03:19AI tech like reinforcement learning, neural networks, sim-to-real transfer, and more.
01:03:24Basically, everything needed to create robots that can tackle the real world just like we
01:03:29do.
01:03:30Their goal seems to be keeping that competitive edge by iterating and churning out smarter
01:03:35more capable humanoid robot products at a rapid pace.
01:03:39And they want to keep perfecting ways to transfer all that AI training from simulations into
01:03:44physical machines, enabling those robots to become more versatile in real-life scenarios.
01:03:49The endgame seems to be developing ultra-flexible, general-purpose humanoid robots for countless
01:03:54valuable applications across industries like manufacturing, healthcare, services, and more.
01:03:59I've got to give Roboterra props, deliberately putting their robot through the ringer on
01:04:03the Great Wall as some kind of extreme stress test shows serious confidence in their tech.
01:04:08Most companies would avoid something that risky with a brand new product, but the fact
01:04:12that the XBot L could pull it off so smoothly is really impressive.
01:04:15It's an amazing proof of concept for their embodied AI approach and perceptive reinforcement
01:04:20learning systems working in one of the toughest real-world environments.
01:04:25To me, this demonstrates Roboterra's outstanding core capabilities in fields like robotics,
01:04:30AI, mechatronics, and more.
01:04:32They are clearly at the cutting edge when it comes to embodied intelligence and making
01:04:35it work reliably in the real world.
01:04:37I'm really looking forward to whatever crazy robot Roboterra comes up with next.
01:04:42If they keep raising the bar like this, they could legitimately help usher in the age of
01:04:47advanced human-like robots that can operate autonomously in our world on an unprecedented level.
01:04:53We're potentially on the cusp of some big breakthroughs in artificial general intelligence
01:04:57or AGI becoming a reality over the next few years.
01:05:00But if anyone can build the physical robotic bodies to house and manifest that future AGI,
01:05:06it might just be an ambitious little pioneer like Roboterra.
01:05:09When it comes to developing super-advanced AI systems like AGI, the situation with whistleblowers
01:05:14at OpenAI shows just how high the stakes really are.
01:05:17Roboterra's achievement with their robot on the Great Wall is definitely impressive
01:05:21from an embodied AI perspective.
01:05:24But achievements like that could potentially seem tiny compared to the impacts, both positive
01:05:28and negative, that creating a general artificial intelligence on the same level as human cognition
01:05:34could have on society.
01:05:35On one hand, AGI represents this incredible technological breakthrough that could help
01:05:40solve countless challenges humanity is facing, from unsolved scientific mysteries to global crises.
01:05:47A generalized AI mind that matches human intelligence has always been viewed as something that could
01:05:52be a catalyst for transformative advances across every field.
01:05:56But on the other hand, the existential risks of an unaligned, super-intelligent AGI that's
01:06:01smarter than humans simply can't be overstated.
01:06:04We're talking about the potential for an advanced AI system to break free from human
01:06:08control with catastrophic consequences for our species and the planet if it's not developed
01:06:13safely and responsibly.
01:06:15The OpenAI insiders are ringing alarm bells about this exact scenario playing out.
01:06:19They allege the company is recklessly rushing towards AGI supremacy without taking proper
01:06:24safety precautions, driven more by competitive pressures to be first rather than doing it
01:06:29the right way.
01:06:30Damning claims like OpenAI ignoring their own safety review processes, muzzling employees
01:06:35with really strict non-disparagement agreements, and deprioritizing safety work after founding
01:06:40members raised concerns paint a very worrying picture.
01:06:44If true, it suggests one of the leading AI labs is taking an extremely risky and unethical
01:06:48approach in this existential race for AGI.
01:06:51The fact that respected researchers like Ilya Sutskever felt compelled to resign due to
01:06:56OpenAI's perceived lack of commitment to keeping AGI safe and aligned only adds more
01:07:01credibility to the whistleblower's grievances, in my opinion.
01:07:05Their letter calling for greater transparency, protecting employees, and ultimately having
01:07:10the government regulate this powerful technology to keep bad actors in check feels like a reasonable
01:07:16and perhaps necessary step to prevent potential disaster scenarios.
01:07:21Because let's be real, we simply can't afford a situation where the first super-intelligent
01:07:26AGI that emerges is catastrophically misaligned with human ethics and values.
01:07:31Exercising restraint and prioritizing responsible development have to be the top priorities.
01:07:36Of course, the whistleblowers likely have some level of bias given their philosophical
01:07:40views and ties to effective altruism movements, so I'd encourage taking more of a neutral
01:07:45stance personally.
01:07:46But even accounting for that bias, the sheer magnitude of what's at stake with AGI certainly
01:07:52warrants a strong commitment to prioritizing safety measures and ethical boundaries over
01:07:56competitive pressures and commercial interests from companies like OpenAI.
01:08:01Not properly aligning an emergent AGI could risk irreparably disrupting human civilization
01:08:06as we know it.
01:08:07And any private entity recklessly brushing off those concerns in their pursuit of being
01:08:11first is essentially playing a potential extinction-level game of Russian roulette
01:08:15with humanity's future.
01:08:16So while I'm impressed by the cutting-edge AI and robotics breakthroughs from companies
01:08:20like RoboTerra, I really hope the broader AI community, whether researchers, executives,
01:08:26policymakers or others, is taking the OpenAI whistleblowers' warnings about these existential
01:08:31risks just as seriously.
01:08:33Because we may only get one shot at developing AGI in a controlled, responsible way that
01:08:38ensures this game-changing technology remains an asset that allows humanity to flourish
01:08:42rather than an unintended catalyst for our potential downfall.
01:08:46The stakes are that high.
01:08:47Alright, don't forget to hit that subscribe button for more updates.
01:08:50Thanks for tuning in and we'll catch you in the next one.

Recommended