Category
📺
TVTranscript
00:00How and when did our human species begin?
00:12The NHK special, Out of the Cradle, was planned to explore this simple yet profound question.
00:21But the production faced a huge problem.
00:25How could humanity's ancient ancestors and their long lost worlds be brought back to life on screen?
00:36Surmounting this daunting challenge required the latest in research and computer graphics.
00:49Ultra-high resolution CGI to recreate our distant evolutionary ancestors with amazing realism.
00:57Creating these images of stunning quality was the result of a new kind of collaboration
01:01between a world-renowned video game developer and expert scholars.
01:11The quality of CGI for this program was quite exceptional. I think it's the first of its kind.
01:19This ambitious project, exploring the millions of years of human evolution, has been two years in the making.
01:26Get ready for a behind-the-scenes look at the amazing process
01:29that has made this exciting journey back through human evolution possible.
01:55This is a scene from the NHK special, Out of the Cradle.
01:59You can almost feel the animal's warm breath.
02:06We are seeing an ancient earth millions of years before cameras were invented.
02:19The grass, the ground, the sun, the trees, everything, including this hominid, was rendered by computers.
02:33This breathtaking virtual world was created by a leading Japanese video game company.
02:42Its hits include enduring series like Final Fantasy and Dragon Quest.
02:48Now, the state-of-the-art technology they have honed for producing games
02:52is being used to recreate the lost worlds of early hominids.
03:02Game creator Hashimi Tabata is leading the team who are conjuring up this prehistoric earth.
03:10We do have comprehensive digital movie-making technology and expertise,
03:15but we've had no opportunity to use it for any other application except for making video games.
03:21We knew there would probably be many hurdles to overcome along the way,
03:25but our team was eager to tackle them.
03:28So, we accepted the offer right away.
03:38Creating a computer-generated movie begins with storyboarding.
03:42Storyboards are made up of sketches.
03:44You could call them the blueprint for the movie.
03:53The test of skill for a CGI production crew is how well they can translate the storyboards into a compelling film.
04:06Based on these storyboards, they must first gather reference materials about prehistoric earth and its inhabitants.
04:15Hominids, animals, and everything else in the environment must be digitally modeled.
04:20So, they assemble and create a huge range of reference material.
04:31These materials are then condensed into concept art, like this.
04:37Concept art provides a visual representation of key ideas and moods for the project.
04:42All of the creators work from these master design references.
04:47They are critical in providing a shared visual vocabulary for everyone working on the project.
04:55Canadian illustrator Naomi Savoy is in charge of the concept art for Out of the Cradle.
05:02Since we're making this time more of an educational-based program,
05:07we really had to work closely with anthropologists and experts,
05:11just to ensure that the things we were making in the program were actually accurate.
05:17So, we had several visits where experts would actually come in the office
05:20and teach us about all sorts of things regarding that period of time.
05:24And we had regular check-ups and feedbacks and conversations to make sure that everything was correct.
05:29As well, we actually did some extra activities where we were making stones and tools from scratch
05:35to see actually how it would feel like to make these things back then and learn the proper way to make it.
05:41So, that was also pretty interesting.
05:45The CGI in the series is created on the principle of reconstructing the distant past as faithfully as possible.
05:54So, the advice of academic experts from various fields is a vital component.
06:05Anthropologist Hisao Baba provides overall supervision.
06:12As researchers, we should normally only refer to the things that we are completely sure about.
06:18But when it comes to making CGI, we have to take some risks, although we do sometimes hesitate.
06:25But by going out on a limb a little, we can show things in a way that's more readily understandable to a general audience.
06:32And then the final product provides a basis for exchange of views among many researchers.
06:42The gaming company gave us special permission to film the team as they create the CGI.
06:48This is their office.
07:01The team that created the CGI for Out of the Cradle is the same team in charge of the famous game series Final Fantasy.
07:08These creative talents have taken on a big challenge.
07:11To reconstruct a world none of us have ever actually seen.
07:18Let's follow the workflow used to create a computer-generated movie.
07:26Using a storyboard as a blueprint, a crude version of the animation, known as a previsualization, is created.
07:33This previs, as it's also called, acts as a sort of visual script.
07:38After that comes the process of making the digital models, known as assets.
07:43Then comes the animation process, where movies are made from these assets.
07:49The final step in the process is placing them into a scene.
07:53Let's take a look at how these processes have unfolded in the creation of the CGI for Out of the Cradle.
08:00Our first character is a very early hominid species called Ardipithecus ramidus.
08:06It is thought to have walked on two legs and lived approximately 4.4 million years ago.
08:13This CGI version of Ardipithecus ramidus is amazingly lifelike.
08:19Look at how naturally it moves.
08:22The skin and hair textures, as well as the facial expressions, are packed with ultra-realistic detail.
08:29It's surprising how a computer-generated image can be so expressive that it feels as if though we saw it in a live-action film.
08:37These are fossil bones from Ardipithecus ramidus discovered in Ethiopia between 1994 and 1995.
08:45Numerous reference sources, including the analytical data from these bones,
08:50were used to resurrect a 4.4 million year old hominid individual in virtual form.
08:56To reconstruct the ramidus male, they first created his basic body.
09:00Character modeler Kasuaki Iwasawa was in charge of this step.
09:09The process is like building a model from clay, but on a computer.
09:15This process is called digital modeling.
09:18The design factors in height, arm length, and other basic body measures of ramidus are determined here.
09:24The modeling for the Out of the Cradle series was exceptionally challenging.
09:30I had no idea what this ramidus looked like,
09:34so I had to ask the expert researchers a lot of questions about its characteristic ecology, muscles, skeleton, and so on.
09:45And based on that, I kept tweaking the outline.
09:48I repeated that process again and again.
09:54Iwasawa built the digital model in close consultation with experts.
09:58His first attempts were met with harsh critique.
10:05Advisory hearings with experts were used to evaluate the digital creation of Ardipithecus ramidus.
10:12In order to keep the model as scientifically accurate as possible,
10:16experts were asked to check the quality of the CGI at each production stage.
10:23Hisaobaba is an anthropologist assigned as the overall supervisor for the series' CGI.
10:33Hisaobaba is a very good anthropologist.
10:36Today, he is assessing the proportions of the latest iteration of Ardipithecus ramidus.
10:50It looks like the hands are a bit too large in this area.
10:55They look much bigger than the feet.
10:58Don't the hands seem about twice as big as the feet?
11:01Why is that?
11:03Or is that just because of the lighting?
11:08Hisaobaba is unhappy at the hands' proportion relative to the feet.
11:19Regardless of the data, I think we should avoid giving an unnatural impression.
11:24Hisaobaba's food nice and simple.
11:26I think we should avoid giving an unnatural impression.
11:30Hisaobaba scrutinizes the model closely,
11:34right down to the size of the fingernails.
11:47The model had to undergo numerous checks like this before receiving final approval.
11:57The way the fingers attach, the proportions of the segments, all those little details.
12:04I worked back and forth with the experts, refining it bit by bit, to arrive at the final form.
12:14These fine-tuning rounds have taken five months.
12:18Now the ramidus body model is finished.
12:22It's still just a featureless surface, but the shape is perfect.
12:27It's time for the next step in the process to begin.
12:35Ramidus was an erect bipedal primate that lived 4.4 million years ago.
12:41Its gait was slightly different from modern apes and modern humans.
12:46How can that be recreated?
12:57Next comes a process called setup.
13:06Though the character model is completed, it is still static like this,
13:11so we need a few tricks to get it to move.
13:15He adds bones and joints to the digital body.
13:19These give the model its range of motion.
13:22But that bending looks pretty unnatural.
13:28Basically, after the first processing, when the joints flex like this,
13:33some parts still stretch out too thin or shrink too short.
13:39So we correct them all.
13:41But it's not easy.
13:43It's a lot of work.
13:45It's a lot of work.
13:47It's a lot of work.
13:49So we correct them all.
13:51But it's a tricky job to fix the body parts in motion into smooth, natural shapes.
13:58Here is the leg after some corrections.
14:02Now the flesh looks natural.
14:05This process demands a keen understanding of human body anatomy.
14:09If the bones move like this, then what should the shapes of the bending parts look like?
14:14That's my daily struggle.
14:19Now, thanks to a skeleton with joints, the Ramudus model is able to move.
14:24The next challenge is its bipedal walking, the species' most important trait.
14:29Motion capture techniques are used to digitally simulate the gait of Ramudus.
14:33A live human actor's movements are recorded,
14:36and that data is used to animate the digital character.
14:42The actor's movements are captured with an array of over 100 specialized cameras.
14:47These cameras are used to record the movements of the actors,
14:50and the cameras are used to record the movements of the actors themselves.
14:53The actor's movements are captured with an array of over 100 specialized cameras.
15:02To ensure accuracy of the movements,
15:04supervising anthropologist Isao Baba also attends the motion capture session.
15:10He provides instructions on how to walk.
15:14The force of the feet pushing the earth backward should not be very strong.
15:18It's somewhat like an elderly person's walking style,
15:21say an 80-year-old.
15:23Think of it that way.
15:26Not much backward force, so should I move like this?
15:31That's it. That's about right.
15:37Let's try it.
15:41Rolling!
15:47A motion capture system makes it possible to assess in real time
15:51the subtleties of how the movement translates to the digital model.
15:58The legs are a little too far apart.
16:01Can you bring them closer together?
16:05OK, rolling!
16:09The movements are recorded over and over,
16:12with fine adjustments made each time.
16:18OK, rolling!
16:22OK, rolling!
16:27There shouldn't be alternating swinging of the arms.
16:31Your arms would dangle, like this, just apart from the elbows down.
16:36OK, rolling!
16:40It's not easy mimicking an ancient walking motion that no one has ever seen.
16:45An hour goes by in a flash.
16:52The way the arms swing is not quite even,
16:55but that may actually add some personality.
16:58The left arm seems to swing a little longer,
17:01but slight inconsistencies may be better.
17:06The motion capture data is fed into the digital model.
17:10The result is a stripped-down digital animation.
17:13This is a pre-visualization, or previz.
17:194.4 million years after it last walked Earth in the flesh,
17:23Ardipithecus rhamnus started to move.
17:26It's a pre-visualization, or previs.
17:29It's a pre-visualization, or previs.
17:32It's a pre-visualization, or previs.
17:35It's a pre-visualization, or previs.
17:37After a few years in the flesh,
17:40Ardipithecus rhamnus starts walking again, on a computer screen.
17:45Its movements are so convincing already,
17:48but there's still more work to be done.
17:51Even though we can get the movement from the motion capture,
17:55it's not perfect yet.
17:57There are little glitches in how the cameras pick things up.
18:01And it doesn't capture fingers.
18:03how the pointer finger is off, how it doesn't make contact with the ground.
18:11Motion capture will always leave some unnatural glitches in the movement.
18:15Those have to be corrected manually, one by one.
18:19Kawamura has a technique to make the movement look more believable.
18:25In the case of computer-generated images,
18:29if, for example, the fingers touch down rigidly, it will just look fake.
18:34Actually, a little bit of flattening motion or some more tensions in the pushing motion are absolutely necessary.
18:40If you don't add those little details to the character's movements,
18:45the finished animation will lack realism.
18:49Looking wooden and fake, that is, very much like CGI.
18:58Kawamura filmed his own hand movements to carefully observe how the joints and muscles behave,
19:03and then he used that information to improve the animation.
19:12There's a lot of trial and error.
19:15Until you actually see it animated, you can't tell exactly what the range of movement should be,
19:20where to pump it up, or tone it down.
19:26Okay, rolling.
19:29In this motion capture session, they have to record an even more challenging movement, running.
19:36Here, we really don't get the impression he's running fast.
19:41It's hard to run fast when you can't push your legs backward.
19:45Right, it's all in the glutes.
19:48Because Ramadas lived at the dawn of bipedal locomotion, he was probably not a skilled runner.
19:54Therefore, the animation must look appropriately awkward.
19:59As a reference, Baba looks at the way gorillas run.
20:11Here is the final animation of Ardipithecus Ramadas.
20:15It certainly looks realistic.
20:17Its movement is based on a live human performance, which is then tweaked manually to become even more natural.
20:28The virtual Ramadas is looking more and more like a real animal.
20:32So what's next?
20:36The smooth skin has been covered in fur.
20:41And another related step will be essential in making the digital model come alive.
20:51The fur must move naturally.
20:56Shiho Kobayashi takes charge of the task.
21:01I'm handling the fur simulation.
21:06It's the process of physical simulation.
21:10We simplify objects, assign them mass, and add the pull of gravity.
21:20The digitally modeled fur is static at first.
21:24To make it move, parameters like hair weight, gravity factor, and levels of wind are fed to a computing system
21:31to calculate how the fur should move.
21:42Kobayashi uses a number of techniques to make each strand of hair behave in a lifelike way.
21:49The process is long and intricate.
21:53But the results look great.
21:57Today, a year and a half since the beginning of the project,
22:01the making of the Ramadas CGI is entering its final stage.
22:07They still have work to do on the face of Ramadas.
22:11Perhaps more than anything, the face is what will breathe life into this digital model.
22:17This tricky final stage is literally a vital, life-giving stroke.
22:24Creating the character's face involves multiple processes.
22:33The process is called sculpting.
22:36Now we add the fine wrinkles and other details onto the skin.
22:44Much like with a real sculpture, Iwasawa draws in the wrinkles one by one,
22:49Old-fashioned artistry unfolds at the heart of cutting-edge CGI production.
22:56I'm taking my inspiration from chimpanzees to model the facial wrinkles of Ramadas.
23:02The texture and color of the skin are also modeled after chimps.
23:11Each spot and wrinkle is drawn by hand.
23:16After the sculpting process comes surfacing.
23:19This adds skin color, spots, and other features.
23:23A scientific basis is used in the design of the skin texture.
23:35Since Ramadas lived in the forest, the skin became a part of his body.
23:41Since Ramadas lived in the forest, the skin probably wasn't completely dark.
23:47I looked through chimp photos and saw ones with spotted skin.
23:51So I decided to try that on Ramadas.
23:58Now for the finishing touches.
24:04Now I'm doing what we call the facial setup.
24:07The process to add facial expressions to the model.
24:10Specifically, I use a trick to get the facial moved by using a combination of different memetic muscles.
24:19Memetic muscles is another name for the facial muscles.
24:22We modern humans also have memetic muscles, and by moving them, we create different expressions.
24:30To create the digital Ramadas face, 20 to 30 different facial expressions were set up.
24:35All of them were modeled after human and chimp expressions.
24:39Not even human actors can be so animated.
24:47The nose, mouth, and cheeks, it was painstaking to make those areas expressive.
24:54So what do you think about the outcome?
24:59I wouldn't say it's 100% perfect, but it's in the 60 to 70% range.
25:06Shouldn't the rate be a bit higher?
25:11Maybe. It'll be up to the viewers to decide.
25:17The digital Ramadas now has a very expressive face.
25:20It can make more than 50 types of facial expressions through varied combinations of muscular movements.
25:31It's been about 9 months since the work on Ramadas started.
25:35Out of a complex series of tasks, the finalized Ramadas has emerged.
25:41At last, we can come face to face with the primitive ape man.
25:50We still need to recreate everything in this virtual world.
25:53This shot, for example, requires more than just making the character.
25:59Look at the background, the whole collection of trees, shrubs, and grass.
26:05The background design needs to reinvent the Earth of 4.4 million years ago.
26:10Millions of trees are needed to create this environment.
26:20The most difficult part was figuring out how the habitat of Ramadas should look.
26:28He conducted in-depth research on ancient and modern plants from across the globe,
26:32and then created the digital models of necessary plants one by one.
26:42While I was modeling these plants, I always felt unsure about everything.
26:47I could not feel relieved until the experts approved the models I showed them.
26:52It's taken three whole months to determine the full set of vegetation types
26:56to be included in the virtual environment.
27:02Then, the plants need to be placed.
27:04But can you imagine just how much work is required to fill up even just this one shot?
27:12I'm placing so many objects, but they are all natural things.
27:16So they should vary in shape.
27:18Environments and locations are all differing factors.
27:22For example, in this lot, seen from this angle,
27:25separately created plants are all placed together.
27:29But as similar shapes are bunched, the shot will look unnatural.
27:33So you must make individual corrections on each shot.
27:36It's a lot of work.
27:38Creating the habitat for Ramidus may have been tougher than creating Ramidus itself.
27:52Another technique for making this prehistoric world look real is the lighting.
27:57In the digital world, there is no sense of light until you put one there.
28:01There's no sky or sun.
28:05This section introduces the sky and the sun that match the world,
28:09setting the angles to incorporate the light
28:12and thus give finishing touches to the world.
28:17The lighting is done in a way that makes it look real.
28:20In the dark space, a sky and a sun are placed.
28:24By controlling the hue, brightness, and angle of the light,
28:27he optimizes the lighting of the scene.
28:34Finding the best angle for illuminating the characters
28:37is an important job of a lighting specialist.
28:41In the dark space, a sky and a sun are placed.
28:44By controlling the hue, brightness, and angle of the light,
28:47Finding the best angle for illuminating the characters is an important job of a lighting specialist.
28:56Without lighting, the completed Ramidus model would look like...
29:01nothing but a black blob.
29:12In this scene, the lighting helps create the atmosphere of Ramidus' forest home.
29:17In this scene, the lighting helps create the atmosphere of Ramidus' forest home.
29:22The art of lighting is a big part of making Ramidus come to life.
29:28The scene is almost finished.
29:31And even these pieces of fruit were the product of thorough research and painstaking work.
29:37This time, I created a type of fruit called Mirianthus.
29:43This is a Mirianthus, or a giant yellow mulberry.
29:48It has a knobbed surface and is made up of many layers of pulp.
29:55It is native to Africa and is frequently eaten by chimpanzees.
30:00We don't have this type of fruit in Japan,
30:03To create the model of the fruit, Katsuyama collected a lot of reference data about Mirianthus itself,
30:09but also about other types of fruit.
30:13She gathered not only data, but also bought various different kinds of fruit,
30:18and carefully studied them to make the digital Mirianthus more realistic.
30:25She seems to be pretty good at it.
30:27I like texture mapping, so I sort of created my own fruit stand.
30:32I built four different base models with different patterns,
30:36from which I created more variations.
30:39That part was a lot of fun.
30:44Katsuyama also created a whole series of fruit models.
30:48This is one of them.
30:51This is a yellow mulberry.
30:54Katsuyama also created a whole series of models representing varied stages of the fruit's life,
31:01from green, to ripe, to rotten.
31:09She spent two weeks observing the rotting process of real fruit prior to her modeling work.
31:15To make the Mirianthus look even more real, one last touch has been added.
31:20Let's rewind.
31:24Can you spot the bugs flying around the fruit?
31:28That was added by a creator in charge of visual effects.
31:35This is Masanori Yamao, a virtual effects designer.
31:39Here's the bug that should be flying around.
31:42This model already has its wing-flapping animation.
31:46Zoomed in, we can see that it's a fly.
31:49That's what's buzzing around the fruit.
31:52Even the detailed work of making this small fly buzz around has to be created.
31:59He's also particular about some parts of the fruit.
32:02Well, on this animation, I debated whether flies should go near the fruit.
32:07It took me a while to work this out.
32:14I mean, what happens if they go near each other?
32:17Because in real life, when bugs get close together, they don't just keep flying.
32:22It's hard to tell what's going on.
32:25I mean, what happens if they go near each other?
32:27Because in real life, when bugs get close together, they don't just keep flying.
32:32They're actually aware of each other and react accordingly.
32:37These flies are so tiny that some viewers might not even have noticed them.
32:42But they give a visual clue to the sweet odor of the Mirianthus fruit.
32:46And they even hint at the passing of time responsible for the fruit rotting.
32:51Finally, after so many stages of work, done by so many creative talents,
32:57the earth of 4.4 million years ago that was once home to Ardipithecus ramidus has been digitally reborn.
33:04Now, let's explore this prehistoric world.
33:20Huh?
33:50Huh?
34:15You would never be able to film this today on any place on earth.
34:18It was lost a long time ago.
34:21The prehistoric landscape and the life of our distant human ancestor has now come to life in a hyper-realistic 4K movie.
34:38The quality of CGI for this program was quite exceptional.
34:42I think it's the first of its kind.
34:45The overall impression is different.
34:48It doesn't only recreate the body shape, but also the skin texture,
34:53the way the fur is slightly disarrayed, and how all these elements change when the character moves.
34:59And also the facial expressions.
35:02The characters can make a puzzled face, show a sense of danger, or even make a truly subtle expression.
35:09All those delicate movements are brilliantly recreated.
35:13And the integration of the characters with the environment is also superb.
35:18For example, you can really feel that the sun is beating down and how hot it is.
35:24You can easily empathize with the characters.
35:27You can feel their relief when they reach the cool shade of the trees.
35:31I've never seen any computer-generated movie that can convey this level of reality.
35:37For this project, CGI was used to also reconstruct other hominid species,
35:43and their habitats from different epochs.
35:53This is Earth of 3.5 million years ago.
35:56900,000 years have passed since the time of Romulus.
36:00The star of this era is a species that lived in groups in plains.
36:03Its forests have declined due to global aridification.
36:07They are Australopithecus afarensis, a bit of a tongue twister name.
36:13Let's take a look at their life.
36:27The group is foraging together.
36:30The group is foraging together.
36:51But danger may be lurking.
36:55A predator passes.
36:59A predator passes.
37:14Yes, in this epoch, hominids were prey to carnivores.
37:20They were largely defenseless against the ferocious enemies.
37:29Whoa!
37:44This is Earth, 2.4 million years ago,
37:48when it was inhabited by another hominid species, Homo habilis.
37:53They are considered a very important species.
37:55Homo habilis were the first proto-humans to use tools,
37:59and may have even been responsible for the invention of stone tools.
38:06This group seems to have spotted something.
38:09They've come upon a pack of hyenas, feeding.
38:22While they try to steal the food,
38:25the hyenas try to steal the food.
38:28The hyenas try to steal the food.
38:31The hyenas try to steal the food.
38:34The hyenas try to steal the food.
38:37Will they try to steal the kill?
38:59Very bold.
39:00But, to survive, they have to eat.
39:06Eat.
39:16Some researchers hypothesize that the invention of stone tools
39:19may have been a happy accident.
39:22Out of the Cradle dares to recreate that scene based on this hypothesis.
39:36Eat.
40:06Eat.
40:30Homo erectus lived 1.8 million years ago.
40:34This species had two key traits.
40:43The first is that they started hunting other animals as prey.
40:49The second is that their body shapes and movements
40:52had come to resemble our species much more closely than any earlier ones.
41:03Since they look so much like modern humans,
41:05a combination of live-action filming and digital recreations
41:09were used to reconstruct their life.
41:13Location shooting was done in Australia.
41:18Australia was chosen as it offers a variety of pristine natural environments,
41:22from desert to forest to coastlines.
41:28This ambitious effort to bring Homo erectus back to life
41:31was undertaken by a Japanese production team
41:34working alongside filmmakers from Australia and Hollywood.
41:38Around 100 people in total were involved in the shoot.
41:47This man is playing Homo erectus, our main character.
41:54He is made up with prosthetics before the shoot.
41:58Let's take a peek at how the special effect makeup is done.
42:05A prosthetic makeup artist from Hollywood
42:07will give the actor a face that turns the clock back 1.8 million years.
42:17A special mask restores the raised cheekbones and the brow ridge
42:21that are characteristic to Homo erectus.
42:27The makeup artist took a cast of the actor's face
42:30and under the guidance of anthropologist Isao Baba,
42:33built a mask with these features.
42:40After two hours on the makeup chair,
42:42his modern human face travels back in time
42:44to create a 1.8 million year old Homo species.
42:49The actor's face is now that of Homo erectus.
42:53He's also wearing a green cap
42:55so that they can add digital hair later on.
42:58Essentially, it's a high-tech wig.
43:20It's finally time to roll the cameras.
43:23First up is a scene of Homo erectus hunting.
43:37The prey is a species that no longer exists,
43:40so it will have to be added digitally.
43:42For shooting purposes, he stabs his spear into a prop.
43:54When the live-action footage shot in Australia
43:57is combined with a digital model created in Japan,
44:00it looks something like this.
44:07For Mark Rees-O'Gill, the producer of the Australian shoot,
44:10this was a brand new experience.
44:13You know, new people, new experiences, new locations.
44:17You know, this is the first time
44:19New experiences, new locations.
44:22You know, this is very rewarding for me as a producer.
44:27So, yeah, it was a great opportunity.
44:31During their month on location,
44:33they shot a variety of scenes with Homo erectus.
44:39Cinematographer Adam Clark explains the shoot's difficulty.
44:44There have been a lot of VFX elements
44:48and, of course, going out into real locations
44:52and shooting, but also having to bear in mind
44:56the visual effects, the prosthetics,
44:59the post-production side of it.
45:04That has challenges in its own right.
45:07We've been to the Virwa fields
45:09and had to build an 80x20 blue screen.
45:13While there's always pressure to shoot in terms of the time,
45:17there's also the time you actually are required
45:20to get all of those elements together and right
45:23so that at the end of the day, someone can put the compulsions together
45:27and actually has the information that they require.
45:35In this scene, a group of Homo erectus feast on meat from their prey.
45:47The primal food source for Homo erectus
45:50gradually transitioned from plants to meat.
45:53Some experts believe the meat-eating diet
45:56fueled an increase in their brain size.
45:59In this scene, a group of Homo erectus
46:02gradually transitioned from plants to meat.
46:05Some experts believe the meat-eating diet
46:08fueled an increase in their brain size.
46:29To re-establish a vivid prehistoric world,
46:32you need something beyond visuals of ancient species and their habitats.
46:36To re-establish a vivid prehistoric world,
46:39you need something beyond visuals of ancient species and their habitats.
46:43The production also needed to create the right sounds.
46:54The right sounds include the speech of Neanderthals and Homo sapiens.
47:00Kakupa!
47:11Reconstructing the language of our human ancestors
47:14was one of the project's daring challenges.
47:17Ka-ku-pa.
47:19Ka-ku-pa.
47:21Good, and we want to kind of flow to the next gesture, right?
47:24So these would all have been part of a sentence.
47:26Ka-ku-pa.
47:27Ka-ku-pa.
47:29For advice on the proto-language used by the characters,
47:32the team turned to Andrew Byrd, a historical linguist.
47:38But how does he reconstruct something spoken in distant prehistory?
47:50We don't know if the Neanderthals had language.
47:55The further along we get within our research,
47:58the more likely it seems that they did have language.
48:02But we do know a couple things.
48:04First off, we know that they were highly intelligent.
48:08Second, we know that their anatomy,
48:10while slightly different from that of Homo sapiens,
48:13was almost certainly tolerant,
48:17or it almost certainly permitted the production of speech.
48:20Now that speech may have sounded different from that of our own,
48:24but it would have been something I would imagine fairly similar.
48:31This scene depicts the possible first encounter
48:34between Neanderthals and Homo sapiens,
48:36as imagined by researchers.
48:39The Neanderthals shout as they run away.
48:47What does that mean?
48:50The reason the word for father for the Neanderthals was babu,
48:54and I essentially used babbling as an inspiration there.
48:58But of course babies aren't babbling about everything, right?
49:01So there are other concepts that a baby's not going to be able to create.
49:06So I used onomatopoeia for that.
49:08So the idea is that there are certain sounds in the world
49:12that we try to imitate, and that becomes the word.
49:15And so the idea that came to me
49:18was that to wait is the idea of having two feet
49:22planting down in the ground, right, in the mud,
49:25and that you're waiting, physically waiting there.
49:28So the word is puka,
49:30which is created to imitate the planting of feet within the mud.
49:37Another sound that needed to be reconstructed
49:40was that of prehistoric music.
49:42It can be heard during a scene
49:44where a group of Homo sapiens is performing a ritual.
49:48Homo sapiens
50:04Here is the man who created the music of early Homo sapiens.
50:09Rupert Till
50:14He is Rupert Till of the University of Huddersfield in the UK.
50:21He is both an archaeologist and a composer.
50:24It makes a very deep sound.
50:26He's been conducting research on the music of the Paleolithic era
50:30for many years.
50:39I try to avoid using any sounds or instruments that exist today.
50:44These are sounds that are being made as part of a ritual,
50:48and the music and the dance and everything comes together.
50:52So I try to use natural sounds, sounds from the body,
50:56clapping, singing, making sounds with your voice,
51:00and the instruments we know about,
51:03like prehistoric ancient flutes,
51:06prehistoric ancient flute sounds.
51:11Till also found something ideal
51:13to help achieve the sounds from ancient times.
51:22I was really lucky to be able to go to some caves in Spain,
51:26and while we were there,
51:28we were looking for rocks that ring when you hit them.
51:32And in this cave, we discovered these amazing stalagmites,
51:39stalagmites from the ground and stalagmites from the ceiling.
51:43And when you hit them, we hit them with a very soft mallet
51:47so that it wouldn't damage them.
51:49Some of them ring like bells.
51:5310, 11, 12, 13,
51:5914, 15, 16, 17...
52:17Even over these incomplete images,
52:19Till's ritualistic music lends an enchanting atmosphere.
52:50Wow!
52:59I think it's important to convey
53:01actual images of ancient people to the public.
53:04As researchers, we try to grasp the past as accurately as possible,
53:09based on previous excavations and research.
53:12And these technologies are a truly fantastic means
53:16of communicating our discoveries to the public.
53:24Thanks to a vast amount of reference material,
53:27the expertise and insight of many scholars,
53:30and state-of-the-art digital imaging technology,
53:33these lost prehistoric worlds have been brought back to life.
53:39If a real Ramadis could see this digital model for himself,
53:43how would he react?
53:47Peek-a-boo!
53:49What words of surprise would a Neanderthal blurt out?
53:55Kakuma!
54:00Digital images will spark your imagination,
54:03and they may perhaps make you feel closer to the prehistoric species.
54:16In Shibuya, Tokyo.
54:19Today, the soundtrack for Out of the Cradle series is being recorded.
54:35Shu Kanematsu and Takashi Omama
54:39are the two talented young musicians
54:41who composed and performed the original music for the series.
54:47The original score exceeds 40 pieces.
54:51One, two...
55:16Oh, oh, oh!
55:47An ambitious project to recreate human evolution
55:51in hyper-realistic CGI.
55:56A two-year production of stunning images
55:58has been in fact driven by good old-fashioned artistry,
56:01contrary to the stereotype image of cutting-edge digital media.
56:10In much the same way that human evolution was a long and thorough process,
56:14from standing upright,
56:16to inventing tools,
56:17to acquiring language,
56:19and ultimately the arrival of modern humans.
56:24This cutting-edge CGI has come to life through painstaking creativity and knowledge
56:28from artists and researchers alike.
56:31So for the first time,
56:32we can see how this stunning series came Out of the Cradle.