Altered States

Idea generation and iteration

In this lesson we were introduced to the Ai generator which can generate images for our ideas:https://openai.com/product/dall-e-2

In this project we had the idea to make something surreal, which could look great and be fun. The scene of our choice is a lake, a grass field or a flower bush. We wanted to do something special as some of the other students had chosen grass, so for the time being it was decided that the lake or seaside.

This is our mind-map, we collected all the elements we want and try to make a story.

Ai production

In today’s session, after a discussion with vallerie, we wanted to make a huge mask floating on the surface of the sea which represents the identity of the human being, then next to it there are two hands which represent the life of manipulation and confinement of each individual, then we need the camera to push the mask to open its mouth and inside will appear the exact same scene, which represents the endless but meaningless exploration of the people themselves.

My team members did some more simple concept images and then created the storyboard.

We went to kew garden, Holland park and Tokyo park for research, and on Billie’s advice, it was best to find a rock pool rather than a lake with a lot of fluctuating water. At last we chose Kew garden which matches our ideas most.

During the presentation, Sam and Billie suggested that we put a reference to the mask at the waterside so that it would be a better replacement when tracking.

New Footage

During spring break my group member Vallerie went back to her home country of Indonesia and shot the footage we needed. After spring break, we saw these clips and felt the different moods brought by this seaside, rocks and featured tropical architecture.
Then we changed the original idea and set the whole theme in the mysterious and intriguing mood between exotic islands and pirate ships and human remains.

We then made a new storyboard and moodboard based on the footage. we wanted the atmosphere to be mysterious, creepy and disgusting.

We did 2D tracking test together and it was four point tracking, cause three of us are 3D students so Nuke is brand new thing to us and it’s really confusing at the very first touch. But with Sam’s help, we finally figured it out that how to make the images attach to the footage video.

We did the 3D tracking test separately, and with Ria’s help we solved a lot of problem, we knew that how to have a references box for the ship model, and that’s for the animation work in Maya.

Test:

Skull Modelling

Nomad Sculpture
Upgrade

I first sculpted a skull in Nomad from a sphere based on a reference drawing, from which I discovered how important it is for a sculptor to keep rotating the model to check the accuracy of all angles. I then used the crop tool to remove the jawbone part of the skull, as I thought a mutilated skull would be more atmospheric according to our theme. After that I made the skull one corroded tooth after another from squares.

High-poly
Low-poly

I then imported the model into Zbrush for further detail sculpting and polygon reduction for further work in Substance Painter. As a model with a hundred k polygon, it is very difficult to unfold UV for it. So I used a tool called ZRemesh in Zbrush, which creates a Low-poly one based on the High-Poly model, and then I used all projection, which ensures that the details are still preserved in the re-topologised model. I then did some detailed sculpting such as cracks, and finally exported it in obj format.

This is the trickiest part, the tutorials I found used UV master which is a software we don’t have in uni’s PC. So I’m going to unfold UV in Maya. Then you can see in the screenshot that I have used the automatic UV divider.T hen it turned out a bit not so good. Later on after doing the shipwrecker model we found out that it is possible to unfold UV in Zbrush, which I will try next time.

Because our skull is a deeply corroded object, messy UVs won’t cause too much bad effect, but it’s still a very sloppy step in the 3D industry and I’ll be redoing the UV part after this project.

SP

I then adjusted its colour and material in SP and with the group chose the most suitable effect for it.

This was the final render results in Maya, and there was something happen during the process which drove me crazy.

After I applied all the texture maps, the skull was stretched out in the render window, I didn’t know what was happening, I checked all the render settng and Uv with no problem, finally I found out that it was just a matter of choosing between Normal and Diffuse map.

https://skfb.ly/oHJuQ

Here is the final model I uploaded onto Sketchfab.

2D tracking: After calculating the tracking points, I erased the positioned objects in the video and placed the skull model on the rendered image. Then, I made a copy as a reflection, giving a ripple distortion effect. 3D tracking: Mygroupmate let Nuke calculate the tracking points, export the camera, and then import it into Maya. After finding the appropriate location, place the model, and provide lighting. Analysis the video, soft and cool light was given to illuminate the object, and a point light source was added in the dark to illuminate more details. Then, after completing the animation, render the sequence diagram, import it into Nuke for synthesis and colour adjustment. Colour mixing mainly uses grade and colour correct nodes, and then she also used constant nodes to simulate fog, according to the principle of aerial perspective to achieve the effect of near reality and far emptiness. she dimmed the original video, then reduced the contrast, and then adjusted the colours to cool grey to create a more realistic effect. We then imported the composite into After effects, added sound effects and music and exported it.

Final

According to Sam’s feedback, it would be more realistic if we did some light effects on the boat.