In the book Einstein’s Dream, there was a chapter talking about how trapped we are in time. We can only experience this moment. It’s not possible to travel to the past, nor foresee the future.
This idea haunted on me for weeks, I thought that what if we create an illusion to make people feel that they can control the time?
And furthermore, to make them feel that they are not confined in this body, time and space?
After a series of brainstorms, we come up with this plan:
The original Idea:
- Install a 360 camera to capture the environment in the room.
- Have the user sitting in a chair and wear a VR headset, holding controllers in hand
- Live stream the 360 videos and play it back in the VR headset
- The user would then be able to see the actual environment in the VR world. He/she would also be able to control the speed of the 360 videos playing with the controllers, seeming like he/ she is controlling the rate of time passage.
Make it happen
Then we started working, the first step is, googling.
After a week of googling shit on the internet, we found out that there are so many technologies we can employ, including Unreal, Unity, different bands of 360 cameras, RTMP, HLS, HTC VIVE, Google Daydream, Cardboard VR, etc. To decide which one to use, we tried most of them and finally found the ones most suitable for us.
So, we’ve tried:
HLS and RTMP are the two mainstream live streaming protocols, and finally, we chose RTMP because the latency is more under control.
We tried making a playback client both in Unreal and Unity, but in the end, we chose Unity because it works better with Google Daydream SDK.
We had built an RTMP live stream server based on Nginx, and it works well. But later we switched to SRS because deploying SRS is much faster.
As to the camera, we finally chose Ricoh Theta V because it’s the only one available that can live stream in 4K.
So, finally, to do this:
Create an RTMP server using SRS.
Setup the Ricoh Theta, and live stream the graphic via OBS.
Build a client by Unity on a Google phone, and playback the live stream via a Google Daydream kit.
User test and fine tune it.
We have some more ideas to try, such as setting a different latency and placing the camera in different height.
We tried to put the camera about 3 meters high, and it’s almost touching the ceiling, creating a static bird view, and tried to put it on the floor, so you see everyone as a giant, just like you are in Gulliver’s Travels.
The shortest latency we have got was 2 seconds, but when it’s too short, you can only see the back side of your head, you cannot see yourself turning around. We love keeping the “see yourself turning around” part, so we make the latency 4 seconds.
To make it an immersive experience, we added a voice over track which is April reading the chapter of Einstein’s Dream which gives our the inspiration.
We made the “stop the clock” button on the controller, and we tried different schemas, like utterly stopping the video, and slow motion, we love the slow motion best,
and after a few seconds in slow motion, the video will playback in 2x speed to catch up the time, indicating you cannot always live in memories.
One other thing worth mentioning is that we halved the speed and pitch of the voice over track when it’s in slow motion.
And we added an enter slow motion sound effect, when the user press the button, and an exit slow motion sound effect when the slow motion ends.
I love this project
It ends up a pretty cool project because I love seeing people getting surprised. We showed this project in the 2018 ITP Winter Show proudly and captured some “Wow” moment.
See more: [Note for setting up ITTGTT] (http://yangyang.blog/2018/12/nothing-doing-it-again/).