Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Documentary video

https://youtu.be/Yauy3niIFIQ

Editor: Xiyue Huang

Script:Qiandai Sun

Voice actor:David

In our VR game, players embody a tree and experience its life cycle, from a seedling to a mature tree, while observing day/night and seasonal changes until death. The Wood Wide Web concept represents the exchange and cycle of life. When a tree dies, it becomes a source of nutrition for other organisms, injecting new vitality into the ecosystem. Physical interaction for players outside the field includes using water sprays instead of rain, electric fans instead of wind, and tapping players to simulate interactions with small animals.

Our device is Oculus Rift, and players control the game through controllers and headsets. Connecting the device with Unity is time-consuming due to configuration and connection issues.

Our team divided the work according to the storyboard and carried out scene modeling, animal animation, nutrition transfer animation, day and night change, change of seasons, and sound design collection and import.

Cooperating as a group, we faced problems, modified the plan, and solved the issues together. This process was valuable and laid the foundation for the output of our final project.

After our presentation, we received positive feedback. Most players found the experience immersive and interesting. However, we need to add transitions between different scenes to avoid abrupt changes. We will continue improving the game experience to engage players better.

Personal reflection Xiyue Huang

As part of the Unity VR team project, I had the opportunity to work with a talented group of individuals, each with their unique skills and strengths. Throughout the project, we faced various challenges and obstacles that required effective collaboration and communication to overcome. Reflecting on our team project, I have to say that we encountered several challenges in terms of collaboration. I regret that we were not able to fully utilize the potential of each team member. In addition, our communication efficiency could have been better, which led to some delays and misunderstandings during the project. I learned the importance of clear communication, active listening, and constructive feedback in creating a cohesive and efficient team.

From a technical perspective, I was able to expand my knowledge and skills in Unity and VR development. I learned how to implement interactive features such as grabbing and interacting with objects in VR and how to optimize performance for VR applications. I also had the chance to work with various VR devices, including the Oculus. In this project, I learned a lot about Unity VR development and acquired various technical skills. I am grateful to our kind mentor Leo for his careful guidance and leading me to solve problems step by step. I learned how to use VR devices and Unity to implement interactive functions, how to use Unity’s API and tools to create scenes and objects, and how to use coroutines to manage scene and object switching, etc. When connecting the VR devices, I encountered countless problems such as headset latency, unconnected controllers, HDMI disconnections, camera blackout, etc. These issues involved some version matching problems with SDK and OpenXR, which I solved one by one. Now, I can proudly say that I am a VR connection master, and I will never have trouble connecting again. In addition, I also learned how to organize code better, making it more maintainable and extensible. These technical skills and knowledge will have a positive impact on my future learning and career development. I am very happy to participate in this project. Although it was challenging and required a lot of work, I also gained much more in return.

Looking back on this project, I do have some regrets about areas where I feel we could have done better. One area that comes to mind is the overall interactivity of the project, which could have been more robust and engaging. Additionally, the transitions between different parts of the experience could have been smoother and more seamless. Specifically, I think it would have been great if we could have added more functionality to the controller buttons to enhance the interactive experience. While I’m proud of what we accomplished, I believe there is always room for improvement and I hope to incorporate these lessons into future projects.

Looking towards the future, I believe this experience has equipped me with valuable skills and knowledge that I can apply to future projects and career opportunities. I feel more confident in my ability to collaborate with a team and tackle complex technical challenges. Additionally, I am excited to continue learning and exploring the possibilities of VR development and how it can be applied in various industries and fields.

Overall, I am grateful for this experience and the opportunity to work with such a talented and dedicated team. It has been a valuable learning experience that I will carry with me as I continue to grow and develop in my career.

VR Virtual interaction

Camera Raise

To make the camera move up by clicking the A button on the controller, I use the OVRInput class to detect button presses on the Oculus Touch controller. Specifically,use the OVRInput.GetDown(OVRInput.Button.One) method to check if the A button has been pressed.

To move the object, I check for user input using either the KeyCode.E or OVRInput.Button.One keys being pressed, and then use the transform.Translate method to move the object upward using the Vector3.up direction multiplied by a speed factor (cameraSpeedSeed) and Time.deltaTime. Once the object has reached a certain height (50), the code sets the firstScene boolean to false and the secondScene boolean to true, then starts a coroutine called “SecondScene”. In the second scene, the player’s position, direction, and rotation are stored in their respective variables for later use.

Day and night changes:

Two scripts were set up for day and night changes.
One controls the movement of the sun to create changes in the shadows. The CalculateTimeDifference(sunriseTime, sunsetTime) method is used to set the sunrise and sunset times. However, it was observed that the changes in the shadows were not very noticeable.

Another script was written to switch between three different skyboxes for morning, noon, and evening. The time was set to automatically switch every 15 seconds using the if (Time.time % 10 < Time.deltaTime) statement in the Update function.

           

Seasonal Changes:

Implemented seasonal changes for tree materials using Unity’s URP. I used color variations in the material to represent the four seasons. Smooth automatic switching between four material textures was achieved by setting a timer to control the interval between season changes. In each Update function, the timer was incremented by the time of the last frame, and the time elapsed since the last season change was checked against the set interval. If the time elapsed exceeded the interval, a season change was triggered. To smoothly transition between materials, the Lerp function was used. However, a problem occurred where the shader material for the tree trunk and leaves changed together. This was resolved by separating the material textures for the trunk and leaves in separate nodes.

Animal Interaction:

Controlling animal animation. Two methods have been attempted, one using scripts and the other using animation clips.
There were some difficulties with the Animator at the beginning because the imported animal model came with bone animations, such as a bird flapping its wings. However, I needed to add a movement clip to the bird and play both animations simultaneously in the Animator. To achieve this, I created an empty Animation Clip as a parent and dragged the two child Animation Clips into the empty Animation Clip. Then, I used a Blend Tree to control the bird playing both animations at the same time in the Animator Controller.
The other method involved triggering movement through coding, by setting a target point with “targetPosition = new Vector3(Random.Range(-5f, 5f), Random.Range(-3f, 3f), 0f);” and calculating the distance from the animal to the target point with “distance = Vector3.Distance(transform.position, targetPosition);”. This allowed for automatic animation control.

Timeline Events

Under the guidance of the professor, we attempted to implement the multitasking switching function using coroutines. Three coroutine functions were defined: SecondScene, ThirdScene, and FourthScene, which were used to switch between different scenes at different time intervals.

The first coroutine function, SecondScene, waits for 5 seconds, calculates a new position spawnPosition, and instantiates a game object birdPrefab at this position. It then waits for another 20 seconds, sets the variable secondScene to false, and ends the bird animation. Finally, it starts another coroutine function, ThirdScene.
The second coroutine function, ThirdScene, waits for 1 second, then instantiates a game object rainPrefab, enables the particle system for rain, and plays a sound effect. It then waits for 10 seconds, destroys the rain object, stops the sound effect, and starts another coroutine function, FourthScene.
The third coroutine function, FourthScene, is the scene for finding fire. It sets multiple fire particles and activates each game object in the fires array sequentially using a script, while playing a sound effect. Each fire object waits for 2 seconds, and then they are destroyed. Finally, the sound effect is stopped, and another scene, trunk, is loaded.

Hardware: VR Equipment and Connection

 

I am mainly responsible for the setup and operation of Unity VR.

Device Selection

We chose to use the Oculus Rift for our project. Initially, we used the Oculus Quest, but during the first Unity test, we found that the Quest’s rendering speed was slow, the resolution was low, and there was significant lag that severely affected the gameplay experience. Because our project uses Unity shaders, rendering transfer speed is critical. Therefore, we resolved this issue by using Unity post-processing rendering and utilizing the low-latency head tracking and visual optimization capabilities of the Oculus Rift, which yielded better results.

         

Connecting the Headset:

To connect the Oculus Rift headset to Unity, I needed to ensure that the Unity and Oculus SDK versions matched. Then, I set up the Unity environment by enabling Oculus support in the XR Plugin Management, importing the Oculus Integration package, and setting up the OVRCameraRig. This required removing all other cameras in the scene to avoid conflicts that could prevent the display from working correctly.
Next, I added the OculusInteractionSampleRig to the scene and used the OVR Manager component in the OculusInteractionSampleRig > OVRCameraRig to adjust the Tracking Origin Type to Floor Level. By following these steps, we were able to successfully connect and use the Oculus Rift headset in Unity for our project.

Connecting VR Controllers:

To connect VR controllers in Unity, we needed to first ensure that the controllers were compatible with the Rift headset we were using.

Next, I mported the Oculus Integration package into Unity and added the OVRCameraRig to the scene. Then, i added the OVRInput component to the scene’s controllers to enable input functionality.

To map controller inputs to game actions, I used the Unity Input System, which allowed us to define custom input actions and map them to specific controller buttons. We then used these actions to control various elements of the VR experience, such as player movement and interaction with objects in the environment.

Sometimes, I encountered issues where only the headset was working, but the VR controllers were not responding. In the Oculus Store, it showed that the controllers were connected, but we could not control anything with them. I suspected that the controllers were not connected to Unity. To resolve this issue, I used OpenXR as a replacement and selected Oculus under Play Mode OpenXR Runtime. Then, in the Interaction Profiles, I clicked the “+” button to add the Rift Controller option, which enabled us to connect the controllers. Finally, I imported the XR Interaction Toolkit, which allowed me to use the controllers in our VR project.

At the beginning, the XR Interaction Toolkit was not running, and I discovered that it was due to an outdated version. However, the latest version was not displayed in the package list within Unity. To solve this, I selected the “+” button under the Package Manager and chose “Add Package by name.” Then, I entered “com.unity.xr.interaction.toolkit” and imported the latest version.

By following these steps, I was able to resolve the issue of the VR controllers not connecting in Unity and use them effectively in our project.

   

Grab Function:

To implement grabbing functionality in Unity VR, I use a prefab file that includes a ControllerGrabInteractor component and make it a child of the ControllerInteractors object.I add the Grab Interactor script to the controller object and use the HandGrabInteractor script for gesture tracking.

To make an object grabbable, I add the following necessary components: Collider, Rigidbody, Grabbable, and Grab Interactable. These components allow the object to have physical properties such as collision detection, mass, and gravity, and enable it to interact with the grab interactor script for grabbing and releasing.

reference:

https://circuitstream.com/blog/oculus-unity-setup

Interactive Design

Room switching: By clicking the handle button, the character model will automatically move forward.
Question: How to make the characters move in the small space of VR

The design of the handle function:
1. The controller is transformed into a specific tool
2. The controller simulates the hand, hand movements: pick up, put down, touch.
3. Controller buttons, quest2 buttons support touch recognition, touch and press can be recognized as two actions, that is, touch and press can be distinguished. In addition, the joystick on the handle is not limited to 4 directions. In the future, it can be considered to incorporate the idea of a knob in fighting games. This button can be pressed and can be used as a functional interactive logic switch button.

Life Loop project——VR, is a VR experience project that helps users fully experience life. The game runs mainly through the construction of VR HTC equipment and Unity.

The main functions used by users are:
1. Experience the life of a flower. At this time, the user cannot move. The user’s perspective is the perspective of the flower. However, the user can experience the growth of the flower through the first perspective controller and the growth animation of the flower. The perspective is slow from the seed in the soil. Slow to high. You can interact with the environment, observe the surrounding environment, experience the perspective of flowers, and feel the sounds of the environment: the sound of soil breaking, the sound of wind, the sound of rain and the sound of birds. Cooperate with the sound of the environment in the game, prepare 4D interaction outside the VR equipment, and refer to the 4D movie mode: shake the chair when breaking ground, sprinkle water when it rains, blow the wind in spring, and use sunlamps to create a heat source on the top of the head in summer , enabling users to break through the limit of 3D experience and achieve 4D experience.

http://www.yd4dmax.com/product/4d-movie-theater/

2. Experience human life. The flowers are picked off, the controller switches to the human perspective, and the main player becomes a human. At this time, he can move freely and explore the space of the human world. Three scenes are set up, and different rooms can be selected by clicking on the handle button. They are infancy, adolescence and old age. A representative scene will be built. Users can move and explore the scene as a human subject, and can interact with some items, such as viewing books, letters, switching lights, using crutches, etc. When the experience of the three scenes is completed, the jump function will be used to jump back to the first scene for looping.

Abe, M. et al. (2022) ‘Hype Live: Biometric-based Sensory Feedback for Improving the Sense of Unity in VR Live Performance’, in 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). [Online]. 2022 IEEE. pp. 836–837.

Imaoka, Y. et al. (2020) Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology. Frontiers in Psychiatry. [Online]

Wang, Y. et al. (2020) VR‐Rides: An object‐oriented application framework for immersive virtual reality exergames. Software, practice & experience. [Online] 50 (7), 1305–1324.

Meeting minutes

The First Meeting:

Xinyue Lin:

The concept of this project is to turn the viewer into a plant, such as a tree or a flower. Participants will have the opportunity to see and feel life from a plant’s perspective, rather than a human’s. For example, they can experience wind, rain and sunshine, watch bees gathering nectar, birds flying overhead, insects foraging and more. Viewers will also witness the growth stages and life cycle of the plants, such as the elongation, flourishing and withering of leaves. The project will also incorporate sound effects, for example, when an insect flies onto a flower, you can hear the sound of the insect as clearly as pedaling. You can hear the sound getting further and further away as the insect flies away from the flower.

First meeting

 

The Second Meeting:

Amber Zhang:

Feedbacks from Lecture about group project(might not fully covered all feedback points)

Inspiration suggestions are from Dr. Jules Rawlinson: Presence vs absence? (How might the project achieve both concepts?)

Inspiration research area suggestions are from Dr. Jules Rawlinson: academic research key words: “death and digital presence”, “VR grief representation resolution”

Inspiration example is from Dr. Jules Rawlinson: looking back (Notes: the project’s name was not fully sure is correct)

Inspiration suggestions are from Mr. Leo Butt: “Death” could be repeatable in digital games; players could be reborn and learn from “death” in digital games?

Target audiences: will our project still aim at treatment of mental issue such as stress, grief?

Perspective idea of soul transition in our project? becoming a soul of Plants and humans?

Immersive technologies installation agenda? AR VR Projectors/real simulated physical effects: rain, sunlight?

The Third Meeting:

Xiyue Huang:

idea about ‘Death’&’Presence’

What is presence, I thought of death, the antonym of presence, what is existence. The idea comes from coco. Although people are dead, they have existed before. If their memories are remembered by others, they are not really dead. If they are remembered by everyone, this is presence. Another conceptual inspiration comes from COCO. This is an animated movie about death. The real passing is not death, but being forgotten.
Purpose: To achieve a spiritual healing effect and heal the soul. Death is a part of life, people always ignore its existence and can’t believe it. But one thing is clear, we all have to face it and experience it.

I want to set up a memory palace scene, the overall space is clean and simple, and there is a part of the VR space in front of the player. For the time being, there are currently four rooms, and each room represents the memory of a dead person. Through the interactive experience with this room, players can feel what kind of person the owner of this room is. Let’s use sound to express the human state, visualize the sound and make it into a waveform.

Model style / Keywords: fantasy, simplicity, color, line

<soul>

After the third meeting, the professor questioned the idea of the theme of death. Compared with the theme of flowers, this idea seems to be not enough for users to participate in immersion. The antonym of presence is absence. It doesn’t make the player feel the presence, the player’s experience is absent, and we finally settled on another theme.

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel