Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Group 2_Final Report

Summary

Link of All our Blogs of Submission Two: Posts ‹ dmsp-process23-Submission 2-Group 2
Since there are several blogs that are co-edited, please check the titles of these blogs which mark the co-writers.
Our Project Video:

Link to the video above: Presentation Video_ Google Drive

Live Demo: Live Demo *2_ Google Drive

Link of Source Files:

MaxMSP_Projection:Max Projection_ Google Drive

MaxMSP_Sound:  MAX Sound_ Google Drive

TouchDesigner: Touch Designer_Google Drive

MadMapper: MadMapper_Google Drive

Video: Video of the movement of the characters_ Google Drive

Group 2_Teamwork_Future Improvement

The content of this blog consists of the work of  Rudan, Jin, Yuan and Jaela.

Space

Although we understand the space (E15A) for the presentation is not that big and neat enough for our installation, we still choose the room for our update project. That is because there are several rooms booked for the degree show such as C09, C02, and other rooms such as C08 and E21 are not available during the spring vacation. Therefore, we used a large black curtain to cover the floor.

A possible future improvement would include a more suitable room for the audience to walk around. At the same time, it is necessary to be well-lighttight for projection and proper size for filling the room with music.

We expect to find a room large enough in the future that the projections of water ripples and the interaction of the touch designer can fill the room. Max is still capturing the movement of the characters, but these changes will present a more dramatic visual change. The changes made to the image by the characters will become more visible

installation

The current installations appear too crude and in the future, we can improve the visual effect in terms of the beauty of the installations, the neatness of the environment and the refinement of all aspects.

Interaction

Although we enriched the interactive way, the experiences still be regarded as insipid. In this case, it would be better to make two video walls interactive rather than just videos with masks. For example, the audience can be captured as a human mask.

As well as this, the current interaction is still between the person and the wall, and the person and the installation, we want to make it so that the movement of the person has an effect on the whole room. For example, when people walk on the floor, the floor will ripple as they fall; the water ripples on the walls will change as more people talk in the room, the volume of sound, etc.

We expect the future of interaction to be a dual interaction between people and people and people and things.

Music

We have updated another piece of music, but the main part still be remained. The original piece is generated for the vibration of water, so it tends to have a strong sense of rhythm. It would be better if we change the style and atmosphere from sci-fi music to sound conveying pleasant and peaceful.

Convey core ideas

We have been working on this problem for a long time, but the final effect was still not satisfactory. Apart from the introduction before watching and the explanation from us, the designers, it seems to be difficult for the audiences to understand what we want to convey and the metaphor for each piece of content. In the future project, we would prefer to learn from more projects which can be easily understood.

Group 2_Update of Interaction in TouchDesigner with Kinect

Based on our meeting on April 19, we decided to make the installation more interactive and playable. I find some video tutorials of Touch Designer with Kinect and Leap Motion on Youtube. Since the Leap Motion is not available from the university, I booked the Kinect camera from the music store.

From my perspective, the theme or style needs to keep consistent with our project, which means it is expected to be related to water and water-ish stuff. Therefore, I prefer creating water painting effects through the feedback function and the slope TOP.

Tutorial Link: Water painting Effect

Water Painting Part Effect:

Kinect Camera Connection

I use kinect v1 TOP and kinect CHOP of panel and selector to get the left and right hand data from camera. In addition, I drag the data as a chop reference to the position of the water painting circle.

And then I combine this part with the sound visualization. Now the sound visualization part would appear when the part is covered by the circle.

Group 2_Teamwork_Group Meeting about Project Update

The content of this blog consists of the work of  Rudan, Jin, Yuan and Jaela.

According to Philly's and Jules's feedback on our presentation and exhibition at ECA Main Building E15A on April 19, I noted them and some corresponding ideas and held a meeting with my team members.

Feedback Record:

From Jules:

The sound generated by Maxmsp is an active feedback process. The waves can travel across the water’s surface and hit the other end, and then it started to come back again.

  1. The sound is expected to be louder to avoid being broken up. The speaker is not big enough to produce the high frequency and create the vibration. Also, the interaction way of clapping or other ways to make noise is not un-stopping. Try to actively use feedback to make it stable.
  2. Refer to the Chandni pattern.
  3. Create sound or fans to create ripples on the surface,  and also make the strings move.
  4. Animate the videos, make the character moving from one screen to the other. The person can walk around, disappear and so on. Merge these three screens.
  5. The music feels like a sci fi music rather than a positive, pleasant atmosphere. Add a bit variety.

 

From Philly:

  1. Working in a slightly bigger space that was more nevigable, allowing the audiences to walk around. Present this piece as a more walkable and immersive installation.
  2. Place objects or interacive elements within the space.
  3. The video materials did not effectively communicate the themes of the piece. Create significantly more variety and use some more abstract imagery.
  4. The touch designer aspect needs some more variation.
  5. More proposed theme in video content, music evolution or visual evolution.

I summarized the feedback and categorized them into several parts given by Jules and Philly, and organized a meeting on April 19 to discuss the update plan.

I found some references as follow:

After the meeting, we decided to make some improvements as follow:

  • Yuan will generate more peaceful and pleasant sound pieces through GarageBand and other softwares.
  • Jin will keep working on the digital ripple with Touch Designer, and generate a new interactive way for the audience, which is a water painter effect followed by gesture interaction based on Kinect camera and feedback function. In addition, Jin is going to prepare the manuscript and voice over for final video.
  • Rudan prepared the new material, refined the video of the characters walking around, and edited the final video.
  • Jaela is going to prepare the video editing and video recording.

Reference

Abstract speech visualisation – touchdesigner tutorial 45 (2021) YouTube. YouTube. Available at: https://www.youtube.com/watch?v=1FgJ842dyr4 (Accessed: April 27, 2023).

Touchdesigner Tutorial edge feedback (2021) YouTube. YouTube. Available at: https://www.youtube.com/watch?v=hve2UbKgJ9s (Accessed: April 27, 2023).

Group 2_Teamwork_Mar 29_TouchDesigner, AE and MadMapper

The content of this blog consists of the work of  Jin and Yuan.

______Jin______

There are two aspects I would love to talk about in this week’s blog. One is the space selection for our projection and installation, the other is effect enrichment.

During the past two weeks, we kept testing studio spaces and rooms to find a place where can achieve effects we want (sufficient projection distance and shading).

First we tested at E15A and E15B at ECA main building. Since the size of the room, it is hard for us to make the frame project fully on the wall. But in the positive aspect, they are dark enough and able to make the sound fill of the whole room.

We used TouchDesigner to make effects of colorful ripples. We found that when we place a collage created by PVC piece and laser paper or change the focus of projector to blur the picture, the final effect would be better combined with real-time water ripples.

simple device

Then we tried a photographic studio at Evolution House, where we tried to use camera to capture the overlay effect of TD virtual and real-time ripples. And then we made the capture picture as the content of the human mask (to represent the person’s internal emotions/heart).

The effect did not meet our expectations.

Also, when we projected the content of TouchDesigner, the software border cannot be hidden and it influences the audience’s experiences.

Therefore, I want to use Madmapper to control the frame size and shape the overlay effects.

First, I want to create mask in Madmapper, but it did not work. So through AE, I create a mask at the position of heart, which follows the movement of the person. Now the layer below can be seen through the mask.

I pre-recorded the TD ripples to test the effect and that went well. So I managed to make it a live input. I tried NDI camera to capture screen, but it didn’t work.

Then I found TouchDesigner has similar mapping function of Madmapper, so I tried the kantanMapper, but it has the same question that the full-screen function cannot work.

After that, I used syphon to connect TD and Madmapper. 

Then I tried to use the corner to show the inner world and external appearance.

—–yuan——

After these two weeks of constant adjustments and modifications, a version has now been formed. In terms of sound design, my initial idea was to make ambient music, but after testing it, I found that the sound was smooth and did not highlight the rhythm of the characters on screen。

屏幕录制2023-03-30 01.44.32

so I added sound effects to the design. I used four samples as the ambient background sound. I used four samples for the ambient soundtrack, which were some of the water and ambient sounds we recorded outside last week, and then three dotted tones, using footsteps and water drops to make a soundscape. Also to match the image you need to take into account the rhythm, beat, pitch, timbre, mood and theme.

In our work, we show three sections of people walking, running and galloping to map the inner musical combination of the characters. Considering tha the movement of the person walking, running and galloping has a certain rhythm to it, matching the footsteps of the walk to the rhythm of the drums. Ambient sound effects are used to enhance the music. More relevant to the theme.

Combine Touch Designer’s audio visualization with the device’s water ripple:

In our tests we used the Touch Designer’s to pick up the audio from the max to make the water drops change. To create an effect that matches the rhythm and intensity of the audio.

Group 2_Teamwork_Jin&Rudan_Mar 19_Touch Designer

Week8

Based on the content of this week's Jules tutorial and Jules' advice to us, we have made further improvements to our design solution. By using Touch Designer and Max, we have increased the interactivity of the design and improved the visual effect of the design. Rudan and Jin were responsible for the part of Touch Designer, and Jeala and Yuan for the part of Max.

Design Development

Our design concept is a ‘human lifetime’ and the implementation is still water, sound and installation. But we wanted to achieve the effect of ‘audience-action/sound-installation-visuals’ interacting with each other and changing in real-time with the audience’s movement/sound, so we introduced Touch Designer and Max. The interaction flow is shown in the diagram below:

(This image was drawn by Rudan Zheng)

We plan to show different effects on the two walls, one wall shows people’s inner world, and the other wall shows people’s behavior and actions. These two walls are displayed together with the installation and the projector screen, which can interact with the audience in real-time.

It can be understood that we are showing a “person”, and the voice and behavior of every audience who comes to watch this “person” can have an impact on this person, and this influence will change the shape of the projection on the wall.

Rudan &  Jin _Touch Designer

Reference:

Through learning some TD tutorials on Youtube and testing, we achieved the visualisation of sound in real-time. The input can be translated into flowing water, and changed based on the rhythm and strength.

Rudan’s First Test

Touch Designer_Jin

Jin’s Second Version①

Jin’s Second Version②

I used VB-audio Virtual Cable as the input and output of sounds. Therefore we can change the music at any time, just change the music in spotify or youtube, without inputing music files into Touch Designer and use switch to change. Also, it can deal with sound from microphone, environment sounds and music in place and generate animations, which we would like to project on the water ripples created by our installation.

Besides, the colors and shapes of the water ripples in TD can be changed through ramp and displace components respectively (details can be found in video and pictures①②). The size of ripples can be changed by paramater multiply in maths component.  The volume of music can also be flexible to adjust.

One obvious problem is the quality of sound, and until now I have no idea about how to improve this.

Group 2_Teamwork_Analysis and Reflection of other Installation Projects with Arduino

The blog content is by Rudan in collaboration with Jin.

Blog Background:

According to Philly's feedback about our submission one, we tend to reflect our initial ideas and concepts and analyze other installation projects, in order to have a deeper understanding of wider context.

In addition, after discussion, our group is divided into two sub groups for further research and clear division of labor. Rudan and Jin are responsible for more detailed Arduino coding and installation creating process. Jalea and Yuan are in charge of music generation with MAX. In this blog, I (Jin) and Rudan would make a reflection on our submission one based on our research on other installations with Arduino and lights.

Rudan:

Project One: Plant Sensory Visualization

【Notes: On the rational integration of installation design and theme】

https://www.bilibili.com/video/BV1LL4y1T7oG/?share_source=copy_web&vd_source=a8711564270b15c0c52431a25b613a14

Case concept introduction:

Plants are often overlooked, but they live in symbiosis with us. They collect invisible information about nature and express it through their own state – the color of their leaves is richly colored, their form is uplifted or drooping – and they are the most important part of our lives.

This project aims to explore the possibilities of human-plant interaction by ‘touching’ plants through art installations and ‘observing’ their data – light, water, and emotion. This will help people to understand what plants know and feel, and to deepen their connection with them.

【Notes: There is a certain similarity to our concept. The design uses the Arduino and the installation to show the life of a plant, and our design uses water to show the life of a person. Through this design, I thought about the possibility of using devices to complement the concept of ‘human life’.】

Studyable component:
  • The way in which touch sensors and humidity sensors pick up data and import it into the device (data transfer from the device via Arduino only).
  • The way in which the plant state is represented figuratively.
  • The way in which the device interacts with people.
Arduino:

The installation is divided into two parts: the moss and the device (blue optical fiber). The moisture sensor is used to obtain soil moisture data, which is fed into the Arduino, and the blue light dot on the optical fiber increases. In this way, the moisture of the soil is visualized.

By means of a touch sensor, the red light spots on the optical fibers increase when a person touches the moss.

【Notes: The installation visualizes the state of the plant in a fiber-optic variation. In our design, the concept of the ‘human life cycle’ was intended to be presented through the changing projection of water on the wall, but this did not seem to be a direct way of making the viewer appreciate our intentions. We thought about the possibility of using some sensors to make the projection of the water change and at the same time make the wall appear to represent the stages of a person’s life (child, middle-aged, elderly).】

Project Two: Mirror World

【Notes: On the use of the sense of fragmentation in the expression of meaning.】

https://www.bilibili.com/video/BV1q14y1h74J/?spm_id_from=333.788.recommend_more_video.0&vd_source=6fa38bbeeec457ffe4bd5ac5a5e3fb8e

Case concept introduction:

The materials are mirrors, acrylic panels, acrylic columns, UV adhesive, and the model is purely handmade. A visual art display combined with the materials.

【Notes: The sound and nature of the material can be used wisely to achieve unexpected visual effects.】

Studyable component:
  • Visual effects.
  • Sound effects.

【Notes: Consider whether the sounds and effects of the material can be used to further highlight the changing ‘stages of life’.】

– Thinking about the development –
  • Using the principle of pin-hole image, the projection of ‘a life of a human’ on the wall will change.

https://byjus.com/question-answer/draw-a-labelled-diagram-to-show-the-formation-of-an-image-of-a-tree-by/

For example, in our design we use wall projections to present the three stages of a person’s life, i.e. in the first stage, the projection of a child appears on the wall and the colorful ripples produced by the water surface are reflected in the child’s projection; in the second stage, the projection switches to a middle-aged person and the colorful ripples reflected in the middle-aged person’s projection change; in the third stage, the projection switches to an elderly person and the ripples change with the music.

(This image was drawn by Rudan Zheng)

  • Transformation of pictures in small-aperture imaging using the kaleidoscope principle, achieved with a mechanical gear mechanism.

Conception: the weight of the water is sensed by a pressure sensor and the rotation of the gear is controlled by the change in weight (the fall of the steel ball). The picture of the small hole imaged is bound to the gear and changes when the mechanical gear undergoes rotation. Case reference:

https://www.bilibili.com/video/BV1oo4y1677q/?spm_id_from=333.1007.tianma.4-4-14.click&vd_source=6fa38bbeeec457ffe4bd5ac5a5e3fb8e

  • Combine fragments of highly reflective materials such as broken mirrors and glass with the reflection of light to make it more complex and achieve a more striking visual effect.
  • Consider incorporating the sound of breaking mirrors and glass into the soundtrack of the ‘middle age’ stage to convey the idea that life is not always easy.
– Arduino (distance sensor coding)-

CS:

Arduino:

Continue reading “Group 2_Teamwork_Analysis and Reflection of other Installation Projects with Arduino”

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel