Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Feedback of Exhibition

Sound Design Feedback from Jules

  1. Speaker Setup:

Jules suggested that the speaker arrangement could be more “irregular” to align with the innovative nature of the exhibition. We can reorient the four speakers to “pair” differently with West Court’s built-in stereo system. For instance, adjusting them so that the field recordings are heard predominantly from the right, while the data sonifications emerge from the left. Additionally, changing the direction of the speakers will enable us to capture both the direct sound and its reflections off the venue’s surfaces, adding a dynamic layer of auditory variation.

  1. Spatial Mixing:

Jules also recommended considering spatial mixing to better replicate the LiDAR scanning process. Currently, our data sonifications are set to stereo output. However, enhancing this setup to allow RGB sounds to emanate from varying heights or to have the LiDAR high-tech sounds orbiting around the listener could significantly enrich the auditory experience. Such adjustments would not only simulate the three-dimensional aspect of LiDAR scanning but also create a more immersive and engaging environment for the audience.

 

Visual feedback

Visitors were effusive in their praise for the astute utilization of space evident throughout the exhibition, lauding the discernible arrangement of diverse elements. Particularly striking was the efficacy of the central installation featuring suspended white paper, which commanded attention and imbued the space with a sense of cohesion and intrigue. Notably, interactive projects, such as those showcasing the Edinburgh College of Art (ECA) and Dean Village, stood out prominently among visitors, garnering widespread recognition for their engaging and immersive qualities.

 

Furthermore, valuable input from Jules highlighted the potential for enhancing the symbiosis between the projects and the exhibition venue itself. Specifically, suggestions were made to align the layout of the venue with that of the ECA projects, thereby fostering a seamless integration and heightening audience immersion. By mirroring the configuration and ambiance of the projects within the exhibition space, attendees could be further enveloped in an experiential journey that transcends mere observation, fostering deeper engagement and appreciation of the showcased content.

Pre-Exhibition

Venue Setup

On April 4th, we held our exhibition at West Court, designed to allow visitors to experience Edinburgh anew, interpreting the city through various media driven by lidar data.

Figure 1: Overall venue setup top view

Figure 2: Overall venue setup left view

Figure 3: Overall venue setup isometric view

Video Setup:

To create an immersive experience, we projected four different video contents onto the room’s four walls. The front and back walls, which were plain white, served as direct projection surfaces. For the uneven side walls unsuitable for projections, we installed large displays to ensure clear image quality. Two of these video displays were interactive, equipped with stands that held the keyboard, mouse for gaming, and Leap Motion Controllers for visual interaction.

Figure 4: Video setup

 

Audio Setup:

The audio component of the exhibition utilized West Court’s built-in stereo speakers, which were complemented by four additional speakers. The existing speakers broadcasted the field recordings, while the four additional speakers, arranged in two stereo pairs, played the data sonifications of Vennel Step and Dean Bridge. Positioned centrally, these speakers acted as an auditory link between the different video contents, enhancing the immersive experience.

Figure 5: Audio setup

 

Light Setup:

We transformed West Court into a darkened environment by turning off all interior lights, which made our monitors and projections prominently visible in the darkness. Brightness was strategically used to guide visitors through the venue. Additionally, display lights were installed around the edges of a central mirrored table, shining outward to prevent the space from being too dark while accentuating the 3D printing placed in the center.

Figure 6: Light setup

 

Paper Setup:

We utilized the room’s existing pipes as anchor points to stretch thin ropes across the space, with paper sheets hung on them to visually encapsulate visitors within layers of data. Papers were also placed on the floor to guide visitors along a designated path and wrapped around various stands, enhancing the theme that the artworks were composed of scanned data.

 

iPad & 3D Printing Setup:

At the entrance, an interactive iPad displayed the scanned model of Vennel Step, offering visitors an initial glimpse into the lidar scanning results. The 3D prints were displayed on a mirrored table at the center of the venue, where the reflection from the mirrors helped focus attention on the intricate details of the 3D models, effectively showcasing the blend of technology and artistry.

Figure 7: Paper and 3D printing setup

Equipment List

Audio Equipment:

  • 1 laptop
  • 4 Genelec 1029 speakers (with stands)
  • RME – FireFace UC
  • Cables

Video Equipment:

  • 4 laptops
  • 1 Bluetooth mouse
  • 1 Bluetooth keyboard
  • 1 Leap Motion Controller
  • 2 Projectors
  • 2 Televisions

Other Equipment:

  • 1 Music Stand (for iPad display)
  • 4 common stands (for keyboard, etc)
  • 1 mirror table
  • 6 exhibition lights
  • glue&tape
  • rope(for hanging paper)

Figure 8:Audio Equipments

Figure 9: Video Equipments

Figure 10: Stand

Overall Induction

The inspiration for Hyperobject X Edinburgh comes from Timothy Morton’s concept of the Hyperobject. This concept views the entire Earth as a whole and elevates the status of objects to the same level as humans. Therefore, in this installation art, we elevate the Place within the Hyperobject to a position equal to that of humans. As people look at Lidar data, they also become like Lidar, scanning the entire Place within Hyperobject Edinburgh.

Heidegger mentioned the world and the earth in The origin of the work of art. The earth is the foundation on which the world is revealed, and the world is displayed through the earth; thus, in our installation art, Data is our earth, the basis of all our works, without which these works would not be created. At the same time, this Hyperobject Edinburgh is displayed in a new way because of the existence of Data.

In Hyperobject X Edinburgh, there are a total of seven works, including paper that materializes Lidar data, an interactive installation at ECA and Dean Village, a video installation at EFI and Vennel Steps, 3D printed works depicting movement and stillness in superimposed states and 3D printed works of Scottish blackface sheep.

The paper materializing Lidar data is the heart of the entire installation art, it not only connects the works in various Places but also guides people to reconsider these digital data, questioning if, from the perspective of Lidar, Edinburgh under Hyperobject is merely a set of numbers?

Figure 1: Paper project

 

ECA and EFI, as parts of the University of Edinburgh, not only represent various aspects of our lives but also symbolize a vision for the future world. Vennel Steps and Dean Village are more like representatives of the city, each connecting different places within Edinburgh. Vennel Steps, linking Lauriston Place and West Port, provides a beautiful view of Edinburgh Castle, thanks to its location. Vennel Steps, linking Lauriston Place and West Port, provides a beautiful view of Edinburgh Castle,thanks to its location.

Figure 2: ECA project

Figure 3: EFI project

Figure 4: Dean Village project

Figure 5: Vennel Steps project

 

The only human-created 3D print that brings the perspective into the superimposed state of motion and stillness, illustrates that humans are inherently limited and can be influenced by other factors, thanks to the existence of this superimposed state. The Scottish Blackface sheep toy and the hat it seemingly ‘created’ itself represent Scotland, and also whimsically places a hat made from its own Lidar data on the Scottish Blackface sheep.

Figure 6: Human project

Figure 7: Scottish Blackface sheep project

 

HyperobjectXEDINBURGH——Main object-Human object-Small object

Main object

Our installation will be installed in the West Court, maximizing the utilization of this space.

Fig 1-rendering of main object

Fig 2-rendering of main object

In the entire space, the most important media of the Main Object are paper and sound. Paper not only gives data a physical entity in the real world but also visualizes the data, allowing the audience to see, touch, and smell it.Therefore, we have ordered six pieces of paper measuring 31cm by 1000cm, and these data will be scrambled, randomly sorted, and assigned different font sizes.

Fig 3- Practical  of data paper

Fig 4- data paper of PDF

Among these data, 20% are significantly larger, while 80% are of a more regular size, thereby greatly enhancing the visibility of the data.This 60-meter-long paper will become the main body of the Main Object, and other data-bearing papers will appear in the space, which can be affixed to walls, appear beneath the object as a map, and so on.

The sound will be divided into Diegetic and Non-diegetic parts, with the Diegetic part comprising the soundscape.These sounds will permeate the entire space, serving as a key to leading people into the hyperobject Edinburgh, as they can create a realistic atmosphere, generating an illusion of being in the city.The sound design of the Non-diegetic part is more crucial than the Diegetic, as it is key to leading people into the world of Data.

 

Human Object

John Cage played a trick in his work 4’33”, where the audience expects a performance from the main performer during this duration. In reality, the audience itself becomes part of the work, thus we will also play a trick within this human object.This trick consists of two parts: paper and 3D print.We will print a  paper saying “Please listen carefully”, misleading the audience into a mistaken action. In reality, there will only be sounds within the space; the 3D print does not produce sound.

The 3D print part represents me, or rather, a fusion of myself in both static and dynamic states, symbolizing a superimposed state.Together, our group covered me with black plastic bags and aluminum foil, scanning me with a smartphone’s radar.The scanning results were imported into CloudCompare for processing and editing, exported as a new model,then imported into Cinema 4D for further modeling and editing, and finally exported in a 3D printable format and uploaded to ucreate for 3D printing.

Fig 5 static state and dynamic state in Comparecloud

Fig 6- static state and dynamic state in Comparecloud

Fig 7 – Human object in C4D

Fig 8 – 3D print in Prusaslicer of Human object

Small Object

Marcel Duchamp transformed a urinal into ‘her’ work “Fountain”, similarly, a Scottish Blackface can become our work, but unlike a regular Scottish Blackface, it is one wearing a hat.This hat is made through 3D printing, and it also represents its scanned data.After scanning the Scottish Blackface, we processed it to transform it into a hat, presenting both the real Scottish Blackface and the data-Scottish Blackface simultaneously, of course, in various hat sizes.

Fig 9- Data Scottish Blacface in Comparecloud

Fig 10-Data Scottish Blacface in Comparecloud

Fig 11-Data Scottish Blacface Hat in C4D

Fig 12-Data Scottish Blacface Hat in Prusaslicer

Plans for the next stage – Hyperobject

HYPER-OBJECT X EDINBURGH

MULTI-MODAL INSTALLATION

Concept

Hyperobject is a book written by Timothy Morton, in which he uses Global Warming as an example to explain in detail what a Hyperobject is. And he believes in the book that “Hyperobject may be a map of high-dimensional space”. From another perspective, Hyperobject exists around us in a form that cannot be fully seen, and due to human limitations, we cannot see all of it, just like La Niña is a footprint of global warming, it is time, it is history, it is also a Hyperobject. Hyperobject does not occur in spacetime, but it emits spacetime from within itself to be seen, so how can we capture the Hyperobject in the city of Edinburgh through Lidar?

The I Ching mentions a phrase, the largest object is shapeless, so we can regard the city of Edinburgh as a Place, in which various Hyperobjects exist.Because Hyperobject, like time and wind, cannot be directly seen but can only be indirectly seen or felt, Lidar is like a new eye, replacing us to re-examine everything in this Place, and to find evidence of the existence of Hyperobject.

Idea-What we will do

The vast place of Edinburgh will become the location of our activities. In this place, we use Lidar to collect and scan data, and then edit and process these data through software. We will create new works based on these data. These created works are no longer limited by specific physical space, they can appear anywhere in Edinburgh, and each of them is both individual and unified. Individually, they exist as independent works. Unified means that these works are all created based on the same data. Therefore, we will design a main work and many other small works to be placed in different spaces, making them a part of the whole Edinburgh, a part of the Hyperobject, and a part of this spacetime.

OUR PROJECT

We want to create 12 different hyper obejects with the help of 12 different phenomena and 12 different scans.

Through the lidar scans, edinburgh is no longer a place but data. We aim to showcase this data in all forms and shapes. We will be using softwares and technology to manipulate these data showing the past, present and the future of the same place in various forms.

It is left to the human senses to make sense of it and put the whole place together. Different humam senses awaked in different places but combined into a feel of a single place (edinburgh).

OBJECTS

1. Signification of the data acquired.

The data obtained would be sonified and presented to the audience in the form of audio in small speakers.

2. Physical state

The transition of a place into the form transcending past, present and future would be displayed physically (3d printed) and displayed.

3. Virtual

The data would be manipulated visually creating an animation along with specific sound effects.

4. Raw data

The data would be represented as raw data that makes up all these visual data. This would be as running and printed text. It will have a qr code scanner that would take us into the visual place.

 

Week 5/Week 6- Lidar scanning and data collection

February 2024

During the fifth and sixth weeks, we decided to scan two areas.

The first is the West Court and Sculpture Court in the ECA Main Building, as well as the corridors connecting these two places. They are not only places where many activities take place on campus, but also places with a lot of foot traffic, and there are many interesting things and data in these two places.

Figure 1 Sculpture Court

The second is Vennel Steps, which is not only a passage connecting Lauriston Place and West Port, but also a good location to view Edinburgh Castle. In this passage, it not only carries the memory of this city, but also exists as a bridge linking places, and is also a part of the Hyperobject of the entire Edinburgh.

Figure 2 Vennel Steps

A total of 26 scans were conducted in the ECA Main Building, obtaining 34,409,476 points.

Figure 3 ECA Main Building top view

Figure 4 ECA Main Building  side view

Figure 5 ECA Main Building  side view

Figure 6 ECA Main Building Overall top view

Figure 7 west court and north corridor

Figure 8 South corridor by west court

Figure 9 Sculpture Court and exhibition works

Figure 10 Sculpture Court and exhibition works

A total of 32 scans were conducted at Vennel Steps, obtaining 716,122,248 points.

Figure 11 Vennel Steps top view

Figure 12 Vennel Steps side view

Figure 13 Vennel Steps Front viewFigure 14 Vennel Steps top view

Figure 15 Vennel Steps top view

Figure 16 Vennel Steps top view

Figure 17 Vennel Steps top view

These two scans brought in over 700 million points of data, which can become the material for our next stage of creation after we edit them.

Sound Design – Qinglin Zhu, Ming Du

1. Concept and Idea

Concept

The combination of visuals and sound in films brings us an immersive experience, where sound plays a crucial role in shaping immersive spaces.Michel Chion categorizes film sound into two parts in “Audio-Vision: Sound on Screen”: Diegetic, which is sound within the narrative space, and Non-Diegetic, which is sound outside the narrative space. The interplay and transformation between these two types of sound create a sensory immersion for the audience.Meanwhile, Ben Winters mentions in “The Non-diegetic Fallacy: Film, Music, and Narrative Space” that Non-Diegetic sound is partly a sign of the fictional state of the world created on screen.Therefore, can we apply the theories of Michel Chion and Ben Winters to art installations, making sound an indispensable part of sensory immersion and allowing sound to work with visuals to create a field within the “Place” of this installation?

Idea

Sound is divided into two parts: Diegetic and Non-Diegetic. Diegetic refers to Place Sonification, while Non-Diegetic refers to LiDAR Sonification.In Diegetic sound, we use sounds that are closer to the real world, which can give the audience a sense of familiar unfamiliarity. This type of sound can shorten the distance between the audience and the space of the installation.In Non-Diegetic sound, we primarily use sounds from synthesizers, which are very distinctive and can quickly capture the audience’s attention.Through the combination of Diegetic and Non-Diegetic sounds, the entire installation’s field is placed at the intersection of reality and virtuality, making it both real and beyond reality.
For our simulated Edinburgh immersive experience, it’s essential to employ strategic spatial audio techniques. To achieve this, our recording plan incorporates the Ambisonic format for mixing, with the aim of utilizing a 5.1 (or 5.0) surround sound playback format to immerse the audience.

2. Field recording

Recording Plan

Dual Microphone Sets

  • Stereo(ORTF): This set will employ the ORTF stereo technique to ensure high-quality 2-channel stereo recordings. This approach is aimed at capturing a rich, natural sound field.
  • Ambisonic: In parallel, we will use an Ambisonic microphone to capture sounds in the first-order Ambisonic format. This technique is chosen for its ability to preserve the spatial characteristics of the soundscape, offering a more immersive listening experience.

Recording Perspectives

  • Distant Shots: Aim to capture a broad spatial layering, retaining the scene’s depth.
  •  Medium Shots: These should align with everyday auditory habits, ensuring there are plenty of dynamic changes.
  • Close-up Shots: Focused on detailing specific sounds within the environment, these recordings will be more targeted, serving potentially as sound effects or sonification elements, rather than general ambience.

Impulse responses

  • To record the acoustic reverberation of different places, we plan to use a slapstick and balloons to create impulse responses, allowing us to capture the unique acoustic signatures of various spaces effectively.

Equipment List

1. Microphone
Sennheiser AMBEO VR mic
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=4149

Schoeps – MK4 + CMC1L * 2

 

2. Portable Field Recorder
Sound Devices – MixPre-6 II
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=9483

Zoom – F8(backup)
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=4175

3. Accessories
Toca – Slapstick
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=7135

Rycote – Cyclone – Fits Schoeps MK4 Pair
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=10052

Rycote – Cyclone – Fits Sennheiser AMBEO
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=10049

K&M Mic Stand * 2
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=8261

Sound Devices – Battery
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=1509

AA Battery Charger
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=10083

AA Rechargeable Batteries x 4 *2
https://bookit.eca.ed.ac.uk/av/wizard/resourcedetail.aspx?id=6024

3. Place Sonification – Diegetic

LiDAR technology, which captures environments to generate point cloud data, provides a distinctive avenue for uncovering the hidden characteristics of a location through sound. By selecting data points that hold particular relevance for auditory interpretation, these elements can be converted into sounds that are perceptible to an audience. The point cloud data obtained from LiDAR scans can be converted into a CSV format readable by Max/MSP using CloudCompare, facilitating the manipulation of audio based on data.
In this project, the intention is to use granular synthesis to represent the granularity of point cloud data. By controlling arguments such as grain rate, duration, pitch, and amplitude in real-time based on the data from the place, the variation within the point cloud data can be audibly demonstrated. Moreover, Max/MSP allows for further sonification through data input, such as using the data to control the parameters of processors and synthesizers, or triggering specific samples with extreme values. This approach enables real-time sound matching based on visual effects, bringing the scanned environment to life in a unique and engaging way.

Audio 1: A granular synthesis for rocks(using weather data in Iceland).

Figure 1: A granular synthesizer in Max/MSP

Figure 1: A granular synthesizer in Max/MSP.

4. LiDAR Sonification – Nondiegetic

LiDAR, as a contemporary high-tech innovation, has inspired audiovisual installations that often employ nearly sci-fi sonic textures to reflect their technological essence. These installations predominantly use synthesizers and processors to create sounds, aiming for an auditory experience that matches the visual in its capacity to transport audiences beyond the familiar. The sound design in these setups focuses on dynamic and timbral shifts that keep pace with visual transformations, emphasizing rhythm and atmosphere rather than conventional melody.

https://vimeo.com/649452074

An notable example in this domain is Ryoichi Kurokawa’s “s.asmbli [ wall ],” where sound plays a pivotal role in creating the atmosphere and making the LiDAR scanning process audible. The installation capitalizes on the rhythmic aspects of the scanning to set its tempo, relying mainly on synthesized sounds without clear melodic content. It employs reverberation automation to reflect changes and progressions within the scene, enabling the audience to intuitively understand the creation and evolution of data. This immersive experience guides viewers through the construction, transformation, and dismantling of various places, driven by the combined narrative of sound and visual.
The adoption of technologically derived timbres is intended to signal the processes underlying the visual display, employing nondiegetic sounds to hint at the artificial crafting of these places. This strategy not only deepens the immersive effect by linking technology with creativity but also prompts the audience to consider the interplay of human and technological efforts in depicting and altering environments. Installations like Kurokawa’s encourage exploration of the relationship between organic and technology, showcasing the artistic possibilities offered by LiDAR.

Ming Du-Idea of installation

After communicating with Dr. Asad Khan last week, I began to reflect on my ideas and consider how they could be expanded.

The existence of time made me think about the differences between dimensions, If one dimension is a point, then two dimensions consist of a line made up of countless points, three dimensions form a cube composed of numerous two-dimensional lines. So, what does the fourth dimension look like? Is it also composed of countless stacked three-dimensional objects?

This also reminds me of the last part of the movie “Interstellar,” where the protagonist enters a black hole and arrives in a four-dimensional space, and in this four-dimensional space, he is able to traverse time and communicate with his daughter through gravity.

Then, through Wikipedia, I learned about the existence of the tesseract, and how it might appear in two-dimensional and three-dimensional worlds.

 

So, I began to contemplate what I should do.

Can we use LiDAR and installation to achieve the projection of this four-dimensional space?

Can we also use time-lapse photography with LiDAR, thereby not only animating the scanned scenes but also materializing time?(Although the fourth dimension is not time, it can be visualized.)

Perhaps it could be designed as an installation. This installation would consist of four projection screens, not connected to each other, with gaps allowing viewers to enter the cube formed by the four screens. In the center, there would be a cube made of a projectable and translucent material, serving as an interactive area for the audience. The entire setup would create an immersive experience, thus requiring an immersive sound design as well.Visually, the LiDAR-processed images are projected not only on the four screens but also on the material in the interactive area.

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel