Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Sound Object – diegetic field recordings and nondiegetic data sonification

The combination of visuals and sound in films brings us an immersive experience, where sound plays a crucial role in shaping immersive spaces.Michel Chion categorizes film sound into two parts in “Audio-Vision: Sound on Screen”: Diegetic, which is sound within the narrative space, and Nondiegetic, which is sound outside the narrative space. The interplay and transformation between these two types of sound create a sensory immersion for the audience.Meanwhile, Ben Winters mentions in “The Non-diegetic Fallacy: Film, Music, and Narrative Space” that Nondiegetic sound is partly a sign of the fictional state of the world created on screen.Therefore, can we apply the theories of Michel Chion and Ben Winters to art installations, making sound an indispensable part of sensory immersion and allowing sound to work with visuals to create a field within the “Place” of this installation?

Sound is divided into two parts: Diegetic and Nondiegetic. Diegetic refers to field recordings, while nondiegetic refers to LiDAR data sonification. In the Diegetic component, we use field recordings to reveal the unnoticed details in the real world, which can give the audience a sense of familiar unfamiliarity. This type of sound can shorten the distance between the audience and the space of the installation. In the Nondiegetic aspect, we primarily use sounds from synthesizers, which are very distinctive and can quickly capture the audience’s attention. Through the combination of diegetic and nondiegetic sounds, the entire installation’s field is placed at the intersection of reality and virtuality, making it both real and beyond reality.

 

The sound design of this project is strategically divided into two principal components: diegetic field recordings and nondiegetic data sonification.

Diegetic Component: This segment includes field recordings from iconic locations across Edinburgh, including historic Royal Mile, bustling Princes Street, tranquil Dean Village, busy Waverley Station, and relaxing Portobello Beach. Crafted from meticulous field recordings, this object captures the essence of Edinburgh’s diverse auditory environments, weaving together the distant chatter of passersby with the rapid flowing of water into a rich tapestry that embodies the city’s unique atmosphere. This approach allows exhibition visitors to immerse themselves in the complex soundscapes of Edinburgh, experiencing it as a cohesive hyperobject within a singular exhibition space.

Nondiegetic Component: This component focuses on the sonification of LiDAR scan data, employing Max/MSP to transform the point cloud into the audible. Specifically, this data-driven sound design translates the intricate spatial (XYZ) and color (RGB) data of Edinburgh’s Vennel Step and Dean Bridge into a captivating auditory experience. The essence of these places is not merely represented but reimagined through the sonification, creating a soundscape where technology meets artistry.

 

1. DIEGETIC FIELD RECORDING

The field recordings are edited into several one-minute segments, seamlessly interconnected by ambient music transitions, offering a continuous 12-minute auditory journey. Each segment is introduced by a voiceover, delivered in a Scottish accent, that introduces the featured location, thereby grounding each auditory snapshot in its geographical and cultural context.

Figure 1: The edit session of the 12-minute field recording

The ambient music transitions serve not only to give the audience a moment to refresh their ears and prepare for the next soundscape, but also to reflect the high-dimensional nature of Hyperobjects. These transitions use pads to establish a sonic zone conducive to mind wandering, complemented by the use of reversed bell sounds to symbolize the free flow of time. This musical approach not only enriches the listening experience but also simulates the sensation of navigating through time and space within a multi-dimensional framework.

Figure 2: The arrangement session of ambient music transitions

This auditory experience transforms everyday noises into a profound exploration of space and memory, guiding you through an unseen Edinburgh where every sound tells a story. It extends an invitation to experience Edinburgh as never before, where the essence of each location is not only heard but deeply felt, creating a vivid and lasting impression of the city’s atmospheric diversity.

 

2. NONDIEGETIC DATA SONIFICATION

LiDAR technology, which captures environments to generate point cloud data, provides a distinctive avenue for uncovering the hidden characteristics of a place through sound. The point cloud data obtained from LiDAR scans can be converted into a CSV format readable by Max/MSP using CloudCompare, facilitating the manipulation of audio based on data.

Figure 3: The raw point cloud data of Vennel Step

Within the Max patch framework, RGB data-driven sounds emerge as primary sonic elements. A general control mechanism assesses the RGB values to identify the dominant color, which then activates the playback of a color-correspondent sound. This sound mapping process is influenced by subjective color associations: red with sharp, intense sounds; green with natural sounds; and blue with harmonious sounds. This method not only audibly illustrates the visual spectrum but also invites listeners to explore the emotional resonance of colors.

Depth is articulated through the Z data, introducing a layer of complexity to the soundscape. By mapping the vertical extremities of each site to distinct sounds, the installation captures the topographical variance of Edinburgh. This approach not only highlights the physical contours of the locations but also envelops the listener in a multi-dimensional auditory space.

Furthermore, significant shifts detected between adjacent rows of LiDAR data—marking the completion of an angular scan—are signified through a distinctive “hit” sound, thereby audibly marking the progress of the scanning process. Other data types, not directly converted into sound, serve as control signals that adjust sound parameters such as cut-off frequency and volume in real-time, adding a dynamic layer to the auditory experience.

Figure 4: The diagram of the Max patch

While the Max patches for both locations share a basic logic, modifications have been made in audio sample selection and data mapping to reflect the unique characteristics of each site.

 

Red: Shared Elements Across Sites

Both locations employ the same audio assets for the color red, which represents the high-tech texture of LiDAR. This uniformity emphasizes the technological underpinnings central to both sites, thereby reinforcing their conceptual linkage.

 

Green: Site-Specific Sound Design

For the color green, we have implemented granular synthesis to reflect the granularity of point cloud data typical of LiDAR scans. By dynamically adjusting parameters such as grain rate, duration, pitch, and amplitude, we transform the data’s variance into an auditory experience. Specifically, the base materials differ: Vennel Step incorporates sounds derived from rock, while Dean Bridge uses metal sounds, reflecting their respective construction materials.

 

Blue: Interpreting the Color and the Site Feature

At Vennel Step, the color blue is conveyed through MIDI outputs that trigger harmonious synthesizer sweeps. These sweeps metaphorically depict the significant elevation changes of the steps, likening the vertical variation to the changing rate of an LFO (Low Frequency Oscillator) in a synthesizer. This auditory translation invites listeners to experience the physicality of the steps through sound.

At Dean Bridge, blue is represented by playing a specific audio sample, composed of harmonious synth hits and wind chime sounds, both forwards and backwards. This technique symbolically reverses time, reflecting the bridge’s longstanding presence and its role as a temporal conduit linking the past, present, and future. Through these sounds, we encourage the audience to perceive the bridge across various temporal dimensions.

 

Z Depth: Reflect the surroundings of the site

Vennel Step’s low end is adjacent to a bustling road. When Z is at its lowest range, the sounds of traffic, conversing crowds, and pedestrian footsteps dominate, capturing the urban vibrancy of the area. As Z ascends to its highest range, the ambience shifts dramatically—birds chirping and wind rustling become discernible, previously masked by the city noise and becoming perceptible only at elevated heights.

At Dean Bridge, the extremely low range of Z captures the sound of water flowing under the bridge. At the highest range, the auditory scene shifts to include birds, wind, and the rustling of trees. This reflects the natural environment surrounding the bridge, embraced by venerable trees.

Figure 5: The Max patch of Vennel Step

This data sonification offers more than just a novel way to perceive data; it invites participants to immerse themselves in the digital heartbeat of Edinburgh, where every data point sings a piece of the city’s story, crafting an immersive narrative that bridges the gap between the digital and the sensory.

 

The overall sound design for this project represents a pioneering endeavor to merge the realms of environmental sounds and data sonification into an immersive auditory experience. By capturing the essence of Edinburgh through field recordings and interpreting physical data through sonification, we offer a unique dual narrative of the city. This innovative approach not only showcases the potential of sound as a medium to interpret and represent LiDAR data, but also invites audiences to engage with the environment and data in a deeply immersive and sensory manner.

Feedback of Exhibition

Sound Design Feedback from Jules

  1. Speaker Setup:

Jules suggested that the speaker arrangement could be more “irregular” to align with the innovative nature of the exhibition. We can reorient the four speakers to “pair” differently with West Court’s built-in stereo system. For instance, adjusting them so that the field recordings are heard predominantly from the right, while the data sonifications emerge from the left. Additionally, changing the direction of the speakers will enable us to capture both the direct sound and its reflections off the venue’s surfaces, adding a dynamic layer of auditory variation.

  1. Spatial Mixing:

Jules also recommended considering spatial mixing to better replicate the LiDAR scanning process. Currently, our data sonifications are set to stereo output. However, enhancing this setup to allow RGB sounds to emanate from varying heights or to have the LiDAR high-tech sounds orbiting around the listener could significantly enrich the auditory experience. Such adjustments would not only simulate the three-dimensional aspect of LiDAR scanning but also create a more immersive and engaging environment for the audience.

 

Visual feedback

Visitors were effusive in their praise for the astute utilization of space evident throughout the exhibition, lauding the discernible arrangement of diverse elements. Particularly striking was the efficacy of the central installation featuring suspended white paper, which commanded attention and imbued the space with a sense of cohesion and intrigue. Notably, interactive projects, such as those showcasing the Edinburgh College of Art (ECA) and Dean Village, stood out prominently among visitors, garnering widespread recognition for their engaging and immersive qualities.

 

Furthermore, valuable input from Jules highlighted the potential for enhancing the symbiosis between the projects and the exhibition venue itself. Specifically, suggestions were made to align the layout of the venue with that of the ECA projects, thereby fostering a seamless integration and heightening audience immersion. By mirroring the configuration and ambiance of the projects within the exhibition space, attendees could be further enveloped in an experiential journey that transcends mere observation, fostering deeper engagement and appreciation of the showcased content.

Field Recording

We conducted field recordings at several iconic locations throughout Edinburgh, including the Royal Mile, Princes Street, Dean Village, Waverley Station, and Portobello Beach, each chosen for its unique soundscape.

 

(Amb_DeanVillage_waterflow_rapid_close)

In Dean Village, the uneven terrain necessitated a flexible recording setup, so we used two Schoeps MK4 microphones mounted on a boom pole. These microphones come equipped with Stereo Windshields preset to the XY recording configuration, so we changed our recording plan from ORTF to XY configuration instead. During the sound walk, we identified the sound of flowing water as the distinctive soundmark of Dean Village. We then strategically captured this sound from various distances—distant, medium, and close-up shots—to record the water’s diverse sonic expressions.

 

(Amb_RoyalMile_ScottishBagpipes_withApplause)

At other locations, the presence of large crowds made it impractical to use such conspicuous equipment. As a result, we switched from Mixpre6 to the more handy Zoom H4n recorder, which features an integrated XY stereo microphone. Although this handheld recorder’s built-in microphone has limited optimal working distance and does not match the sound quality of the MK4 pair, it allowed us to capture the essence of each place without disrupting natural behaviors. For instance, on the Royal Mile, we captured the iconic sounds of Scottish bagpipes and crowd applause; on Princes Street, the ding-ding of trams and the hum of busy traffic; and at Portobello Beach, not only the waves but also the sounds of beach volleyball and lively chatter from seaside restaurants.

This strategic adaptation to each location’s specific acoustic environment allowed us to capture the most authentic and vibrant sounds possible. These recordings not only document the sonic diversity of Edinburgh but also enhance our project’s ability to transport listeners into the heart of each iconic place through sound.

Pre-Exhibition

Venue Setup

On April 4th, we held our exhibition at West Court, designed to allow visitors to experience Edinburgh anew, interpreting the city through various media driven by lidar data.

Figure 1: Overall venue setup top view

Figure 2: Overall venue setup left view

Figure 3: Overall venue setup isometric view

Video Setup:

To create an immersive experience, we projected four different video contents onto the room’s four walls. The front and back walls, which were plain white, served as direct projection surfaces. For the uneven side walls unsuitable for projections, we installed large displays to ensure clear image quality. Two of these video displays were interactive, equipped with stands that held the keyboard, mouse for gaming, and Leap Motion Controllers for visual interaction.

Figure 4: Video setup

 

Audio Setup:

The audio component of the exhibition utilized West Court’s built-in stereo speakers, which were complemented by four additional speakers. The existing speakers broadcasted the field recordings, while the four additional speakers, arranged in two stereo pairs, played the data sonifications of Vennel Step and Dean Bridge. Positioned centrally, these speakers acted as an auditory link between the different video contents, enhancing the immersive experience.

Figure 5: Audio setup

 

Light Setup:

We transformed West Court into a darkened environment by turning off all interior lights, which made our monitors and projections prominently visible in the darkness. Brightness was strategically used to guide visitors through the venue. Additionally, display lights were installed around the edges of a central mirrored table, shining outward to prevent the space from being too dark while accentuating the 3D printing placed in the center.

Figure 6: Light setup

 

Paper Setup:

We utilized the room’s existing pipes as anchor points to stretch thin ropes across the space, with paper sheets hung on them to visually encapsulate visitors within layers of data. Papers were also placed on the floor to guide visitors along a designated path and wrapped around various stands, enhancing the theme that the artworks were composed of scanned data.

 

iPad & 3D Printing Setup:

At the entrance, an interactive iPad displayed the scanned model of Vennel Step, offering visitors an initial glimpse into the lidar scanning results. The 3D prints were displayed on a mirrored table at the center of the venue, where the reflection from the mirrors helped focus attention on the intricate details of the 3D models, effectively showcasing the blend of technology and artistry.

Figure 7: Paper and 3D printing setup

Equipment List

Audio Equipment:

  • 1 laptop
  • 4 Genelec 1029 speakers (with stands)
  • RME – FireFace UC
  • Cables

Video Equipment:

  • 4 laptops
  • 1 Bluetooth mouse
  • 1 Bluetooth keyboard
  • 1 Leap Motion Controller
  • 2 Projectors
  • 2 Televisions

Other Equipment:

  • 1 Music Stand (for iPad display)
  • 4 common stands (for keyboard, etc)
  • 1 mirror table
  • 6 exhibition lights
  • glue&tape
  • rope(for hanging paper)

Figure 8:Audio Equipments

Figure 9: Video Equipments

Figure 10: Stand

Overall Induction

The inspiration for Hyperobject X Edinburgh comes from Timothy Morton’s concept of the Hyperobject. This concept views the entire Earth as a whole and elevates the status of objects to the same level as humans. Therefore, in this installation art, we elevate the Place within the Hyperobject to a position equal to that of humans. As people look at Lidar data, they also become like Lidar, scanning the entire Place within Hyperobject Edinburgh.

Heidegger mentioned the world and the earth in The origin of the work of art. The earth is the foundation on which the world is revealed, and the world is displayed through the earth; thus, in our installation art, Data is our earth, the basis of all our works, without which these works would not be created. At the same time, this Hyperobject Edinburgh is displayed in a new way because of the existence of Data.

In Hyperobject X Edinburgh, there are a total of seven works, including paper that materializes Lidar data, an interactive installation at ECA and Dean Village, a video installation at EFI and Vennel Steps, 3D printed works depicting movement and stillness in superimposed states and 3D printed works of Scottish blackface sheep.

The paper materializing Lidar data is the heart of the entire installation art, it not only connects the works in various Places but also guides people to reconsider these digital data, questioning if, from the perspective of Lidar, Edinburgh under Hyperobject is merely a set of numbers?

Figure 1: Paper project

 

ECA and EFI, as parts of the University of Edinburgh, not only represent various aspects of our lives but also symbolize a vision for the future world. Vennel Steps and Dean Village are more like representatives of the city, each connecting different places within Edinburgh. Vennel Steps, linking Lauriston Place and West Port, provides a beautiful view of Edinburgh Castle, thanks to its location. Vennel Steps, linking Lauriston Place and West Port, provides a beautiful view of Edinburgh Castle,thanks to its location.

Figure 2: ECA project

Figure 3: EFI project

Figure 4: Dean Village project

Figure 5: Vennel Steps project

 

The only human-created 3D print that brings the perspective into the superimposed state of motion and stillness, illustrates that humans are inherently limited and can be influenced by other factors, thanks to the existence of this superimposed state. The Scottish Blackface sheep toy and the hat it seemingly ‘created’ itself represent Scotland, and also whimsically places a hat made from its own Lidar data on the Scottish Blackface sheep.

Figure 6: Human project

Figure 7: Scottish Blackface sheep project

 

Dean’s Village with leap motion

Dean’s Village

Dean’s Village Visual Object:

Dean’s Village, chosen for its prominence as a tourist destination in Edinburgh, serves as a canvas for digital manipulation and reinterpretation. Through lidar scanning and subsequent visual manipulation, Dean’s Village is depicted as spiraling into an alternate dimension, eliciting intrigue and personal engagement from the audience. This intervention not only offers a new perspective on a familiar space but also underscores the transformative potential of digital technologies in reshaping our understanding of urban environments

Data Acquisition and Preparation:

  • Lidar data of Dean’s Village was acquired and processed, resulting in a comprehensive digital representation of the environment.
  • The data, initially captured at various locations, was stitched together using Cycle 360 on an iPad, yielding files in .e57 and .las formats.

Subsampling Using Cloud Compare:

  • The lidar data was imported into Cloud Compare for subsampling, aiming to reduce point density for compatibility with TouchDesigner.
  • Subsampling techniques were employed to optimize the data while preserving essential details.
  • The subsampled data was then exported for further processing in TouchDesigner.

TouchDesigner Integration:

  • Upon import into TouchDesigner, the lidar data file was divided into distinct paths for visual and color input into the geometry.
  • These inputs were subsequently connected to the camera, facilitating visualization within the TouchDesigner environment.

Interaction Design:

  • Initially, noise elements were introduced to the data file to enhance visual complexity and dynamism.
  • Mouse input was incorporated to enable user interaction, allowing specific sections of the data to respond dynamically, such as spiraling movements.

Integration of Leap Motion:

  • To enhance audience interaction further, Leap Motion technology was integrated into the project.
  • The Leap Motion SDK was configured on a laptop, enabling seamless integration with the project environment.
  • Customization of the Leap Motion interface was undertaken to accommodate single-hand input, ensuring intuitive interaction.
  • The Leap Motion input was integrated into the feedback loop, enabling users to interact with the visual representation through gestures and hand movements.

Output Projection:

  • The final output, incorporating all interactive elements and visual manipulations, was connected to a window output within TouchDesigner.
  • This output was then projected onto a screen, providing a dynamic and engaging visual experience for viewers within the exhibition space.

Final objects

Projects

Main project

Our project aimed to investigate various modalities for the dissemination of lidar data. Our exploration encompassed the following mediums:

  1. Paper: The utilization of printed raw lidar data facilitated a tactile engagement, allowing users to physically interact with the information. This approach offered a tangible means of presenting the data, enhancing comprehension and immersive understanding.

2. 3D Printed Objects: Through the process of 3D printing, we rendered the lidar-scanned data into physical objects, such as the representation of a sheep’s hat. This reinterpretation of the data into tangible forms provided a novel perspective, transforming abstract data into palpable entities, thereby enriching the experiential dimension of the project.

3. Visual Objects: Employing various visual effects and manipulations, we presented processed lidar data in digital formats. By leveraging digital visualization techniques, we aimed to offer diverse perspectives and experiences of the scanned environment, enabling users to explore the data through different visual lenses.

4. Sound Objects: Our project incorporated both diegetic (field recordings) and nondiegetic (derived from lidar-scanned data) sounds from the scanned environment. These auditory elements were integrated with the visual objects, synergistically enhancing the overall experiential dimension. By intertwining soundscapes with visual representations, we aimed to provide users with a holistic and immersive sensory experience, enriching their understanding of the scanned environment.

In summary, our project engaged with a multidimensional approach to the dissemination of lidar data, encompassing tactile, visual, and auditory modalities. Through these diverse mediums, we aimed to offer users a comprehensive and immersive exploration of the scanned environment, transcending traditional data dissemination methods and fostering novel modes of engagement and understanding.

Paper:

 

The exhibition space incorporates the use of paper as a fundamental medium, symbolizing the unprocessed lidar data obtained from scans. Positioned strategically throughout the exhibition, these paper displays afford visitors direct access to the foundational data upon which the entire curated experience is based. Furthermore, the arrangement of these paper elements serves a dual purpose, guiding the flow of visitors through the space while also concealing undesirable negative spaces, thereby fostering a cohesive and immersive exhibition narrative.

3D Printed Sheep’s Hat:

 

Utilizing 3D printing technology, the project explores the transformation of traditional objects, such as the Edinburgh black face sheep’s hat. Through scanning and subsequent manipulation of size and color, the hat undergoes perceptual metamorphosis, transcending its conventional identity. This intervention prompts viewers to reconsider their preconceived notions of form and function, thereby enriching their engagement with the exhibited artifacts.

3D Printed Human Object:

 

A member of the group, Ming, serves as the subject for the 3D printed human object. Ming is captured amidst black dustbin bags and foil, creating data voids within the scan. The resulting 3D-printed representation encapsulates Ming’s movements, highlighting the interplay between presence and absence. This exploration not only captures Ming’s physicality but also underscores the expressive potential of material choices within the scanning process.

Vennel Staircase Visual Object:

 

The Vennel Staircase, a renowned photographic landmark in Edinburgh, undergoes reinterpretation through lidar scanning and digital manipulation. The staircase is visually reimagined as cascading downwards akin to a waterfall, a transformation further accentuated by accompanying auditory elements. This intervention not only reframes the familiar space but also invites viewers to contemplate the dynamic interplay between physical reality and digital representation, thereby fostering a renewed appreciation for the urban landscape.

Dean’s Village Visual Object:

 

Dean’s Village, chosen for its prominence as a tourist destination in Edinburgh, serves as a canvas for digital manipulation and reinterpretation. Through lidar scanning and subsequent visual manipulation, Dean’s Village is depicted as spiraling into an alternate dimension, eliciting intrigue and personal engagement from the audience. This intervention not only offers a new perspective on a familiar space but also underscores the transformative potential of digital technologies in reshaping our understanding of urban environments

ECA Visual Object:

The underlying principle governing the design methodology employed in crafting the point cloud data for the campus section was the reinterpretation of familiar vistas. First central to this endeavor was the conceptualization and realization of the Edinburgh College of Art (ECA), whose thematic essence revolves around the motif of “natural growth”. This initiative entailed a holistic amalgamation of disparate elements, including the West Court of the ECA’s main edifice, the intrinsic architectural configuration of the Sculpture Court, and the outward expanse of the ECA center courtyard. Leveraging the transformative capabilities inherent within the point cloud model, each constituent data point was imbued with characteristics reminiscent of verdant vitality, thereby evoking an imagery akin to the organic evolution of flora. By meticulously processing the point cloud model, each data point was rendered to emulate a dynamic graphic portrayal, exhibiting a continuous metamorphosis akin to that of a flourishing botanical entity. Consequently, the architectural ensemble of the ECA assumed an appearance reminiscent of a post-apocalyptic construct enveloped by lush vegetation. Furthermore, the experiential facet of this endeavor was enriched through the utilization of a third-person controller mechanism, facilitating player traversal and exploration within the designated environment. This immersive engagement afforded participants a departure from quotidian encounters, thereby fostering a transformative encounter with the built environment.

EFI Visual Object:

The next step was to process the point cloud model of the EFI building. Situated within the esteemed confines of the Old Royal Infirmary and constituting an integral facet of the University of Edinburgh, the Edinburgh Futures Institute (EFI) epitomizes a nexus of innovation and forward-thinking scholarship. In conceptualizing the design ethos for this visionary institution, paramount consideration was accorded to its futuristic orientation, encapsulated within the overarching theme of “fantasy and technology”. Accordingly, the aesthetic palette adopted for EFI’s particle representation exudes a pronounced technological ethos, characterized by an interplay of vibrant blues and yellows. Each datum within this milieu undergoes a distinct processing treatment, markedly divergent from the approach adopted for the Edinburgh College of Art (ECA). Herein, every data point assumes the form of a fluctuating piece of paper, symbolizing the transient nature of information dissemination and evolution. The chromatic spectrum exhibited by these particles extends beyond the predominant blue and yellow hues, encompassing an eclectic array of colors such as azure, emerald, amethyst, and citrine. Moreover, the dynamic positioning of each data point imbues the ensemble with an ethereal, ever-shifting quality, evoking a surreal, dreamlike ambiance. Ultimately, the integration of a first-person camera interface facilitates user immersion within the reimagined confines of the EFI edifice, offering a transcendent journey akin to traversing the realm of dreams.

Sound object: 

The combination of visuals and sound in films brings us an immersive experience, where sound plays a crucial role in shaping immersive spaces.Michel Chion categorizes film sound into two parts in “Audio-Vision: Sound on Screen”: Diegetic, which is sound within the narrative space, and Nondiegetic, which is sound outside the narrative space. The interplay and transformation between these two types of sound create a sensory immersion for the audience.Meanwhile, Ben Winters mentions in “The Non-diegetic Fallacy: Film, Music, and Narrative Space” that Nondiegetic sound is partly a sign of the fictional state of the world created on screen.Therefore, can we apply the theories of Michel Chion and Ben Winters to art installations, making sound an indispensable part of sensory immersion and allowing sound to work with visuals to create a field within the “Place” of this installation?

Sound is divided into two parts: Diegetic and Nondiegetic. Diegetic refers to field recordings, while nondiegetic refers to LiDAR data sonification. In the Diegetic component, we use field recordings to reveal the unnoticed details in the real world, which can give the audience a sense of familiar unfamiliarity. This type of sound can shorten the distance between the audience and the space of the installation. In the Nondiegetic aspect, we primarily use sounds from synthesizers, which are very distinctive and can quickly capture the audience’s attention. Through the combination of diegetic and nondiegetic sounds, the entire installation’s field is placed at the intersection of reality and virtuality, making it both real and beyond reality.

 

Vennel staircase – waterfall effect

Vennel Staircase Visual Object:

The Vennel Staircase, a renowned photographic landmark in Edinburgh, undergoes reinterpretation through lidar scanning and digital manipulation. The staircase is visually reimagined as cascading downwards akin to a waterfall, a transformation further accentuated by accompanying auditory elements. This intervention not only reframes the familiar space but also invites viewers to contemplate the dynamic interplay between physical reality and digital representation, thereby fostering a renewed appreciation for the urban landscape.

Data Acquisition for Vennel Staircase:

The lidar data capturing the Vennel Staircase environment was meticulously acquired using specialized equipment and techniques.
The scanning process aimed to capture detailed spatial information, including the architectural features and contours of the staircase and its surroundings.

Subsampling for Vennel Staircase:

Subsequent to data acquisition, the lidar data underwent subsampling procedures to optimize its density and format for downstream processing.
Utilizing software tools such as Cloud Compare, the point cloud data was subsampled to reduce computational overhead while preserving essential details.

TouchDesigner Integration for Vennel Staircase:

Upon completion of subsampling, the subsampled lidar data was imported into TouchDesigner, a visual programming environment.
Within TouchDesigner, the imported data was processed and manipulated to create dynamic visualizations of the Vennel Staircase environment, incorporating elements such as color mapping and texture rendering.

Point Transformation for Vennel Staircase:

A critical component of the visualization involved the implementation of point transformation techniques to simulate specific visual effects, notably the waterfall-like motion.
Through customized algorithms and feedback loops, certain points within the lidar data representing the staircase were dynamically manipulated to create the desired visual impact.

Output Projection for Vennel Staircase:

The final output generated within TouchDesigner, incorporating all visual enhancements and interactive elements, was prepared for projection onto a display screen.
Utilizing appropriate projection equipment, the enhanced visualization of the Vennel Staircase was projected within the exhibition space, providing viewers with an immersive and engaging experience of the digitally transformed environment.

Field Work

In terms of scene scanning we scanned a total of eight objects, of which four large scenes and two special objects we did special processing through other software in the post.

Firstly, we chose EFI as our test site for our first scan and we recorded the detailed scanning process in this scan. So we can be handy for future scans.The reason for choosing EFI is firstly because EFI has a wider environment while the house is sorted in a more regular manner, and it is more convenient to collect data with less foot traffic.In order to enhance the collection of data, a two-phase approach was implemented. Initially, a comprehensive scanning of classrooms was conducted, encompassing all corners to delineate the spatial extent of the building. Subsequently, a detailed examination of the interior of each room was carried out to capture nuanced features. The second phase concentrated on the internal corridors of the building, employing an inward-to-outward methodology to gather comprehensive spatial data. This approach encompassed the acquisition of positional, chromatic, and intensity data, enabling a meticulous assembly of point cloud data and the identification of optimal capture angles for detailed analysis.

The initial selection for our primary scanning endeavor was the Edinburgh College of Art (ECA). Given our nascent acquaintance with radar scanning technology at this juncture, we opted for an environment with which we held a profound familiarity—the ECA Main Building. Specifically, our focus extended to the West Court and Sculpture Court within the aforementioned edifice, alongside the interlinking corridor facilitating access between these locales. Notably, these areas not only serve as focal points for numerous campus activities but also witness considerable pedestrian traffic, thereby affording a rich tapestry of diverse and compelling data. Moreover, given their recurrent role as venues for our academic pursuits and culminating events, we deemed it a singularly poignant endeavor to encapsulate these spaces as emblematic data points.

Subsequently, our exploration extended beyond the confines of the campus to encompass an off-campus locale—the Vennel Steps. This distinctive site not only serves as a pivotal conduit linking Lauriston Place and West Port but also affords a panoramic vantage point overlooking the majestic Edinburgh Castle. Reverberating with the collective memory of the city, the Vennel Steps transcend their utilitarian function to assume the role of a symbolic bridge, forging connections between disparate locales. From a data-centric perspective, this site boasts a singularly unique topography characterized by staggered elevations, punctuated by the surrounding residential structures. Such distinctive features lend themselves to an unparalleled portrayal through the lens of data points, offering a nuanced depiction distinct from that of other locations.

After that, our data collection efforts transitioned towards outdoor environments, with Dean Village—an esteemed historic locale nestled in the north-western precincts of Edinburgh—emerging as the focal area for our endeavors. Our initial foray led us to the environs of the Water of Leith Walkway, where we meticulously scanned a wooden bridge adorning the path. Nestled amidst a serene ambiance, this bridge epitomized solidity, connectivity, and enduring continuity in its steadfast form. Its tangible presence served as a tangible point of embarkation, imbuing our journey with a palpable sense of foundation and materiality. Subsequently, we directed our attention towards a nearby waterfall, strategically chosen for its dynamic character—a departure from the static features of previous scanning locales. This marked our inaugural attempt at capturing data from a substantial, dynamically evolving entity, thereby broadening the scope of our technical proficiency and experiential repertoire.

In addition to large scenes, we also scanned a number of other objects, such as people and some small objects.

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel