Ruiqi:
1. Presentation Procedure:
Experiencers head to the spots near the brink of the hill — not quite falling off, but close enough for a great view of the city and Arthur’s Seat.
They plop down on the blanket, put on some cardboard VR headsets, and dive into the world as seen and heard by an Irish Wolfhound and a Chihuahua — one tall, one tiny, both dramatic.
Next, they move a bit closer to the hill’s edge — still safe — and can choose to sit down again or stay standing, depending on how brave or cold they’re feeling.
Then, it’s headset time again, this time to experience the world like a slightly confused Labrador: vision a bit blurry, hearing a bit off (or on), but vibes? We think it is matched.
2. Presentation Materials:
⭐️Binaural dogs’ heads: Chihuahua/ Labrador/ Irish wolfhound
⭐️Beyerdynamic Headphones x3
⭐️Adaptor 6.5mm to lightning x2/ Type C x1
⭐️Cardboard VR headsets

⭐️Blanket (For experiencers to sit/ kneel down to fully be engaged in the dogs’ perspectives)
⭐️Dog chewer (For immersive decoration)
⭐️Dog treat ball (why we choose the blue one—blue and yellow are the colours that dogs see most clearly. )
⭐️Laminated cue boards

3. Presentation Purpose
In this experience, people will see the world through the eyes (and ears) of three very different breeds of dogs. Ever wondered how a tiny Chihuahua, a giant Irish Wolfhound, or a slightly confused Labrador makes sense of things? Well, now’s your chance.
4. How Human Beings Are Connected to the Dogs?

5. Lovely Moment—Warm Group Photo

Zixuan:
Today, we officially held our immersive exhibition on Calton Hill. The event went smoothly overall, and we received many valuable pieces of feedback that gave us a lot to reflect on.
This exhibition was divided into two main sections:
The first section focused on the auditory differences between large and small dogs. We set this part up on the grass, where we laid out a picnic blanket so participants could choose to sit or lie down—bringing their physical perspective closer to that of a dog. To enhance immersion, we also placed real props—the same balls and dog toys shown in the video—on the lawn. This allowed participants to see the objects from both a dog’s perspective (in VR) and their own, deepening the contrast between the two and reinforcing the immersive effect.

The second section took place near the edge of the hill, where participants had a clear view of Arthur’s Seat and the surrounding landscape. The scenery was stunning, and we found that the natural beauty of the environment made the contrast with the “sensory impairment” content (blind/deaf dogs) even more striking and poignant.
To maintain immersion, we scheduled the presentation at nearly the same time of day as when we shot our footage. This ensured that the lighting and sun angle matched, providing a consistent and believable visual experience.
One fun and unexpected moment during the exhibition was when a dog that happened to be playing on the hill ran off with one of our toy props—a treat ball from the setup. It seemed to really enjoy it, and the moment brought a smile to everyone’s face, making the experience feel even more alive.

🎧 Feedback & Reflections
After the participants finished the experience, we had short conversations with many of them to gather feedback. Based on their responses, I reflected on a few aspects that could be improved:
-
VR Headset and Accessibility
Some participants had vision issues, such as nearsightedness or astigmatism, which made it uncomfortable for them to use the cardboard VR headset. Those with astigmatism saw double images, while nearsighted users reported dizziness. This made me realize that we hadn’t fully considered the diversity of user needs during the design phase, and future iterations should take this into account. -
Video Transition Breaks Immersion
In the second part of the experience, participants had to manually switch videos between two segments. This broke the sense of immersion for a few people. I’m considering changing it to automatically play both videos in sequence, which would create a smoother, uninterrupted experience. -
Headphone Isolation Issue
Our headphones weren’t effective in blocking out ambient noise. Some of the more delicate audio details we designed were lost in the background noise. In the future, we may need to use more closed or noise-cancelling headphones to preserve the full quality of the soundscape. -
Insightful Feedback from Professor Jules
Professor Jules offered some thoughtful suggestions after experiencing the work. He recommended experimenting with bone conduction audio to simulate a more realistic, first-person auditory experience. He also noted that the sense of depth in the hearing experience—especially between large and small dogs—could be more pronounced. This was a detail I hadn’t fully considered before, and I plan to improve this in the next version of the sound mix.
Carly:
On the 2nd of April, we presented our project to the world at the top of Calton Hill. We met 2 hours before the presentation so we could have enough time to get up there, reserve the specific spot and set everything up for the presentation. When 16.00 arrived, I went to the top of the stairs to meet Jules and Andrew. Jules was the first one to experience our presentation, while Andrew checked the other group’s presentation. When finished, they exchanged, and Andrew experienced our presentation.
We got some great feedback from our professors, classmates, colleagues and friends. Some said they felt like a dog, some would have loved more movement, some said that it was really interesting that the sun placement was the same as the one we were showing them…


