S. J. Zhang
Islands of Sound Gif Centered.gif
Screen Shot 2018-12-18 at 2.26.39 PM.png


Islands of Sound

We set out to design an interactive interface where people can compose their own ambient sounds by touching different natural and synthetic objects. Projected graphics reflecting their unique energies will also be generated with the sounds.

We end up creating an interactive product that can be used as a wireless speaker as well as a home decor.

To give all the users ability to take the experience on the go, we decided to designed an App.




Physical Computing

User Testing





Piezo Sensor

Capacity Sensor




2018.11 - 2018.12

A six-week project


How it started

Initial Idea

Our main concept is inspiring people to find the neglected beauty in daily life. When we were considering what can help connect human mind and the physical world in an intuitive way, we started to think about texture, graphics and sounds. In this case, we tended to design an experience that can provoke human sense of touch, sight and hearing.

The aggregation of natural and artificial objects, as well as the mixture of natural and synthesized sound actually surround people all the time, but might be the last thing to be noticed. In order to reintroduce those amazing combination, we chose stone, metal, wood, acrylic and fabrics, which are common natural or artificial materials that compose our living environment, as our design elements.


Based on Interviews

Explore Opportunities

Our first attempt was to figure out what kind of experiences participates were expecting from a speaker. We ran some testings and interviewed a few people (including two musicians, three designers and five people who were familiar with neither speakers nor interaction design) to help us further develop our idea. Following are what we noticed during the interviews.

I. Placement Options Do Affect Interaction

There are two ways in our mind to place the product: it could be placed directly on a table, or it could be mounted on a wall, and there’s no surprise that where the product is does influence how people interact with it.


Table Based

Expectations from interviewees: meditation experiences with pure, concentrating sounds generated and simple interactions.

Challenges: space consuming for a home decor; high requirements for the environment including quietness; difficulties in graphic projection.

Opportunities: design a table coming with the product; projecting graphics on the wall behind the users or erasing the projection to waive distractions.


Wall Mount (Chosen)

Expectations from interviewees: rich body movements; graphics generated somewhere on the wall during interactions; could be left aside and continue playing sounds.

Challenges: placements of the objects; sound sources selection and mixture; ways to mount objects, circuits and sensors on the wall.

Opportunities: linear placement of the objects to give a sense of a keyboard instrument; ambient sounds instead of meditation sounds; a board could be hang up.


II. Build an Interaction Starting from a Simple Touch

One of the most intuitive things people would do when they are trying to “feel” something, is touching. Through our interviews, we asked participants to describe a material they like, all of them mentioned the textures of those materials. We focused on this point, and decided to build an interaction starting from a simple touch.

So, this is how the interaction works:

Sounds can be generated and mixed by users when they simply touch different objects, graphics will be generated indicating the on/off states of the sounds, and the speaker itself can be put aside playing sounds.


Visual Inspiration for Design

We took inspiration from sculptures of Isamu Noguchi, works of Arnout Meijer Studio and creations of Olafur Eliasson. We wanted to combine an artful composition of natural materials with programmed lighting effects and a mix of natural and synthetic sounds.

Credit to Cereal Magazine

Credit to Cereal Magazine

Credit to Arnout Meijer Studio

Credit to Arnout Meijer Studio

Credit to Olafur Eliasson

Credit to Olafur Eliasson


How do we do it

Designing the Installation

To create a complete sensory experience. We decided to start with the physical objects. There are mainly tree components to the installation.

  • The touch-sensitive physical interface composed of different objects.

  • The sounds triggered by users touching objects.

  • The graphics accompanying the sounds.

We drew out a system diagram to better dissect what we need to accomplish separately.

System Diagram.png

We then narrowed down to the few objects that we want to use on the interface: wood, rock, bronze, glass and a linear element.

We decided to draw grids of conductive paint over the likes of wood and rock to make the non-conductive objects touch sensitive. We explored two options connecting the signal to the microcontroller.

  • We could draw paint over the board and gather at the end of the board to wires. Lines can also be decorative.

  • We could draw the paint to the back of the objects, and connect to wire there and wire all the connection behind the board.

We went with option ii later due to conductivity constrains with the paint. The charges don’t travel long distance very well.

Header Background Copy 2.png

Trial and Error

Prototyping & User Testing


Capacity Sensing

Though initially, the linear element on the board was only there for visual hierarchy so that the composition can be more balanced. Through our user testings, we found that most of people assumed the linear elements on the board were sliders.

“They look like volume bars.”

“I’d like to have volume levels projected directly on them as a visual feedback.”

In this case, we chose to use a brass rod as our volume controller. By calculating the sum of capacity values, we could tell the time length our users holding the rod, and increase/decrease the volume of sounds.

Testing conductive paint for physical user input.

Testing conductive paint for physical user input.


Usability Testing

We put one of each our materials on a white foam board and stuck it on the wall. We observed how user interact with the objects and interviewed them of their expectations. They interaction validated a few of our hypothesis, while some ideas inspired us new ways to improve the experience.


Volume Control

Though initially, the linear element on the board was only there for visual hierarchy so that the composition can be more balanced. Through our playtesting, we found that a lot of people assume the linear element on the board is a slider. Some assume that the slider is for audio volume. Some even suggested that we could project volume levels directly on to the linear element to be visual feedback when interacted.


Usability Testing II: Capacity Readings weren’t good enough

We pasted up conductive tapes and wire two objects for the second round user testing. The interactive experience was disappointed due to its sensitivity to the physical environment, and its variation from one user to another. We have to find the other way to stabilize the sensor readings.

The volume rod worked well since it only counted on the time length instead of precise capacity value.


New Sensor

Piezo (Knock Sensor) can be used for sensing vibration when it is connected directly to an analog pin(+) and the ground(-). It performed sensitive and stable at the same time. Unlike capacity, it could work almost ignoring the environmental influence.

We decided to attach Piezo sensors under the objects replacing conductive tapes. Now Arduino is able to receive much more responsive signals to trigger following instructions.


New Plan

At this point, we have following components:

I. Input:

  1. Four Piezo Discs connected to four 10M Ohms fixed resistors, respectively (in parallel), Pin A0 to A3, and the ground.

  2. Three pieces of foils (conductive materials) connected to three 10M Ohms fixed resistors, respectively (in serial), Pin D2, D4 (receive pin), D6 and D7.

II. Serial Port

  1. Bluetooth module connected to 5V power, D0 (TX pin), D1 (RX pin) and the ground.


Usability Testing III: Unclear Instructions & Noisy Environment

The shape of our objects confused users as they looked like knobs. Most of users didn’t know how and where to start.

The third time user testing was also in a noisy environment, like the show we want to present at. Even though the installation itself worked fine, users were not able to gain a satisfying sound composing experience due to the noise.

Solution: Instruction & Headphone

  1. We decided to project instruction “light tap on us” on the board when it is no being interacted with.

  2. For the ITP Show, we decided to set up a headphone instead of using speakers. The composing interaction now becomes a single user experience.


The Making

Generative Graphic

For the visual elements that would be projected on the board indicating the status of each object, we programmed generative graphics to reflect the amplitude and frequency of sound. From pixel water ripples to fading ellipses; from single/multiple particle systems to bezier waves, we tested the visual effects on both laptop monitors and the physical projected interface, and iterated several times for getting closer to our expectation.



Water Ripples Effect: Object Status

We simulated the dynamic expanding-fading water ripple effect, mapped the amplitude of each sound track to the radius of ripples to show the on/off status of each object.

At the end, the size of the water ripples was reduced due to the limited projection surface.


Frequency and Amplitude

In the beginning, we tried to apply particle systems to reflect the frequency and amplitude of the sounds. We had a hard time working on multiple particle systems due to the nested classes. However, it turned out to be way too complicated and messy visually than we expected.


Bezier Wave

We decided to keep the graphics simple in case of distracting users from enjoying the sound itself. People are already familiarized with the shapes on sound waves. To work off that, but simplifying the visual, we created two bezier waves reflecting the low/bass and high/mid frequency respectively instead. We also programmed them to move gently when no sound is activated, so that they feel inviting to users.

Screen Shot 2018-12-18 at 8.01.24 PM.png

How it was made


As we explore different materials used for the board. We decided to add a bluetooth chip to our micro-controller so that we only need a power cord into the board. This configuration will also make it easier to move the board to different places. For a complete experience, we only need to plug in the power and map the projection. While a finished board itself with power can also perform a sound-only mode.


The Board

Since we need to mount objects on to the board, the board needs to be sturdy. Both acrylic and wood works for the purpose, but since we also need a depth to hide circuits and speakers in the back, sth that comes with a hollowed depth is gonna be helpful. Luckily we found a Lazy Susan at the size we need, with a removable bearing base.


The Cover

We found the perfect fabric to cover the surface, a light grey wool with a slight linen-like texture that is soft and stable. We took inspiration from mattress covers’ construction and sewn a cover with elastic back for the board. When it is mounted or hung against the wall, there is only one noticeable seam around the edge and we have a pretty clean side. There are more seams on the cover, but they are mostly along the grain so barely noticeable.


The Interface

When we are constructing the real interface with real objects at scale, we asked a few musicians to place their ideal compositions. We were told that if we want user to actually remember the sounds associated with the objects and be able to purposefully compose something, we should go for a linear composition reminiscent of piano keyboard and other MIDIs.

We layered a few object to give them some height and now the board looks like a landscape from the side.


Volume Control

Our original idea, was the users can slide their finger along the metal rod to adjust volume. Due to the unpredictable nature of capacity sensing, we weren’t able to get accurate data from the sensor on where the contact points is between the rod and the fingers. We opted for a different solution where the rod is separated into two parts, touching one of them would increase the volume and the other to decrease. We bended metal strip to fix the rod onto the board.



We drilled through the wooden board, led the wire to the back and connected them to the Arduino. The Arduino is then connected to bluetooth chip that communicate wirelessly to a computer.

Circuits, Arduino Board and Bluetooth chip on the back.

Circuits, Arduino Board and Bluetooth chip on the back.

Screen Shot 2019-02-12 at 6.54.58 PM.png

Get to the Mix

The Sound

Due to the limitation of time and our ability to compose sound or music, we chose to use free sound for our project. It also came with problems that first, it was hard to mix four sounds with different tempos and beat; second, it’s up to users when to turn on/off a sound.

Dealing with those two problems, we edited the sound, match the tempo manually, and cut them into the same duration length. Besides, we rewrote the p5.js sketch, start to loop the sound in the beginning with 0 volume to make sure whatever sound users triggered is a part of our melody. 


Manual and Digital

Projection Mapping

  1. We used syphon to capture the canvas from the browser in real time.

  2. We created a round mask cutting out what we need from the sketch, and then fixed the skewed shape by adjusting the perspective of output media with ISADORA.

  3. Physically, we added a cut out filter in front of the projector lens to block the “colorful black” rectangle that will be projected around the board. 


Hello world

Introducing Islands of Sound

After five weeks of testing and pivoting, we finally put all the components together. The board itself is wireless, using bluetooth to communicate with a computer nearby. The computer receive serial data from the sensors on the board, calculates the graphic and plays the audio in the browser. Finally, Isadora(Projection Mapping Software) syphons the screen and send the graphics to the projector on the ceiling.

Here is a demo:


ITP Winter Show 2018 

This project was selected to participate the ITP Winter Show, a two day exhibition of recent creative interactive projects by the students of ITP and IMA. We had the opportunity to show industry professional as well as friends and families what we’ve been up to at school.

Over two days, around eight hours of opening time, we welcomed more than two hundred guests listening and interacting with this projects. We received overwhelmingly positive responses from gallerists to artists, high school students to advertising veterans.

A guest experiencing the project. *All guests have consents of their pictures taken at the show.

A guest experiencing the project. *All guests have consents of their pictures taken at the show.


Thank you for making it to the bottom!

During this process, we wanted the user to be able to take the experience with them. So we designed an App.