Are we losing touch with sound? The growing trend of non-tactile controllers
With the increasing presence of touch screens in both studio and live applications, it’s important to consider what their impact may be on our creative workflows. The convergence of our daily tools into one place has meant we are often obsessed with our screens. I have students unable to detach themselves from what may be happening on social media. I see it in bars and cafes; people staring at their phones while half listening to friends directly opposite them. And I often see people driving while tapping away – my biggest pet peeve of them all. I’m by no means innocent of this distraction. I regularly find myself zoning out staring at my phone when I should be paying attention (though never behind the wheel). These devices are not only a distraction, but they are also a barrier. They require us to participate in a visual task and the attention required often drags us away from other important interactions.
So is it possible that these devices will take our attention away from more important aspects of our audio work?
The first indications I observed of how touchscreens were going to affect music production came through basic apps which aimed to simulate instruments; A simple piano keyboard that played the notes when you pressed down (although lacking the dynamics of any physical mirror), a drum kit that you could tap “ba-dum-tiss” on, a flute you could play by blowing into the mic and covering fake holes on the screen. And hundreds, if not thousands more as app stores grew across each platform. But I, like many, never believed these would come to replace to physical necessity for pianos, drums and flutes. The tools on tablets are becoming more sophisticated, and some of the instruments are also less primitive than ones I have previously mentioned. The apps from Moog and NinjaJam have shown that some of these interactions can be fun and really creative for music makers.
Animoog uses slider keys to make up for the lack of dynamic control
Some of this functionality and integration has been excellent; the Slate Raven systems have made their way into beautiful studios around the world and are being shown off by many a famous artist and mix engineer in Steven Slate’s typically slick way. The new controller implementation from Logic X has given many users a control surface through their phones and tablets. The Avid S6 dock has brought together one of the most powerful DAW controllers and one of the most flexible tablet devices, although at a very high price point. And there are many more examples of these types of products. The benefits of these controls are obvious. Apps are cheap, and tablets are relatively inexpensive compared to your average hardware controller. Faders, potentiometers and everything that comes with managing these mechanics are a manufacturing cost that can now be visually represented in a simple GUI. The interfaces are often customisable and are compatible with a wide range of software using protocols like OSC. Tablet interactions are becoming very important in the live sector, with engineers wanting to run around venues while still maintaining control – although many will return to the main board to mix the actual show. The benefits are there, for both creator and consumer, amateur and professional. It is a popular tools that has a lot of uses.
However, there is still a fundamental issue with all of these integrations – the need for screens.
The use of a screens to interact with an action is one that requires visual attention. Not just with touchscreens, but even with your standard monitor. Mixing with a mouse is one of the most tedious and dull experience for any musical creative. Yet even omitting the mouse, touchscreens are still not the best experience when you demand responsive feedback. The lack of tactile response from a flat piece of glass means our eyes are always focused on the action and not always on the intent. This was my first reaction to working with touchscreens in the studio. Each tweak I made to the mix had me looking down at the screen, placing my finger on a specific piece of the glass and hoping it would respond appropriately to the movement I was making. It may be that I am simply a creature of habit and that sitting in front of a bank of faders feels like a more natural interaction.
Tuna: Does adding physical controls help, or defeat the object?
So where can screens improve in regards to control? Some have attempted to add accessories for the surface to allow for a more hands on experience. The Kickstarted kits from Tuna showed there was a market of twiddlers who were interested in expanding the glass experience, This time, if was DJs who wanted a way to get away from staring at the screen and instead do their Guetta-dance and jab the sky with their other free hand while the crowd screams. Or something.
Haptic force-feedback could also be an option, although becoming a little primitive at this stage. Still, the physical object is missing, but you can receive some sense of feedback from your actions without staring down. So maybe we need to really think outside the box.
MIT’s Sean Follmer and his malleable surface.
Enter the Human-Computer Interaction (HCI) crowd. In recent years, HCI has lead the way in trying to pushing the boundaries of the screen. Some devices with moderate success have been the Leap Motion 3D optical detector and the Myo gesture recognising armband. Both have seen integrations with music and continue to find interesting places in performance and research alike. Through my playing around with them, I stumbled across Sean Follmer and the crew in the Tangible Media Group at MIT. Among some of their amazing creations, I stopped in awe at the Tunable Clay project: “Tunable Clay uses material stiffness as an extra dimension for 3D modeling in its malleable interface”. Can I make faders? Pots? Could this be a fully customisable physical control surface that still allows for all the dynamics and control a console gives us? The brain wonders…
Can malleable controls work with an iPad? Yup…
Another project from this department is the Behind-the-Tablet Jamming Interface: “The Behind-the-Tablet Jamming Interface enables malleable input with varying stiffness as haptic feedback, while avoiding occlusions with on-screen content”. And that’s where we come back to the screen. Avoiding occlusions is a nice way of putting it. The team’s aim is to get you to work with the physical material and in this case, haptic feedback. Maybe this could be out of the box enough!
How much do you use the mixer as an instrument?
So what about the occlusions of visual content we currently face in the studio today? Consider the days before DAWs dominated our sightlines and when the mix engineer only had a meter bridge to contend with for visual stimulus. All the focus was on the audible sweetspot and not the visual one. And yes, all the lovely meters and visuals we get from the likes of Ozone, T-racks et al are amazing and have their place – but the most important thing was always listening, and then reacting in a dynamic, physical way.
Audio and Visual stimulus are known to overlap and often they support each other. Missing information from a visual view can be found in audio cues. Missing audio info can be found in visual cues – the cocktail party effect for example…
Sometimes one can even trick the other, such as with the McGurk effect…
So are we potentially being tricked, like the McGurk effect, or enhanced like the Cocktail party effect?
This is known as cross-modal interaction and is studied quite extensively in psychology. The brain is constantly interpreting a complex array of sensory inputs to give you information, but often these can be coded incorrectly and form illusions. Studies like those conducted by McGurk and McDonald in 1976 showed that we have a visual dominance when interpreting these signals, further backed up by the “illusory flash effect” conducted by Shams, Kamitani, and Shimojo in 2000. Our eyes often dominate our ears and may tell our brains a different, less accurate story.
Many times in the studio I will switch off all screens to focus on those faders and what is coming out of the monitors, with fewer visual distractions to sway my judgement. My eyes can often remain closed and I will still be able to locate the appropriate Fader and adjust as I need. To me, the fader represents a physical object, holding the reigns to a sound and keeping it in its place. In my hands, it gives me the power to determine dynamic changes and associate that to other physical objects around it. Like a key sitting in a piano, each has it’s place and each will be used in a way that gives ebb and flow to a piece of music. It gives me an aspect of order and control I don’t get from the glowing glass of a tablet.
Are we often seeing too much?
Like the apps pretending to be piano, drums and flutes, I don’t believe touch screens will come to completely replace the humble pot and fader just yet, although it certainly has made a case for ditching the mouse. The dynamic intricacies of compiling a mix on a physical and mechanical surface still lives on in the artform of mixing. As the audio industry has been adding more and more functionality through these devices, more technology companies are looking for pathways away from the barriers of screen interactions. I’m not saying we should abolish screens in the studios, but I think more engineers should be aware of how their perceptions may be swayed by too much visual stimulus. Now the audio industry needs to begin looking beyond the glass and think about how we can return to interactions that rely more on our raw audio perception rather than being glued to a screen.