Music making with iOS; the pros and cons; no. 4; the touchscreen vs the touchscreen

So far, in this short series of short(ish) articles looking at the pros and cons of iOS as a music technology platform – and intended to provide a balanced view so that those considering what iOS might have to offer them are better placed to judge – we have considered the issues of storage format (part 1), size (physical footprint; part 2) and price (hardware vs software; part 3). In part 4, I’d like to look at one of the very obvious highlight features of iOS (and any modern smart phone/tablet); the touchscreen.

Getting touchy feely; the downsides

Touchscreen control of software is something that, on the whole, makes for a very intuitive user experience. I’ll come to some of the very specific ways that is a clear positive for the musician in a minute but, even for general functions such as email and web browsing, when it’s a device designed to be mobile (although it can work quite happily in your office), a user interface that doesn’t require you to lug a keyboard or mouse around is an obvious bonus.

It’s not all ‘pros’ though and, for musicians, I think there are some obvious examples where the touchscreen actually makes certain tasks more difficult. In short, I think these really boil down to a couple practical issues. First, touchscreens require you to touch them (doh!) and that means your finger(s) are then covering the very thing – and in a music software context, that might be a control in a synth or a note in a MIDI sequence – that you need to be able to see to properly adjust it. Second, touchscreens are flat (so are most computer screens; double doh!) so that virtual representations of controls, knobs, faders, strings or piano keys, while looking the part, don’t give you the same tactile response when you ‘touch’ them.

Real keyboard vs virtual keyboard? Which gives the superior playing experience… ? And which is easier to put in your pocket for use on your commute…?

Let’s take one example that perhaps illustrates both of these issues. When you are using a software synth or guitar amp sim or DAW/sequencer, etc., and you reach out to adjust a rotary style knob, even in UI where things have been kept fairly spacious, the odds are your finger then covers that control. Depending upon what sort of touch/gesture the developer has built into the UI, you might then have to drag up/down or left/right or even in an actual ‘rotary’ direction to adjust the control.

So, your finger (and perhaps even the rest of your hand) starts by covering up the very thing you are trying to adjust (the first problem outlines above) and, as you make the gesture to adjust the control, you don’t really get the same sense of ‘control’ that you would have with a physical rotary knob (the second problem outlined above). Yep, it works and, yep, developers can add some useful pop-up graphics elsewhere within the display that make it easier to see what you are doing as you adjust the control (and these are rather elegantly implemented in some apps), but it is nowhere near as gratifying or as intuitive as real hardware.

Ooohhh…. that feels so good…. :-)

However, for someone (like me) who spends a lot of time in a DAW/sequencer on their iOS setup, I think there is another ‘my finger is in the way’ example that is perhaps more irritating (and can get old very quickly); MIDI editing. Yes, it is something of a minor wonder to be able to record and edit MIDI data on a device as compact as an iPhone or iPad but, having played in my MIDI parts (and my keyboard skills are not great), there is generally some tidying up to be done. With a touchscreen, this is not something I ever really look forward too.

Steinberg have done a great job with the MIDI Editor within Cubasis…. but that doesn’t stop my fingers getting in the way of actually seeing what I’m doing when I use it.

It doesn’t really matter what the editing task then is – moving notes, changing their length, copy/paste tasks, manually adding notes or adjusting MIDI CC data (e.g. note velocity) – really your fingers just get in the way. I’ve tried the majority of iOS sequencers and, to one extent or another, I think they all suffer from this issue. Things can be a little better if you use a stylus pen designed for a touchscreen (and Apple’s Pencil is good in this context but not exactly cheap) but I really think MIDI editing is one example where the computer mouse is actually a far superior tool for getting a job done. And if you are fond of a bit of photo editing or graphics creation on an iPhone or iPad (or other touchscreen device) then you will suffer the same sorts of practical issues there; in this regard, the touchscreen can be a bit of a workflow limiter.

Perhaps the other obvious example of the second issue – the lack of tactile feedback from the flat touchscreen – has two sides to it but let’s start with the downsides; the rather unsatisfying nature of playing a virtual piano keyboard. Yes, it’s great that we can play keyboard parts in via the touchscreen as we sit on the bus going to/from work, but it’s not the same as interacting with ‘real’ piano keys. It’s difficult to give your performance expression when the instrument (in this case, the touchscreen) doesn’t physically respond in some way to your touch.

Getting touchy feely; the upsides

So much for the obvious cons of the touchscreen; what about the pros? Well, let’s start with the obvious one and it’s the flip-side of the last point made above. Music app developers are a pretty smart bunch on the whole and a good number of them have spent a considerable amount of time trying to design touchscreen interfaces that alleviate that rather unsatisfying ‘not so tactile’ playing experience. This has been attempted in various ways but, at their most innovative, these ‘performance surfaces’ have actually bought us completely new ways to ‘play’ music.

At one level, this is simply the (still clever) on-the-fly redesign of a virtual-piano keyboard so that it only includes the notes in your preferred key/scale combination. Animoog perhaps paved the way here and others have followed that lead. No, it’s still not a really tactile experience but it certainly makes the touchscreen more easily playable…. and fewer duff notes to start with then means less MIDI editing later (see above).

Animoog – including a brilliant reinvention of the virtual piano keyboard.

At the next level we have also seen some brilliant ‘MIDI performance’ apps developed for iOS. These often do away with the whole virtual piano keyboard concept and give the user something very different as a means of generating MIDI data. Apps such as Chordion, ChordPolyPad and Navichord come to mind as obvious examples but there are others. And, what’s more, you don’t really need traditional piano keyboard skills to get something useful out of these apps. Indeed, some of them are so good, that these apps are more than enough of an excuse to hook your iPad up to your desktop music production system and use them for MIDI data entry there also.

With an app like Chordion, whole chords can be triggered with a single button…. touchscreen MIDI data entry made easy….

Of course, if you want to go even further down this touchscreen ‘reinvent the performance interface’ route, you can. This is, perhaps, where there are some real design gems within iOS….. bought to us by those developers that have thought some distance outside the box. I’d perhaps put an app such as ThumbJam in this group. It takes the concept of an X-Y pad (which is another concept that translates in a powerful and flexible fashion to the touchscreen) and turns it into a very expressive means of playing a virtual instrument.

ThumbJam is one of the most expressive performance interfaces you can find in an iOS music app…. a very clever bit of design.

For all its simplicity as an app, Figure is another example. It might be limited in many ways but, as a means of creating what are essentially MIDI sequences, it is a UI that almost anyone can enjoy and exploit.

Figure – the app itself might have a streamlined feature set but the method for making music is clever, creative and very accessible even to those without traditional musical skills.

However, perhaps my personal favourite in this area is an app that I think is an under-appreciated gem; Oscilab. This combines elements of that key/scale constraint approach with a sequencing system that is based upon drawing note ‘flows’ into a sequencing grid and then modulating those flows based upon various waveforms. This aspect of the design is both innovative and – to me at least – incredibly musical. To a large extent, the design does way with the problem of MIDI note editing described above because note editing doesn’t really involve editing individual notes…. Instead you edit the ‘flow’ of those notes within a pattern and Oscilab pitch quantizes that ‘flow’ to the chosen key/scale. For EDM music styles it’s a great tool and one that I hope the developers can keep plugging away at.

Oscilab – a brilliant bit of touchscreen design. It’s a sequencer but not as we know it….

Oscilab is one example of many where an iOS music app developer has reinvented what a musical instrument might be for the touchscreen; there are plenty of others. I think it’s great when a developer doesn’t just try to recreate the features of a ‘real’ instrument or piece of equipment (although that approach can also be great). Instead, they reimagine the key functions of that instrument or equipment for the touchscreen device. In this way – and when the developer gets it right – the touchscreen becomes one of the biggest ‘pros’ there is for the iOS music maker.

And there’s more….?

This is an area in which I’m sure there is still plenty of scope for some future surprises…. while not always the perfect ‘input device’, the touchscreen is something that can be exploited in some incredibly flexible ways…. but it does take a bit of a special mind within the development process to deliver something both novel and functional. For me, however, this is one of the most interesting and exciting aspect of iOS music technology….. and I’ll happily live with the (relatively minor) cons of the touchscreen to experience the benefits it can bring when a developer delivers something novel, brilliant and genuinely useful to the musician.

Be Sociable; share this post....

    Comments

    1. I was going to hold back this comment as I suspect you will cover it in a future article. I am currently looking for a way to perform live with the iPad. In my mind I need to connect external controllers and I’m exploring the new crop of mini keyboards that have mini keys and regular size keys but in the 25 key limitation. I also want to connect other midi controllers like the Casio horn and possibly the drumkat, plus a real guitar and I just snagged the irig blueboard. This is my attempt to get over the limitations of a piece of glass as a controller. That being said I totally agree with your comments about how some developers have interfaces that are brilliant for controlling with glass. These are exciting and frustrating times in the Wild West of iOS!

    2. Still having issues emailing you back replies but this post goes well with your third question about our needs.
      What if Apple were to announce 3D Touch (Force Touch) on the iPad, next week?
      Glad you mentioned ThumbJam and Animoog. To me, they really take advantage of the touchscreen. On iPhone, ThumbJam supports Force Touch in a very useful way. (Animoog also supports it, AFAICT, but it’s less obvious.) Seaboard 5D is another example, and the whole “Multidimensional Polyphonic Expression” thing (MPE) might really emerge from iOS apps.

      And since you talk about tactile feedback, it makes me think about support for “taptic” response. Haven’t heard of a music app which takes advantage of it (and my iPhone 6s Plus has very limited taptic support as compared to the iPhone 7). Yet it’s easy to imagine some virtual sliders and XY pads which make you feel your gestures (knobs are probably less amenable to this kind of haptic/taptic treatment).

      And to piggyback on Eric’s point, part of the issue with most iOS apps in terms of UI is that they combine control and sound production. Sure, there are some apps which are only meant as controllers (AC Sabre being a rather interesting case). And Audio Units may eventually pave the way to apps which delegate control to other apps. But most synth apps (and romplers, etc.) devote most of their interfaces to an on-screen virtual keyboard which is very limited in usefulness. For anyone wanting to use an iOS device in performance, it’d make a lot of sense to separate control and sound production.
      One way to achieve this could be with separate devices. In this sense, an iPhone running AC Sabre can be a controller for a synth running on an iPad. Or a laptop can drive audio effects running on an iPhone. Not to mention that we can use external MIDI controllers through USB, MIDI, or Bluetooth. (Like Eric, using a 25 key device, on occasion. Been mostly focused on a Wind Controller.)

      In my experience, touchscreen devices excel at two types of “control”: innovative ways to send MIDI/OSC messages (à la TouchOSC, ThumbJam, AC Sabre, Animoog, etc.) or ways to tweak multiple aspects of sound at the same time (XY pads, multi-sliders, etc.). Of course, those overlap in that you can use MIDI to modulate a sound. But there’s some usefulness in the conceptual difference between sending notes and doing a filter sweep.

      To my eye, the GUIs of most iOS musicking apps remain way too clutter, especially for live performance. As we know from visually impaired users, a touchscreen needs not focus on visual elements. While you’re shredding, you’re usually not looking at your hands. So, having as minimal an interface as possible can actually help iOS musickers quite a bit. The Klevgränd Kanvas GUI may be a step in that direction, but it’s probably not so much of a victory in terms of ergonomy.

      Part of the issue, though, might be that the App Store focuses so much on screenshots. An app which only displays two XY pads is less likely to sell than something inspired by the “spaghetti” mess of modular synthesis. Yet, for musicking, we often care more about the sounds we can produce than about having as many virtual buttons, wires, and knobs as possible.

      Imagine an app which gives you all the power of Camel Audio’s Alchemy in terms of sound, focused on-screen elements on controlling the sound (with the eight pads and/or some XY pads), supported accelerometer, gyroscope, and 3D Touch, and relied on MIDI input for notes and such. No on-screen keyboard, not even knobs. You prepare your patches before performance and you play by controlling the things it makes sense to control while playing. You could have another app, running on another device, which would focus on notes and “expression”.

      In a way, this could be part of a kind of iOS MainStage.

      Not holding my breath. But it’d sure help Apple maintain its dominance in mobile musicking.

    3. Knobs! Aargh! If you were designing a new touchscreen interface then tiny circular knobs are probably the last thing you would want for ease of use and accuracy of parameter changes. I tend to map as many controls as possible to my MIDI devices and leave the screen for more touching and visual cues.

    4. If you are a fan of omnisphere,as I am,their tr app is a gem. Wireless control over local network. The orb is an advanced xy controller. Fun for hours. Can’t praise spectrasonics enough. Constant inspiration.
      TC-11 by bitshape is also a nice synth using touch in a different way. It’s visually pleasing also. Mirror screen to large monitor via Apple TV. Fun for hours.

    Speak Your Mind

    *