The mobile device has bought the freedom to make music almost anywhere. Whether it’s iOS or Android or smartphone or tablet, music apps can let you indulge your music-making habit – on the bus, on the train or on the beach – the choice is yours. And while, as yet, the software available for musicians on these mobile platforms can not match that available for the desktop/laptop format in terms of the power and total features, they are still pretty amazing – GarageBand, NanoStudio, AmpKit+, djay, Animoog, AmpliTube, Figure, ThumbJam, SynthStation, iMS-20, and the many others – all take a bow. They are capable of some seriously good results in their own right and the fact that they can work on a device that you can slip into you coat pocket just makes them even move remarkable.
For many users, myself included, this combination of portable yet powerful is enough to sell the idea of music making via a smartphone or tablet device and convince me that the technology has something serious to offer. However, I think there is another aspect of these devices that, in being impressed by the ‘studio in your pocket’ concept, it is possible to miss the significance of – the introduction of the touch screen to the music-making masses.
Control at your fingertips
Eventually, it is pretty easy to predict that this touch-screen technology is going to filter down into the mainstream laptop and desktop world (indeed, if you are prepared to pay for it, you can already get touch-screen monitors) but, for now, it is the domain of the smartphone and tablet – and one of the unique selling points of the technology as it creates a very intuitive means to interact with the software.
This is true for the music software we now all use on a daily basis – virtual faders to slide on your mixer, virtual knobs to rotate on your amp sim and virtual keyboards to tinkle while playing your synth or piano instruments. These are all great in their own right and, if portability is ultra-important (or just a necessary convenience), mean that you don’t need to lug extra bits of kit around with your smart device – you can get by with just the touch-screen and the virtual controls provided by the software developers.
Less virtual, more innovative
Anyone who regularly uses music software, whether in a desktop/laptop or mobile environment, will be used to the virtual recreations that software developers provide of devices that also exist in a hardware format. Virtual instruments, amp sims and effects processors are the obvious examples.
It’s easy to understand why this approach works. Present a guitar player with something like Guitar Rig or Amp Farm and it won’t take them long to find their way around because the software is designed to look and operate like a virtual version of the real (hardware-based) thing. Indeed, the graphical representation often goes far enough that the guitarist can tell which specific amp or cab model is being represented in software or which ‘real’ wah-wah pedal they are sticking in their virtual effects pedal board. The same applies with synths and effects processors, with virtual emulations of specific hardware units a common practice. Again, the reason for this design is obvious – the devices present a familiar-looking interface to the user and are therefore instantly accessible.
All of which is great – but there is something else going on as well. Take a look at apps like Figure, ThumbJam, Animoog and even good, old Garageband, and you can also see evidence that some developers are starting to think outside of the ‘virtual’ box. Instead of providing the user with a software emulation of a familiar hardware interface, they are providing an alternative take – one that is less about recreation and more about innovation.
And the key thing here is that the user interacts with the software via a touch-screen. Virtual knobs, sliders and piano keys are fine but, when you have a touch-screen interface, are they the only way, or even the best way, that a musician might control their software?
Musical by design
Propellerhead’s Figure is one of the better examples here. While there are all sorts of things missing from the app in its current state of development (the ability to save your compositions being one of the more significant things!), it does have two impressive features; great sounds and instruments that, instead of emulating a traditional hardware control surface, are designed from the ground up to make best use of the touch-screen interface.
So, the lead and bass instruments don’t have a piano keyboard in sight. Instead, we are presented with a ‘playable’ section of the screen that allows the user to control both pitch and some aspect of the synth’s sonic qualities by simply tapping or swiping their finger across this playable zone. And used in combination with the Rhythm, Range and Scale Steps controls, you can easily – and very intuitively – create all sorts of very musical phrases. In short, it is a brilliant piece of design that has absolutely nothing to do with a traditional ‘instrument interface’ that a musician might be familiar with from the world of hardware.
ThumbJam’s playing surface – where you can limit the available notes to a particular scale, apply vibrato by moving your fingertip and pitch-bend by tilting your iPhone or iPad – is another brilliant piece of innovation. Despite what must be a somewhat stripped back sample base, the experience of playing the cello, violin or string section instruments via this interface is quite something. It is incredibly expressive. I can imagine the same instrument with a further section of the screen reserved for selecting different string articulations (legato, pizzicato, staccato, etc.) and, as someone who is primarily a guitar player, I would absolutely prefer this method of data entry over a traditional piano/MIDI keyboard. And Garageband’s various Smart Instruments and Animoog’s scale-specific keyboard have hints of the same thing – developers who are seeing that they don’t have to clone traditional hardware – that there are alternative, more innovative, and perhaps more musical, ways to get a performance out of a virtual instrument (or to control an effect) than emulating just the sorts of knobs, buttons, faders, strings and keys that we have used for so long, and are so used to, in the hardware world.
The next step (or two)
Personally, I think (and hope) that the apps mentioned above represent just the start. Yes, there will always be a place for the ‘hardware clone’ approach and this works very well when implemented in an appropriate fashion but, for those app developers prepared to think a little differently, the potential to create something new, exciting, innovative, musical – and, like Figure – just outright fun to use, is considerable. The world of music software development in touch-screen smart-devices is still a young one. Let’s hope the innovations we are seeing now are just the first step….
There is another aspect to the innovative control options that smart-device touch-screens provide that also interests me – how smart-device technology can be fed back into the desktop studio environment. With apps like Figure, ThumbJam or Garageband, the touch-screen of my iPhone or iPad provides an alternative, and very intuitive, means of creating music but, when it comes to my project studio built around a desktop-computer, I’m still stuck with my mouse, computer keyboard and a traditional MIDI master keyboard (oh, and the occasional bit of MIDI guitar but that’s a control option that I’m still constantly disappointed by despite having tried almost every product available over the last 20 years).
However, my iPhone or iPad already have a role in my desktop studio as I regularly use Stenberg’s Cubase iC or Neyrinck’s excellent V-Control Pro to control my desktop DAW (usually Cubase but also Reason and, occasionally, Logic). This technology allows my iDevice to seamlessly link to my desktop DAW via WiFi and the communication is, in the main, both solid and responsive. So, what about taking this to the next stage? When can I have an interface like Figure’s sat on my iPad (for example) and use it to play (rather than just control) a virtual instrument running in Reason (or Cubase or Logic)?
We will have to wait a while before 27” touch-screen monitors of the iPad’s screen quality and responsiveness are affordable enough to hit the mainstream, but the smartphone and tablet computer are here now – and the ability to link them to a desktop system is here now – we have all the technology already in place. All we need is for an innovative software developer to come along and realise that there might be some money to be made from capitalising on this potential. I, for one, can’t wait to see it happen.
All of the apps mentioned in this article are available on the iTunes App Store