Anyone remember when music technology meant hardware synths, drum machines and MIDI cables? OK, I’m being a bit flippant here because these technologies are still a very valid part of both live and recording contexts. However, if you do remember when this was all there was, then I suspect you also remember the times it didn’t work – and nobody could work out why.
Of course, we also went through the same head-scratching when we moved over to computers, with MIDI streaming in from these external hardware devices and, a little while later, MIDI data whizzing around inside the computer from one virtual MIDI device to another. However, virtual MIDI is now a pretty mature technology and, if you are lucky enough to use one of the top-flight DAWs on your desktop of choice, you can now safely (well, pretty safely) send MIDI data wherever you want, whenever you want. At its best, it is sophisticated, slick and, very importantly, keeps everything locked vice-like to a tempo dictated by whichever MIDI device within your system is identified as the ’master’.
The state of the (iOS) art
Now we have iOS and it’s cute, hi-tech, exciting and has lots to offer the performing or recording musician…. but its also new and, as yet, still in the spotty teenager stage rather than mature – energetic and thrusting – adulthood. And we all know what that means – it can occasionally be stroppy and unpredictable – and still nobody can work out why.
In terms of routing audio, things in iOS land are actually in pretty good shape. Background audio, Audiobus and, more recently, inter-app audio (IAA), while still not without their own particular limitations or quirks, are generally straightforward to use and they get the job done. But when it comes to MIDI? Well, let’s just say that there is still some room for improvement.
There is not too much problem now in getting basic MIDI note or controller data into or out of most iOS music apps. So, for example, if you want to record MIDI data into your DAW app and then send it out to a virtual iOS synth or two, then that can usually be achieved without too much difficulty. And if you do get a little stuck, then perhaps something like MidiBridge – a very useful MIDI utility app by developer Audeonic – can help you establish MIDI communication between apps that might not otherwise be happy to chat.
However, MIDI is not all about note and controller data; it also serves a synchronisation function, allowing you to use one MIDI device to control the transport functions (start, stop, tempo) of other ’slave’ devices. In the days of hardware, a typical situation might have had a hardware sequencer acting as a master and being used to trigger a drum machine (playing drum patterns) and a synth (playing step sequenced or arpeggio patterns).
This is probably where the biggest problems still lie in iOS MIDI. Some apps provide MIDI clock sync support and some don’t. Of those that do, not all of them can serve as both master and slave with the user being able to configure which role they might play in a given situation (for example, Cubasis can only currently act as a master). And even when you do find a configuration that, in principle, ought to work (for example, one app acting as master and a second acting as slave), the teenager can sometimes kick in and things don’t always work. Perhaps your slave just refuses to start, starts only intermittently or, if it does start, their is some latency in the system and the timing between the two devices is simply not tight enough.
The solution, of course, is for Apple and every app developer to really get to grips with Core MIDI and to ensure these wonderful music apps we have available to us are speaking the same (MIDI) language when we need them to communicate. While this sounds like it ought to be a simple enough task (after all, a similar process has proceeded at some speed for audio), we still seem to be some distance from seeing this happen.
All aboard the (Midi) bus
There is, however, what might be a very useful little chink of light that has just appeared on the horizon; MidiBus. Like MidiBridge, this is the work of Audeonic and, also like MidiBridge, MidiBus is, on the surface, a rather unglamorous MIDI utility app that offers one (OK, perhaps two) basic functions.
In the case of MidiBus that main function is to act as a MIDI clock sync master device. And, so you can judge something about the quality of that MIDI clock sync, the second function is to provide monitoring of the MIDI data; you can see how the clock varies over time.
Actually using MidiBus itself couldn’t be more straightforward. In the Transport screen, you can see all the currently running MIDI enabled apps in a list and can tap these to toggle them on as a destination for MidiBus to sent MIDI sync data to. If you engage ’play’ in the MidiBus transport controls, then MidiBus will start transmission.
What happens next depends upon the MIDI capabilities of the target apps and – the more complex bit of the process – the configuration of the MIDI settings within each of those apps. This is where the fun starts because there is little by way of consistency here and it can often take a little digging and head scratching to find the settings you might want. And, of course, some of that digging might be in vain as quite a number of iOS music apps cannot currently accept MIDI clock sync.
All this means, I think, that MidiBus is perhaps just a wee bit ahead of its time – but perhaps also a bit of a wake up call (kick in the a**) to the iOS music app development world – as there is a good number of apps that still need some work on the MIDI front.
That said, if you chuck MidiBus at some of those apps that do respond to MIDI sync, how do things work out?
To start my testing, I picked a couple of apps that offer support for receiving MIDI clock data – DM1 and Thesys – and, once I’d made the necessary settings within their MIDI configuration options, was easily able to get them to start and stop via MidiBus and they seemed to lock pretty well together in terms of tempo (although I know there are MIDI geeks who are way more sensitive to MIDI timing than me so this is something that you really do have to text for yourself). Having a pattern-based sequencer app like Thesys lock to a drum machine app is a lot of fun and, if you like to perform or compose using pattern-based tools, it’s great to be able to sync these sorts of apps together.
Flipping to the MidiBus Monitoring screen suggested that the clock data issued by MidiBus is pretty solid in terms of its timing with zero variation in tempo and low latency (about 0.4ms). This is obviously very welcome, although how tightly the slave apps respond is, of course, also dependent upon their MIDI coding, not just that of MidiBus.
Both Cubasis and Auria can generate MIDI clock data of their own so, assuming MidiBus’ monitoring provides an accurate means of observation, I was interested to see just how stable those clocks might be. The results were quite interesting. With Cubasis, the tempo variation averaged around +/- 0.7% and the latency around 0.5ms. With Auria, the tempo variation averages +/- 0.5% (although the reporting of it seemed a little odd) and the latency was 0.6ms. In both cases, I had no other apps running in the background and was just using a project containing four stereo audio tracks.
Clearly, if the data MidiBus generates from its Monitor display is accurate – and assuming that there are no other variables or data bottlenecks at play here – the clock data generated by MidiBus itself is more consistent than that of either Cubasis or Auria. If you are particular about your MIDI sync, MidiBus would be well worth experimenting with.
A test too far?
Just of the heck of it, I then tried a test that I fully expected to be problematic. I used Secret Base Design’s Apollo to establish a MIDI over Bluetooth connection between my iPad Air and iPad 3rd gen. I then closed my eyes, crossed my fingers and tried to use MidiBus running on my iPad Air to send MIDI clock data to DM1 running on my iPad 3rd gen.
The results were – perhaps as I had expected – just a little unpredictable. Sometimes DM1 would trigger and sometimes it would not. On other occasions, it would trigger but not match the tempo set by MidiBus. It would seem that – at present at least – that extra layer of communication involved in the transmission of the MIDI clock data was just a step too far. Clearly, however, there is some potential here and, as and when all the wrinkles are ironed out (wherever those wrinkles might be in this particular data chain), the prospect of an ‘iPad band’ all sync’ed together via MidiBus would be an interesting musical experiment.
As I suggested earlier, I think MidiBus is a much-needed and very useful MIDI utility app. Audeonic’s website has information for developers on how support for the library of code can be implemented in apps and suggests that it is not a difficult task. If that’s the case, then it would be great to see MidiBus act as a bit of a nudge in getting us closer to a more uniform, consistent and reliable MIDI communication environment under iOS.
At UK£1.99, MidiBus is unlikely to break anyone’s bank and, if you use independent MIDI apps such as drum machines or synths with step sequencer features as part of your music making, then MidiBus might be right up your street. As a means of locking those sorts of apps together – providing their MIDI implementation is up to it of course – MidiBus is a simple and effective tool. It might not be the most glamorous or exciting of apps to look at but, as one of those very useful utility apps that can help everything else run smoothly, MidiBus is well worth having to hand.