If what remains of your hair (!) is mostly grey, then the odds are you might remember the wonder of MIDI’s first appearance. Having been subject to various bits of development work for a couple of years (and with a significant contribution from Sequential Circuits, makers of the Prophet synth so recently modelled from iOS as the iProphet), the first Musical Instrument Digital Interface (MIDI) standard/specification was published in 1983 after having been agreed on by companies such as Roland, Yamaha, Korg, Oberheim and Sequential themselves.
MIDI bought a new paradigm for workflows, whether that was in the context of live performance or in the recording studio. It allowed you to connect MIDI-capable pieces of equipment together and send digital data between them. One keyboard (or, indeed, many keyboards, could be controlled from another (master) keyboard. Equally, with allied developments in personal computer technology and the concept of sampling, it was not long before MIDI data was being recorded and sequenced by computer software. The age of computer-based recording was upon us.
However, grand that all sounds though, that’s not to say that the early adopters of MIDI didn’t face some fairly significant teething problems. Troubleshooting complex MIDI rigs therefore became a common task and, for those who really understood what the technology was doing, there was some gainful employment available helping those that didn’t but could see the potential and wanted to use the technology.
While it has been refined and improved over the years, MIDI still forms the core protocol for many tasks in the digital recording studio, whether that’s for triggering synths or virtual instruments or automating mix parameters within your DAW/sequencer. Some 30+ years after it first appeared, MIDI is still alive and kicking and embedded in the vast majority of music technology equipment in one form or another. Slowly, however, over the course of a number of years, manufacturers and software developers mastered the MIDI protocol and, on the whole, MIDI in hardware form is now a pretty painless experience.
However, the same is not always true once you get to MIDI in a computer environment, particularly when you start to get involved with ‘virtual’ MIDI (software-based MIDI connections between multiple virtual instruments). Our two most common desktop operating system – Windows and Mac OS – have different underlying code that handles MIDI and, as a consequence, software developers working across both platforms often faced some interesting challenges in ensuring compatibility and smooth operation. However, the technology has come a long way and, in recent years, under desktop operating systems, MIDI is usually a solid and dependable (if occasionally quirky and imperfect) part of the overall music technology system.
New OS, old problems
If part of the challenge for music software manufacturers has been dealing with MIDI under Windows and OSX, then it is perhaps not surprising that a new OS – in this case iOS for Apple’s mobile computing platform – might create a new set of potential problems.
Equally, iOS has seen a whole bunch of new software developers come to the market and, while the likes of Steinberg, Korg or Yamaha might have plenty of MIDI programming experience under their belts, for lots of indie app developers, this was not the case. In short, therefore, it is perhaps not that surprising that, under iOS, MIDI implementation within apps, and MIDI communication between apps, has been a bit of a bumpy ride.
I’ve been dabbling with iOS music technology for a number of years (and some of you, I’m sure, for longer than I have) but, in the last 2+ years of running the Music App Blog, I’ve quite often made comments about less-than-perfect MIDI support in the apps I’ve been reviewing. I am, on the whole, sympathetic to developers in their struggles with MIDI under iOS, but that doesn’t mean that I enjoy apps with incomplete MIDI specs, MIDI sync that doesn’t work (or, just as bad, works inconsistently) or apps that ought to talk to one another via MIDI but, for no reason that I can work out, don’t.
Fortunately, there are dedicated folk out there who want to put that right. Perhaps one of the best examples of this is developer Audeonic. I reviewed their excellent MidiBridge app back at the start of 2013 (and it has recently been updated to provide iOS8 support) and I’ve lost count of the number of times MidiBridge has dug me out of a MIDI hole by allowing me to make MIDI connections between iOS apps that otherwise just refused to talk to one another. Like Audiobus for audio communications between apps, MidiBridge is a very useful utility for solving MIDI-based data flow problems.
Equally, I was impressed with their MidiBus app – a MIDI Clock utility that can act as a ‘master’ MIDI Clock source and, as a result, ensure all your other iOS MIDI apps are being driven from the same, precise, clock source. The app also acts as a MIDI monitoring tool so you can see how the clock data from your various other apps varies (no, of course it shouldn’t, but it does) over time.
MidiBus is, however, also a library of code and, like the Audiobus SDK that developers can incorporate into their apps to ensure audio communications follow a standardised protocol, so can they incorporate the MidiBus library (its free to do so) to ensure that apps follow a standard MIDI communication protocol.
Having chatted with Nic Grant from Audeonic about the improving, but still somewhat sorry, state of MIDI under iOS, he was kind enough to provide me with some very informed insights into where he thinks we are currently at. His thoughts make for an interesting read and – with Nic’s permission – much of the discussion that follows below is based on his input. If you would like a somewhat more informed take on why iOS MIDI doesn’t always work as it should and, importantly, what developers might do about it (in Nic’s view at least), then read on…. I’ll then chip in again at the end…. over to Nic….
A short (iOS) history
Apple introduced CoreMIDI to iOS in iOS4.3 (it’s been around on Mac OS-X for ages) – before then only proprietary MIDI hardware under the ‘MFI’ banner was available such as Line6′ Midi Mobilizer and the Akai Synthstation.
The advent of CoreMIDI was ground-breaking but brought with it a new set of problems. The documentation was scant and it was up to the developer to implement MIDI in their apps as best they could. Some guidance was provided by members of the OMAC (Open Music App Collaboration) group but the dissemination of this guide was fairly limited. Pete Goodliffe created an open source class ‘PGMidi’ but this only painted part of the picture and many developers dropped the class into their apps and hoped for the best.
The end result was that a very large number of apps did not work well with each other (or didn’t work at all) with just basic MIDI interaction let alone synchronising with other apps or devices. Developers were going it alone and user confusion and dissatisfaction soon followed.
Over time, developers understood better what was required of a ‘well behaved’ MIDI app and things improved somewhat, but yet a large proportion of music apps were hamstrung by broken or incomplete MIDI facilities; even the ‘big guns’ who should have known better failed to get it right.
Enter the MidiBus library
One of the more comprehensive MIDI based apps was Audeonic’s MidiBridge which is often used to ‘glue’ apps with MIDI shortcomings together. Audeonic has made an attempt to provide developers with an easy to use, robust and rich featured library that they can easily integrate into their apps resulting in a ‘well behaved’ MIDI app out of the box along with access to sending highly accurate MIDI clock, managing the MIDI topology and correctly handling system exclusive messages amongst other features.
Essentially the MidiBus library is MidiBridge’s CoreMIDI engine neatly packaged for re-use along with the OMAC fast-switch implementation and a new MIDI clock generator. Additionally it defines virtual MIDI ports correctly for an app and provides a database like view of the device’s MIDI landscape with all types of ports (physical, network and virtual) presented in one homogenous fashion.
The aim of the library is to ensure that apps implementing the library ‘just work’ together well without the developer needing to give much thought about it.
The MidiBus app
In conjunction with the library, Audeonic released the first ‘MidiBus powered’ app – the creatively named MidiBus. The features of the app are detailed elsewhere but as well as providing a handy master clock for an iOS device the app was designed to showcase the MidiBus library in a real-life working app.
Developers have gradually become aware of the library (the ‘bus’ probably helped raise its profile a bit) and the list of apps sporting the library is growing steadily as shown at the MidiBus site. Users of the MidiBus powered apps have come to appreciate the ease in which they interact with each other for live playing and synchronisation.
Let’s look at the main features of the library in a bit more detail:
Internally in CoreMIDI the management of an app’s virtual ports differs from all other ports and a common pitfall is to not take that into account. This is why many apps’ own virtual inputs never seem to work. This is something that MidiBus will take care of – all ports can be treated the same way.
Creating virtual ports is also a little fiddly with CoreMIDI. For optimum interactivity, an app should define it’s own virtual ports simply as the name of the app and nothing more. This makes it apparent to users of other apps exactly which app they are dealing with. In MidiBus your virtual port names are identical to that of the app.
MIDI Topology Queries
Apps, devices and network connections can come and go over the lifecycle of an application. MidiBus implements a notification scheme where whenever there is any change to the MIDI landscape, your app is notified that a change has taken place.
An app can carry out a query anytime on all the MIDI devices in the system. A query filter is passed to the query allowing a subset of ports to be retrieved, for example, all apps that can receive MIDI or all physical devices.
Query results are a set of interfaces which will pair up any MIDI in with it’s corresponding MIDI out, so if an app wishes to display MIDI ports in pairs like in traditional hardware MIDI (or like MidiBridge) then this is possible too.
OMAC App fast-switching
The ability to switch from app to app via each app’s UI was pioneered by Audeonic as part of the OMAC group. This was later replicated in Audiobus but the OMAC version has lived on and has since been extended to permit switching between apps using a MIDI controller.
MidiBus powered apps automatically receive this feature and when an app is sent an OMAC MIDI-switch message (message format is open and published) it will initiate a switch to the requested app.
Using a ‘management’ app such as MidiBridge, it is therefore possible to program buttons on a synth or controller and move to various apps simply by pushing a button.
MIDI sync has been extremely problematic with iOS both sending and receiving. Creating a stable clock signal requires good understanding of the MIDI sync protocol and also how CoreMIDI works.
The MidiBus library includes a built-in clock generator so developers can send a proven and reliable clock signal from their own apps (just as accurate as the MidiBus app itself).
Sending clock is one half of the sync problem. Receiving/slaving is the other side of the coin, but it would be virtually impossible for a MIDI library to be able to impose an audio syncing structure on an app’s architecture. This is because there are many differing audio engines in use by developers.
Rather than attempting to force a certain audio sync architecture, the MidiBus library comes with a suggested model (example code) for slaving an app to it’s own clock output which has been looped back and fed back into the app.
The neat trick here is that the app in question is therefore instantly capable of slaving to it’s own clock or of another source since the logic for doing this is the same. In addition, it is possible to configure MidiBus to present the sync code in advance so that the app has time to schedule audio (or MIDI) events according to the timestamps of the incoming signal.
Audeonic makes the library available at no charge to developers and it is actively supported by them – developers who sign up for the library are provided access to the Audeonic developer forum for posting questions, requesting features and being generally kept up to date with the library.
Future enhancements are expected that will also provide MIDI Time Code (MTC) sync facilities and also be able to load in custom MIDI processing filters into the MidiBus engine itself.
If you are an app developer about to embark on a new app (or want to make an existing app work better MIDI-wise) you really ought to check out the MidiBus library – Audeonic really has made adding the best MIDI a ‘no-brainer’. As a final sweetener, the MidiBus library is also available for Mac OS-X, so if you have cross-compatible apps it will provide the same benefit there.
Can we fix it? Yes we can!
I don’t know about you, but I’m kind of glad my own head is spared having to untangle the complexities of MIDI data flow under iOS and that developers such as Audeonic (and Nic himself) are prepared to do it on my behalf (a long time ago I used to be a coder but I’m definitely in the ‘software user’ category now rather than the ‘software developer’ category).
Compared to audio – with its more tangible nature and the instant gratification many of the wonderful iOS audio effects apps can bring – MIDI is perhaps the unglamorous partner in iOS music technology terms. Indeed, there are lots of iOS musicians who simply ignore it and just record all their synth parts ‘live’ as audio…
This is perhaps a shame as MIDI has a considerable amount to offer the high-tech musician. On the desktop, this is a pretty mature technology and, if you use any of the popular DAW/sequencing environments, then MIDI is a key part of the furniture. It allows you to perfect your synth and virtual instrument performance, control your plugin effects and automate your complete mix and, whether you ever ‘see’ that data or not, it is a vital element of what makes computer-based recording such a powerful tool.
Under iOS, at yet, MIDI can’t quite deliver the same benefits. Given that the OS is still in the grumpy teenager stage (rather than wise, but still thrusting, middle age), maybe this is not altogether surprising. However, if technology such as Audiobus is helping move the iOS experience for musicians closer to that of the desktop, then we really do need MIDI support to keep up to complete the picture.
And, for that to happen, we need DAWs/sequencers that feature comprehensive MIDI specifications (with all the very creative options that brings such as groove quantizing, MIDI-based key/scale harmonization and full mix automation). But, before we can get there, we need MIDI under iOS to be technically transparent to developers… and that’s where a technology such as MidiBus – the Audiobus for MIDI perhaps? – needs to really take off with the development community.
I’ve no idea if that will happen but, if iOS music technology is to continue to move towards maturity (and hence stability), solid, consistent MIDI implementation is an important part of the development pathway. When they are not busy playing an instrument or typing stuff for the blog, my fingers are firmly crossed this is going to come to pass sooner rather than later….
Finally, many thanks to Nic Grant from Audeonic for allowing me to incorporate his detailed understanding of MIDI under iOS into this piece :-) and, if you happen to use an iOS music app that you think has a MIDI spec that still needs some work, then do give the developer a polite nudge. Some of this change is going to be driven by the demands of users so, whether we understand the details of the technology or not, we can all contribute towards encouraging some solutions….