MIDI on iOS – taming the mess

MIDI iconIf what remains of your hair (!) is mostly grey, then the odds are you might remember the wonder of MIDI’s first appearance. Having been subject to various bits of development work for a couple of years (and with a significant contribution from Sequential Circuits, makers of the Prophet synth so recently modelled from iOS as the iProphet), the first Musical Instrument Digital Interface (MIDI) standard/specification was published in 1983 after having been agreed on by companies such as Roland, Yamaha, Korg, Oberheim and Sequential themselves.

MIDI bought a new paradigm for workflows, whether that was in the context of live performance or in the recording studio. It allowed you to connect MIDI-capable pieces of equipment together and send digital data between them. One keyboard (or, indeed, many keyboards, could be controlled from another (master) keyboard. Equally, with allied developments in personal computer technology and the concept of sampling, it was not long before MIDI data was being recorded and sequenced by computer software. The age of computer-based recording was upon us.

Would you expect to buy a hardware synth without MIDI connectivity? No, I thought not....  MIDI is an essential part of even the most humble of music technology hardware.

Would you expect to buy a hardware synth without MIDI connectivity? No, I thought not…. MIDI is an essential part of even the most humble of music technology hardware.

However, grand that all sounds though, that’s not to say that the early adopters of MIDI didn’t face some fairly significant teething problems. Troubleshooting complex MIDI rigs therefore became a common task and, for those who really understood what the technology was doing, there was some gainful employment available helping those that didn’t but could see the potential and wanted to use the technology.

While it has been refined and improved over the years, MIDI still forms the core protocol for many tasks in the digital recording studio, whether that’s for triggering synths or virtual instruments or automating mix parameters within your DAW/sequencer. Some 30+ years after it first appeared, MIDI is still alive and kicking and embedded in the vast majority of music technology equipment in one form or another. Slowly, however, over the course of a number of years, manufacturers and software developers mastered the MIDI protocol and, on the whole, MIDI in hardware form is now a pretty painless experience.

However, the same is not always true once you get to MIDI in a computer environment, particularly when you start to get involved with ‘virtual’ MIDI (software-based MIDI connections between multiple virtual instruments). Our two most common desktop operating system – Windows and Mac OS – have different underlying code that handles MIDI and, as a consequence, software developers working across both platforms often faced some interesting challenges in ensuring compatibility and smooth operation. However, the technology has come a long way and, in recent years, under desktop operating systems, MIDI is usually a solid and dependable (if occasionally quirky and imperfect) part of the overall music technology system.

MIDI connectivity of music hardware - traditional 5-pin ports or USB-based - is now routine and (generally) reliable technology. When will MIDI under iOS catch up?

MIDI connectivity of music hardware – traditional 5-pin ports or USB-based – is now routine and (generally) reliable technology. When will MIDI under iOS catch up?

New OS, old problems

If part of the challenge for music software manufacturers has been dealing with MIDI under Windows and OSX, then it is perhaps not surprising that a new OS – in this case iOS for Apple’s mobile computing platform – might create a new set of potential problems.

Equally, iOS has seen a whole bunch of new software developers come to the market and, while the likes of Steinberg, Korg or Yamaha might have plenty of MIDI programming experience under their belts, for lots of indie app developers, this was not the case. In short, therefore, it is perhaps not that surprising that, under iOS, MIDI implementation within apps, and MIDI communication between apps, has been a bit of a bumpy ride.

I’ve been dabbling with iOS music technology for a number of years (and some of you, I’m sure, for longer than I have) but, in the last 2+ years of running the Music App Blog, I’ve quite often made comments about less-than-perfect MIDI support in the apps I’ve been reviewing. I am, on the whole, sympathetic to developers in their struggles with MIDI under iOS, but that doesn’t mean that I enjoy apps with incomplete MIDI specs, MIDI sync that doesn’t work (or, just as bad, works inconsistently) or apps that ought to talk to one another via MIDI but, for no reason that I can work out, don’t.

Fortunately, there are dedicated folk out there who want to put that right. Perhaps one of the best examples of this is developer Audeonic. I reviewed their excellent MidiBridge app back at the start of 2013 (and it has recently been updated to provide iOS8 support) and I’ve lost count of the number of times MidiBridge has dug me out of a MIDI hole by allowing me to make MIDI connections between iOS apps that otherwise just refused to talk to one another. Like Audiobus for audio communications between apps, MidiBridge is a very useful utility for solving MIDI-based data flow problems.

MidiBridge - now working quite happily under iOS8 :-)

MidiBridge – now working quite happily under iOS8 :-)

Equally, I was impressed with their MidiBus app – a MIDI Clock utility that can act as a ‘master’ MIDI Clock source and, as a result, ensure all your other iOS MIDI apps are being driven from the same, precise, clock source. The app also acts as a MIDI monitoring tool so you can see how the clock data from your various other apps varies (no, of course it shouldn’t, but it does) over time.

MidiBus is, however, also a library of code and, like the Audiobus SDK that developers can incorporate into their apps to ensure audio communications follow a standardised protocol, so can they incorporate the MidiBus library (its free to do so) to ensure that apps follow a standard MIDI communication protocol.

Having chatted with Nic Grant from Audeonic about the improving, but still somewhat sorry, state of MIDI under iOS, he was kind enough to provide me with some very informed insights into where he thinks we are currently at. His thoughts make for an interesting read and – with Nic’s permission – much of the discussion that follows below is based on his input. If you would like a somewhat more informed take on why iOS MIDI doesn’t always work as it should and, importantly, what developers might do about it (in Nic’s view at least), then read on…. I’ll then chip in again at the end….  over to Nic….

MidiBus will provide you with data about the MIDI clock quality through its Monitor screen.

MidiBus will provide you with data about the MIDI clock quality through its Monitor screen.

A short (iOS) history

Apple introduced CoreMIDI to iOS in iOS4.3 (it’s been around on Mac OS-X for ages) – before then only proprietary MIDI hardware under the ‘MFI’ banner was available such as Line6′ Midi Mobilizer and the Akai Synthstation.

The advent of CoreMIDI was ground-breaking but brought with it a new set of problems. The documentation was scant and it was up to the developer to implement MIDI in their apps as best they could. Some guidance was provided by members of the OMAC (Open Music App Collaboration) group but the dissemination of this guide was fairly limited. Pete Goodliffe created an open source class ‘PGMidi’ but this only painted part of the picture and many developers dropped the class into their apps and hoped for the best.

The end result was that a very large number of apps did not work well with each other (or didn’t work at all) with just basic MIDI interaction let alone synchronising with other apps or devices. Developers were going it alone and user confusion and dissatisfaction soon followed.

Over time, developers understood better what was required of a ‘well behaved’ MIDI app and things improved somewhat, but yet a large proportion of music apps were hamstrung by broken or incomplete MIDI facilities; even the ‘big guns’ who should have known better failed to get it right.

Enter the MidiBus library

One of the more comprehensive MIDI based apps was Audeonic’s MidiBridge which is often used to ‘glue’ apps with MIDI shortcomings together. Audeonic has made an attempt to provide developers with an easy to use, robust and rich featured library that they can easily integrate into their apps resulting in a ‘well behaved’ MIDI app out of the box along with access to sending highly accurate MIDI clock, managing the MIDI topology and correctly handling system exclusive messages amongst other features.

Essentially the MidiBus library is MidiBridge’s CoreMIDI engine neatly packaged for re-use along with the OMAC fast-switch implementation and a new MIDI clock generator. Additionally it defines virtual MIDI ports correctly for an app and provides a database like view of the device’s MIDI landscape with all types of ports (physical, network and virtual) presented in one homogenous fashion.

The aim of the library is to ensure that apps implementing the library ‘just work’ together well without the developer needing to give much thought about it.

The MidiBus app

In conjunction with the library, Audeonic released the first ‘MidiBus powered’ app – the creatively named MidiBus. The features of the app are detailed elsewhere but as well as providing a handy master clock for an iOS device the app was designed to showcase the MidiBus library in a real-life working app.

Developers have gradually become aware of the library (the ‘bus’ probably helped raise its profile a bit) and the list of apps sporting the library is growing steadily as shown at the MidiBus site. Users of the MidiBus powered apps have come to appreciate the ease in which they interact with each other for live playing and synchronisation.

Let’s look at the main features of the library in a bit more detail:

Virtual MIDI

Internally in CoreMIDI the management of an app’s virtual ports differs from all other ports and a common pitfall is to not take that into account. This is why many apps’ own virtual inputs never seem to work. This is something that MidiBus will take care of – all ports can be treated the same way.

Creating virtual ports is also a little fiddly with CoreMIDI. For optimum interactivity, an app should define it’s own virtual ports simply as the name of the app and nothing more. This makes it apparent to users of other apps exactly which app they are dealing with. In MidiBus your virtual port names are identical to that of the app.

MIDI Topology Queries

Apps, devices and network connections can come and go over the lifecycle of an application. MidiBus implements a notification scheme where whenever there is any change to the MIDI landscape, your app is notified that a change has taken place.

An app can carry out a query anytime on all the MIDI devices in the system. A query filter is passed to the query allowing a subset of ports to be retrieved, for example, all apps that can receive MIDI or all physical devices.

Query results are a set of interfaces which will pair up any MIDI in with it’s corresponding MIDI out, so if an app wishes to display MIDI ports in pairs like in traditional hardware MIDI (or like MidiBridge) then this is possible too.

OMAC App fast-switching

The ability to switch from app to app via each app’s UI was pioneered by Audeonic as part of the OMAC group. This was later replicated in Audiobus but the OMAC version has lived on and has since been extended to permit switching between apps using a MIDI controller.

MidiBus powered apps automatically receive this feature and when an app is sent an OMAC MIDI-switch message (message format is open and published) it will initiate a switch to the requested app.

Using a ‘management’ app such as MidiBridge, it is therefore possible to program buttons on a synth or controller and move to various apps simply by pushing a button.

Clocking Out

MIDI sync has been extremely problematic with iOS both sending and receiving. Creating a stable clock signal requires good understanding of the MIDI sync protocol and also how CoreMIDI works.

The MidiBus library includes a built-in clock generator so developers can send a proven and reliable clock signal from their own apps (just as accurate as the MidiBus app itself).

Clocking In

Sending clock is one half of the sync problem. Receiving/slaving is the other side of the coin, but it would be virtually impossible for a MIDI library to be able to impose an audio syncing structure on an app’s architecture. This is because there are many differing audio engines in use by developers.

Rather than attempting to force a certain audio sync architecture, the MidiBus library comes with a suggested model (example code) for slaving an app to it’s own clock output which has been looped back and fed back into the app.

The neat trick here is that the app in question is therefore instantly capable of slaving to it’s own clock or of another source since the logic for doing this is the same. In addition, it is possible to configure MidiBus to present the sync code in advance so that the app has time to schedule audio (or MIDI) events according to the timestamps of the incoming signal.

Worth investigating?

Audeonic makes the library available at no charge to developers and it is actively supported by them – developers who sign up for the library are provided access to the Audeonic developer forum for posting questions, requesting features and being generally kept up to date with the library.

Future enhancements are expected that will also provide MIDI Time Code (MTC) sync facilities and also be able to load in custom MIDI processing filters into the MidiBus engine itself.

If you are an app developer about to embark on a new app (or want to make an existing app work better MIDI-wise) you really ought to check out the MidiBus library – Audeonic really has made adding the best MIDI a ‘no-brainer’. As a final sweetener, the MidiBus library is also available for Mac OS-X, so if you have cross-compatible apps it will provide the same benefit there.

If iOS is to approach what's possible with music technology on the desktop, we need MIDI support, both in terms of MIDI hardware (such as the iRig MIDI2 show here) and in terms of software.

If iOS is to approach what’s possible with music technology on the desktop, we need MIDI support, both in terms of MIDI hardware (such as the iRig MIDI2 shown here) and in terms of software.

Can we fix it? Yes we can!

I don’t know about you, but I’m kind of glad my own head is spared having to untangle the complexities of MIDI data flow under iOS and that developers such as Audeonic (and Nic himself) are prepared to do it on my behalf (a long time ago I used to be a coder but I’m definitely in the ‘software user’ category now rather than the ‘software developer’ category).

Compared to audio – with its more tangible nature and the instant gratification many of the wonderful iOS audio effects apps can bring – MIDI is perhaps the unglamorous partner in iOS music technology terms. Indeed, there are lots of iOS musicians who simply ignore it and just record all their synth parts ‘live’ as audio…

This is perhaps a shame as MIDI has a considerable amount to offer the high-tech musician. On the desktop, this is a pretty mature technology and, if you use any of the popular DAW/sequencing environments, then MIDI is a key part of the furniture. It allows you to perfect your synth and virtual instrument performance, control your plugin effects and automate your complete mix and, whether you ever ‘see’ that data or not, it is a vital element of what makes computer-based recording such a powerful tool.

Under iOS, at yet, MIDI can’t quite deliver the same benefits. Given that the OS is still in the grumpy teenager stage (rather than wise, but still thrusting, middle age), maybe this is not altogether surprising. However, if technology such as Audiobus is helping move the iOS experience for musicians closer to that of the desktop, then we really do need MIDI support to keep up to complete the picture.

iOS DAW/sequencers such as Cubasis now provide mix automation based upon MIDI technology but, as I'm sure Steinberg would be the first to admit, there is still more that could be done in terms of the MIDI feature set under iOS.

iOS DAW/sequencers such as Cubasis now provide mix automation based upon MIDI technology but, as I’m sure Steinberg would be the first to admit, there is still more that could be done in terms of the MIDI feature set under iOS.

And, for that to happen, we need DAWs/sequencers that feature comprehensive MIDI specifications (with all the very creative options that brings such as groove quantizing, MIDI-based key/scale harmonization and full mix automation). But, before we can get there, we need MIDI under iOS to be technically transparent to developers… and that’s where a technology such as MidiBus – the Audiobus for MIDI perhaps? – needs to really take off with the development community.

I’ve no idea if that will happen but, if iOS music technology is to continue to move towards maturity (and hence stability), solid, consistent MIDI implementation is an important part of the development pathway. When they are not busy playing an instrument or typing stuff for the blog, my fingers are firmly crossed this is going to come to pass sooner rather than later….

Finally, many thanks to Nic Grant from Audeonic for allowing me to incorporate his detailed understanding of MIDI under iOS into this piece :-) and, if you happen to use an iOS music app that you think has a MIDI spec that still needs some work, then do give the developer a polite nudge. Some of this change is going to be driven by the demands of users so, whether we understand the details of the technology or not, we can all contribute towards encouraging some solutions….



Be Sociable; share this post....


    1. I confess to being one of those who records synth parts “live” as audio. But it’s only because I can’t figure out how to use MIDI! I understand its potential – and I suppose I do use MIDI a little in an app like Gadget where I edit the “live” synth parts I record by lengthening/shortening or adding/deleting the little blocks in the piano roll grid (which is wonderful). But going from app to app is where I get completely lost. Case in point: I recently bought Xynthesizer, and I love the interface. I want to use it to create a pattern, but have it play the sound of another app, like Thor or Nave. (That’s what MIDI is all about, right?) How do I do this? Is there a “MIDI for dummies” or somesuch that can walk me through this basic process? Apologies if my questions are too basic.

      • Hi Martin… don’t apologise… this kind of question is useful to people like myself (and other iOS music bloggers) because it reminds us that we shouldn’t take anything for granted about what our audience knows and what they don’t. One of the things I like about iOS is that I think it is (slowly) becoming the first point of contact that some musicians have with music technology. As such, these ‘newbie’ (and I appreciate that term may well not apply to yourself) music tech folk don’t necessarily have the background of having worked on, for example, a desktop music production system and having grasped some basics from that that translate over to iOS. At some stage – in between the review and the updates and the news coverage – I ought to get around to doing some other tutorial style pieces. MIDI under iOS really ought to be on that list. I’ll get there at some point :-) best wishes, John

      • Martin – No, those aren’t dumb questions. Learning MIDI even on a very basic level can be quite confusing because “MIDI” can mean so many different things. It’s best to think of it almost like a “language”. Languages can be written/transcribed for others to view and use – this would be like the “piano roll” notation you referred to. It’s part computer language, part musical notation (almost like a treble cleff or guitar tablature grid. Everything in MIDI is assigned a numerical value from 0 to 127. This includes notes, such that “C3” (a common center point on a keyboard) is MIDI number 60, whereas one octave higher (C4) is MIDI note 72. And this makes sense, musically, since there are 12 half-notes or semitones between octaves.

        Languages can ALSO be spoken in real-time to give direction to others – this would be like what happens when you set Xynthesizer’s “MIDI Out” port to Thor or another synth, and then check to see that Xynthesizer’s “MIDI In” setting is receiving “Notes” and/or “Clock” from Xynthesizer. The notes programmed in Xynthesizer then “talk” to Thor and get it to play those notes using a Thor engine.

        What really threw me off (and still does) with MIDI in iOS is that it’s not as simple as hooking up a cable. Things that are otherwise “connected” may fail to work for completely different reasons. So, in iOS, a common example of this is “background audio”. Many apps default as being able to continue playing when they are minimized in the background of iOS…but some do not, or BGKD AUDIO has to be turned on. There may be other issues, such having the wrong MIDI “channel” selected (OMNI means it is looking for every channel, this usually suffices for simple connections between apps).

        Anyway, keep at it, and hope this helps! Thanks to John for the comprehensive history and article!

    2. Rob Jones says:

      Especially on the iPhone which lends itself poorly to the task of playing on a keyboard. While some ux designs work really well, such as Jordan Rudess’s apps or the iFretless series, the miniaturized versions of most synths are fun toys but require an external keyboard to be useful. But as a MIDI sound module it’s a better deal, if only I had the ability to set up genuine MIDI sync and in/thru/out within the virtual cabling. While there are some partial implementations of multichannel MIDI networking, I have yet to find the AudioShare model for state-saving a rig and creating/opening MIDI files on the phone.

    3. In addition to the above, we should speak of the new concept of MIDI OVER BLUETOOTH has been added to IOS8 and supports Bluetooth 4.0 LE standard with very low power consumption. Yesterday I tried it between my iPad mini with bs-16i application and garageband on iMac and iMac sound controlled with iPad at high speed by the way.

      Also soon will release the Bluetooth communicator Mi.1 of Quicco (https://www.indiegogo.com/projects/mi-1-wireless-midi-interface-connecting-digital-piano-and-ios-device) project funded by crowfunding.

    4. Jayson Vein says:

      Thanks for this article John. I only skimmed it for now. I’ve read a lot on midi. I run some ios apps doing the virtual midi thing. But, for the most part, I do not understand midi. At all. How it works, why it is useful. I can’t seem to wrap my mind around it.

      The developer of I think it was midibridge emailed me back after I asked a lot of questions, about using my BOSS gt-3 multi effects pedal as a controller for JamUp. It was informational. And prob. helpful too.. I just didn’t comprhend much of it.

    5. Jayson Vein says:

      Huh, when I posted my post, the other responses were not visible to me. They are all very helpful everyone.


    6. Thank you very much John and Nic Grant from Audeonic, this article helped to fill in some MIDI holes in my brain…though not all of them. I’ve noticed that as I create more complex music on the iPads that the reliance on MIDI has grown and it’s an upward trend that is heading to the desktop and vice versa.

      As Jorge mentions above, now there is MIDI over Bluetooth (MOB) which worked great for me using the Apollo app iPad-mini to iPad4 (haven’t tried the native iOS yet). I wonder though, if that’s just another area of confusion and complication for app users as we move forward and if so, is MIDIBridge a solution for that as well?

    7. To borrow a “Doug-ism” from The SoundTest Room vids, I am a bonafide “midiot.” What I’ve learned has been largely through trial, error and online research. Within 6 months I’ve gone from “huh, wha?” to being able to connect most apps they way I want. The lone holdout for a long time was the M3000 Mellotron, which just didn’t want to talk to Genome at all.

      Finally tried it as in app audio with Beatmaker and boom! Piano roll! Which is huge, because I’m a worthless keyboard player at best.

      Still, just last week I had Gadget drifting all over the map after 16 bars when playing into Auria. I seriously thought about grabbing Midibus, not for the first time, because it was specifically a clock syncing issue.

      I hope Audeonic’s bus catches on like that more famous ‘bus app. Thanks, John, to both you and Nic for sharing your insights and once again starting an important discussion towards the future of iOS music.

    8. Many thanks to John and Jeff H for the replies, and to all for the thoughtful comments. What a great blog and supportive community!

    9. Erik Hauri says:

      For starters, how about every midi-capable bit of software just include a standard Midi Implementation Chart? Like ALL hardware manufacturers do?

      It is really bad policy to make customers have to figure the implementation out by trial and error.

    10. this is a bit of an offshoot but the midi/scale harmonization features of korg gadget has been a keystone for me writing songs. I’ve recently been looking into ways to make the logic pro x midi editor only allow for input of notes within the key and scale as available in korg’s gadget. Even more specifically, The marsailles Gadget’s chord function (which im not exactly sure what kind of chords it creates within the key and scale im guessing 1st 3rd 5th)

      I would love to implement this in my full daw setup in logic pro x if anyone knows how? itd be nice to use this with all my synths vsts. its an amazing way to compose and cuts out some time for me writing out the scale before i start composing.

      also the fact that cubasis now lets you use app effects as sends is really cool. wish i would have got it over auria now as i dnt really use the expert features as much on my ipad bc its usually for quick compositions and ideas before i haul it over to the computer

    11. Stephanie Merchak says:

      Thank you for this interesting article. Reading it made me realize that I’m not dumb after all. I’ve been successfully using computers with MIDI (virtual and hardware) for something like 15 years but when I got to making music on iOS I was never able to make apps talk to each other via MIDI. I thought I was doing something wrong but it seems after all that I’m not the reason it’s not working, the apps are. As I can understand from your article, there’s no way of making those music apps work flawlessly with MIDI for the moment unless one uses MIDIBridge and MIDIBus. Even following the instructions in apps manuals (like Music studio or FL Studio) don’t seem to lead to good results with MIDI. One app that was very promising and that was in a certain way a combination of MIDIBridge, MIDIBus and AudioBus was JACK. Unfortunately, not many apps were compatible with it and JACK stopped working altogether after iOS got updated to version 7.

    Speak Your Mind