The future of AU midi thoughts

This is not specific to NS2, but my thoughts on how AU midi is handled in all iOS apps.

At this time, AU midi seems to be one of those ‘what the hell do we do with this’ technologies.
Initial implementations in AUM and BM3 were reasonably successful if a little lacking in some areas. Later implementation by Cubasis, Auria, BM3 (the new implementation) and NS2 seemed to relegate (imo) AU midi even further away from its potential, with lack of or poor recording opportunities. They also missed the modular factor, but more on that later.

Now we have our first AU midi keyboard in the KB-1. Not a bad first effort and quite a useful app, or it will be once AU midi is sorted in apps. Problem is, this keyboard is separate in classed as a separate entity then the built in keyboard itself and this imo misses one of the best features of AU midi.

Imagine, if you will, a time when we are not limited to the input devices of any host app - a time when the input of midi data to either timeline or sound device is modular - want that keyboard from Animoog? - then load in their AU midi version of it. Want the esoteric keyboards from Seline Redux? - load the AU midi version in. Want to build your own set up? - then load in AU midi keyboard designer. Want an XY pad to play keys or drum pads? - well you get the picture

But how will they communicate? - Midi of course. But what about communication differences between sound devices? - having shareable, loadable and editable midi setups so that you can have that XY axis on your keyboard routed to the correct value. This way you can have knobs and sliders, XY pads, keys, drum pads, sliding wobbly wobbly input devices- who cares what...it’s all midi information coming being sent via the information your finger do on screen! It just needs to be managed correctly and be user friendly.

Obviously I’m not saying let’s not have built in devices for input information. No, I’m saying let’s have ones that can be interchanged for whatever device we want!

I’m also not saying that AU midi apps should only inhabit this space - they should be in the chain in many places of the host - for AU midi fx chains, AU midi sequencers etc etc.

Now, I’m aware that AU midi as a concept is relatively new and have been told that the documentation is not, let’s say ‘top notch’ :p , but hey let’s get thinking out the box people. Then let’s let our devs know that this is what we want as the future - a future where the nature of iOS as a touch screen device is one of its biggest advantages (along with portability of course).

Yes compatibility issues will at first plague this concept, just as they have with screen issues of AU apps in general, but ya gotta start somewhere.

Let the devs know they don’t have to make all the options themselves for the future ‘Super Host’ or ‘DAW’ - they can set it up so that other devs take some of the development strain by adopting modular concept AU midi!

Food for thought.

Comments

  • Nice thoughts. Sound like you’re looking down the road a ways, which is good to do. It’s almost like the developers need to have a convention to agree on some standards that would allow them to to move AU & MIDI further, then tell Apple what they want - that’ll be the biggest hurdle.

  • @User_Error said:
    Nice thoughts. Sound like you’re looking down the road a ways, which is good to do. It’s almost like the developers need to have a convention to agree on some standards that would allow them to to move AU & MIDI further, then tell Apple what they want - that’ll be the biggest hurdle.

    Would be interesting to be a fly on the wall at such a meeting :)

  • Yes, it's an interesting, if not somewhat confusing time. When I made Rozeta there was indeed no documentation. I figured stuff out (with lots of assumptions) based on a single presentation slide from WWDC'17.

    But more significant than missing documentation was the fact that there were no conventions for how the technology should be used. I built Rozeta (at the time still named Obsidian ;) ) together with Jonatan of AUM/Kymatica. I did the plugin side of things, he made it work on the host's end. So AUM being the first host for AU MIDI obviously had a huge impact on how I approached the concept and how I came to the modular nature of the plugin suite.

    Now that AU MIDI is also being adopted by more 'linear' (non-modular) hosts like NS2 and Cubasis I suddenly run into aspects of the design that don't feel right. Which means I suddenly had to implement MIDI THRU in the plugins etc.

    Apple, in the meantime are also clarifying some of their original intentions with the format. Unfortunately those are not always aligned with how Jonatan and myself interpreted the technology (a direct result of working without documentation). For example: all Rozeta plugins are built as regular AU instruments which send out MIDI, whereas Apple has now made a separate AU MIDI format available. A real (if minor) headache for me and host makers. I can't change Rozeta to this new format without breaking everyone's existing projects and host makers now have to support both formats.

    Anyway... we're getting there. Slowly. But we're learning as we go.

    On the positive side: if Apple had laid down their original intentions and restrictions from the start, it's less likely that I would have made Rozeta conceptually as "modular" as I have now because the technology was obviously not meant to be used this way if it were up to Apple (evidence: Garageband still doesn't support this way of using Rozeta).

    :)

  • @brambos

    Thanks that is very insightful.

  • @brambos

    On a side note, I thought I read somewhere that your instrument apps are actually AU instruments inside a host for the standalone part? If so, do the changes Apple have made with regards to AU midi affect them in any way?

  • @Fruitbat1919 said:
    @brambos

    On a side note, I thought I read somewhere that your instrument apps are actually AU instruments inside a host for the standalone part? If so, do the changes Apple have made with regards to AU midi affect them in any way?

    Correct, my standalone apps are mini-hosts which host a single instance of my plugin. But Apple's changes don't affect them, because I'm not using the AU MIDI aspect in standalone mode.

  • edited December 2018

    We seen some great AU midi Sequencers, drum Sequencers, keyboards, object triggering AU. X Y pads, LFO,s, interested to see what other midi AU ideas come & appear at the iOS party.

  • @brambos Great post! That helps explain things a lot. A lot of us are left wondering what is going on, and it is always nice to read your posts from the dev perspective. I don't envy any of you guys - really difficult tasks. And Apple... it seems to me to be super frustrating to deal with them, but there is no choice. Thanks for the pioneering work you have done and your willingness to help other devs.

  • Thinking back to hardware days, everything had midi in, out, and usually through. You could send notes to a piece of hardware. That piece of hardware would listen on the input, filter out what isn’t intended for it, and react to the data meant for it. It could generate its own data based on what was incoming and also on its own or by being played by a person. It then passed that data to the out port. Essential to that was midi through because everything was connected in a chain. Nobody else cared what kind of devices were out there because they were all just inputting, outputting, and passing along data.

    Its a vast over simplification I’m sure, but it seems to me that AU midi “FX” (bad term IMO) should be similar. It shouldn’t matter to a host whether the plugin is a drum machine, on-screen keyboard, or a midi mangler of some sort. Plugins should receive data, ignore what isn’t intended for them, then generate, add to or manipulate it, and pass it along. If, as in the case of Rozeta, certain data should not be passed along, such as the notes used to trigger different patterns, that should be the plugin’s responsibility.

    Fortunately we don’t have to talk about midi clock, start/stop, song position, etc here as the host handles all that. (At least I don’t think we do)

    Anyway, it all seems to have gotten very convoluted, but I can’t help but think it’s simple at the root. Midi goes in. Midi gets used, created, or modified, midi passes out (including anything not intended for the plugin). I can’t stress that last thing enough. Midi through for all data received except that intentionally blocked by the plugin is critical.

    I hope that made sense.

  • edited December 2018

    Oops. I forgot one point. Midi FX can also receive and can react to audio, producing midi output based on how they react to it. But the principle is the same. Data (Audio) comes in, data is reacted to, data (Midi and Audio) is output. Audio data pass-through is obviously essential in this case as well.

  • @number37 said: Midi goes in. Midi gets used, created, or modified, midi passes out (including anything not intended for the plugin). I can’t stress that last thing enough. Midi through for all data received except that intentionally blocked by the plugin is critical.

    That's nice, but it's not quite that simple. Physical cables can be easily split and rerouted around other devices. In software, the host needs to take care of such things.

    Example: in the case of Rozeta Bassline, MIDI input is used to transpose and/or trigger pattern playback. The plugin can not distinguish between MIDI input which is or is not meant to be received by the plugin, other than through MIDI channels. It's up to the routing capabilities of the host (and possibly methods for [re]mapping MIDI channels) to avoid conflicts.

    Virtual MIDI easily becomes a UX nightmare, because its routing tends to be mostly invisible.

  • This stuff’s such a huge non-issue or talking point on desktop/surface pro etc. Really hope Apple or devs or wherever the conflicts currently lay can get their heads together and put it in the past :)

  • edited December 2018

    @brambos, sorry, but I don’t understand the need for host routing. One would only need to split or route around equipment that can’t pass data through or doesn’t properly filter out data it shouldn’t. If every piece of hardware properly used/filtered/passed through data is it should, it is just one string of devices.

    I disagree that the host should do all that routing, or at least that there is a need for it if the plugins handle data as they should. And what I’m saying is yes, the plugin should receive all the data, decide what is meant for it, and pass the rest along. Of course midi channel should be used to decide what to react to, but message type (CC, PC, Sysex...) is also available and important.

    To do otherwise is inviting the equivalent of creative cable mess we have in hosts now.

  • @flockz said:
    This stuff’s such a huge non-issue or talking point on desktop/surface pro etc. Really hope Apple or devs or wherever the conflicts currently lay can get their heads together and put it in the past :)

    What conflicts?

  • edited December 2018

    @brambos said:

    @flockz said:
    This stuff’s such a huge non-issue or talking point on desktop/surface pro etc. Really hope Apple or devs or wherever the conflicts currently lay can get their heads together and put it in the past :)

    What conflicts?

    Erm... one element of midi au works in one host but not in another. Open Routing in one Host but not in another. SPA works in one Host. Not in another.... and so on. It’s like Russian roulette ;)

  • edited December 2018

    @number37 said:
    @brambos, sorry, but I don’t understand the need for routing. One would only need to split or route around equipment that can’t pass data through or doesn’t properly filter out data it shouldn’t. If every piece of hardware properly used/filtered/passed through data is it should, it is just one string of devices.

    I disagree that the host should do all that routing, or at least that there is a need for it if the plugins handle data as they should. And what I’m saying is yes, the plugin should receive all the data, decide what is meant for it, and pass the rest along. Of course midi channel should be used, but message type (CC, PC, Sysex...) is also available.

    To do otherwise is inviting the equivalent of creative cable mess we have in hosts now.

    You can't simply leave the host out of the equation; it provides the virtual cables/routing. The first AU MIDI hosts (AUM, ApeMatrix, BM3) offered free routing of MIDI between plugins: you could wire the cables yourself.

    But later, e.g. in the case of NS2 and Cubasis, the routing suddenly got fixed. I.e. the output of MIDI FX1 is forced into MIDI FX2, etc. This was a new variable which changed the usage context of AU MIDI plugins long after they had been made.

    To say it's all the fault of the plugins because they somewhat resemble how physical MIDI devices used to work is oversimplifying things quite a bit. There is no equivalent for a "host" in the physical world. This is complex shizzle.

  • edited December 2018

    No, no, no, no, the last thing I’m saying is it’s the fault of the plugins. :o

    I’m saying it could all be greatly simplified if plugins began to work that way. I fully recognize that the various host implementations have introduced this new complexity. I don’t blame the hosts either. Everyone is feeling their way.

    I’m only attempting to introduce a concept which I believe would simplify things if it ever came to pass. I was just tossing out a theory to see if it stands up to others poking holes in it.

    If it doesn’t add to the discussion then I’m happy to drop it.

  • As an end user with au I just get a smaller ui and more confusion as to what works and what doesn’t. Multiple instances yes, but half the time I dont care if I get to use multiple instances in a track. Everyone screams for au every app release, I’m thinking please just stick with full screen. Iaa and ab have their own issues but au standard on ios is far from prime time. Interesting times.

  • @number37 said:
    No, no, no, no, the last thing I’m saying is it’s the fault of the plugins. :o

    I’m saying it could all be greatly simplified if plugins began to work that way. I fully recognize that the various host implementations have introduced this new complexity. I don’t blame the hosts either. Everyone is feeling their way.

    I’m only attempting to introduce a concept which I believe would simplify things if it ever came to pass. I was just tossing out a theory to see if it stands up to others poking holes in it.

    If it doesn’t add to the discussion then I’m happy to drop it.

    No need to... and either way, I'm adding features to my plugins to accommodate Nanostudio because hosts need plugins and vice versa. But we're learning as we go, and it's always easier to spot learnings in hindsight :)

    p.s. quite frankly when I made Rozeta I felt it was such an obscure use case that I didn't expect any significant uptake of the concept in the first place :)

  • edited December 2018

    @brambos said:
    p.s. quite frankly when I made Rozeta I felt it was such an obscure use case that I didn't expect any significant uptake of the concept in the first place :)

    I’m sure there’s some clever esoteric literary comparison about wars being started by falling in love with the wrong woman, etc. but I’m not well enough read to pull it off. Where is @JohnnyGoodyear we need him?

  • edited December 2018

    @number37 said:

    @brambos said:
    p.s. quite frankly when I made Rozeta I felt it was such an obscure use case that I didn't expect any significant uptake of the concept in the first place :)

    I’m sure there’s some clever esoteric literary comparison about wars being started by falling in love with the wrong woman, etc. but I’m not well enough read to pull it off. Where is @JohnnyGoodyear we need him?

    Yeah.. we need an obscure vintage GIF B)

  • O no you haven’t shouted the GIF meister lol

Sign In or Register to comment.