The future of AU midi thoughts
This is not specific to NS2, but my thoughts on how AU midi is handled in all iOS apps.
At this time, AU midi seems to be one of those ‘what the hell do we do with this’ technologies.
Initial implementations in AUM and BM3 were reasonably successful if a little lacking in some areas. Later implementation by Cubasis, Auria, BM3 (the new implementation) and NS2 seemed to relegate (imo) AU midi even further away from its potential, with lack of or poor recording opportunities. They also missed the modular factor, but more on that later.
Now we have our first AU midi keyboard in the KB-1. Not a bad first effort and quite a useful app, or it will be once AU midi is sorted in apps. Problem is, this keyboard is separate in classed as a separate entity then the built in keyboard itself and this imo misses one of the best features of AU midi.
Imagine, if you will, a time when we are not limited to the input devices of any host app - a time when the input of midi data to either timeline or sound device is modular - want that keyboard from Animoog? - then load in their AU midi version of it. Want the esoteric keyboards from Seline Redux? - load the AU midi version in. Want to build your own set up? - then load in AU midi keyboard designer. Want an XY pad to play keys or drum pads? - well you get the picture
But how will they communicate? - Midi of course. But what about communication differences between sound devices? - having shareable, loadable and editable midi setups so that you can have that XY axis on your keyboard routed to the correct value. This way you can have knobs and sliders, XY pads, keys, drum pads, sliding wobbly wobbly input devices- who cares what...it’s all midi information coming being sent via the information your finger do on screen! It just needs to be managed correctly and be user friendly.
Obviously I’m not saying let’s not have built in devices for input information. No, I’m saying let’s have ones that can be interchanged for whatever device we want!
I’m also not saying that AU midi apps should only inhabit this space - they should be in the chain in many places of the host - for AU midi fx chains, AU midi sequencers etc etc.
Now, I’m aware that AU midi as a concept is relatively new and have been told that the documentation is not, let’s say ‘top notch’ , but hey let’s get thinking out the box people. Then let’s let our devs know that this is what we want as the future - a future where the nature of iOS as a touch screen device is one of its biggest advantages (along with portability of course).
Yes compatibility issues will at first plague this concept, just as they have with screen issues of AU apps in general, but ya gotta start somewhere.
Let the devs know they don’t have to make all the options themselves for the future ‘Super Host’ or ‘DAW’ - they can set it up so that other devs take some of the development strain by adopting modular concept AU midi!
Food for thought.