brambos

About

Username
brambos
Joined
Visits
95
Last Active
Roles
Member

Comments

  • I suspect that a lot of novice AU developers aren't aware that they don't have to expose all parameters to the end-user. Surely having hundreds of exposed parameters is not good practice and isn't all that useful in the real world. Like... I'm happy my gearbox only exposes a single gear stick and not all the individual…
  • Yeah.. I love NS2 for its lean simplicity. Feature creep is a nasty beast.. but I think Matt has a knack for knowing which things to implement and which to leave out.
  • Not completely true. In this case it was Perforator which (as a bonus feature) sends out MIDI CC data. In most hosts this CC data is not automatically passed on to other plugins if the user doesn't want it, but NS2 does and there's no option to disable it. To avoid weird side-effects I added a switch to disable sending of…
  • No, you're right. It's false accuracy too, because the iOS device can do dynamic CPU throttling without the app knowing about it (which can result in false alarms: having more CPU reserves than you think or CPU peaks being less of a problem than they appear).
  • Yeah.. we need an obscure vintage GIF B)
  • No need to... and either way, I'm adding features to my plugins to accommodate Nanostudio because hosts need plugins and vice versa. But we're learning as we go, and it's always easier to spot learnings in hindsight :) p.s. quite frankly when I made Rozeta I felt it was such an obscure use case that I didn't expect any…
  • You can't simply leave the host out of the equation; it provides the virtual cables/routing. The first AU MIDI hosts (AUM, ApeMatrix, BM3) offered free routing of MIDI between plugins: you could wire the cables yourself. But later, e.g. in the case of NS2 and Cubasis, the routing suddenly got fixed. I.e. the output of MIDI…
  • What conflicts?
  • That's nice, but it's not quite that simple. Physical cables can be easily split and rerouted around other devices. In software, the host needs to take care of such things. Example: in the case of Rozeta Bassline, MIDI input is used to transpose and/or trigger pattern playback. The plugin can not distinguish between MIDI…
  • Correct, my standalone apps are mini-hosts which host a single instance of my plugin. But Apple's changes don't affect them, because I'm not using the AU MIDI aspect in standalone mode.
  • Yes, it's an interesting, if not somewhat confusing time. When I made Rozeta there was indeed no documentation. I figured stuff out (with lots of assumptions) based on a single presentation slide from WWDC'17. But more significant than missing documentation was the fact that there were no conventions for how the technology…
  • Hit the “utils” button where you can configure all the other midi settings.
  • There are still problems with Swift and the CoreAudio/AUv3 framework. I would recommend sticking to C++ for realtime audio/dsp and ObjC for UI/audio framework stuff.
  • Would this be a good thread to congratulate Matt @"Blip Interactive" with a monumental achievement and an incredibly smooth launch? I think most developers will agree that launching an app this complex without significant bugs is a rare phenomenon. A stellar addition to the iOS landscape (and yet another host I need to…