Some notes for AU developers

This would probably be better as a blog, but I don't have one of those so I thought I'd add some notes here. I've now gained a fair amount of experience looking into AU compatibility issues and since they often fall into a few common themes I thought it would be worth sharing.

Sample rate and buffer size

NanoStudio always uses 44.1kHz for real-time render, and upsamples to 48kHz if the hardware output device requires it. An additional complication is that some newer devices may choose a different sample rate based upon the hardware output device currently in use (eg. internal speakers vs. headphone jack).

If NanoStudio needs to upsample, it may request a different number of sample frames on each successive call to the Audio Unit's renderBlock. For example, when upsampling a 256 frame buffer from 44.1kHz to 48kHz, the actual number of samples required per render call will be 44.1 / 48 * 256 = 235.2 frames. Since this is a non-integer number of frames, NanoStudio will sometimes render slightly more frames than it needs and then save those extra frames for delivery at the start of the next buffer requested by iOS. In this example, the AU will be asked to render something like 236, 236, 232, 236 etc. frames on each call to its renderBlock. Over a long time period this will average to 235.2 frames per render call.

During mixdown (offline render), NanoStudio always uses a fixed number of frames (typically 256). Users may choose the sample rate for the mixdown, from 22.05kHz up to 96kHz. Audio units should be able to handle this range.

NanoStudio always follows these rules:

  • The number of frames requested by the call to the AU's renderBlockwill always a multiple of 4 (to aid SIMD optimization)
  • The number of frames requested will always be more than zero
  • The number of frames requested will always be less than or equal to maximumFramesToRender
  • If NanoStudio needs to change the sample rate or buffer size, it will first call deallocateRenderResources, set up the new properties and then call allocateRenderResourcesAndReturnError. These functions will be called exactly once from the main thread, and no calls to the AU's renderBlockwill be made in between.

Some AUs are hardcoded to a fixed rate (eg. 44.1kHz), or they use [AVAudioSession sharedInstance].sampleRate to obtain the sample rate. Instead, they must obtain the sample rate from the bus format specified by the host, which can be different to the AVAudioSession's hardware rate. NanoStudio sets all input and output buses of the AU to the same sample rate, so checking the first output bus is enough as follows:

f64 SampleRate = [avAudioUnit.AUAudioUnit.outputBusses objectAtIndexedSubscript:0].format.sampleRate;

Some AUs expect to always get the number of frames specified by maximumFramesToRender. However, maximumFramesToRenderis simply a hint from the host to specify the maximum number of frames that it will ever ask for in a call to renderBlockso that the AU can allocate the worst case buffer size in its implementation of allocateRenderResourcesAndReturnError. Instead, the AU should only render the number of frames that the host passes to the AU's renderBlock (which follows the rules outlined above).

The symptoms of not handling the host's sample rate/buffer size correctly typically manifest themselves as the instrument sounding out of pitch, or a crackly/chopped sound because the wrong size buffer is rendered.

Parameter changes are not sent to the host's observer functions

NanoStudio registers two observers with the AU's parameter tree prior to calling allocateRenderResourcesAndReturnError:

  • tokenByAddingParameterObserver
  • tokenByAddingParameterRecordingObserver

These observer functions are used to record automation parameter changes and mark the project as modified when an AU's parameter is altered using its native editor UI. Some AUs don't sent these observer messages, so NanoStudio doesn't record automation movements and doesn't mark the project as modified even though the user has modified the AU's parameters.

To ensure that these observer functions are called, the AU should use AUParameter.setValue when setting state from the UI.

Providing overview parameters

NanoStudio will automatically create a default macro mapping if the user hasn't saved their own one. To do this, it uses the following rules:

  • It checks for the most 'useful' parameters by calling the AU's parametersForOverviewWithCount
  • If the AU doesn't provide any overview parameters, it simply chooses the first 10 parameters in the tree

If the AU specifies overview parameters then this provides a better user experience.

AUParameter.setValue doesn't work over the range specified by its minValue and maxValue

For example, if an AU parameter has a minValueand maxValueof 0.0f and 1.0f respectively, it should be possible to use setValueto set a value between 0.0f and 1.0f (inclusive). However, some AUs specify an incorrect range and setValuedoesn't work for all the values which should be possible

In summary

  • Always use the host's sample rate by reading it from the AU's output bus
  • Always render the number of frames passed to renderBlock, not the number of frames specified by maximumFramesToRender as this is just a worst-case maximum
  • Provide overview parameters
  • Ensure that each parameter's minValueand maxValueare set up correctly. It's easy to add some test code which calls setValueon the minimum and the maximum values and then reads the value back to ensure it matches.

I'm always keen to improve AU compatibility, so feel free to drop me a line at https://www.blipinteractive.co.uk/contact.php and I'm happy to help.

Also feel free to contribute to this thread if you think I've got anything wrong - this has been a learning experience for me and I won't claim to be an authority on this by any means.

  • Matt
Sign In or Register to comment.