Roli / MPE discussion and workarounds in NS2

edited December 2018 in General chat
This discussion was created from comments split from: Small sound test - ModelD vs. Obsidian.

Comments

  • edited December 2018

    @MattFletcher2000

    But i thought there is need to do something special (in terms of app implementation) to support Roli stuff.. (i mean specially seaboard) So it means you just need set wider range for pitchbend and that’s all ?

  • edited December 2018

    @Cinebient

    Nice to know. I wonder if you also can set it individually for + and -.

    no, this is not possible...

    The other problem is that most DAWs (if any) on iOS can´t record the MPE data. Means it should record exactly each event on each channel like you played it. Even a lot desktop DAWs can´t handle it.

    This should be easy in NS2, at least theoretically... you can create 16 tracks, set for
    every track same midi in hardware, just different midi channel (and/or also accepted midi notes range if you want limit it) ...

    then you hit record - and every channel will record just notes from that particular channel / range ...

    In NS2 you can send midi from
    one channel to any number of other channels and you can join midi notes from
    any number of channels back to single channel - basically same routing you can do with audio is available also
    for midi

  • edited December 2018

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

  • @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

  • Bit early to be focusing on mpe stuff imho. Car needs wheels before you pimp the upholstery;)

  • edited December 2018

    @dendy said:
    @MattFletcher2000

    But i thought there is need to do something special (in terms of app implementation) to support Roli stuff.. (i mean specially seaboard) So it means you just need set wider range for pitchbend and that’s all ?

    In short, you could ‘mono pitch bend’ with the Seaboard (or similar) using its keyboard just by setting NS2 to pitch bend range +-48. I’m gonna try it later.

    But ‘poly’ per note pitch bend is more complicated and indeed requires each app/AU to implement this.

    Edit: and that’s just pitch bend. For per note things like filter control, lfo control etc the app needs to be specially enabled for this. But more and more are (eg the Moog apps, Quanta, Synthmaster One, Volt, Tardigrain, Roli Noise, the PPGs etc).

    I appreciate it’s not always needed and it’s not everyone’s thing - but it sounds pretty great once you have it working.

  • @MattFletcher2000 said:

    @dendy said:
    @MattFletcher2000

    it’s +/- 2 by default (init patch)

    just go to mod matrix page, locate “pitch wheel” source, it’s by default routed to “Osc All: Coarse Tune” with amout “2”. Change it to 48 ;)

    Brilliant. Thanks!

    Just tried this for real with the Roli Lightblock. Works a treat when pitch bend is set 48 - allows you to bend full notes on the Roli playing surface (only mono though). Sounds really great with that model D copy patch you shared in this thread actually!

  • edited December 2018

    @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

    Just been experimenting further.

    I don’t actually think it ignores aftertouch messages, I believe (although I might be wrong) it passes them through to the AU when playing an external MPE device, but it doesn’t record them.

    In terms of midi cc74, it seems to block all external midi CC’s coming in from getting to the AU, unless you map the midi cc to a macro and then assign that macro to an AU parameter. This process then appears to all end up feeding the CC values to the AU on the same channel (irrespective of what channel they originated on). I know this from testing and listening (you can hear the different cc74 values conflict and jump around).

    So seems to me you can kind of emulate MPE by having, say, 4 NS2 tracks with 4 Obsidean instances on - split across 4 channels. That would work. You could do this with any AU instrument too.

    But you can’t have 4 NS2 ext midi tracks routed to just 1 AU track with an MPE enabled AU loaded. It doesn’t work because the midi CC’s don’t have their midi ch respected/retained once they go into the AU via the ‘macros’ system.

    Never mind :).

    The only way round (which isn’t too bad, and I’ve tried successfully) would be the MPE app to be run outside of NS2 - with 4 tracks (on different channels) of data being sent to it from NS2. With the right routing this works. You can record your midi data in NS2, and you can record the audio output of the MPE app separately and put it into your NS2 project when you’re happy as one or more samples on a ‘Slate’ instance. Just a bit more of a faff (and you still can’t get the aftertouch element).

  • edited December 2018

    @MattFletcher2000 said: Brilliant. Thanks!
    Just tried this for real with the Roli Lightblock. Works a treat when pitch bend is set 48 - allows you to bend full notes on the Roli playing surface (only mono though). Sounds really great with that model D copy patch you shared in this thread actually!

    :+1:

    Btw also other workarounds you suggested here would be worth to share also at AB forums for people who aren't active here...

  • @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

    Just been experimenting further.

    I don’t actually think it ignores aftertouch messages, I believe (although I might be wrong) it passes them through to the AU when playing an external MPE device, but it doesn’t record them.

    In terms of midi cc74, it seems to block all external midi CC’s coming in from getting to the AU, unless you map the midi cc to a macro and then assign that macro to an AU parameter. This process then appears to all end up feeding the CC values to the AU on the same channel (irrespective of what channel they originated on). I know this from testing and listening (you can hear the different cc74 values conflict and jump around).

    So seems to me you can kind of emulate MPE by having, say, 4 NS2 tracks with 4 Obsidean instances on - split across 4 channels. That would work. You could do this with any AU instrument too.

    But you can’t have 4 NS2 ext midi tracks routed to just 1 AU track with an MPE enabled AU loaded. It doesn’t work because the midi CC’s don’t have their midi ch respected/retained once they go into the AU via the ‘macros’ system.

    Never mind :).

    The only way round (which isn’t too bad, and I’ve tried successfully) would be the MPE app to be run outside of NS2 - with 4 tracks (on different channels) of data being sent to it from NS2. With the right routing this works. You can record your midi data in NS2, and you can record the audio output of the MPE app separately and put it into your NS2 project when you’re happy as one or more samples on a ‘Slate’ instance. Just a bit more of a faff (and you still can’t get the aftertouch element).

    Mhhh, i still not get why there is a problem with aftertouch.
    I anyway would use an extern MPE controller outside of NS2 to record it into NS2 sequencer. So what i get so far is that when i want f.e. a 4 voice MPE style synth i would use 4 instances of Obsidian where i would set them to channel 2, 3, 4, 5 f.e. (I also would set up my Seaboard of course to 4 voices then).
    So if i now midi learn a macro with CC74 do i not have all i need.
    Does Obsidian not support aftertouch as source. Otherwise a midi modifier (which can transform aftertouch output to any other midi CC message) would be needed here.
    But then i would have all i need in theory, velocity, velocity release, aftertouch/pressure, slide (CC74) and glide (pitch bend).
    Now i wonder how it would handle the global channel on top like modwheel and pitch bend for all voices.
    But i also think that the Roli controllers are a bit too limited. F.e. If i just would like to change the slide to another CC it‘s not possible, even not in the desktop dashboard app. So we are depend on the flexibility of the instruments and/or midi tools between.
    I think other MPE controllers are more flexible here.

    Yep. That setup would work I think.

  • edited December 2018

    Does Obsidian not support aftertouch as source.

    yes this would be REALLY handyfor creeating more expressive patches for keyboard players ...

  • @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

  • @dendy said:
    @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

    I read this too but also not sure how real it is. I also heard it may only be removed from iPhones, but that is also just a guess/rumor. I hope they do not remove this as it is another step in a good direction of responsive touch surfaces.

  • @toneman said:

    @dendy said:
    @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

    I read this too but also not sure how real it is. I also heard it may only be removed from iPhones, but that is also just a guess/rumor. I hope they do not remove this as it is another step in a good direction of responsive touch surfaces.

    If they remove it, they will probably replace it with a better option. It doesn't only serve music purposes, it's handy in drawing or writing applications to simulate expressive touches.

Sign In or Register to comment.