Roli / MPE discussion and workarounds in NS2

edited December 2018 in General chat
This discussion was created from comments split from: Small sound test - ModelD vs. Obsidian.

Comments

  • edited December 2018

    @MattFletcher2000

    But i thought there is need to do something special (in terms of app implementation) to support Roli stuff.. (i mean specially seaboard) So it means you just need set wider range for pitchbend and that’s all ?

  • @dendy said:
    @MattFletcher2000

    But i thought there is need to do something special (in terms of app implementation) to support Roli stuff.. (i mean specially seaboard) So it means you just need set wider range for pitchbend and that’s all ?

    It´s not that simple but you also can use the Seaboard as "normal" keyboard but if you want to bend the note manually you have to set the synth to +-48 (in some cases +-24) or it will sound off if you make a manual pitch no matter if it´s a semi-tone or more than a octave.
    But since Obsidian seems low on CPU you could just copy the instances of an Obsidian preset and each track should have it´s own midi channel.
    In general a Seaboard sends out midi out per note/channel so each new note will go to another midi channel with each having it´s own midi messages for bend, slide, glide, velocity, velocity release.
    You still have a global channel for modwheel and global pitchbend (which is normally channel 1, but also could be channel 16). Means you have up to 15 individual channels and 1 global channel.
    So you could set up to 15 Obsidian presets/instances to play as one instrument with MPE.
    Not sure how that routing works in NS2. There are a few apps which just always let all channels trough, even if you set it to 1, 2.....etc. And some will ignore the global channel.
    I wish i could test it right now. For sure i will and if that works i can´t wait to set up some Obsidian performances for my Seaboard Rise.
    However, there are also some apps which sending out MPE like Geoshred and ThumbJam.
    So you might try it yourself.
    There are also workarounds with some midi tools like Polymer (mac only sadly, maybe similar tools exit on iOS) where you even can split the midi out into channels per voice or even send 2 notes to channel 2 and 3 to channel 4 and whatever if you want.
    MPE is the future for me for expressive playing and it´s not always needed but always nice to have and for me a standard by now.
    Of course it´s great to have native MPE support in instruments but in most modern DAW´s it´s easy to make even any mono or non MPE instrument a polyphonic monster MPE :)
    The other problem is that most DAWs (if any) on iOS can´t record the MPE data. Means it should record exactly each event on each channel like you played it. Even a lot desktop DAWs can´t handle it.

  • edited December 2018

    @dendy said:
    @MattFletcher2000

    it’s +/- 2 by default (init patch)

    just go to mod matrix page, locate “pitch wheel” source, it’s by default routed to “Osc All: Coarse Tune” with amout “2”. Change it to 48 ;)

    Nice to know. I wonder if you also can set it individually for + and -.
    So f.e. sometimes it is useful to have a -2 but +5 setting.

  • edited December 2018

    @Cinebient

    Nice to know. I wonder if you also can set it individually for + and -.

    no, this is not possible...

    The other problem is that most DAWs (if any) on iOS can´t record the MPE data. Means it should record exactly each event on each channel like you played it. Even a lot desktop DAWs can´t handle it.

    This should be easy in NS2, at least theoretically... you can create 16 tracks, set for
    every track same midi in hardware, just different midi channel (and/or also accepted midi notes range if you want limit it) ...

    then you hit record - and every channel will record just notes from that particular channel / range ...

    In NS2 you can send midi from
    one channel to any number of other channels and you can join midi notes from
    any number of channels back to single channel - basically same routing you can do with audio is available also
    for midi

  • edited December 2018

    @dendy said:
    @Cinebient

    Nice to know. I wonder if you also can set it individually for + and -.

    no, this is not possible...

    The other problem is that most DAWs (if any) on iOS can´t record the MPE data. Means it should record exactly each event on each channel like you played it. Even a lot desktop DAWs can´t handle it.

    This should be easy in NS2, at least theoretically... you can create 16 tracks, set for
    every track same midi in hardware, just different midi channel (and/or also accepted midi notes range if you want limit it) ...

    then you hit record - and every channel will record just notes from that particular channel / range ...

    In NS2 you can send midi from
    one channel to any number of other channels and you can join midi notes from
    any number of channels back to single channel - basically same routing you can do with audio is available also
    for midi

    Ah sorry, i meant that MPE recording in one track plus all automations if you use native MPE instruments and want to replace them f.e with another but still get exact the same automations, cc messages etc. on the exact same channel. Even DAWs like Bitwig failed here (maybe it is solved now).
    They recorded all data but not the specific midi channel to it (like Logic can).
    Otherwise NS2 seems to offer some great midi routings even some desktop DAWs fails or it‘s much more complicated to set up.
    Oh, a question. Do midi tracks also allow to trigger other tracks from the sequencer (so kind of possible midi feedback loops).

  • edited December 2018

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

  • edited December 2018

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

  • @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

  • Bit early to be focusing on mpe stuff imho. Car needs wheels before you pimp the upholstery;)

  • edited December 2018

    @dendy said:
    @MattFletcher2000

    But i thought there is need to do something special (in terms of app implementation) to support Roli stuff.. (i mean specially seaboard) So it means you just need set wider range for pitchbend and that’s all ?

    In short, you could ‘mono pitch bend’ with the Seaboard (or similar) using its keyboard just by setting NS2 to pitch bend range +-48. I’m gonna try it later.

    But ‘poly’ per note pitch bend is more complicated and indeed requires each app/AU to implement this.

    Edit: and that’s just pitch bend. For per note things like filter control, lfo control etc the app needs to be specially enabled for this. But more and more are (eg the Moog apps, Quanta, Synthmaster One, Volt, Tardigrain, Roli Noise, the PPGs etc).

    I appreciate it’s not always needed and it’s not everyone’s thing - but it sounds pretty great once you have it working.

  • @MattFletcher2000 said:

    @dendy said:
    @MattFletcher2000

    it’s +/- 2 by default (init patch)

    just go to mod matrix page, locate “pitch wheel” source, it’s by default routed to “Osc All: Coarse Tune” with amout “2”. Change it to 48 ;)

    Brilliant. Thanks!

    Just tried this for real with the Roli Lightblock. Works a treat when pitch bend is set 48 - allows you to bend full notes on the Roli playing surface (only mono though). Sounds really great with that model D copy patch you shared in this thread actually!

  • edited December 2018

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

  • edited December 2018

    @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

    Just been experimenting further.

    I don’t actually think it ignores aftertouch messages, I believe (although I might be wrong) it passes them through to the AU when playing an external MPE device, but it doesn’t record them.

    In terms of midi cc74, it seems to block all external midi CC’s coming in from getting to the AU, unless you map the midi cc to a macro and then assign that macro to an AU parameter. This process then appears to all end up feeding the CC values to the AU on the same channel (irrespective of what channel they originated on). I know this from testing and listening (you can hear the different cc74 values conflict and jump around).

    So seems to me you can kind of emulate MPE by having, say, 4 NS2 tracks with 4 Obsidean instances on - split across 4 channels. That would work. You could do this with any AU instrument too.

    But you can’t have 4 NS2 ext midi tracks routed to just 1 AU track with an MPE enabled AU loaded. It doesn’t work because the midi CC’s don’t have their midi ch respected/retained once they go into the AU via the ‘macros’ system.

    Never mind :).

    The only way round (which isn’t too bad, and I’ve tried successfully) would be the MPE app to be run outside of NS2 - with 4 tracks (on different channels) of data being sent to it from NS2. With the right routing this works. You can record your midi data in NS2, and you can record the audio output of the MPE app separately and put it into your NS2 project when you’re happy as one or more samples on a ‘Slate’ instance. Just a bit more of a faff (and you still can’t get the aftertouch element).

  • edited December 2018

    @MattFletcher2000 said: Brilliant. Thanks!
    Just tried this for real with the Roli Lightblock. Works a treat when pitch bend is set 48 - allows you to bend full notes on the Roli playing surface (only mono though). Sounds really great with that model D copy patch you shared in this thread actually!

    :+1:

    Btw also other workarounds you suggested here would be worth to share also at AB forums for people who aren't active here...

  • edited December 2018

    @MattFletcher2000 said:

    @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

    Just been experimenting further.

    I don’t actually think it ignores aftertouch messages, I believe (although I might be wrong) it passes them through to the AU when playing an external MPE device, but it doesn’t record them.

    In terms of midi cc74, it seems to block all external midi CC’s coming in from getting to the AU, unless you map the midi cc to a macro and then assign that macro to an AU parameter. This process then appears to all end up feeding the CC values to the AU on the same channel (irrespective of what channel they originated on). I know this from testing and listening (you can hear the different cc74 values conflict and jump around).

    So seems to me you can kind of emulate MPE by having, say, 4 NS2 tracks with 4 Obsidean instances on - split across 4 channels. That would work. You could do this with any AU instrument too.

    But you can’t have 4 NS2 ext midi tracks routed to just 1 AU track with an MPE enabled AU loaded. It doesn’t work because the midi CC’s don’t have their midi ch respected/retained once they go into the AU via the ‘macros’ system.

    Never mind :).

    The only way round (which isn’t too bad, and I’ve tried successfully) would be the MPE app to be run outside of NS2 - with 4 tracks (on different channels) of data being sent to it from NS2. With the right routing this works. You can record your midi data in NS2, and you can record the audio output of the MPE app separately and put it into your NS2 project when you’re happy as one or more samples on a ‘Slate’ instance. Just a bit more of a faff (and you still can’t get the aftertouch element).

    Mhhh, i still not get why there is a problem with aftertouch.
    I anyway would use an extern MPE controller outside of NS2 to record it into NS2 sequencer. So what i get so far is that when i want f.e. a 4 voice MPE style synth i would use 4 instances of Obsidian where i would set them to channel 2, 3, 4, 5 f.e. (I also would set up my Seaboard of course to 4 voices then).
    So if i now midi learn a macro with CC74 do i not have all i need.
    Does Obsidian not support aftertouch as source. Otherwise a midi modifier (which can transform aftertouch output to any other midi CC message) would be needed here.
    But then i would have all i need in theory, velocity, velocity release, aftertouch/pressure, slide (CC74) and glide (pitch bend).
    Now i wonder how it would handle the global channel on top like modwheel and pitch bend for all voices.
    But i also think that the Roli controllers are a bit too limited. F.e. If i just would like to change the slide to another CC it‘s not possible, even not in the desktop dashboard app. So we are depend on the flexibility of the instruments and/or midi tools between.
    I think other MPE controllers are more flexible here.

  • @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @MattFletcher2000 said:

    @Cinebient said:

    @dendy said:

    Do midi tracks also allow to trigger other tracks from the sequencer

    yes - imagine situation that you have for example 5 obsidian (or au) tracks.. now, if you group them and set on group channel "send midi to all child tracks" - all midi notes on that group channel - no matter if recorded in clip or received from external midi keyboard - will be send to all child tracks ..

    (so kind of possible midi feedback loops).

    feedback is not possible, if you try do it app will prevent it.. you will get red warning text near send and routing in that point will be disabled

    Thank‘s! That‘s pretty much like in Logic so it is exact what i want.
    This is getting better and better here! :+1:
    Not sure if any other iOS DAW allow this yet.

    The problem I’m thinking of in a psydo 4 voice MPE set-up such as:

    4 NS2 tracks recording 4 different midi channels with an MPE midi controller (eg Roli Lightblock) rotating events around 4 midi channels

    is not the notes or the pitch bend messages because as you say, they could all be routed to a single MPE enabled AU instrument (which could handle MPE midi data).

    The problem in NS2 is:

    • no aftertouch/poly pressure is recorded
    • Cc data is not simply forwarded on to the AU. You’d have to assign macro and I think this would mess up the per midi channel split you’d need to keep things separate.

    MPE, as Roli defines it has 5 ‘dimensions’

    • pitch bend (‘glide’)
    • Cc74 on the y axis of the keys (‘slide’)
    • Per key aftertouch/polypressure (pressure)
    • Velocity
    • Release velocity (I never use it since it’s so tricky to control and most of the AUs don’t support it!)

    Right now NS2 could only even be faked into supporting per note pitch bend and velocity as far as I can see. So only 2 of the 5. It would be great if it could at least handle Poly pressure and/or cc74 slide.

    As far as i saw supports Obsidian also velocity release (and that is sadly rare even on desktop synths).
    But i wonder why wouldn‘t NS2 record aftertouch events and midi CC74? Does it ignore these messages?
    So in a worst case we have to use macros for this? I mean if it works it‘s O.K. for me.

    Just been experimenting further.

    I don’t actually think it ignores aftertouch messages, I believe (although I might be wrong) it passes them through to the AU when playing an external MPE device, but it doesn’t record them.

    In terms of midi cc74, it seems to block all external midi CC’s coming in from getting to the AU, unless you map the midi cc to a macro and then assign that macro to an AU parameter. This process then appears to all end up feeding the CC values to the AU on the same channel (irrespective of what channel they originated on). I know this from testing and listening (you can hear the different cc74 values conflict and jump around).

    So seems to me you can kind of emulate MPE by having, say, 4 NS2 tracks with 4 Obsidean instances on - split across 4 channels. That would work. You could do this with any AU instrument too.

    But you can’t have 4 NS2 ext midi tracks routed to just 1 AU track with an MPE enabled AU loaded. It doesn’t work because the midi CC’s don’t have their midi ch respected/retained once they go into the AU via the ‘macros’ system.

    Never mind :).

    The only way round (which isn’t too bad, and I’ve tried successfully) would be the MPE app to be run outside of NS2 - with 4 tracks (on different channels) of data being sent to it from NS2. With the right routing this works. You can record your midi data in NS2, and you can record the audio output of the MPE app separately and put it into your NS2 project when you’re happy as one or more samples on a ‘Slate’ instance. Just a bit more of a faff (and you still can’t get the aftertouch element).

    Mhhh, i still not get why there is a problem with aftertouch.
    I anyway would use an extern MPE controller outside of NS2 to record it into NS2 sequencer. So what i get so far is that when i want f.e. a 4 voice MPE style synth i would use 4 instances of Obsidian where i would set them to channel 2, 3, 4, 5 f.e. (I also would set up my Seaboard of course to 4 voices then).
    So if i now midi learn a macro with CC74 do i not have all i need.
    Does Obsidian not support aftertouch as source. Otherwise a midi modifier (which can transform aftertouch output to any other midi CC message) would be needed here.
    But then i would have all i need in theory, velocity, velocity release, aftertouch/pressure, slide (CC74) and glide (pitch bend).
    Now i wonder how it would handle the global channel on top like modwheel and pitch bend for all voices.
    But i also think that the Roli controllers are a bit too limited. F.e. If i just would like to change the slide to another CC it‘s not possible, even not in the desktop dashboard app. So we are depend on the flexibility of the instruments and/or midi tools between.
    I think other MPE controllers are more flexible here.

    Yep. That setup would work I think.

  • edited December 2018

    Does Obsidian not support aftertouch as source.

    yes this would be REALLY handyfor creeating more expressive patches for keyboard players ...

  • edited December 2018

    @dendy said:

    Does Obsidian not support aftertouch as source.

    yes this would be REALLY handyfor creeating more expressive patches for keyboard players ...

    ...as well as breath and expression. Then there is sustain and sostenuto and....... :)
    But aftertouch would be the first of these since even cheap midi keyboards and iPhones with 3D touch support it. Did i say how awesome actually the 3D touch is for aftertouch. Almost as good (or even better in some cases) as my Seaboard Rise. I hope one day iPads getting 3D touch. Oh that would be wonderful. Even on this small screen it´s so much more expressive to play f.e. Model 15 on my iPhone i could on an iPad. It´s also very usable as long as you don´t use more than 4 voices. But it sucks for velocity.

  • @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

  • edited December 2018

    @dendy said:
    @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

    I saw that too but i can´t believe this. If they would do it i would leave all the iOS devices forever in a heartbeat. 3D touch is THE feature for me but it seems that many people even don´t know what it actually can do. They think it´s just a monophonic on/of switch like a mouse right click. But it works very accurate actually and with 5 fingers individually. It seems even a feature which shines best in music applications. So removing this would we really odd. But then they also removed the finger sensor thing and replaced it with the face lock thing i really hate.
    But then i also find the touch bar on the new macbooks is amazing for mac DAWs and even the large trackpad with force touch (kind of monophonic 3D touch) is super for actually real multi-touch in music tools. There are interesting software which takes advance of it.
    But of course it´s all maybe just useful for the niche we like.
    I´m even scared that someday Apple leave Logic....i hope that won´t happen in the next few years.

  • @dendy said:
    @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

    I read this too but also not sure how real it is. I also heard it may only be removed from iPhones, but that is also just a guess/rumor. I hope they do not remove this as it is another step in a good direction of responsive touch surfaces.

  • @toneman said:

    @dendy said:
    @Cinebient i did read somewhere that apple have plans to remove 3D touch in next generations of devices... Buy was just rumours so no idea how real it is...

    I read this too but also not sure how real it is. I also heard it may only be removed from iPhones, but that is also just a guess/rumor. I hope they do not remove this as it is another step in a good direction of responsive touch surfaces.

    If they remove it, they will probably replace it with a better option. It doesn't only serve music purposes, it's handy in drawing or writing applications to simulate expressive touches.

Sign In or Register to comment.