Channel touch not working in poly multichannel subpatch?


#1

hi there, got my board a week ago and started programming some really simple stuff today. i tried to create a 4 channel synth, with a subpatch and poly multichannel on. this works as expected but as soon as i add channel touch into the equation, things start to get wrong. somehow the touch in on one channel affects the other channels as well, but only sometimes. i have not really found a pattern. am i doing something wrong? what i would want is to control cutoff frequency of the two filters with channel touch on each channel individually.
squarevoice_1.axp (4.3 KB)

multivoicetest.axp (7.4 KB)


Merging midi in channels?
#2

its fine here....

I tested it with my soundplane, and each channel was definitely responding independently as expected.

to test further (as I usually use poly expression), I extended it to use pressure to drive the vca, pb for sliding pitch and cc 1 for controlling your sine pwm osc.... then played with it, and everything (pressure, cc, pb) was correctly independent on channel.

I played for a few minutes, with my setup, since pressure was also controlling the envelope, any interference on channel pressure (touch) would have resulted in me hearing a change in gain, which would have been very noticeable.

how do you have this setup? whats feeding the midi etc?
it might be worth checking the midi being sent is correct?
also when you updated the sub patch... did you remember to press the update buttons?
(what your describing sounds like polyphonic mode... non multichannel, but the patch settings were for poly multichannel)

(btw: you dont need to post the square voice_1.axp, as your main patch has the voice patch embedded)


#3

thanks for checking! can you post your patch, so i can try it and learn :slight_smile: ?

i tested it with puredata and sent the data over usb to the axoloti. i will check again, but i am pretty sure the puredata patch is alright. i can try with my actual instrument on tuesday.

cool, that sub patches are embedded, i'm used to puredata...

good to know, it works for you.


#4

if you have puredata, you can try my patch. i get some notes on the axolotl that are much too bright (cutoff open too much) even when touch out is set to 0 on the respecting channel. you will have to rename the patch to axolotitest.pd

axolotitest.axp (4.9 KB)


#5

ok, did a bit more testing, and actually I can get 'stuck' voices with poly multi channel.... so possibly a bug in the voice allocation code, I'll take a look
EDIT:actually not a bug, just the patch doesn't take into account note status , see following post

heres the patch I'm testing with.... its now kind of MPE compliant, as easier for me to test.
multivoicetest2.axp (8.3 KB)

EDIT: as comment above/post below, this has a bug, due to note status, hence notes will occasionally stick.

but if you are getting an MPE enabled controller, you should use the poly expression and the mpe object, as I use these more frequently, so doesn't have the above bug, as MPE object 'deals' with issue.
here is your patch using it... as you can see its much simpler too, as I already calc things like pitch and timbre :slight_smile:

multivoicempe.axp (6.8 KB)

EDIT: again this wont work with your PD patch 'as you expect', due to issues mention in following post.

actually sounds quite nice :slight_smile:

so what controller are you getting this week?


#6

it's my own diy controller, a midi-bass with fsr's for aftertouch and ribbon-softpots for the pitch. it has 4 "strings" and plays much like a regular bass (since i'm a bass player) maybe i can try to implement mpe on my controller (arduino micro brain sending serial data and midi over usb). i'm just not home atm


USB Midi Controllers
Piano Roll. Love it? Hate it? Alternatives?
#7

cool... yeah implementing as MPE compliant is a good idea as its an 'emerging' standard.

the main features are:
send global (none per note expression) on channel 1
all note messages and per note expression on channel 2-16
(its the starting at channel 2 that catches most out!)

per note expression - pitch bend and channel pressure, and CC 74 as 'timbre'

pitch bend range is by default assumed to be 48 semitones. (it can be altered, but for your controller, just assume it for now, so you dont need to start sending NRPN messages :wink: )

the rest to the MPE spec is not being closely followed yet, so if you do the above... you'll be pretty MPE compliant. and certainly axoloti will work fine.

Ive just had a quick look at the code, and its not really a bug in poly multichannel... more a 'feature' and something I take care of in the MPE object - probably whats causing you issues too.

poly multichannel is simple... it basically sends CC/PB/touch only when a note is active.
but this can cause issue, since if you change any of these whilst a note is not active, it will not change the voice.
this is further complicated, by the midi channel does NOT have a direct correlation to the voice (good reasons I wont go it into)

so imagine the following... all on channel 1, assume filter closed
note on (60)
channel pressure 127 - opens filter as expected, on voice 1
note off (60)
channel pressure 0 (under the hood this is 'ignored')
note on (61) ... filter is heard fully open !!!

now this gets even more complicated if we have round robin note routing to voice,
this is all on midi channel 1, assume 4 voices
note on (60) (assigned voice 1)
channel pressure 127 - opens filter as expected, on voice 1
note off (60)
channel pressure 0 (under the hood this is 'ignored')
note on (61) (assigned voice 2) ... filter is heard close (its start state!)
note off (61)
note on (62) (assigned voice 3) ... filter is heard close (its start state!)
note off (62)
note on (63) (assigned voice 4) ... filter is heard close (its start state!)
note off (63)
note on (64) (assigned voice 1) ... filter is heard OPEN !!!

the reason for all this is poly multichannel, really isn't a protocol, it doesn't say anything about what midi messages mean, it just says to blindly route the messages.

this is where MPE kicks it, as it defines what the midi messages mean, and so can have some logic to deal with the above edge cases, in particular you can do things like reset pressure and timbre on note off!

of course with poly multi channel you can (partially) work around this in your patch, i.e. do as Im doing in MPE
basically, what you do is use the gate...
so rather than feed touch directly into (say) your filter or vca, first multiply it by the gate...
this means that when the gate is inactive the filter will close....

in you PD patch this still wont work as you expect though... as if you (say) open the filter, whilst a note is not active, it wont do anything .... but this is because you are not conforming to per note expression.
what you would need to do, change your PD patch, such that you send the touch value every time you start a note.
(this is still not quite perfect, for complicated reasons, but its close enough :wink: )

whats worth noting is , that when using your controller, this should all just work.... as you should make sure you only send per note expression when a note is active.

this probably all sounds a bit complicated... but you get used to it, per note expression like everything is a simple concept but there are certain 'details' required to make it work musically :wink:

anyway hope this helps
Mark


#8

thanks, for the in-depth answer. makes all sense to me.

my controller already supports most of mpe, it just sends on channel 1-4. i try changing that to 2-5 and see how things work. as of now, i have no general expression planned on this instrument.

btw. how do you handle glide and legato playing? (new note on before old note off in a mono synth) on traditional synths this just disables the adsr and if a glide-time is set, glides to the next note. is there an easy way to do this with axolotl? (the second gate on the midi in??) i haven't really looked into it, just thought i'd ask :smile:


#9

using MPE?
continuous glides in MPE are done with pitch bends which are done over a large pitchbend range (default 48 semitones). (this is to avoid envelopes being re-triggered)

legato.... hmm, theoretically this is covered in the keyb/mpe object, as you say with note on before the note off..., would have to check if the voice allocation code works with this. I've not had a problem with it, so I think its ok.

I think your 'challenge' is if you don't have another MPE device to play with, then you cannot see what messages it emits - so hard to test your implementation. there are (as shown above) quite a few 'corner cases' to get it to work well.

a couple of things that might help you....

  • most of the mpe controllers have open source code, so you can see how they handle certain scenarios etc... though their own complexities might make it tricky.
    the soundplane code is reasonable easy to read/straightforward, here is the midi output which contains what you are interested in:
    https://github.com/madronalabs/soundplane/blob/master/Source/SoundplaneMIDIOutput.cpp

  • I think there are some iPad apps which now output MPE, so that would help you see MPE in action.

  • there are some iPad apps and VSTs that support incoming MPE, so you can test your implementation against them.

  • finally Max/MSP supports both input and output of MPE, its pretty new though, so I don't know how well it conforms... or how much it will prevent you from output 'spurious' midi
    but you can get a free trial I think, which might be enough to try it out.


#10

no i actually meant how it is done with axoloti if you don't use the mpe way but just send a new note on before the old note off to get legato playing and then add a glide portion if you want (the old school midi way...)
EDIT: see my post here for a solution: https://sebiik.github.io/community.axoloti.com.backup/t/mono-legato-detecting-overlapping-notes/1780/12

i think i cannot implement pitchbend in my controller since the soft pots are linear but not really. the spacing between semitones is also not consistent, as on a regular bass. additionaly i use a pull-up resistor to determine when no fret is pressed which makes the voltage divider even less linear. all that and the limited accuracy of the arduino adc makes it hard to implement. i now have a lookup table with every fret position saved in (with a calibration routine).
if i glide on the string, the controller just sends legato notes (as described) and if i adjust my synths accordingly i get very smooth glides. :slight_smile:

thanks for all your time and effort!!


#11

one other question, i send now only on channel 2 and i want to get a mono synth, with legato sounds not retriggering. i use the first gate, but i can now play polyphonically just on that one voice? up to four voices (the same as subpatch voices). how do i make the subpatch mono? so that i get four mono voices? neither poly multichannel nor poly expression are working like expected they both are polyphonic on each channel.

multivoicetest.axp (7.7 KB)


#12

glide - look at something like a smooth object
mono synth, is a subpatch with one voice.

in axoloti the best way to do this, is edit the voice patch, change settings to mono, and tick the midi selector. now you get to choose the midi channel for the voice patch on the parent patch - so you can now duplicate the voice patch, and change the midi channel for each.

(as I mentioned, the voice allocation and midi channel are not a one to one mapping in poly modes - and yes, the mpe spec etc, allows for polyphony on one midi channel - but of course per note expression is sacrificed.)


#13

thanks!! that works. slight disadvantage is, that i now have to edit each voice individually if i change something...

i thought i read somewhere that poly multichannel is for guitar midi controllers and hence assumed it was what i now implemented with the individual mono voices. i therefore ask for a polyphonic guitar mode :slight_smile:

that works perfectly, thanks. actually there are about 5 or 6 different implementations available!


#14

that was one use... not the only :slight_smile:

at some point we'd like to make it, such that users can alter the voice allocation algorithms, but its not 'simple' to do this.

make the parameters inlets, and then attach dials to the inlets, that way you can share...
not idea, but can work. another way is to use mod source and attach it to dials...
the only draw back is you loose 'units' as the dials don't have things like 'pitch' hz etc


#15

yeah i get that. what i meant is that when i change some objects in the subppatch that i copied, i have to change them in every subpatch, but maybe there is a workaround as well for that...


#16

one suggestion would be that at creation of a subpatch there could be an option like "master subpatch" which would then mirror all changes made to this patch in all copied instances. but maybe that is quite hard to do, since i guess the axolotl software editor is not aware of a copy being a copy. it kind of works when i create a subpatch and save it as axs though, but i have to restart the main patch to see changes.


#17

exactly, and its those kind of 'extensions' that bite you in the a*s later...

better to tackle the underlying shortcoming, which allow users more control over voice allocation.

essentially what you want (which is a 1:1 channel = voice) is simple enough to implement, its just, you can imagine a huge number of similar things... so hard coding all possibilities in not practical, we need to 'give this up to the end-user'...


#18

that's of course true and i totally agree this should be up to the user! if voice allocation can be improved this would be huge!


#19

If you're making a guitar (bass) controller, then I recommend using the original MIDI Channel Mode 4 (Omni Off, Mono) that has been supported by MIDI guitar since the beginning of time.

About the only thing that MPE changes is attempting to support multiple notes per channel, which doesn't make sense on any guitar since each string can only play one note at a time. MPE does have the cool feature of an associated global channel that saves sending Pitch Bend on every channel when you hit the whammy bar, but I don't think I see a whammy bar on your MIDI bass controller.


#20

thanks for your message and suggestions. i actually have two controllers on my midi bass that are general modulations (equal on all four strings, one is the joystick, the other is a recently added breath-controller). i took a hybrid approach now since i have both serial midi out and usb-midi out.

serial midi out behaves exactly as you describe, so it is conform with the midi channel mode 4. since i have many old analog midi synths, i never planned to implement something else there because they simply don't support that.
on usb midi out i implemented something like mpe, channels start at 2, 1 is used for general modulation only, reduces the midi messages over usb by quite a bit and allows for less cpu hungry patching on the Axoloti. i found the poly sub patch modes don't work reliably with midi guitar mode (one note per channel) so i use four equal sub-patches each set to the according channel. this works perfectly but is a bit of a hassle to set parameters.