I know everything is midi learnable, but it’s always good to have some actual params exposed. Maybe just basic ones like play, stop etc?
That’s complicated. Drambo is very flexible.
what’s a basic parameter?
what’s the parameter you expect in this usecase?
maybe you don’t use the sequencer at all
Yes, I see what you mean. I assumed everyone would use the transport but I guess not.
Still, some settings are more global than others, no?
is it just one instrument without seq ?
what’s a basic parameter now
we have 12 filter cut offs,
which one is the important? Maybe none of them.
do we have 12 tracks with 12 instruments?
Lots of apps expose lots of params I’ve never automated, but they’re still there.
But I do get what you’re saying. Can certainly live without them and just use midi learn.
I like the idea of having, say, 8 or 12 dedicated Morph instances that are always exposed and that you can use kind of like macro controls automated in a linear host’s timeline (eg NS2 that doesn’t send cc messages but does automate au params).
Then those Morph/“macro” instances can be assigned to whatever you like within Drambo as currently : or don’t use them at all if you have no use for them.
That solves the riddle as to what of the myriad of parameters should be exposed.
… and yes, I realise this is probably NS2s problem (sort of) but as development there on something like sending cc’s is unlikely to ever happen then implementing this idea would certainly help those of us who use Drambo purely for its soundmaking abilities as an au within a linear host and not use it for its sequencing.
Mirack uses a set of assignable slots for exposed. Poss makes sense for same way in D?
It could be a module converting 8 pre-mapped AUv3 parameters into incoming CC messages maybe.
Or maybe simpler if can add an ‘expose in slot’ option in Midi learn popover menu, with list of slot numbers 1-? to select?
It could also be another event type, available if D is hosted as AUv3.
Yeah - all great ideas.
I was basing my Morph-module/“macro” idea on the way Serum & PhasePlant have implemented the same kind of thing on desktop (I think Ableton do in Live too? I don’t use it personally).
There’s some dials within the VST that the host knows about and can automate - then use them or not within the VST by mapping them (or not) to whatever you like, it’s up to you then.
when it comes to hosts in iOS I wouldn’t know if they have to be aware of exposed params at the time the au instrument is loaded, or whether they are dynamically aware of params becoming exposed “live” as it were.
likewise I do not know whether it is possible on iOS to dynamically add (or remove) exposed params on the fly?