Binding Delay Oddities

I’ve set up a string of bindings to initialize a hardware synth (VL70) on Song Load:

Every red dot I have drawn represents a 50msec binding post-delay. The binding with three dots has 150mset post-delay and looks like this:

BindingDelay_02_Screenshot 2024-02-15 065548

… and the one-dot bindings with 50msec post-delay look like this:

BindingDelay_03_Screenshot 2024-02-15 065627

ALL bindings are scheduled with Other bindings invoked by the same event and all have [X] Pause song and state… with no pre-delay and no re-trigger limit.

The MIDI Monitor for the Green bindings above is:

… and the Blue bindings above shows:

Here are the issues:

  • The post-delay after the Green bindings is 150msec, but the actual timing runs around 100msec (92msec in this case: from 5,694.822 to 5,694.914)

  • The Controller 61 event should be delayed by 50msec from the Controller 60 event, but there appears to be no delay.

I may be misunderstanding how these bindings work … but any thoughts??

Hey Clint,

I have done testing on this and the delay value is not always indicated as the delay value plugged into the binding. For instance a 100mS delay I set is recorded by the MIDI monitor as executing at 100, 92, 110, 120 mS when the bindings group is invoked. I could understand it taking longer since the system has to schedule a core for the task but I don’t get how it could be less. Maybe @brad could help explain why. It’s like the hand off from one delayed binding to the next delayed binding isn’t working consistently. Is the CC61 binding the last binding in the list of triggered items in your case?

Dave

No, there are a bunch after CC61, but they have no delays.

The big concern is the 150 msec delay. I’m trying to put in the delays that are needed by the VL70 to get it to work reliably, while keeping patch change times to a reasonable minimum.

I probably need to plot binding delay vs real-world delay … I suspect the difference is proportional to the binding delay (i.e. not a fixed offset between binding and real-world delay).

Agreed, I tried adding 25% and got a lot closer to always achieving the minmum value I wanted. So for 150 ms I went with 175mS and it fired at 150 or a little more.

Here’s some raw tests. I’m firing all these from keyboard [Fn] keys **

** The reason I’m using Fn keys is that I want to be able to execute all the binding actions of a Song On-Load at any time. I’ve actually got several Fn keys … one (F5) executes all On Load bindings, and another (F4) executes just the actions to reset the VL70.

  • The Left column is the configured Binding Delay

  • The Center F4 column is when the binding with the delay is executed first, from a global trigger.

  • In the Right F5 column the binding in question is preceded by another global trigger that performs some configuration of my routes, but those bindings have no delays.

Delay  Actual Actual
-----  --F4-- --F5--
 50      26      4   (!!!)
 50      25      4
 50      24      4

100      74     59
100      74     46
100      82     45

150     125    104
150     125     92
150     122     94

200     181    154
200     170    143
200     180    150

500     472    452
500     471    441
500     479    442

Contrary to my hunch, the F4 bias seems to be around -25msec and the F5 bias seems to be around -50.

Haven’t got a clue why configuring (enabling / disabling) some routes would affect the binding delays in downstream bindings …

I think it’s a composite workload thing. Reduce the components in the workload that are real world delay and the real world delay is reduced.

Hi @Clint,

Thanks for posting this - I’ll take a look at this today and get back to you.

fwiw: the timing of delays for some bindings is not precise (depends if it’s a completely MIDI to MIDI binding vs UI thread involved), but it should be at least the delay amount and never less.

Brad

Hi Clint,

I just replicated your setup here, and I can definitely see the bindings firing too quickly and I’ll look into it.

But the second problem you describe with the blue bindings I’m not seeing. Can you send me a copy of your song/rack with the bindings and I’ll see if I can replicate it.

Brad

I’ve looked into it and the problem is not that the events are firing too early - it’s that the previous event is firing too late.

To explain… when bindings are scheduled like this, at the time the source event is triggered, the target actions are all scheduled into the futured based on the current time stamp.

So say the current time is 10,000 and you have three bindings at +0, +50 and +100. Those bindings would be scheduled to trigger at 10,000, 10,050 and 10,100.

Now suppose, because these are being dispatched on the UI thread and the UI is busy for a short period and the second event isn’t dispatched until 10075 - that doesn’t affect the third event which will still be scheduled for 10100. So the final delay between them will be only 25ms instead of 50ms.

One solution to this would be to not schedule the following events until the previous events have actually been triggered, but that gets very complex very quickly.

Thinking about it further:

  • For MIDI targets, the timing between events can be critical when you need to enforce a minimum delay between events for a particular device.

  • For all other binding targets, this kind of precision isn’t really required and jitter in the delays isn’t really a problem. (I think, correct me if I’m wrong)

  • Setting up sequences of MIDI events like this through a series of bindings is tedious because you need to get the scheduling right on each binding and there’s a whole bunch of similar bindings.

What if instead of trying to fix this, I added:

  • On a MIDI Target you could choose to send a MIDI “Sequence”
  • This would bring up an editor where you could enter a sequence of MIDI Events, with a time delay between each.

With this approach the entire sequence can be passed to the audio engine and they can be scheduled sample accurately. The editing becomes easier because you just have the one binding.

The downside is although the timing between events in a sequence could be guaranteed, the timing compared to other bindings (including other sequences) couldn’t and they could still jitter or be offset.

Thoughts?

Sounds easier to manage and tightens the delays accuracy so I’m in favor of it.

Does this mean bindings on the UI thread could alter a sequence of timed bindings you had set up while it was executing? Possibly the MIDI sequence events on the Audio thread be inoculated until completion and then let the next scheduled UI thread carry on?

Last question is do you have an approximation for the amount of busy time on the UI thread would add to any given binding delay in a sequence like this.

Dave

The real-world scenario for me is to pause after sending MIDI to allow my hardware unit (VL70) to complete that task before sending more MIDI requests. However, I see now that this strategy not only pertains to the VL70, but the C4 itself.

The [F5] scenario fires a prior operation that entails significant work - enabling routes, which fires bindings that cause Controller Bar updates, as well as setting levels which causes changes to gains on routes which fires more bindings, etc etc. That don’t happen in femtosecond time.

I believe I have to treat that complex operation the way I would treat a patch change on the VL70 and add a delay to that prior operation.

However, I am wondering if a system of semaphores might be preferable to delays, for complex tasks involving C4 itself. (see below)

To me, the complexity of additional framework for delays seems daunting. Maybe I’m just not seeing how it would work, and it may be straightforward in the end, but at this juncture it does seem daunting.

That said, I have a history with delays that is not pretty (see below) … so I am wondering if a system of … Semaphores might be preferable …

Semaphores

Basically a semaphore could be declared and grabbed by a binding until the tasks of that binding are complete, and then the semaphore is released. Another, later binding would wait on that same semaphore and be blocked until the semaphore is released.

A queue of waiting bindings would be needed so that the bindings wake up in the order that the bindings attempted to grab the semaphore.

Begin Extended Somewhat-Related Side-trip

Mid 90’s and I’m working for the NY and American Stock Exchanges. Bringing up all the computers needed to run the NY floor started at maybe 2AM. Of course, some machines needed to boot only after other machines booted, or after other machines completed booting, or after other machines booted and the completed some set of tasks (loaded databases, etc).

This boot sequence was controlled by a system (not mine) of timing delays. I would hear comments over my cubicle wall like “OK, just add another 3 second delay to X17354 and that should cover it”.

Then … Black Monday … the system repeatedly failed to come up because some host took longer to boot than expected. The markets opened an hour late, and all the computer folks lost their yearly bonus (which was pegged to the percentage of uptime of the computer systems).

My thought at the time was to develop a semaphore system …

End Side-trip

1 Like

No

No, all I’m proposing is that the entire sequence of MIDI events be passed to the audio engine as a unit where they can be properly scheduled in relation to each other. It wouldn’t affect other ui-based bindings.

It shouldn’t add any, but not sure I understand the question.

This might be preferable if you need to schedule non-MIDI things as well… but I’m not sure that’s an actual requirement for this level of precision? Semaphores make sense and would be best to guarantee this kinds of operations, but they also sound even more difficult to setup than the current binding system. What do you envision the UI for this would look like?

What I’m proposing is a way to setup a simple MIDI Sequence, with delays between events. That whole sequence would be a single MIDI binding point target and would be passed to the audio engine as a self contained unit. Once it lands on the audio thread, the timing would be scheduled starting at that point and should be sample accurate (which is sub-millisecond accurate).

As for editing this MIDI sequence, it could be a GUI style editor, or it could be more code like, but basically it would let you create a single binding that does:

SendCC(0, 33, 1);  // cc 0, value 33, ch 1)
SendCC(32,1, 1);   // cc 32, value 1, ch 1)
Delay(150);
SendPR(127, 1);    // program:127, ch 1)

(and similar for your other sequence)

2 Likes

Just did a quick re-test with v4.165. Much improved.

I mainly looked at the first issue from Post 1 above (“The post-delay after the Green bindings is 150msec, but the actual timing runs around 100msec”)

… and did tests similar to the F5 tests in Post 5 above. Here are quick results:

  • With a delay on the Green binding of 50 msec,
    the actual delay is now 34 msec
    vs. a delay of 4 msec in v4.161.

  • With a delay on the Green binding of 150 msec,
    the actual delay is now 140 msec
    vs. a delay of 97 msec in v4.161.

  • With a delay on the Green binding of 166 msec,
    the actual delay is now 149 msec.

I’ve run some follow up tests and found it is much tighter than before. There is still some shifting of the values for each binding on each triggering but as stated it is much closer to the values that are loaded in the delay.

Hey Guys,

I did make some changes here for bindings triggered on the UI thread. Previously these binding were getting quantized to the audio buffer size. Now they should be millisecond accurate, but possibly late - which explains the compression of delays between bindings.

I still think the MIDI sequence binding point will be a more effective way of resolving this, but haven’t had a chance to even start implementing it yet

(I’ve swamped with support emails this week for some reason).

Brad

2 Likes

Hey Guys,

I’ve been playing around with an idea here. Rather that implement an entirely new binding point target for MIDI sequences, I’ve updated the sys-ex binding point to allow delays. I’ve also added some helper functions for generating common (ie: non-sysex) MIDI events.

So you can now do this:

The delay function takes a milliseconds parameter and is encoded in the MIDI stream as a special sys-ex that’s picked up later by Cantabile and causes the events to be scheduled accordingly.

The timing is much more precise (within a millisecond or two)

Is this useful?

2 Likes

1 Like

Neat approach!

However … when thinking about how it would fit into my current (possibly ill-conceived) scheme for setting my VL-70 for a given patch / song (I’ve got hundreds of song files like this):

… where each of the bindings has a different delay (typically 0, 66, or 166msec) depending on my experience. All of these bindings would now need to be converted into a single SysEx binding to take advantage of midi_delay() … Hmmm.

Some issues I see:

  • Losing all the comments

  • Not sure how to handle output to different Cantabile ports (note in the screenshot that some to out the VL70 Mout port and later ones are sent to the RigCtrl port of the VL70 Control rack).

… and it feels like I’d be moving from a nicely crafted high-level language to Assembler code … for a substantial amount of my binding scaffolding …

My current experience is that the use of my current style of bindings fails less and less often, as I have added more delays. Only happened once in Saturday evening’s show. Problem is I have to watch for it (and then manually hit my home-grown “re-fire all bindings” key). So …

On the balance, I would be resistant to a whole-sale change of my current use of individual bindings, and probably just increase the delays, as needed. I use odd amounts for my delays (66 and 166) so I can write a script to alter them wholesale across all of my JSON song files.

If I could write a script to convert my binding stream into the single-binding SysEx, that would be cool … but it seems daunting …

Hi @Clint,

Yes I understand the concern and I wouldn’t suggest converting everything just for the sake of it. The idea was to provide another mechanism and I was especially thinking of the case where you need a delay between sending a bank select and then the program change.

As for comments, you can still have a comment on the binding itself and the sys-ex bindings also support embedded // c style comments

ie: it’s another tool in the kit to use as appropriate.

I’m still thinking about your semaphore idea but I’m concerned that would get complex pretty quickly.

The other possibility is some sort batched binding processing. The root cause of the issue is that for the timings to be accurate the entire set of MIDI events need to be passed to the audio engine as a batch so they can be properly scheduled relative to each other. Because they’re currently passed one by one as each binding is invoked, the timing can jitter.

I’ll include the above mentioned changes in the next build cause they’re pretty much ready to go - if they’re useful then good, otherwise I don’t think they hurt anything.

Brad

1 Like

Yep. I played around with some interface ideas and came up with nothing that was straightforward and comprehensible …

What is really needed is a message back from the hardware that says “ready for more MIDI …”