precise timing of the read in lag
Posted: Mon Jun 07, 2004 2:41 pm
Hi, I still have one question connected with my previous posts.
For the application I have in mind, I need to analyse online some data, and to provide trigger output depending on the result. It is not paramont that the online analysis is fast, but it is that the timing is known.
The principle of the ringbuffer and the variable data lag involved is clear to me. Having understood that the data seen by the computer is variably delayed with respect to reality, I try to measure on the fly this delay as precisely as possible. If this would be known, I could program the trigger to be sent in the [computer] future, after a suitable wait time, which is adjusted by subtracting the current lag. I have already seen that with some hackery (highest priority in the proper threads, clock timer checked frequently, wait VIs dinamically adjusted, etc.) I can achieve timings of the trigger with errors down to just a few msec. However, I still fail by several tens of msec in estimating the time lag. The formula I'm using is:
lag=[(buffer_pointer - 4* display_index) modulo (4*buffer_size_in_I32)]/2^21 + t2-t1,
where 2^21 stands for 2Msamples/sec, t1 is the timer value read by a clock located in a sequence box which encloses READ_POINTER (big orange while loop in state 4), and t2 is the timer value at the very end of my calculation, just before computing the final delay before the trigger. Is this correct? Is there some variable contribution to the lag (e.g., USB data transfer) which I'm overlooking?
Thanks a lot in advance, Enrico
For the application I have in mind, I need to analyse online some data, and to provide trigger output depending on the result. It is not paramont that the online analysis is fast, but it is that the timing is known.
The principle of the ringbuffer and the variable data lag involved is clear to me. Having understood that the data seen by the computer is variably delayed with respect to reality, I try to measure on the fly this delay as precisely as possible. If this would be known, I could program the trigger to be sent in the [computer] future, after a suitable wait time, which is adjusted by subtracting the current lag. I have already seen that with some hackery (highest priority in the proper threads, clock timer checked frequently, wait VIs dinamically adjusted, etc.) I can achieve timings of the trigger with errors down to just a few msec. However, I still fail by several tens of msec in estimating the time lag. The formula I'm using is:
lag=[(buffer_pointer - 4* display_index) modulo (4*buffer_size_in_I32)]/2^21 + t2-t1,
where 2^21 stands for 2Msamples/sec, t1 is the timer value read by a clock located in a sequence box which encloses READ_POINTER (big orange while loop in state 4), and t2 is the timer value at the very end of my calculation, just before computing the final delay before the trigger. Is this correct? Is there some variable contribution to the lag (e.g., USB data transfer) which I'm overlooking?
Thanks a lot in advance, Enrico