Some time ago we talked about the advantages of synchronizing the digitizer sample clock to the laser. Our new system incorporates an on-board PLL multi-phase clock that allows Scanbox to automatically measure the contrast in a test image as a function of sample phase delay and select the optimal value for your setup.
A typical measurement showing the change in normalized contrast of a target image as a function of phase delay (here the range from -8 to 8 covers the entire 12.5 ns period of the laser).
Scanbox automatically shows the resulting images, on the same scale, for each phase delay value as well:
Scanbox will also plot the normalized images obtained for the settings that yield the lowest and and highest contrasts:
One can clearly see from the images that phase does not simply scale the contrasts of the images, but has an obvious effect on their SNR. (Why this occurs exactly is still a matter of debate here, but data rule and the results are clear.)
This option will be automatically available to those who adopt the new tower system. What can you do if you have one of the older systems? You can simply change the sample clock phase by extending the length of the cable running from the laser SYNC OUT to the external clock of the digitizer. Extensions of 50cm in length can be connected together to yield phase steps of 1/8th of the laser period. So if you want to optimize the contrast and SNR of your images take a day off to find your optimal delay and improve the quality of your data.
Thanks! Doing it on-board and automatic is a nice idea. Especially because without a constant-fraction discriminator the trigger output from the laser is massively wavelength-dependent. Like this one can quickly reset the phase delay for each wavelength…
May I ask how you define normalized contrast? (Max-Min) / Mean?
Tip: For variable manual phase settings we use (relatively) cheap coax-delays (like: http://www.thinksrs.com/downloads/PDFs/Catalog/DB64c.pdf).
The contrast of an image was defined by the 5-95% percentile range divided by the mean. It was taken over a square sub-image at the center of the field (to avoid the dead-bands). The relative contrast was computed relative to the image with the lowest contrast. So, the graph above can be read as saying the optimal contrast was 2.3 times larger than the minimum. More interesting, of course, is quantifying SNR ratio — which I did not do. But, at least experimentally, SNR seems related to contrast, something that has caused a lot of angry discussions with theorist here and elsewhere. But hey… data rules. Do you have similar measurements with your scope? I remember you shared this graph — http://goo.gl/l0PBYz
…not yet – these measurements were in a fluorescein-sea. But I will try to do the same measurement (same prep even) as soon as I find some time. We’re also upgrading to GHz FPGAs/digitizers that would allow even more fine grained locked-in sample picking.
In the end we currently stopped using lock-in sampling at our Alazar-setups and simply put in 15MHz in-line low-pass filters. The laser-clock was simply too unreliable and wavelength-dependent so that for most users lock-in sampling was too finicky. I just chose filters empirically and went for the maximum brightness with our TL ti60 amps that still does not lead to “pixel smearing” at our commonly used dwell-times. I should, of course, quantify that in SNR / contrast terms at some point.
It may take a while but I’ll post the results here.