Month: March 2014

A Processing Grating Shader

We have been using Processing for displaying visual and auditory stimuli in the Lab for a while. Some of you have ask for examples of full-screen gratings using shaders. So this is not about Scanbox proper, but still useful for those of you studying sensory systems.

Below is one demonstration of a drifting sinusoidal grating shader. It shows a full field grating and you can change the orientation on the fly (using the ‘a’ and ‘s’ keys) or the spatial period (using ‘w’ and ‘q’) and see the grating change in real-time.

// Processing 2 Sinusoidal Grating Shader

PShader myshader;

void setup() {
 size(displayWidth, displayHeight, P2D);
 myshader = loadShader("sine.frag");
 myshader.set("resolution", float(displayWidth), float(displayHeight));
 noCursor();
}

float per = 200.0; //period in pixels
float th = 0.0; //orientation in radians
float c = 0.5; //contrast

void draw() {

 if (keyPressed)
 switch(key) {
 case 'a':
 th += PI/360.0; // rotate CW
 break;
 case 's':
 th -= PI/360.0; // rotate CCW
 break;
 case 'w':
 per = per*1.05; // increase period
 break;
 case 'q':
 per = per*0.95; // decrease period
 break;
 }

 myshader.set("th", th);
 myshader.set("sper", per);
 myshader.set("contrast", c);
 myshader.set("time", millis()/1000.0);
 shader(myshader);
 rect(0, 0, displayWidth, displayHeight);
}

boolean sketchFullScreen(){
 return true;
}

The code for shader itself, “sine.frag”, is more definitions than anything else:

#ifdef GL_ES
precision mediump float;
#endif

#define PROCESSING_COLOR_SHADER

uniform float time;
uniform vec2 mouse;
uniform vec2 resolution;

uniform float mean;
uniform float contrast;
uniform float tper;
uniform float sper;
uniform float th;
uniform float th_speed;

const float PI = 3.1415926535;

void main( void ) {

float sth = sin(th);
float cth = cos(th);

float color = 0.5+contrast*sin(2.0*PI*((gl_FragCoord.x/sper*cth + gl_FragCoord.y/sper*sth)+time));

gl_FragColor = vec4( color, color, color, 1.0 );

}

Yup, that’s all…

Of course, there are many wonderful things you can do with shaders that make gratings somewhat boring… and are many places to share and test them. The one recommend is shader toy.

Processing also has some neat libraries to interface to other hardware that will make your life simpler when programming new experiments.  Here is an example controlling the shader’s parameters (orientation, spatial frequency and contrast) via a leap motion sensor running on my laptop.

We synchronize visual stimulation in Processing with Scanbox using an Arduino board.  Scanbox has two TTL inputs that timestamp either rising/falling/both edges of a TTL signal with the (frame,line) where it occurred.   Processing communicates with the firmware on the Arduino through the Serial library.  More about this later…

Another advantage of using Processing is that you can easily deploy the code on different target architectures.  Here is an example of a few transparent gratings running on both a Mac and my Android tablet.

A bunch of gratings running on my Mac and Android tablet.

A bunch of gratings running on my Mac and Android tablet.

Recursive Image Alignment and Statistics

If it is awake, it will move.  And if it moves, your images are likely to do it too.  This means that, almost surely, the first step in processing your data will be to align (or register) all the images in the sequence. This is such a basic problem that there are probably as many solutions as readers of this blog, and I invite you to share yours  in the comments below.

Below is our approach, which seems to work reasonably well.  (At some later time I will discuss the possibility of doing this alignment in real-time as images are being acquired.)

Suppose we have a stack of images indexed x_i to align.  We will write a function that takes as input this set of images and returns a corresponding set of transformations  t_i that, when applied to each of the images, results in their alignment.  The function also returns the mean image after the alignment \mu.

The general strategy consists in splitting the sequence in two sets, A and B.  The set A  holds the first half of the sequence and B the second half.  Then, we recursively call our alignment function to register the set in A, which returns (\mu_A, {t_i^A}), and then the one in B, which returns (\mu_B, {t_i^B}).

The final step is to find the transformation required to align the mean image of A with the mean image of B, t_{AB}, and return the mean after the alignment \mu = (|A|t_{AB}\mu_A + |B| \mu_B )/(|A| + |B|) and the transformations (t_{AB} \odot t_i^A, t_j^B).  Here, \odot means the composition of the transformations.  That is, the transformation t_{AB}  is applied to all the transformations of the set A.  The recursion ends when the function is called with a single image, in which case it returns the image itself as the mean and the identity transformation.

It turns that you can not only compute the mean recursively, but all the other moments as well. Thus, in a single pass through the data, the algorithm can align the images, and compute the mean, second, third and fourth moments of the aligned sequence.  From there it is easy to compute the images corresponding to the variance, skewness and kurtosis of the distributions of the signal at each location.

Note the algorithm works for any class of transformations, but it helps to keep it simple.  If you are using a resonant, two-photon microscope then, because of the relatively high rate of imaging the distortions from one frame to the next are well approximated by a simple translation.  Feel free to try other transformations, but beware of the fact that you will pay a heavy prize in processing time.

The resulting code is only a few lines long:

function r = sbxalign(fname,idx)

if(length(idx)==1)

 A = sbxread(fname,idx(1),1); % read the frame
 S = sparseint; % sparse interpolant for...
 A = squeeze(A(1,:,:))*S; % spatial correction of green channel

 r.m{1} = A; % mean
 r.m{2} = zeros(size(A)); % 2nd moment
 r.m{3} = zeros(size(A)); % 3rd moment
 r.m{4} = zeros(size(A)); % 4th moment

 r.T = [0 0]; % no translation (identity)
 r.n = 1; % # of frames

else

 idx0 = idx(1:floor(end/2)); % split into two groups
 idx1 = idx(floor(end/2)+1 : end);

 r0 = sbxalign(fname,idx0); % align each group
 r1 = sbxalign(fname,idx1);

 [u v] = fftalign(r0.m{1},r1.m{1}); % align their means

 for(i=1:4) % shift mean image and moments
 r0.m{i} = circshift(r0.m{i},[u v]);
 end

 delta = r1.m{1}-r0.m{1}; % online update of the moments (read the Pebay paper)
 na = r0.n;
 nb = r1.n;
 nx = na + nb;

 r.m{1} = r0.m{1}+delta*nb/nx;
 r.m{2} = r0.m{2} + r1.m{2} + delta.^2 * na * nb / nx;
 r.m{3} = r0.m{3} + r1.m{3} + ...
 delta.^3 * na * nb * (na-nb)/nx^2 + ...
 3 * delta / nx .* (na * r1.m{2} - nb * r0.m{2});
 r.m{4} = r0.m{4} + r1.m{4} + delta.^4 * na * nb * (na^2 - na * nb + nb^2) / nx^3 + ...
 6 * delta.^2 .* (na^2 * r1.m{2} + nb^2 * r1.m{2}) / nx^2 + ...
 4 * delta .* (na * r1.m{3} - nb * r0.m{3}) / nx;

r.T = [(ones(size(r0.T,1),1)*[u v] + r0.T) ; r1.T]; % transformations
 r.n = nx; % number of images in A+B

end

The algorithm lends itself to parallel processing by enclosing the recursive calls (lines 22 and 23) in a parfor loop. However, the gains (or not) will depend largely on the disk array you have.

The optimal translation (with single pixel resolution) is found using FFTs:

function [u,v] = fftalign(A,B)

N = min(size(A));

yidx = round(size(A,1)/2)-N/2 + 1 : round(size(A,1)/2)+ N/2;
xidx = round(size(A,2)/2)-N/2 + 1 : round(size(A,2)/2)+ N/2;

A = A(yidx,xidx);
B = B(yidx,xidx);

C = fftshift(real(ifft2(fft2(A).*fft2(rot90(B,2)))));
[~,i] = max(C(:));
[ii jj] = ind2sub(size(C),i);

u = N/2-ii;
v = N/2-jj;

You can allow for more complex transformations by using Matlab’s  imregtform but, I warn you once more, you will have to wait substantially longer. Remember that in a typical experiment you will be aligning tens of thousands of images (18,000 in the example below.)

So here is an example of the estimated translations along with the movement of the ball, which is obtained via a camera synchronized to the frames of the microscope. Not surprisingly, when the mouse runs there is relative movement detected in the estimated translation of the alignment.

movement

Also note that the movement in at this magnification has an extent of about 5-6 pixels, this is comparable to the radius of a cell body. A typical result of aligning the images using this method is shown below, where you can still discern individual processes after a 20 min imaging session.

stabilized

The outcome of the alignment is the mean image, along with the moments and the sequence of transformations.  We do not store a new set of aligned images, as doing so would require much space. Instead, next time we read the image from the raw data file, if the alignment data is present,  we can apply the alignment transformation so we can read the aligned stack.

What do you do?  Do you have a different method to share?  

Synchronize to the Laser

Scanbox makes sure the data acquisition of the signals from the photo-multipliers are synchronized to the laser pulse.  Let me explain why.  Suppose you run an asynchronous data acquisition clock at 25Mhz while the laser has a pulse rate of 80Mhz.  Then you may obtain images of single cells that look like this:

problemThis is a zoomed in view of a single cell labeled with GCamp6s.  Notice something funny about this picture? If you look along individual lines you will see that bright pixels are often next to black pixels.  If you look at another frame you will see the same pattern, except that the location of bright/dark pixels would have shifted.

What’s going on?

What is happening is that the phase of the 25Mhz clock relative to the laser pulses changes over time, producing a beat pattern.  For these choice of frequencies, the beating pattern has a period of 5 pixels.  This is because 5 pixels @ 25Mhz is the same as 16 pulses @ 80Mhz, both of which are equal to 200ns. The pattern is obvious when you compute the average auto-correlation of the fluctuations of the lines:

auto

Yuck… Two pixels away there is a 40% anti-correlation in the signal!  How do you get rid of this problem?  Running the sampling clock at 20Mhz does not solve it.  Why?  Because the laser does not run exactly at 80Mhz so the relative phase will still drift. Moreover, the actual frequency of the laser changes with your selection of the wavelength and we want a solution that will work for all wavelengths. Another partial solution is  to temporally filter your signal by selecting PMT amplifiers with a relatively low high-cut frequency.  This, however, will decrease the spatial resolution along the resonant axis and will not solve the problem if your selection of frequencies causes the phase to drift slowly in time.  In this case, the overall brightness of your images will drift slowly back and forth between bright and dark levels. The simplest solution to this problem is to synchronize the data acquisition with your laser.  Indeed, then your single frames look nicer (note image is from a different cell) and you get sharp bright/dark edges with single-pixel resolution.

fixed

In Scanbox this synchronization is simply achieved by ‘cleaning-up’ the sync out output of the Chameleon with a BBP-70+ band pass filter from Mini-circuits.  Then, the level and slope of the trigger in the Alazartech board is fine-tuned to find the relative phase that achieves a maximal signal. The resulting settings will depend on the PMT filters that you use, but will generate the crispiest, noise-free images you optics can support.

So, if you see the sync-out of your laser not connected to anything, and your single images look like the one at the top, it is time to get this fixed.  Your data will thank you.

The Heart of Scanbox

The heart of Scanbox is the custom designed card shown below, now in its third revision:

scanbox_card.001

Here is a brief summary of the architecture.

The card communicates with the host PC through a USB line.  A number of Matlab functions allows one to easily communicate with the card, sending scanning parameters, starting and stopping scanning, receiving TTL events, etc.

TTL events are time-stamped by the card by assigning them the (frame,line) pair at which they occurred.  TTL lines can be programmed to detect rising/falling edges or both. These data are saved along with the entire state of the microscope (including position, gains of PMTs, laser wavelength, etc) in a Matlab file.

The fast shutter line is used to control a Uniblitz shutter that it is only open while scanning.  This line can also be used as a TTL signal that signals when the microscope is in the middle of a scan.

The Pockels cell signal modulates the power of the laser during the line.  This is generated by a look-up table in the hardware so it can be programed to any arbitrary shape.  By default, the shape is such that keeps the mean image brightness uniform except at the very edges where the laser is completely blanked.

The trigger line acquisition line is sent to the AlazarTech 9440 to trigger the acquisition of one line.  The PMTs power are gain signals are provided directly from the card.  The same applies to the CRS resonant mirror controller and the slow galvanometer controller.

A mirror moved by a Firgelli linear actuator is controlled via a PWM signal available form the card.  An I2C interface is provided to communicate with other sensors and an extension header provides 34 additional lines of digital or analog I/O that can be accessed by the PSoC chip.

Of course, the amazing PSoC 5 chip is in charge of everything.

To learn more about the microscope optics and its capabilities go to this page.

A Quick Look at Scanbox’s GUI

In case you were wondering, this is how things look for now in the first release of Scanbox 1.0.

scanbox

Click on the figure to get a larger picture.  Here is a brief description of the panels.

Chameleon: it allow you to change the laser wavelength, power and read out the laser status.

Scanner: Defines scanning parameters including the total number of frames to be collected, the number of lines per frame, the magnification (x1, x2 or x4) or, instead of the number of lines one can specify the frame rate needed.

Position: It allows you to control microscope position in (x,y,z) and tilt the axis of the objective (a-axis).  You can zero the position and later return the microscope to the exact same position.  The control is via a single ShuttleExpress controller.

File Storage: Defines where the data will be stored, the subject ID, field # and experiment #.  It also allows you to select which data channel you want to save (PMT0 – green, PMT1 – red, or both).

PMT Control: It enables and controls the gain of the PMTs.  Gains can be changed on the fly as the data are displayed.

Camera Path: It slides a mirror that diverts the optical path into an IMPREX B2020M camera used to target a specified location or to perform intrinsic imaging through the same (or different objective).  When the mirror is in place the images are displayed in the same image window now displaying the two-photon data.  Two controls allow one to change the exposure and gain of the camera on the fly.

Microscope control: It allows one to perform imaging in “focus” or “grab” modes.  In focus mode the data are displayed but not saved, while grab will both display the data and stream to disk in real time.  The numbers below will continuously update the number of frames collected and the time elapsed since the beginning of the acquisition.

Image display: Allows the selection of the data one wants to display.  Available options are to display the individual channels (PMT0 or PMT1), the channels fused with a red-green pseudo-color map, or display both channels side-by-side.  One can also perform on-the-fly averaging or taking the maximum of the images as they come in, and the accumulated/max image can be reset on the fly (for example, after moving the microscope to another location).

Colormap/contrast: Controls the lookup table of the colormap used to display the data for both PMTs.

Navigation: Allows on-the-fly digital zoom and panning of the image data.  Note that this is digital zoom and will not change the scanning parameters.  If you want to change the actual optical angle of the scan it can also be done on the fly by changing the magnification selection in the scanner panel.

Ball tracker:  It enables  image acquisition on a GigE camera to allow the tracking of the movement of the ball.  This is done with a Dalsa Genie camera under infra-red illumination (the ball has a pattern that absorbs in the infrared to allow for tracking).  The camera is synchronized to the frames of the microscope.

Eye tracker: It enables image acquisition on a GigE camera to track the position and pupil size during the imaging session.  This is again done by a Dalsa Genie camera.  There is no need for infrared illumination as during imaging plenty of the laser light is scattered out of the pupil allowing for easy segmentation and tracking.

Z-stack: Sets up the microscope to sequentially run imaging sessions at various steps.  The data are stored in different files that are numbered sequentially.

Optotune: Allows for control of the waveform in an optotune, electrically controlled lens, for continuous scanning in the Z axis without moving the servos.

Network control: A network protocol allows other Matlab applications running on the network to control the microscope, such as setting parameters, starting and stopping acquisition, moving the position of the microscope and so on.

Welcome to Scanbox!

Welcome to Scanbox File Exchange.  This is a site where you can find regular updates to the scanbox software for two-photon imaging, report bugs, share data analysis tools and ideas for improvement.

As time goes by we will be posting tips and tricks on how to use the software, insights into the code (the Matlab software is open to the community), and documenting the various functions you can use to analyze the data.

Some of features already implemented in the release version of ScanBox:

  • Two analog and two digital channels sampled at laser frequency (80Mhz) and 16-bit depth.
  • Control of PMT gains.
  • Non-uniform spatial sampling correction in real time (raw data are streamed to disk).
  • Real time averaging and display of data.
  • Uniform power density over scan line by modulation of Pockels cell (an arbitrary waveform can be programmed).
  • Control X, Y, Z stage and tilt angle of objective.
  • Z-stack data collection
  • Movement in a rotated coordinate system that keeps the (x,y) plane normal to the objective.
  • Control of laser parameters (power, shutter, wavelength).
  • Two additional TTL signals timestamped with the frame and line number where they occurred.
  • Two GigE cameras synchronized to the microscope frame to acquire eye pupil/position and ball movement.
  • Additional GigE camera for intrinsic imaging through the same (or different) objective.
  • Additional digital I/O, I2C, SPI, current generator (for electrical tuned lens) expansion capability.
  • Remote control of the microscope over the network (change file names, start acquisition, stop acquisition, etc).
  • Matlab software for reading data, motion correction, segmentation, and signal extraction.

At the heart of the system is an AlazarTech 9440 digitizer and a custom-designed card based on the PSoC 5LP 32-bit ARM-based processor from Cypress that is in charge of generating the scan signals, generate trigger signals for the cameras, timestamp external TTL events, and more.  The card communicates with the host computer through a USB serial line.

That’s it for a quick introduction…  but, before leaving, here is one of the first movies we obtained with the microscope in Josh Trachtenberg’s Lab showing 1 min in the life of prefrontal cortex (please don’t ask me what I was doing there).  Enjoy —