New alignment and segmentation tools

Improved alignment and segmentation tools have now been released in the latest version of Scanbox, while retaining much of the functionality of the last version.

sbxaligntool. The new alignment tool, shown below, adds batch processing of files, including the processing of eye and ball motion if those data are present.  A region-of-interest (ROI) can optionally be selected manually or automatically.  For file entries where manual selection was specified, the program will stop and present a rectangle on the screen for the user to specify the ROI.  Typically, automatic ROI works fine, and it does not require the user to stand by the computer to specify the ROI each time a new file starts to process.


As the files are aligned, the Status column and Status message will display the progress. The alignment procedure can also be visualized by clicking the Live update checkbox, which will display the mean of the entire image stack as the process moves along.  Pan and Zoom buttons allow the user to inspect details in the live image, such as fine branches, as the system is carrying out the alignment. This tool performs rigid alignment and the result is stored in a *_rigid.sbx file.  The original data is left untouched. The tool can align images relatively fast (about 65 frames/sec in my computer), but it will take a few minutes to compute the reference image if the sequence is 15 min or more (please be patient). Alignment improves with the number of passes requested.  Usually one pass is very good, but you can try two or more passes by changing the appropriate entry in the column. The alignment algorithm has been improved.

sbxsegmenttool. The segmentation tool works in a similar way as before. After loading the aligned *_rigid.sbx file, it will display the correlation map.  Segmentation then proceeds as in the previous version.


Once a number of cells are selected, you must save the segmentation and then extract the signals by pressing the corresponding buttons. After the signals are extracted you can select a cell with the pull down menu on the bottom left and the traces corresponding to that cell (now highlighted in green) will be displayed.  The blue trace represents the average signal within the cell, the gray trace is the neuropil, and the trace is the estimated spike rate using the Vanilla algorithm with parameters optimized for GCaMP6f.

Improvements include an Undo button, which will remove the last cell segmented. The ability to load a previous segmentation (it will load automatically after you select the *_rigid.sbx file), to continue adding cells to it.  The ability to define a ROI in the correlation map to automatically increase the contrast of the correlation map as the most salient cells are selected. A zoomed version of the mean image on the right to go along with the correlation map.  And the tool now saves the neuropil and deconvolved signal as well.

Give these tools a try. Report back any suggestions for improvements or problems you encounter.

A Processing Grating Shader

We have been using Processing for displaying visual and auditory stimuli in the Lab for a while. Some of you have ask for examples of full-screen gratings using shaders. So this is not about Scanbox proper, but still useful for those of you studying sensory systems.

Below is one demonstration of a drifting sinusoidal grating shader. It shows a full field grating and you can change the orientation on the fly (using the ‘a’ and ‘s’ keys) or the spatial period (using ‘w’ and ‘q’) and see the grating change in real-time.

// Processing 2 Sinusoidal Grating Shader

PShader myshader;

void setup() {
 size(displayWidth, displayHeight, P2D);
 myshader = loadShader("sine.frag");
 myshader.set("resolution", float(displayWidth), float(displayHeight));

float per = 200.0; //period in pixels
float th = 0.0; //orientation in radians
float c = 0.5; //contrast

void draw() {

 if (keyPressed)
 switch(key) {
 case 'a':
 th += PI/360.0; // rotate CW
 case 's':
 th -= PI/360.0; // rotate CCW
 case 'w':
 per = per*1.05; // increase period
 case 'q':
 per = per*0.95; // decrease period

 myshader.set("th", th);
 myshader.set("sper", per);
 myshader.set("contrast", c);
 myshader.set("time", millis()/1000.0);
 rect(0, 0, displayWidth, displayHeight);

boolean sketchFullScreen(){
 return true;

The code for shader itself, “sine.frag”, is more definitions than anything else:

#ifdef GL_ES
precision mediump float;


uniform float time;
uniform vec2 mouse;
uniform vec2 resolution;

uniform float mean;
uniform float contrast;
uniform float tper;
uniform float sper;
uniform float th;
uniform float th_speed;

const float PI = 3.1415926535;

void main( void ) {

float sth = sin(th);
float cth = cos(th);

float color = 0.5+contrast*sin(2.0*PI*((gl_FragCoord.x/sper*cth + gl_FragCoord.y/sper*sth)+time));

gl_FragColor = vec4( color, color, color, 1.0 );


Yup, that’s all…

Of course, there are many wonderful things you can do with shaders that make gratings somewhat boring… and are many places to share and test them. The one recommend is shader toy.

Processing also has some neat libraries to interface to other hardware that will make your life simpler when programming new experiments.  Here is an example controlling the shader’s parameters (orientation, spatial frequency and contrast) via a leap motion sensor running on my laptop.

We synchronize visual stimulation in Processing with Scanbox using an Arduino board.  Scanbox has two TTL inputs that timestamp either rising/falling/both edges of a TTL signal with the (frame,line) where it occurred.   Processing communicates with the firmware on the Arduino through the Serial library.  More about this later…

Another advantage of using Processing is that you can easily deploy the code on different target architectures.  Here is an example of a few transparent gratings running on both a Mac and my Android tablet.

A bunch of gratings running on my Mac and Android tablet.

A bunch of gratings running on my Mac and Android tablet.