# New alignment and segmentation tools

Improved alignment and segmentation tools have now been released in the latest version of Scanbox, while retaining much of the functionality of the last version.

sbxaligntool. The new alignment tool, shown below, adds batch processing of files, including the processing of eye and ball motion if those data are present.  A region-of-interest (ROI) can optionally be selected manually or automatically.  For file entries where manual selection was specified, the program will stop and present a rectangle on the screen for the user to specify the ROI.  Typically, automatic ROI works fine, and it does not require the user to stand by the computer to specify the ROI each time a new file starts to process.

As the files are aligned, the Status column and Status message will display the progress. The alignment procedure can also be visualized by clicking the Live update checkbox, which will display the mean of the entire image stack as the process moves along.  Pan and Zoom buttons allow the user to inspect details in the live image, such as fine branches, as the system is carrying out the alignment. This tool performs rigid alignment and the result is stored in a *_rigid.sbx file.  The original data is left untouched. The tool can align images relatively fast (about 65 frames/sec in my computer), but it will take a few minutes to compute the reference image if the sequence is 15 min or more (please be patient). Alignment improves with the number of passes requested.  Usually one pass is very good, but you can try two or more passes by changing the appropriate entry in the column. The alignment algorithm has been improved.

sbxsegmenttool. The segmentation tool works in a similar way as before. After loading the aligned *_rigid.sbx file, it will display the correlation map.  Segmentation then proceeds as in the previous version.

Once a number of cells are selected, you must save the segmentation and then extract the signals by pressing the corresponding buttons. After the signals are extracted you can select a cell with the pull down menu on the bottom left and the traces corresponding to that cell (now highlighted in green) will be displayed.  The blue trace represents the average signal within the cell, the gray trace is the neuropil, and the trace is the estimated spike rate using the Vanilla algorithm with parameters optimized for GCaMP6f.

Improvements include an Undo button, which will remove the last cell segmented. The ability to load a previous segmentation (it will load automatically after you select the *_rigid.sbx file), to continue adding cells to it.  The ability to define a ROI in the correlation map to automatically increase the contrast of the correlation map as the most salient cells are selected. A zoomed version of the mean image on the right to go along with the correlation map.  And the tool now saves the neuropil and deconvolved signal as well.

Give these tools a try. Report back any suggestions for improvements or problems you encounter.

# Spikefinder: the Vanilla algorithm

Over the last few months, the Spikefinder challenge has provided a playing ground for colleagues to offer ideas about how to estimate the spiking of individual neurons based on the measured activity of fluorescent calcium indicators.

The challenge was for people to come up with strategies that beat the performance of state-of-the-art algorithms, STM & oopsi. A good number of algorithms were able to achieve this in short time, including one I submitted, termed Vanilla.

The best performing algorithms seem to have relied on modern machine learning methods. Vanilla is nothing more than a linear filter followed by a static-nonlinearity $y(t) = \phi ( h(t) * x(t) )$ — thus the name.

The filter $h(t)$ is a linear combination of an even filter, estimating the mean of the signal at time $t$, and an odd filter, estimating the derivative of the signal at time $t$.

The even filter is a Gaussian, $h_{even} = A \exp ( -t^2 / 2 \sigma^2 )$, and the odd filter is the derivative of a Gaussian $h_{odd} = B t \exp (-t^2 / 2 \sigma^2)$.  The constants $A$ and $B$ are such that the norm of the filters is normalized to one, $\| A \| = \| B \| = 1$.  These two filters are linearly combined while keeping the norm of resulting filter equal to one, $h(t) = \cos \alpha \: h_{even}(t) + \sin \alpha \: h_{odd}(t)$.

The output nonlinearity is a rectifier to a power, $\phi ( x ) = (x- \theta)^\beta$ if $x>\theta$, and zero otherwise.

The model has only 4 parameters, $\{\sigma, \alpha, \theta, \beta \}$. The amount of smoothing of the signal is controlled by $\sigma$, the shape of the filter is controlled by $\alpha$, and the threshold $\theta$ and power $\beta$ determine the shape of the nonlinearity.

The model is fit by finding the optimal values of $\{\sigma, \alpha, \theta, \beta \}$ that maximize the correlation between its output $y(t)$ and the recorded spiking of the neuron.  I used Matlab’s fminsearch() to perform this optimization, which was typically finished in about 60 sec or less for most datasets.

The only pre-processing done was a z-scoring of the raw signals.  In one dataset (dataset #5, GCaMP6s in V1), we allowed for an extra-delay parameter between the signal and the prediction.

I was surprised this entry did relatively well, as the algorithm it is basically a version of STM. I think the particular shape of the output nonlinearity (rectifier+power vs exponential), the constrain imposed on the shape of the filters, and the resulting small number of parameters, paid a role in Vanilla doing better overall.

The top algorithms reached an absolute performance of about 0.47 and it seems unlikely this performance can be improved by a lot. This seems to highlight the limitations of the current families of calcium indicators in yielding precise spiking information — so there is plenty of opportunity to improve them.

It is interesting that despite its simplicity, the relative performance of Vanilla, with a correlation coefficient of 0.428 was not dramatically inferior to that of the top performing, deep network models, with all its bells and whistles, which landed at 0.464.  So, one must pay due respects to deep networks, but I was honestly expecting Vanilla to be completely blown out of the water in terms of performance, and I don’t think it was.

Finally, Vanilla is a non-casual algorithm, as both past and future samples are used to predict the response at time $t$. In some situations, however, when we are trying close the loop as fast as possible by controlling a stimulus based on neural activity itself, we need algorithms that are casual and can provide a fast, online estimate of spiking activity. I wonder if any of the submissions are causal algorithms and, if not, what would be the best performance the methods can attain if we allow them to provide estimates of spiking based only on past samples.

# External TTL trigger

To automatically start /stop acquisition by means of an external TTL signal follow the following instructions.

First, connect an appropriate TTL control signal to the P1.6 (pin #21) of the extension header of the Scanbox board.  The view below shows a top view of the Scanbox control board.  The pin in question is located on the back row of connectors when viewing the board from the front.  As a ground pin, you can use pin #3.  To make it easier to make the appropriate connections it helps to get the this cable and route it outside the box.

Start the Scanbox software and operate as usual by focusing and selecting the area you want to record.  When ready to switch to external trigger control, simply click the “External TTL Trigger” checkbox, which is located in the middle of the Scanner control panel.

After enabling the TTL trigger, the manual Focus/Grab buttons will be grayed out and blocked from usage.  If you want to go back to manual control simply deselect the TTL Trigger checkbox.

The rising edge of the TTL control signal is used to start/stop the microscope.  Minimum pulse width is 1 ms.

While controlling the microscope using an external TTL signal it is useful to run it in continuous resonant mode (so you avoid waiting for the resonant mirror to warm up) , and set the “autoinc” configuration variable to “true”, so file numbers increment automatically after the completion of each session.

To use this feature you have to update to the latest version of the firmware/software.

# Knobby scheduler

A new Scanbox panel allows users to define arbitrary changes in (x,y,z) position over time (frames) which are then executed by Knobby (version 2 only) while imaging.

Each entry define changes in x, y and z (in micrometers) relative to the present position and the frame number at which they will take place.

The “mem” column allows one to specify one of the stored absolute coordinates instead (memory locations are coded A=1, B=2, C=3).  If a memory location is defined the other entries are ignored and the position in the referenced memory is used instead.

This mechanism extends the z-stack functionality to include the ability to tile a sample and brings back the control window to one of the panels in Scanbox (as opposed to being controlled in Knobby’s screen).  The Knobby table is also saved in the info.knobby_table variable.

Paths can be computed offline and stored in a Matlab file that can be loaded.  The example below shows knobby moving the sample along a circular path.

# How to update Scanbox?

Installing updates is not very difficult.  Here is a step-by-step guide.

• Uncompress the zip file in Documents/MATLAB/<new scanbox directory>.  Keep your old distribution intact, in case something does not work and you want to restore the old version.
• Start Matlab
• Type “edit scanbox_config” in the Matlab window to edit the scanbox_config.m from the old version.
• Use pathtool in Matlab to add all the subdirectories in the new distribution to the path.
• Type “edit scanbox_config” to edit the new configuration file.  You should now have both the old and new config files on the Matlab editor. Copy the required settings form the old file to the new one.
• To update the firmware in Scanbox do the following:
1. Open the Cypress Bootloader Host (seearch for Bootloader if you can’t find it)
2. For the file entry in the Bootloader Host, select the file <newdir>/drivers/DarioBox.cyacd
3. In the Matlab command window type “scanbox_config; sb_open; sb_reset; sb_close”
4. In the ports panel of the Bootloader host you will see a USB Human Interface Device listed immediately.  Select the device and click the Download button at the top left of the Bootloader Host Window (it is the one with an arrow pointing down). Note: you ave 20 seconds to upload the new firmware (from the time you issued the Matlab commands), otherwise Scanbox will continue booting normally. The Log panel in the Bootloader host will show if the programming of the box was successful.  You can now quit the Bootloader host.
5. The new firmware version should now show up on the Scanbox LCD display.
• Now we need to update Knobby.  To do so, simply go to the Matlab window and type “knobby_update”.  Wait for Knobby to update.
• Now we need to update the firmware in the motor box.  To do so follow the instructions here
• Close Matlab.
• Run the vc_redist.x64.exe file in the scanbox/core/ directory.
• Open a command window and type “conda install pyserial”.  This will update the python serial library and any dependent components.  Reply [yes] when asked to proceed with the update.

That’s all.  Restart Matlab and launch Scanbox.  You have been upgraded!

If something goes wrong it will likely happen during startup and you will get a corresponding error message (in the form of red text in the Matlab window).  Send me the message and I will help.

# Bada boom! New system coming soon!

So why the long silence in the Scanbox blog?

We have been working hard on a the development of our new system. A modular, expandable system that will run the new line of Neurolabware microscopes (aka the Kraken microscope) and is backward compatible with our previous box.

Want a sneak peak?

Here is a closeup of some of the LCD/power modules…

If you are interested in the new features of the Kraken microscope and the new modular system please get in touch with Neurolabware.

# Virtual Knobby

Happy new year! We have plenty of exciting Scanbox developments happening this year, so stay tuned to the blog.  You don’t want to miss anything!

We recently introduced a wireless version of knobby that runs on Android tablets. The same software is now available to run on Windows, side-by-side your Scanbox application.

The controls and behavior are identical to the tablet version.  To use virtual knobby simply set the tri_knob configuration variable to “127.0.0.1”.

After launching Scanbox from Matlab go to the yeti/knoby_virtual/ directory and launch the knobby_virtual.exe application.  That’s all…  Go ahead, give it a try!

So you now have three options for position control: classic knobby, knobby tablet and virtual knobby.

Note also that if you are a user of classic knobby and run into some issues (like a rotary encoder going bad) you can always use virtual knobby as an emergency replacement while the hardware version gets fixed.  So no more downtime for a broken knobby.