Happy Holidays from Scanbox

Happy holidays to all Scanbox users!

This was a very productive year.  We introduced a number of new features including:

Introduction of Virtual Knobby
Introduction of Knobby 2.0
Introduction of Knobby Scheduler
Support for TTL start/stop acquisition mode
Enhancements to alignment and segmentation tools
Introduction of the plugin server
Introduced spatial calibration and return to origin with Knobby 2.0
Developed remote control of microscope position with click-and-center mode
Developed automatic calibration of the optotune
Introduced automatic gain control of laser power
Support for visualization of individual slices during acquisition of volumetric data
Support of spatial calibration for multiple objectives
Introduced new functions to process volumetric data
Developed tiling with Knobby 2.0
Introduced methods to program ETL to sample on a surface
Support of intrinsic and wide-field imaging
Added configuration settings for different PMT amplifiers
Developed PPL delay line for optimal sample clock phase
Introduced support for the NLW Mesoscope
Developed automatic procedure for pockels cell calibration
Completed development of our new tower system

All software features are offered at no additional cost to our Scanbox users.

What will 2018 bring?

Well… that depends on your requests for new features. What do you like to see implemented?  What needs improvement?

Let us know in the comments below!


Measuring the field of view and validating the uniformity of spatial correction

If you followed the instructions on spatial calibration you should have a nice uniform field in Scanbox.  One simple way to measure the resulting spatial resolution and spatial uniformity at each zoom setting is to mount a 40 line pairs per mm RONCHI calibration slide on top of a green aut0-fluorescent slide.  By aligning the slide with the horizontal and vertical axis of the scan you can precisely measure the size of your field:



You can also superimpose them to visually judge if the spatial correction is isotropic throughout the field… which, in our setup it is rather uniform:


The dark side bands are blanked by the pockels cell and their size is controlled by deadband configuration parameter.

A simple lick-o-meter and liquid reward delivery system

Some behavioral experiments reward correct performance. A typical response is a lick of a spout and the reward is a fixed volume of water. One simple way to achieve this is by using an Arduino board along with the capacitive sensing library.  The diagram below summarizes the parts used.  Just connect a 1 Mohm resistor between pins 2 and 3.  Pin 2 is also connected to a metallic spout. Short pins 22, 24 and 26 and connect them to one of the solenoid pins of the valve .  The micro-valve has 3 ports.  The port further away from the pins must be sealed off.  The middle port is connected to the spout.  The port closer to the pins must be connected to the a 60 cc syringe reservoir.  With the syringe filled with 30 cc of fluid at a height of 20 cm from the spout the code below delivers 2 uL of fluid per pulse.


Once downloaded, the Arduino code below accepts 1-byte commands. A command 0f 0x00 (0) simply rests the lick variable and starts monitoring for licks.  A command of 0x01 (1) reads out if there had been a lick since the last reset (a single byte reply).  A command of 0x02 (2) delivers a reward.  Any other number is interpreted as a change in the valve pulse width, thereby changing the volume delivered each time. The only variable that may need to be adjusted is the the threshold for lick detection which is initially set at 500.  The code just a skeleton…  you will likely need to modify it for your own application.  Credit: Nick Olivas in the Trachtenberg Lab helped designed this.



Adding an SLM path for optogenetics

As promised, there has been some progress in adding an SLM path for optogenetics to the microscope and things are looking good…

We are now calling on potential users to provide input into the features that they would need/want in a GUI interface, as this may impact how useful this addition will be to our own work.

At the moment here is our basic plan:

  • Provide a tool to align the SLM and two-photon imaging paths.  This will consist in the user using the camera port to acquire an image of the scan area onto a chroma slide.  The system will then flash a number of single spots on a different slide and acquire their positions to compute the best affine transformation between the SLM image plane and the tw0-photon image.  Calibrations will be required just before the beginning of each imaging session.
  • Once a sample is imaged, provide cell-selection tool to obtain the coordinates of N desired locations in the imaging plane.
  • After cell selection a Matlab-based server will use the calibration and and selected points to accept network commands that describe the desired intensity at each of the N locations and the duration of the pattern. After the optimal phase is computed on the GPU it will be presented to the SLM for the specified duration and the start/end time of the presentation will be logged by the microscope through an event line in Scanbox.

So present and future Scanbox users…  is this a reasonable starting point? As it is, the system would be limited to N (as large as you want) points of different intensities.

Please add  your comments and/or suggestions below.  This is your opportunity to have an input into the design of the SLM path….  don’t miss it!

Yeti turns one! (and plans for next year)

The first image acquired by Yeti is just 1 year old!  Happy birthday Yeti!

yetiWe have seen Yeti grow a lot during the first year….

He added capabilities to stabilize images  and extract signals in real-time, enlarged the field of view, offered volumetric scanning allowing for co-variation of laser power and focal plane, included bidirectional scanning, provided on-line computer-assisted segmentation, linear power calibration, enabled remote firmware updates via the Yeti bootloader, and more…

Of course we are far from done…  Yeti has a lot of interesting plans for the next year, including:

Position control: Several colleagues accustomed to other 4D manipulators would prefer to have separate turning knobs for each axis. We are planning to provide such an option that will also off-load all the motor control out of Yeti’s GUI, thereby providing better during acquisition. This knobbynew device, called Knobby, is in its advanced stages of development.  It features a beautiful touch screen to interact with the user and display position information.  The 3D mouse control will still remain an option for those of you that prefer to keep working with it. Knobby is expected to be available soon.

Processing GUI: While some colleagues have developed and are happy with their own processing pipelines, other users would rather have a simple interface to perform standard image processing of the acquired image sequences, such as alignment, cell segmentation, signal extraction, and spike detection.  We are planning on producing such interface this year that will make life easier for many users.

Support of memory-mapped files: Yeti will provide a memory mapped file with a ring-buffer of recent data that can be consumed by other processes performing real-time analysis of the data stream.  This can come in handy for those of you that want to perform closed-loop experiments or want to develop your own real-time spike detection algorithms.

SLM path and control for optogenetics:  Yes, yes, we know…  you need this as soon as possible! We are working on it!  If there are any suggestions as to the kind of protocols you would like implemented for stimulation let us know.

Of course, if there is any other features that will make your work easier just add it to the comments section.  Depending on demand and feasibility, we will try to include it.

In the meantime, happy holidays to Yeti users everywhere and may the new year bring peace, health and great science!

Welcome to Scanbox!

Welcome to Scanbox File Exchange.  This is a site where you can find regular updates to the scanbox software for two-photon imaging, report bugs, share data analysis tools and ideas for improvement.

As time goes by we will be posting tips and tricks on how to use the software, insights into the code (the Matlab software is open to the community), and documenting the various functions you can use to analyze the data.

Some of features already implemented in the release version of ScanBox:

  • Two analog and two digital channels sampled at laser frequency (80Mhz) and 16-bit depth.
  • Control of PMT gains.
  • Non-uniform spatial sampling correction in real time (raw data are streamed to disk).
  • Real time averaging and display of data.
  • Uniform power density over scan line by modulation of Pockels cell (an arbitrary waveform can be programmed).
  • Control X, Y, Z stage and tilt angle of objective.
  • Z-stack data collection
  • Movement in a rotated coordinate system that keeps the (x,y) plane normal to the objective.
  • Control of laser parameters (power, shutter, wavelength).
  • Two additional TTL signals timestamped with the frame and line number where they occurred.
  • Two GigE cameras synchronized to the microscope frame to acquire eye pupil/position and ball movement.
  • Additional GigE camera for intrinsic imaging through the same (or different) objective.
  • Additional digital I/O, I2C, SPI, current generator (for electrical tuned lens) expansion capability.
  • Remote control of the microscope over the network (change file names, start acquisition, stop acquisition, etc).
  • Matlab software for reading data, motion correction, segmentation, and signal extraction.

At the heart of the system is an AlazarTech 9440 digitizer and a custom-designed card based on the PSoC 5LP 32-bit ARM-based processor from Cypress that is in charge of generating the scan signals, generate trigger signals for the cameras, timestamp external TTL events, and more.  The card communicates with the host computer through a USB serial line.

That’s it for a quick introduction…  but, before leaving, here is one of the first movies we obtained with the microscope in Josh Trachtenberg’s Lab showing 1 min in the life of prefrontal cortex (please don’t ask me what I was doing there).  Enjoy —