A one-line analysis pipeline

With the ability to read Scanbox binaries in Suite2p I wrote a one-line processing pipe-line for Scanbox. If you are within a data directory with folders you want to process, all you have to is pass a list of the experiments to sbxsuite2p():


This example just contains a list of two experiments. Matlab will then open a command terminal, activate the suite2p virtual environment, process the files, create a suite2p/ folder with the output, and generate matching Scanbox files, such as *.signals (containing the spikes), *.segment (containing the ROIs) and *.align (containing the mean image), for each of the experiments in the list.

sbxsuite2p() running the pipeline on a list of experiments

The python script that runs these steps is called suite2p_process.py, and you can modify it to overwrite some of the default options.

from suite2p import run_s2p, default_ops
import numpy as np
import sys, os
import matlab.engine
import scipy.io

data_dirs = sys.argv[1:]

ops = default_ops()					# load default options

for dd in data_dirs:				# batch suite-2p processing of each file

	print('\nProcessing: ' + dd + '\n')

	db = {
	      'input_format': 'sbx', 		# process sbx files
	      'delete_bin': True,			# delete bin 
	      'look_one_level_down': False, # whether to look in ALL subfolders when searching for sbx files
	      'fs': 15.56,					# typical sampling freq
	      'data_path': [dd], 			# a list of folders with data - just one here...  
	      'save_path': dd,				# save path                     
	      'subfolders': [], 			# choose subfolders of 'data_path' to look in (optional)
	      'save_mat': True,				# save matlab output
	      'fast_disk': 'h:/bin' 		# string which specifies where the binary file will be stored (should be an SSD)

	opsEnd = run_s2p(ops=ops, db=db)

for dd in data_dirs:									# Do post-processing 

	print('\nPost-processing: ' + dd)

	os.chdir(dd + '/suite2p/plane0/')					# change directory to where the results are
	fn = dd.split('/')
	fn = fn[len(fn)-1]

	eng = matlab.engine.start_matlab()					# start up your engines...
	eng.sbxsuite2sbx('Fall.mat',fn,0.8,nargout=0)		# convert back to Scanbox segment and signal files
	ops =  np.load('ops.npy', allow_pickle=True)		# save mean image in align file
	ops =  ops.item()
	m = ops['meanImg']
	os.chdir('../../')									# go back to base folder
	scipy.io.savemat(fn + '.align',{'m': m})			# save mean image in align file

And the matlab function that execute the script is:

function sbxsuite2p(fname)

str = [];

if iscell(fname)
    for i=1:length(fname)
        str = [str ' ' getpath(fname{i})];
    str = getpath(fname);

str = strrep(str,'\','/');
pyscript = strrep(which('suite2p_process.py'),'\','/');     % find the processing script
cmd = sprintf('conda activate suite2p && python.exe %s %s & ', pyscript, str);

function p = getpath(fn)

    s = what(fn);
    p = s.path;
    error('No such file');

That’s all… So, after you close Scanbox and turn the laser off (don’t forget to turn the laser off), get sbxsuite2p() going and come back in the morning ready to analyze the data. Your old Matlab processing scripts should work independently of whether you used Scanbox or Suite2p pipelines.

To use this script you will need to add the Matlab Python engine API to the Suite2p virtual environment. Here are the instructions.