DAQ Synchronization

DAQ Synchronization

Align and synchronize data acquisition systems

Research, and in particular behavioral neuroscience, often requires integrating multiple measurements of physical phenomenon simultaneously. Such measurements can often be achieved electronically through the application of Data Acquisition (DAQ). There are many reasons that the recordings of a whole system cannot be achieved on a single DAQ. Any two recording DAQ (even two identical DAQ, and even if expensive and fast), if not properly clock-synced, will most likely not have aligned clocks, and therefore will not perfectly agree on the passage of time. This error, or rather mismatch, is exemplified when a recording epoch is sufficiently long relative to the duration of an event of interest. Indeed, even the amazing recordings of the very fast (and fragile (!) Neuropixel electrophysiology probes do not agree on the passage of time, with a sample rate of between 29999.9 and 30000.1 Hz.

But fear not! Synchronization of multiple DAQ can be accomplished offline (post hoc)! If a proper synchronization signal is shared and recorded amongst all DAQ in a system, then care can be taken to align all recordings and thus timing is agreed upon by all DAQ. Herein is a project and pipeline to show how to generate a simple, inexpensive, and unique barcode synchronization signal and use this to align multiple DAQ.

A simple microcontroller can be used to generate a unique synchronization barcode signal either at set intervals or upon pressing a button, and this signal can be sent to all recording DAQ. One DAQ/clock is chosen to be the Main DAQ (somewhat arbitrarily so), and all other DAQ are treated as Secondary. The Main DAQ will record the synchronization signal and any other signals required and will be treated as ‘true’ time. All Secondary DAQ also record the synchronization signal and their respective signals. The synchronization signal of the Secondary DAQ is compared to that of the Main DAQ, and then, through simple linear math, we can align the two recordings.

General Workflow

A general workflow for a general project can be the following:

All code with example data can be found within this zip file. Simply download and extract.

More Specific Workflow

Example: LabJack and Neuropixel

This project started from users’ needs for integrating Neuropixel (NP) systems into behavioral setups. The current IMEC cards for recording NP can output a basic signal for integration with other DAQ, but it does so without indication of a ‘start’ (zero time) such that it is difficult to align recordings. Alternatively, you can record an input signal from outside equipment into the NP system and then use this to align all recordings (including the NP ephys) to time zero, but this does not account for scaling (DAQs that do not agree on time). A better solution would be to generate a simple, unique signal and record this on both the NP and an outside DAQ. With this known signal, you can account for alignment to time zero and scaling, and since this synchronization signal is generated regardless of whether a DAQ is on or off, recording systems can begin independently without the need for a shared start signal.

An example use of the code and resources is shared with data here. Simply click this link to download all code and example data. In this given example, a simple Arduino (called a Beetle) is used to generate a random 32-bit barcode. It can then output this barcode on one of its 5V pins as a TTL ‘synchronization’ signal. It can do so automatically or, if one so chooses, only after a trigger (button push or input TTL). The Arduino then increases the barcode by one bit and outputs that signal, ad infinitum (or after a set number of barcodes, if using a trigger). This synchronization signal is recorded by the IMEC NP system (along with its ephys recording), and is also recorded by another DAQ along with other behavioral systems. We chose to use the all-powerful LabJack DAQ, because it is inexpensive, powerful, and fast enough for our needs. We were using a camera system for recording behavior using the super awesome DeepLabCut. This camera was recording at 200 Hz, so setting our DAQ to record at 2000 Hz is way more than sufficient (research “Nyquist rate” for more info). Additionally, we also use the the LabJack (LJ) to record the start of a trial, an LED presentation to the animal, and the output of a treadmill (these are recorded as a TTL and an analog inverting signal, respectively).

If your experiment uses NP and LJ as recording devices, you can follow this example code and data to learn how to process the LJ data and align it to the NP data. The LJ data first requires processing because the LJ can output its data as multiple .dat files with headers; to clean them up, we remove the headers and combine them into one .npy or .csv file. We then take the NP (“Main”) recordings and the LJ (“Secondary”) recordings, and extract the barcodes in each dataset. With the barcodes indexed on both recordings, we can compare the barcodes shared by both Main and Secondary recordings, use the difference in Main/Secondary index values between barcodes to generate a linear conversion formula, and then apply this conversion to the Secondary recordings so they align and scale to the Main recordings. Now all signals are aligned and scaled appropriately.


(Click on the image to see it in detail)

All code with example data can be found within this zip file. Simply download and extract.

More Specific Workflow/Example: Other DAQs

Although we originally designed the code in this repository to align data between a LabJack DAQ and Neuropixel, it should be able to work un-altered for any DAQ systems which output their data in either Data file (.dat) or Numpy file (.npy) format, and have the signals in those files in columns. Even if your multiple DAQ setup has systems which do not meet these conditions, fear not! With a basic understanding of Numpy and Python, it should be relatively simple to alter the Python codes within the repository so they can handle different file types and/or data formatting to meet your needs. You can also try to Email Us with questions.

Example data

Within the zip file is a folder titled ExampleData which contains example data that one can use for running all code provided here. A LabJack recorded several signals and a Neuropixel system was set up to record it’s one TTL and it’s ephys data.

An Arduino was used to generate a barcode that was recorded simultaneously (via a BNC cable with a T-connector) on both the custom National Instruments PCIe used for Neuopixels and a LabJack. OpenEphys was used to record NP data from a used probe that had it’s tip broken off (we had it laying around). In the short recording time, 8 GB of data was generated, so the ephys data file was removed from this example. Prior to this, it could have been found in:


OpenEphys outputs the TTL in a seperate data file (aligned by indices of the ephsy data) in:


The LabJack recorded the barcode TTL signal and several other signals meant to represent a behavioral setup. This is similar to the LabJack Example project and more details for customization for YOUR research can be found there. In brief, the barcode signal was recorded on Channel (LJStreamUD Row) 0, Channel 1 recorded a 3.3 V 15 Hz signal from a camera (a frame is taken at every up and down tick of the signal), Channel 2 was not used (could record a Trial Start Signal), and Channel 3 recorded an inverting 10V sinusoid (meant to represent…oh, let’s say a treadmill).

Equipment Signal LabJack Input LJStreamUD Row +Ch -Ch Scaling Equation
Arduino 5V Digital FI04 0 193 199 y=floor(a/2^4)-2*floor(a/2^5)
Camera NA. What could you put in here?? FI05 1 193 199 y=floor(b/2^5)-2*floor(b/2^6)
Not Used 3.3V Digital FI06 2 193 199 y=floor(c/2^6)-2*floor(c/2^7)
Treadmill +/-10V Analog AIN00 3 0 199 y=d

The LabJack output it’s files to:


For clarity of this example, the NP TTL file is copied to another folder, called Main (as in the Main DAQ; the DAQ we will treat as ‘true’ time). The LJ data is copied also, into a folder called Secondary.

Notes and References

Much of this work was inspired by the LabJack Community and the Open Ephys system, ESPECIALLY for barcode generation.

A special thanks to Josh Hunt of the Felsen lab for assistance, direction, and some fancy dancy coding help. Check out his git here.

ONE Core acknowledgment

Please acknowledge the ONE Core facility in your publications. An appropriate wording would be:

“The Optogenetics and Neural Engineering (ONE) Core at the University of Colorado School of Medicine provided engineering support for this research. The ONE Core is part of the NeuroTechnology Center, funded in part by the School of Medicine and by the National Institute of Neurological Disorders and Stroke of the National Institutes of Health under award number P30NS048154.”