Scientific cameras can be expensive. But oftentimes, high resolution at high frame rates is not necessary. In fact, because of limitations in computation times and data storage, most videos taken with high resolution/speed cameras are immediately down-sampled. See Fig 3 of On the inference speed and video-compression robustness of DeepLabCut.
Enter the ever-hackable PS3Eye Camera, henceforth called PS3E (yeah, I’m too lazy to type the ‘ye’). We show how to get time stamped frames at 640x480 resolution at greater than 90 frames per second. This little powerhouse can probably get at your needs. At the time of writing, this camera is around $7.
Here we will look at the use of this camera, with special consideration for DeepLabCut (also see ONE Core notes on installation). We will look at several aspects in particular: Positioning the camera(s) using 3D printed and off the shelf parts, manual focusing with novel lenses, capturing video in Windows (please consider Ubuntu, you will be happy you did!), using this camera in Windows in Matlab (note, it works natively in Ubuntu Matlab!), capturing time stamped images in python (on Ubuntu), use in LabView, and finally syncing the cameras together so that they capture simultaneously (it’s like reverse Hebbian learning: Those that wire together fire together!!!).
Of note, DeepLabCut can OUTPERFORM humans. Meaning, if you can see it in the video feed, DLC can too! See here on using multiple cameras (really breaking the bank with another $7 camera, eh?!) for 3D analysis.
What a neat little camera. Here we will go over:
- Lenses. The camera offers good resolution and timing, but I don’t like the lens that comes with it. We show you several lens options that will help you get better videos. But, this requires:
- Disassembly. The housing that comes with the camera holds onto a lens we don’t like. To hold onto new lenses (camera standards M12 and CM Mount), we show you how to break open the case. Then you can use:
- 3D Printed Housing. To hold onto your chosen lens(es) onto 1/4”-20 screws (industry standard in tripods and similar) or:
- Positioning. Hold the cameras in a flexible yet rigid manner with gooseneck tubing. Then consider:
- Syncing multiple cameras together for calibration, 3D analysis, and true distances (think meters instead of pixels). You can output the frame times of a camera via 3.3V TTL (kind of, it drops frames like most cameras). You can use this signal to sync multiple cameras together, but I doubt that is necessary because:
- The noise introduced is small
- Mirrors could be used with DLC, then post crop the EXACT same frames into ‘new videos’
- SOFTWARE: we provide software to capture timestamped videos of multiple cameras, ignoring duplicate frames. Play with camera settings (example: exposure) for your USB Camera. Only tested on Ubuntu, but we also provide some thoughts on:
- Not Ubuntu and outside of Python. But why?
Positioning the Cameras
The PS3E comes on a stand that allows for moving the camera in two dimensions. But that isn’t enough for good positioning of the camera to a new point in space and having it stay there. You have probably played with flexible tubing, called gooseneck tubing (think of those old lamps that you position into place and the light stays put). It’s perfect for this application, but rather difficult to find on its own. The designs below are all based on this kind of tubing (Part Number GSNECKRAW18”X.460 N/F Raw Flex.105OTMB/.062Galv.ASAP, 18” in length) from Uniprise (note, they come covered in machining oil, so first wipe them off). The gooseneck tubing can be connected to a variety of 3D prints.
Here are some 3D files you can print with an FDM printer (high speed, no supports, 20% infill, nGen or PLA) to hold the gooseneck tubing and the PS3Eye Camera (click on the .stl to view in 3D in your browser, download from there by clicking the ‘cloud down arrow’ icon):
|BarePS3EyeBack.ipt||Holds the Gooseneck and PS3E when disassembled *1|
|BarePS3EyeBack.stl||Holds the Gooseneck and PS3E when disassembled *1|
|BarePS3EyeFrontHolder.iam||Cover for Gooseneck and PS3E when disassembled *1|
|BarePS3EyeFrontHolder.stl||Cover for Gooseneck and PS3E when disassembled *1|
|ClipBottom.ipt||Holds Gooseneck sandwiched in a binder clip|
|ClipBottom.stl||Holds Gooseneck sandwiched in a binder clip|
|ClipTop.ipt||Holds Gooseneck sandwiched in a binder clip|
|ClipTop.stl||Holds Gooseneck sandwiched in a binder clip|
|PSEye3.ipt||Holds the Gooseneck and PS3E|
|PSEye3.stl||Holds the Gooseneck and PS3E|
|GooseneckBreadboard.ipt||Holds the Gooseneck to a breadboard|
|GooseneckBreadboard.stl||Holds the Gooseneck to a breadboard|
|GoosneckholderTripod.ipt||Holds the Gooseneck on a tripod|
|GoosneckholderTripod.stl||Holds the Gooseneck to a tripod|
|M12_mount.scad||Mount an M12 lens onto the PS3E *2|
|M12_mount.stl||Mount an M12 lens onto the PS3E *2|
1 As remixed from tax
2 As remixed from jasonwebb
See Assembly instructions below to see some pictures of what these look like.
The gooseneck tubing should screw perfectly into the provided holes and need not be affixed with any glue or screws. But if you want extra holding power, there are places you can drop in two 6-32 nuts and screws (pan head, 3/4” McMaster 90272a151). If you don’t plan on disassembling the camera from it’s housing, you can print out the PSEye3.stl above and attach the camera with that rubber band you kicked under your desk yesterday.
The same gooseneck fitting threads into the GooseneckBreadboard.stl. This part can be attached to a general 1 x 1” breadboard with the basic 1/4”-20 screws. Or, if you would like to have a more general clip, print out the ClipBottom.stl and ClipTop.stl and find yourself a 1” binder clip. The two inner holes here were originally intended to be screwed into the binder clip, but I broke a drill bit trying to use these holes. By simply screwing the top to the bottom with the outer holes (M3 or 6-32 screws work), you can sandwich the parts over the clip and it holds well.
If you are replacing the lens, print out the M12_mount.stl, don some gloves to protect the optics, then remove the two screws holding the lens holder in place, and replace with the lens of your choice (pictures below). The focal plane will move with how much it is screwed in. When you are all set up, adjust the focus plane to the desired plane by tightening or loosening these screws. This will require some guessing and checking.
The BarePS3EyeFrontHolder.stl allows the wires to be well held with the strain relief. If you have other wires to get to the outside, create a makeshift strain relief by creating a knot in the wires larger than the hole, then push the wires through the hole. If the wires get tugged, the knot will take the force, not the soldered connection. Two M3 screws affix the camera PCB to the 3D print.
The BarePS3EyeBack.stl simply snaps into the front, but you can also screw them together if you need.
Tracking larger animals? Try printing out the GoosneckholderTripod.stl and attaching it to a tripod.
There are three main reasons to disassemble your camera as soon as you get it.
- The focusing that comes with the PS3E is only one or the other. That is, it clicks to a ‘close up’ and a ‘further away’ focal plane. You may wish to focus even closer, at infinity, or adjust the field of view (wide angle). You could also get a lens without an IR filter, allowing you to capture in a spectrum invisible to a human. To do this, read below (Lens section).
- Syncing multiple cameras can be achieved with disassembly and soldering two wires onto each camera board. Tying several cameras into a ‘master’ and ‘follower’ configuration allows for all cameras to capture images simultaneously.
- It’s fun.
Fist, remove the plastic covers on the back. Then remove the 4 screws on the back.
Once you crack the outer cover, there is no going back. Or at least, it would be difficult. Expect that you will break the plastic. But that’s OK, you can 3D print parts (above). Start at the top of the camera with a flat object (screwdriver) and wiggle it in between the plastics. Push it around from the top, down one side, then the other.
It will eventually pop off. Remove all the screws holding the PS3E camera to the front. The two screws with arrows should remain (these keep the lens holder attached).
You can remove the covers for the microphones.
As stated, the lens that comes with the PS3E offers two focal planes, which may be sufficient for your needs. However,you may wish to focus even closer, at infinity, in between, or adjust the field of view (wide angle). You could also get a lens without an IR filter, allowing you to capture in a spectrum invisible to a human (IR LEDs for unnoticed timing and triggering?). You can find a HUGE variety of inexpensive M12 lenses available online.
If you are replacing the lens, print out the M12_mount.stl, don some gloves to protect the optics, then remove the two screws shown above with red arrows, and replace with the lens of your choice. The focal plane will move with how much it is screwed in. When you are all set up, adjust the focus plane to the desired plane by tightening or loosening these screws. The M12 mount part may not print with exact threading (especially if printed at the low resolution we suggest), but that doesn’t matter at all. The lens should still thread in nicely.
Is your lens causing distortion of the image? Check out Argus for ways to deal with this.
Use with Windows 10
This camera does not naively work with Windows 10. There are several drivers out there that claim to work, some that you even pay for, but we did not find success. However, the driver called PS3EyeDirectShow has proven success with us. A simple download and running of the .msi and you can begin to use and test it (try to view or record it in VLC).
Use with Windows 10 and Matlab
Lots of places online claim that this camera does not work with Matlab in Windows, including Matlab support. However, we have gotten it working using the Image Acquisition Toolbox and the driver above. We are not going to post any code, because it is pretty straightforward using their functions.
Use with Ubuntu
It works. No driver install required.
Capture timestamped videos with the script found here.
Use with Labview
Want to use this camera with LabView? (It’s ok, I’m not here to judge). The ONE Core has not tested this, but has heard of good results from Austin Consultants.
It might be important to capture videos with the frames in sync. This may be important if you are trying to get 3D data. But probably not. See, from our testing of timestamped videos, for two cameras not tied together, we were able to achieve a frame rate of 92 frames per second for each camera (at 640x480 pixels), and 0.000159 seconds between acquisition of a frame from one camera to the second camera. For three cameras, we had a frame rate of 69 frames per second, with an average time of 0.00032 seconds between the first and third camera. See here for data.
Buuuuutttttt, if you still want to sync the cameras, check out this post. It requires some crazy hard soldering. Basically, you disassemble the housing, and scrape off a bit of PCB insulation covering a through hole, solder to that pad (VSYNC or trigger output), solder to the top of R26 (FSIN or trigger input), and solder another wire to ground. You then ground all cameras together, set one camera as the ‘master’ to send output on the VSYNC into the FSIN on all other cameras. Tying several cameras into a ‘master’ and ‘follower’ configuration allows for all cameras to capture images simultaneously (!! reverse Hebbian learning: cameras that are wired together fire together!!!!!). Hold the wires to the PCB with hot glue. Others claim to drive the frame rate with an Arduino, but our testing found it quite unreliable/noise inducing.
Although we soldered the hardware together and showed success on the oscilloscope, we haven’t dove deep enough to find a way to have software simultaneously capture the two data streams. It is probably possible, with some sort of fancy threading, but I can live with an average of 0.15 msec delay between 2 cameras. If you use our code to obtain data, you will see no advantage to soldering the cameras together; this code grabs the data sequentially.
Use a Raspberry Pi to stream video to…other computers/Data Storage
ONE Core acknowledgement
Please acknowledge the ONE Core facility in your publications. An appropriate wording would be:
“The Optogenetics and Neural Engineering (ONE) Core at the University of Colorado School of Medicine provided engineering support for this research. The ONE Core is part of the NeuroTechnology Center, funded in part by the School of Medicine and by the National Institute of Neurological Disorders and Stroke of the National Institutes of Health under award number P30NS048154.”