19.8.16

Orange Pi ONE and camera

I bought Orange Pi One to drive my 3D printer.
The plan was to use OctoPrint, connect printer using USB, connect network using the on-board Ethernet and use the Orange Pi Camera board to have video.


All was going OK with the installation.
I modified the Ubuntu Xenial installation to work in read-only mode first.
I mounted a network share from Banana Pi server (1TB drive there) to Orange Pi no problem.
Then, I re-mounted back to read/write and installed all necessary tools to compile OctoPrint.
No problem so far, everything went OK.
Now the final task - camera.
I followed the example here and was able to start motion. But there was a problem - even when there was no change in the scene (e.g. static picture from camera and nothing moving), motion took 25% of all CPU cores. And, when motion occurred - despite the lowest possible detection settings, the usage of CPU spiked to 50%.
Unacceptable.
So, I decided to proceed further with the installation and get MJPG-streamer. I was using the SW successfully already on multiple devices - OpenWRT router based on MT chip, Banana Pi, Raspberry Pi and even standard x86. I liked it - it had a reputation for the least CPU-consuming streamer outthere.
I compiled it no problem, fired it up and bang - no go.
The input_uvc.so was unable to work with the Orange Pi camera.
I managed to get fswebcam work - I mounted /tmp partition as a RAM disk and stored a capture twice every second. I run the mjpg-streamer with input_file.so successfully, but the resulting framerate was 2 frames per second and due to fswebcam it used almost 30% of all CPUs. So not a real win, considering the CPU usage vs framerate.
I started to test gstreamer. It was not able to work with /dev/video0, so I had to install v4l2loopback (using apt-get install v4l2loopback-dkms) and vidcopy.
As I am using the Orange Pi ONE in headless mode, it was VERY painful to try to construct a working pipeline.
I managed, but the rtp video stream had a huge delay (10+ seconds) and CPU usage was more then 40% on all cores, again. I considered this not working, again.
My last hope was ffmpeg. Capturing from webcam to file in /tmp was no problem at all. But streaming was a different topic - again I had huge delay - depending on the time difference between lunching the server and the receiver. And same CPU usage as gstreamer.
Again, no go.
So I was back to square 1. I installed USB webcam, disabled the gc2035 and used mjpg-streamer. All worked well, CPU usage was below 25% on 1 core and ~20% on the second, the rest of the cores were idle. Fantastic!
But it got me worried - I still had the Orange Pi camera and it was useless.
So I continued experiments. I found out the mjpg_streamer has an undocumented input option for the input_uvc.so to use YUVU instead of MJPG. I tried that - no success.
Then I realized that the vidcopy forces the mode not to YUVU, but to UYUV. That means the order of the video data differs. I was desperate and started to browse the internet to find details on how to decode the UYVY to RGB. The plan was to edit the source of mjpg_streamer and add the support somehow - although my programming skills are terrible (this is how desperate I was).
I found out some details and decided to search for the code that manages the switch to YUVU and start from there. I found it uses libjpg to translate the RGB back to mjpg stream - so far so good, that part I can simply copy&paste.
So I decided to create my own input parameter - and THAT'S THE MOMENT I found out, that there is another undocumented switch already in place. It's called -u and guess what - it forces mjpg_streamer to use UYVY!!!
A quick test - with v4l2loopback module in place, vidcopy running and mjpg_streamer with -u parameter for input_uvc.so I finally was getting a picutre!
The mjpg_streamer complains about the camera when fired - it tries to use uvc controls and fails miserably, but I don't care :)
CPU usage was ~20% for the 4th core, the rest of them were idle!
Total success!!!
To streamline, I modified /etc/modules and included "v4l2loopback video_nr=9" - this creates the loopback video device number 9, so I have /dev/video9 after every reboot.
And from then it was simple - running "./vidcopy  -i /dev/video0 -o /dev/video9 -f UYVY -w 800 -h 600 -r 5 &" fired vidcopy and moved it to background.
And finally running "mjpg_streamer -i './input_uvc.so -r 800x600 -f 5 -n -u -d /dev/video9' -o './output_http.so'" and I was done!
Hope this helps folks outthere.