Monthly Archives: July 2017

Trashbot upper body and neck servos revived

I recently split my Trashbot in half to finally get hold on the walking patterns of the lower part, as I changed the controller from Arduino Nano to Raspberry Pi. Here’s the upper part:

Trashbot upper body incl neck & head

Trashbot upper body incl neck & head

With the recent progress of running Oculus Rift from a Pi 3 and the experiments of streaming video from the stereo cam Blackbird 2, I thought it was a great idea to attach the camera to the upper part of Trashbot and send the Oculus head orientation to the neck controlling Arduino Mini Pro.

First step is to get the Arduino run with the PC again. But oh, that shitty servo:

So, before diving deeper into head synchronous robotic telepresence, I’ll need to fix that bugger…

Wireless streaming stereo video from an RC rover to the Oculus Pi

The next iteration was of course to simply try out the wireless transmission of video. And this doesn’t make sense if you still have the camera attached to yourself, so I “augmented” another project of mine, the rover:

Arduino / RC mixed autonomy Rover

Arduino / RC mixed autonomy Rover

Also I did a little self-sufficient stereo video transmission pack, including the Blackbird 2 camera:

Battery, video sender, stereo camera

Battery, video sender, stereo camera

And attached the two. I love modular designs where you can recombine your projects easily. So these two simply have two batteries etc. Here’s the full setup:

Oculus Pi wireless streaming video rover setup

Oculus Pi wireless streaming video rover setup

Here’s the video:

(Don’t know what the heck Pinnacle Studio was thinking to put the black frame around the video when exporting.)

Strange seems that we seem to get some kind of “interlace distortion” when the car is moving too quickly. I don’t quie know whether that is an artefact from the analog video transmission or from the digitisation process itself:

interlace artifactsWhen playing with this, you intuitively move your head to look around. So I could read out the head movements and send them to the Arduino on the rover to actually move the camera. I’d also need to add another servo to the camera to actually be able to move along at least two axis’. Let’s see…

 

2 3D cams & 1 2D cam on the DK2 @ Pi3

Cryptic title of a blog post, I know. But I urgently needed to try out my cameras with the Pi3 for the use on the Oculus Rift DK2 (video below). My last attempt to stream video locally into the Rift was successfull, so I wanted more. The initial single 2D camera was a logitech webcam C525 (for 60$)

logitech hd webcam c525

logitech hd webcam c525

I have two stereoscopic cameras. The Minoru which is “kind of” a cheap camera at about 70€, but then again: not, because it’s just 2x 640×480 resolution… I stripped the camera to reduce its size, weight an volume to actually use it on the Trashbot since I know that it is supported by the Raspbian as a camera.

Minoru Steroscopic Cam

Minoru Steroscopic Cam, around 70€

I also experimented already with the Blackbird 2, an analog camera for streaming video from drones to video goggles. However, that attempt was not really successful since the camera software on PC (!) was laggy and I was thinking that it may be due to the video capture card Easycap (10-20€).

BlackBird 2

BlackBird 2, around 180 USD

But let’s see, what happens when I run it on the Pi 3:

Learning: the 190€ combo beats the other two BY FAR in experience.

PS: In the video, you see that I attached the cameras to a different Pi than the one attached to the Oculus. This is because, the Oculus Pi has display settings tuned towards the goggles and I need to invest time to make the config changeable via software to actually switch between the googles and a real external display.

Oculus Pi untethered, pt. 2: a camera!

Last time, I was able to get the Oculus Rift DK2 to run on the Raspberry Pi 3, including the head tracking. However, the first interactions showed that it’s cumbersome to work with the desktop (since it’s not distributed on the two eyes but really is using the LCD as one screen) and also to use the keyboard.

Also, in the context of making the Oculus mobile and untethered, it is necessary to have a camera onboard, at least, until I get the XTtion to work. It is interesting that it is not too easy to find software that simply can display a video stream from a local webcam, most blogs just describe how to stream from a remote webcam or make a local USB cam accessible via some webservice.

My last attempt to get a stereo analog cam to work was not really cool since the latencies on a PC plus some weird display software as an .exe were not the ideal setting to really improve things.

So after some research, I found a git repository for streaming a local webcam to a dedicated  view, independent from the desktop. You need to install CMake and libbsd-dev to make it run via:

sudo apt-get install libbsd-dev
sudo apt-get install cmake

Also, I was able to install the camera in a nice position without additional mechanics:

cam fits nicely the cables of the Rift and enough space on the straps to lead cables.

cam fits nicely the cables of the Rift and enough space on the straps to lead cables.

Here’s the video walk through with some live feed to see the latency:

Next, I may try to either position the video on one eye, or even double the stream to both eyes. Or I may try to use the Minoru Stereo cam that I’ve been working on last year.