Monthly Archives: November 2014

3d scanning trashbot for later comparison

so, the last couple of days i invested more time into 3d scanning the first version of the trashbot before really going into improving it. turns out that scanning a 3d object using skanect is more complicated than i thought.

i’ll report three different methods here:

  1. build a rotating lego stand for the desktop
  2. walk around the object to scan
  3. build an arduino controlled stepper rotating stand

let’s see first what the scene to scan is:

1) i thought i build a little rotating stand using lego, so i stole some legos from my kids and build something like this:

lego stand decomposed

lego stand composed

next, i scanned it, letting the motor run and went into analysis:

2) next, i decided to actually put the robot on a stand that doesn’t move and move the camera around instead, also assuming that different heights may add information and get a more complete picture:

okay, cut the video to not bore you too much with it. i think the result is more interesting:

(yeah, that’s our garden plants in the background, we already had some snow in germany…)

finally, i found this skanect tutorial and thought, that obviously, the rotating stand is finally the way to go:

3) so i built  a little step-motor / arduino driven rotating stand so i could control the rotation speed (check out this blog entry on a MUCH better stepper library for arduino) much better than with the lego and isolated the object from nearby objects.

arduino stepper rotation

okay and here’s the result (sorry for the audio quality, don’t know what went wrong):

that’s currently my last attempt. i think for now, i got the best results by manually scanning the bot and not using the rotation / stepper.

TrashBot v1.0: my first four-servo biped robot that actually walks.

after the complicated start with ROFI, i decided to build my own biped from scratch but never really found the time and focus to do it. but then i came across “bob the biped” by bajdi over on letsmakerobots and i fell in love. immediately.

this guy has really done a great job designing a simple, working and cure biped that really is charming. the design principles fascinated my from the first second. he just used four servos, to on each side connected by an L upside down. the foot servo is used to shift the center of gravity, the hip servo is used to move to bot forward. the other side is working in a mirrored fashion, preparing the next move, so to speak.

also, the software is really nice as it produces the gait pattern parametrically and he also shared the source code. the gait pattern is actually one of the things that were too complicated with ROFI, esp since the original inventor’s design also included arduino, android and c# software.

so, okay, i decided to build my own interpretation of the core design of bob, of course i don’t have a 3d printer yet, but to me the casing is just icing on the cake, i need to understand the mechanics, gravity implications and dynamics of the gait.

here’s how i started, one foot servo (nice mg996r, metal gear machine) connected to a hard-drive holder from an old pc: feet and legs

next, i attached an L shaped metal piece i had floating around to the servo horn of the hip:

leg and knee

then, i found some metal pieces that had fortunately the holes drilled in the right distance to actually fit the servos’ holes:

metal hips

 

now i coulddo a test connection of legs and hips (in the end you have to decompose the ensemble again to center the servos before finalising the build up):

hip, servos and legs

 

now i could add the foot servos and voilá we have a standing prototype: standing prototype

of course the hard plastic of the feet is slippery, so i needed to add more “footage” as well as extend the standing area, so i cut out some feet from polystyrol which is hard enough to support stuff but soft enough to be cut with a knife (or actually sliced and then broken):rough cut outs of feet

 

cut out feet

and finally attached the feet servos to them with little screws and L shaped metals i had found:

IMAG0819

 

 

next, it was time to build some electronics. this time, i didn’t want to use the breadboard, i actually roughly know how the layout should look like so, i chose this time circuit boards and connector strips i had found at amazon: board and connectors

 

cut them into the right pieces to fit the arduino nano and four triple plugs of the servos plus some lipo connection:

cut connector strips

 

 

minimal circuit design (the nano can be powered by up to 12v, the lipo has 7.4, so i can power the servos and the nano with the same battery, i left out the bec for the moment):

circuit

as you can see the upper part gets power and gnd from the battery (on the lower part), that’s also powering vin and gnd of the arduino nano. here’s the front (top battery in, middle nano, lower right servos):

circuit front:

 

here’s the final bot from the side (some polystyrol to save the circuit from shorts by the metal hip) and the lipo attached to the front. as you can see, he can stand on one foot, which is the most important feature…

first key learning: long legs imply delicate balancing of the upper part. not very elegant. this is why the “bob” is so elegant, short legs, less risk to fall over… final bot side

 

here’s mr trashbot from the front (electronics and lipo attached with a rubberband, for fast component exchange and re-engineering. we’re learning!):

trashbot front view

 

finally, here’s the little guy walking. not too elegant as the long legs and the heavy metal gear servos make the legs shaking. and some software issues i guess (number of animation frames vs playback duration should be fine tuned, letting him walk faster will let the shakes become stronger in amplitude and let him fall more easily.)

 

i did a couple of tricks to the original source code. for example, i split the max angles the servos would do between hips and feet. this means that the feet can do larger angles than the hips. furthermore, i also gave the feet different max angles depending on whether they are up (less) or down (more). my feeling was that this way i would get the center of gravity close to the standing foot’s center.

but here’s really room for experimentation.

i think i’ll stiffen the legs next, possibly also add a gyro to the circuit and let the gait pattern adjust dynamically according to the shakes the hardware does so that he’ll move more slowly when the shakes become too strong. let’s see. finally, i may also add bluetooth for visualising the gyro’s recording to learn a bit more.

or add knee servos. in that way, the lifted leg could retract a bit which would pull the center of gravity a bit more inwards. but it would also add weight. hm… decisions, decisions. what would you suggest?

 

make munich 14, pt 2: oculus rift und kamera

eins der projekte, die ich auch gerne mal basteln wollen würde ist eine kamera führung, die mit drei servos entlang drei achsen geschieht. die servos werden durch die kopf orientierungsdaten der oculus rift gesteuert, die aufnahme der kamera kommt in der rift an.

im vorliegenden projekt wurden dynamixel servos genommen, ca 30€ das stück, die seriell vernetzt werden, was die komplexität verringert. allerdings war nur eine 2d kamera drauf bzw nur ein kamera chip, nicht zwei, sodass wohl im rift kein echter 3d eindruck entstehen konnte.

darüberhinaus war dazwischen ein relativ langsamer laptop, sodass die latenzen nicht wirklich überzeugt haben dürften. schade eigentlich, denn latenzen sind meines erachtens wesentlich wichtiger für einen lebendigen eindruck in der virtuellen realität.

hier das video dazu:

 

der mensch hier hat das etwas ernsthafter betrieben, nur zum vergleich:

 

make munich 14, pt 1: urban farming

eins der inspirierendsten projekte auf der make munich war “plants and machines” die ein aquaponisches system entwickeln, das arduino kompatibel ist (ein bisschen raspberry pi ist auch dabei), aber die sensorik mit einem bus verbindet, um die verkabelungskomplexität zu reduzieren.

leider hatten die jungs kein demosystem dabei, sodass das publikum nicht so recht darauf aufmerksam werden konnte. der aktuelle prototyp hat wohl beim reisen etwas gelitten und nachdem das system in wenigen wochen komerziell verfügbar sein soll und man nicht mit halbfunktionierenden systemen erschrecken wollte. von der generellen reiseunfähigkeit der fische mal abgesehen.

ich finde das ein sehr spannendes projekt, das mir auch etwas offener scheint als ein ähnliches projekt auf kickstarter, das dieses jahr gescheitert ist (keine aquaponik, sondern “nur” automatisches gewächshaus):

 

ich bin immer an solchen projekten interessiert. spannend finde ich auch public lab’s infrarot kamera, die das pflanzenwachstum dokumentiert:

 

das ding liegt auch bei mir daheim rum und will endlich an einen netz-verbundenen raspberry pi gehängt werden um den untergang unseres gartens zu dokumentieren…

wer sich noch mehr für urban farming interessiert, wird auch meinem pinterest board fündig.