Monthly Archives: October 2014

concepts for moving in virtual space yet safe in real world.

we all know the onmi by virtuix that allows for walking / running in the virtual space while remaining on one spot in the real world. i’ve backed this project (i hope they will deliver soon) as it seems that the user experience is conceptually well executed through special shoes that can be tracked by the floor:

but just today i came across a “chair” that actually frees your body from the floor, i.e. it allows for moving more freely in the virtual space while the physical device is actually home-compatible compared to “lawn mower man” kind of interfaces. govert flint did this as a graduation project at the design school in eindhoven:

it seems that it has different ways of collecting data, i’ve read about accelerometers, probably even potentiometers would probably do the job. then, they added some software to actually recognise gestures. i think we should go further than that and really use the whole body posture. in that way, we could use the kinect to record the posture and simplify the design of the chair. would be great to try that out (i do love the possibility to swing left and right, that’s going beyond a bar stool. actually, that guy should do a kickstarter project.

btw: the original lawnmower man interface is akin to a gyroskope:

but that’s clearly not consumer compatible. so i’m sold on that chair thingy.

then there’s the züricher hochschule für künste who allow the user to lie on “bed” and use wings attached to the arms to fly like a bird. also quite inspiring, but again, in terms of practicality, not for me at home:

 

 

was kommt nach whatsapp? in zehn jahren?

gerade kam auf google plus die diskussion auf, ob es whatsapp in 10 jahren noch geben würde. und ich gestattete mir die these, dass wir vermutlich nicht mehr text in telefone hacken würden, woraufhin gefragt wurde, was wir denn dann tun würden. die antwort wurde irgendwie immer länger, sodass ich dachte, ich schreibe sie in meinen blog.

“keine ahnung, sonst würde ich es ja jetzt bauen. aber vergleiche pinterest gegen facebook, zb. auf pinterest muss man nix schreiben und kann trotzdem seine “ansichten” (sic!) ausdrücken. ausserdem ist mir nicht klar, ob es in 10 jahren noch telefone gibt. werden wir dann nicht endlich die beiden use cases “surfen” und “telefonieren” dissoziiert haben? d.h. grosses display zum surfen ja, aber zum telefonieren knopf im ohr (der ausserhalb des ferngesprächs auf durchzug schaltet)? sprich eher 7 inch oder mehr in der tasche. als tablet. hoffentlich rollbar. oder kein tablet mehr, sondern oculus rift als brillengestell (das transparent wird on-demand)? zehn jahre sind für mich so weit weg, ich habe keine ahnung, was in der zeit alles entwickelt werden kann. wir haben jetzt octo-core phones und mehrere tausend shader-kerne in pc grafikkarten. wearables leben von sensoren, haben aber heute noch zu wenige um wirklich sinnvoll oder nützlich zu sein. wie wäre es mit selbsthaftenden sensoren über wernicke und broca areal auf der kopfoberfläche? die beim hören und sprechen aufzeichnen? die per deep-learning die hirnsignale zusehends besser in text / akustik / bilder / handlungen übersetzen? tastaturen sind doch, von weitem betrachtet, das schlimmste human-computer-interface. wir hominiden haben hundert-tausend jahre gebraucht, um aufrecht gehen zu lernen und damit die hände frei zu haben und jetzt nutzen wir sie, um sie acht stunden am tag auf den rechner zu legen. ist das nicht bekloppt?”

(oculus rift + leap motion) * unity3d = fun^2

okay, so i had some more time and actually tried to find out how to bring all the little hardware pieces together to form a working system. for the moment, it seems to me that the gaming engine unity3d is the best way to integrate most of the stuff i have although there are some nice things with point clouds i’d like to do with the asus xtion sensor (a derivation of primesense’s kinect that they developed together with microsoft).

but let’s start small, otherwise there’s just flat chaos. first of all, i’d like to show you what the current problem is and then describe how i got here. somehow i don’t find any information in the internet, so i think it’s worthwhile to write it up and get some people’s minds around it.

here’s what i built in about ten iterations with unity3d, leapmotion and the oculus rift:

 

here’s me manipulating the cubes with the leap motion on the table and the hmd on my head. i had to remove the leap motion from the hmd (see my last post taping it) during experimentation as i didn’t get any tracking first and later somehow the hands were mirrored, so i decided to simplify the setup and get it running on the table first.

here’s two things i learned in unity:

oculus leap unity3d

in the box (a prefab from the leap motion package), the rift camera has to be IN FRONT of the leap motion controller icon. the screenshot above shows the correct positioning from top view (leap motion icon is in the middle, right to the lamp icon). it may sound trivial but it took me two hours with many different settings (like the leap motion config menu also allows for changing orientations, etc).

second learning:

z axis not mirrored

the z-axis must not be mirrored. secondly, i scaled the sandbox by 40, this allows you to move freely in the box and your hands seem to be roughly in proportion to the cubes (default size from unity).

one thing i still don’t know: how can i actually get a screen recording while i’m playing? this is unity’s output (ok, ok i did 11 iterations):

unity file outputs

“direct to rift” works for me with my hmd, but on the pc screen there’s just a black window (as in my second video above), without “direct to rift” i see a live window, but the hmd isn’t showing anything. of course, the hmd is then used as a second screen on the pc, and i could move the window to that screen but then it won’t be full screen. and the second problem is that the software actually captures the mouse for controlling the navigation in the virtual space, so i can’t move the window as soon as i the animation actually starts… any ideas? haven’t found anything in the unity “build settings”.

here’s the software that i used:

i was inspired to do the cubes in the box by this video that is a brief tutorial into unity3d and leap motion integration (but has no oculus rift integration).

 

there’s a second very inspiring video that shows how to get unity running and install the oculus rift package for it (but no leap motion integration):

actually, a couple of things are easier now as you simply can register after downloading unity3d right at first start.

Oculus Rift and the Leap Motion controller

okay, so i’ve tried the device on a couple of family members and since the virtual desktop example is fast enough for my current little computer, it was a great experience for everybody.

it seems though that the head tracking in the example is just for turning your head but somehow it’s not tracking your relative height, i.e. looking under the table somehow doesn’t work. Continue reading

Alzheimer’s Disease: amyloid-β vs tau tangles.

One hypothesis why neurons die in Alzheimer’s disease is the accumulation of extra-cellular amyloid-β proteins. In Alzheimer’s disease, the such proteins accumulate to form plaques and consequently cannot be dissolved and metabolised anymore.

Here’s a great video (1+h) “Alzheimer’s Disease: From Genes to Novel Therapeutics”
NIH “Wednesday Afternoon Lecture” by Rudolph Tanzi (2011) Continue reading

oculus rift shipping and what i plan to do with it!

oculus rift shipping!

YEAH! can’t wait to hook up the oculus rift dk2 up to my wii fit board. i think that navigation in the virtual space via finger, keyboard, mouse, wii nunchuck or razor hydra etc is sub optimal. you have to have your hands free to do things that hands do in the real space. walking is obviously not part of that…

so i would like to stand on the board and read out the pressure sensors of the wii fit board to accelerate ahead to steer left / right just by moving my body similar to using a skateboard.

and the leap motion. it is of outmost importance to use your hands in the space and leap motion did a great pivot recently when adapting their SDK to the oculus rift, i.e. adding support for finding your hands even if the leap motion controller is not on the table but attached to your helmet. can’t wait to see this. plus the leap motion controller / sdk has built-in support for gesture recognition so that you can easily implement a couple of basic interactions with objects in the virtual space.

and my two asus xtions, the guy below uses three kinects. it will be a great thing to fuse the real world and the virtual world. i’d love to track my body via openni and / or see myself in the virtual space. the rift’s head tracking and possibly the xtion’s spatial position tracking will allow me to walk in space but also give real out-of-body experiences in the helmet. soo cool, check the video out below:

 

BRING. IT. OOOOON!

next iteration of the quadruped lego walker

this is the initial modification of my original walker design now with a spine that has twisting the spine between the front and the hind legs so that it change direction. also, i added the power functions IR remote control.

i find it quite problematic that the power function motors and servos are actually all three bits long, they should be four imho. you always have to make the design more complex than necessary.

the spine also has to become more elegant and reduced but at the same time more reliable. it’s also becoming obvious that we need more grip at the feed…

epigenetics introductory collection

To understand what epigenetics is, we need to understand what genes are and why determine who and what we are. Luckily when I started to dive into this topic, I found this wonderful BBC documentary that is worth watching every minute.

 

In the video, the story of Överkalix caught my interest and I simply had to research deeper into this topic. Let’s re-collect what we just learned in the video: Continue reading