Skip to main content

Update.

Just an update - while I had originally approached this with the intention of releasing the code as open source my findings regarding....well various aspects of this project, but relevant to this aim, the code itself, means that I'm putting any software development on the back burner for the next few weeks while I perform a study into how people naturally perform gestures. I'm also looking at some options to improve certain show stopping issues with the system (primarily the limited FOV of the webcam).

Any code that does emerge for the project, at least for version 0.1 is unlikely to be very robust but I think that can be overcome : I'm currently thinking that broad colour segmentation followed by some form of object matching technique (e.g. SIFT/SURF) should make quite a robust and reasonably fast algorithm for marker detection however if the FOV problem cant be solved, I actually think that ANY vision based system is inappropriate for this sort of interaction style.

Yes, that's a bit damning, however I am doing HCI research here, not computer vision....and that doesnt mean that I dont have other tricks (literally) up my sleeve :)

Comments

Popular posts from this blog

I know I should move on and start a new blog but I'm keeping this my temporary home. New project, massive overkill in website creation. I've a simple project to put up a four page website which was already somewhat over specified in being hosted on AWS and S3. This isn't quite ridiculous enough though so I am using puppet to manage an EC2 instance (it will eventually need some server side work) and making it available in multiple regions. That would almost have been enough but I'm currently working on being able to provision an instance either in AWS or Rackspace because...well...Amazon might totally go down one day! Yes, its over-the-top but I needed something simple to help me climb up the devops and cloud learning curve. So off the bat - puppet installation. I've an older 10.04 Ubuntu virtual server which has been somewhat under-taxed so I've set that up as a puppet master. First lesson - always use the latest version from a tarball unless you have kept t

Prototype Hardware Rig

Its late here so I'm a brief post. As I've said, hardware wise there are a lot of options as to where to place the various components for a WGI and a few studies out there have looked at the various components in some depth (TODO refs). However an obvious constraint for this project is available resources (hardware, time, money etc) and when it comes to the act of physically assembling thing, and doing so quickly, for my first build it is almost a matter of just making the best out of things. Hence I present the rigging for O6Sense FG v0.01 which has been inspired by observing the trend towards the wearing of large closed-cup headphones and the opportunities this opens in having a "socially acceptable" wearable prototype. What is it? Its a pair of cheap headphones with the cups cut off and 2 Rode pop-shield mounts fitted into the rear wire loops. I cant help but feel some providence at work here since not only is the rear arm a snug fit the screw holes are the ex