Skip to main content

Some observations/questions

Study delayed since I think I can make progress with the prototype and answer some of my questions while opening up new ones :/ I'm glad I know this sort of last minute thing is quite common in research or I might be panicking (3 months to go, omg!).

I'm still having problems with marker tracking due to varying lighting conditions.  At home, my "reliable" green marker doesn't like my bedroom but is great downstairs and in my office. Blue/red/yellow - all tend to suffer from background noise. I may have to try pink! Basically I know that colour based segmentation and blob tracking is a quick and easy way of prototyping this, but real world? Terrible!

If using dynamic gestures what are the best symbols to use? In fact is any semiotic system useful for gesture interaction? One could also ask are symbolic gestures really that useful for a wearable system....

Where should a camera point? i.e. where should its focus be?
I've found myself starting gestures slightly left of any central line, so primarily using the right side of my body - how true does this hold for other people? Is there a specific camera angle that is useful? These ones are for the study.

I was trying out a mockup of the interface on my palm and while I know the "natural" interaction style would be to make any projected UI elements into touch elements, my tracking just cant support it (in fact I have to wonder how anyone has done that since it sends histogram based trackers e.g. camshift insane). Hence there is a gulf between the input modalities location and that of the visual output modality which I don't believe represents an effect interaction paradigm. I've not tried it on other surfaces, I want to get a little further with the UI first.


Still hopeful of putting together another demo video by Monday.

Comments

Popular posts from this blog

I know I should move on and start a new blog but I'm keeping this my temporary home. New project, massive overkill in website creation. I've a simple project to put up a four page website which was already somewhat over specified in being hosted on AWS and S3. This isn't quite ridiculous enough though so I am using puppet to manage an EC2 instance (it will eventually need some server side work) and making it available in multiple regions. That would almost have been enough but I'm currently working on being able to provision an instance either in AWS or Rackspace because...well...Amazon might totally go down one day! Yes, its over-the-top but I needed something simple to help me climb up the devops and cloud learning curve. So off the bat - puppet installation. I've an older 10.04 Ubuntu virtual server which has been somewhat under-taxed so I've set that up as a puppet master. First lesson - always use the latest version from a tarball unless you have kept t

Prototype Hardware Rig

Its late here so I'm a brief post. As I've said, hardware wise there are a lot of options as to where to place the various components for a WGI and a few studies out there have looked at the various components in some depth (TODO refs). However an obvious constraint for this project is available resources (hardware, time, money etc) and when it comes to the act of physically assembling thing, and doing so quickly, for my first build it is almost a matter of just making the best out of things. Hence I present the rigging for O6Sense FG v0.01 which has been inspired by observing the trend towards the wearing of large closed-cup headphones and the opportunities this opens in having a "socially acceptable" wearable prototype. What is it? Its a pair of cheap headphones with the cups cut off and 2 Rode pop-shield mounts fitted into the rear wire loops. I cant help but feel some providence at work here since not only is the rear arm a snug fit the screw holes are the ex