This post is nothing but some of my thoughts on ideas that I have proposed for this challenge. And it will be short too.
By the way, this week I received KIT from element14, and as like all I got very excited to see and work with them.
I am not going to describe about KIT contents as they are available on element14, and I don't want to duplicate them here. All I am going to give is just a link where you can find the information about these KITs.
If you haven't read about my concepts then here is the snapshot of same
- My first idea is the one where Tony Stark interacts with his computers with hand gestures. Picks up file from One computer monitor and Puts in other Computer and starts interacting with. In my project I would love to do similar thing with Images/ Pictures first. If possible same thing I would love to do with Videos and supported file. I feel it's a very cool project.
- My second idea is a Surface Table: we can find this in many movies from Mission Impossible to Amazing Spiderman to Iron Man once again. Those are mostly touch based, but I am looking to use the Microchip's MGC 3130 GestIC kit that I have from element14 to interact with displays. Here As I don't have such big monitor, I would use my laptop display as surface display.
- My third idea is a Wrist Computer: if time permits I would love to implement a wrist computer, just like personal assistance we would have seen in movies. A Raspberry Pi + PiFace (though it's bulkier) would do this job. Also if possible a GPS integration with this computer enable a Door unlock mechanism based on my location. And Door lock unlock can be handled by Gertboard.
IRON MAN Computer Interactions -
To start with I am very excited to try IRON MAN movie computer interactions. And I am sure it will be coolest thing to see. In kit we got two sensor Kits one is XTRINSIC SENSE BOARD from freescale and MICROSTACK ACCELEROMETER board. The initial task is to recognise some gesture with these sensors. I will be posting the implementation details in next few blogs. Then, once I am able to detect some natural gestures then next part is to map it with some computer action. There would be wireless communication between the interactive computer and the module on wrist or in palm.
The task described above would be my first task in this challenge. I would update as I keep accomplishing the tasks.
If you are still confuse with what I will be really doing in IRON MAN Computer Interaction, then wait for my video .