android sign.jpg

Aiko Chihara may have silicone skin but her arm movements are fluid enough for sign language (via Toshiba)

Aiko Chihara is looking for a job either in the welfare or healthcare fields (or both) sometime in the near future, however she will need more education in the field of software programming refinements before that happens. Like most of the younger generation, she, too is working a temp job before she can advance into the career she was born to do. So before that happens, she will work for tech giant Toshiba, manning the company’s reception desk at their booth at this year’s CEATEC 2014 that was held in Japan.


Aiko is unique, though, and not like everybody else looking for a career in the fields mentioned above. Instead of being flesh and blood, she’s an android with internal circuitry, a mass of servos all wrapped up in a silicone-based skin. Toshiba put her on display at this year’s annual electronics trade show in Japan. What sets her above most other robots is that she has uncannily fluid movement in her upper limbs, which allows her to perform sign language as well as speak fluid Japanese while doing so.


The android is actually the product of several different tech companies; Shibaura Institute of Technology and Shonan Institute of Technology provided the robot’s motion sensing and teaching technologies. aLab Inc. and Osaka University created the robot’s body and head, capable of showing human expressions and Toshiba designed the algorithm that provides fluidity to the android’s 43 actuators in the robot’s joints. As it stands at the moment, Toshiba will use Aiko at future tradeshows and exhibitions coming up next year but hopes to develop the android further for use in welfare centers as well as healthcare institutions.



See more news at:


The McFly Hoverboard and High-top combo from ZBoards. Can these kicks work today? (via Zboards)

The latest from Zboards is enough to get old Delorian purring with glee as they take a classic ride back to the future. Ever want to feel the wind flowing through your hair as you ride on a hoverboard like Marty McFly? Obviously, you'll be wearing your acid wash jeans and shell jacket to match. Well, now you actually can. One lucky winner will actually get the whole ensemble, complete with hoverboard, high-tops, and more for free!


The rest of us will have to settle for the hoverboard and replica McFly high-top shoes. Of course, you can always visit the local thrift shop to find your own 80's outfit that will make others sneer with envy as you zoom past on your hoverboard as fly as McFly. Oh yeah, they'll be jealous, maybe – if they are, in fact, old enough to appreciate the reference.


The Zboards McFly Limited Edition hoverboard and high-top combo costs $700. The board will set you back about $600. This Zboard is electric pink in design and, most importantly, literally electric. I must say, the design of this electric board is pretty awesome. The kit is held underneath the board. To run the board, you just use one foot pad to accelerate, and you put pressure on the back foot pad to break. The Zboard also come with motion sensors to allow you to steer by sifting your weight and balance slightly. Hence, all you have to do is keep yourself balanced on the board. Which may be easier to do in Marty McFly high-tops.


The top speed on the board is 17 mph and the battery life is supposed to last you 5 miles on the highway (hopefully not literally on the highway), and 3 miles in the city. So, basically, if you can cruise at a good pace, your battery should last longer. The battery takes about 5-6 hours to charge – not good if you want to go on a longer trek. But it could get you to work in time for a power-up to fuel the ride back home. The board also comes with a handle for easy carrying when the battery dies on you and you're too lazy to use it the old fashioned way.


There was an electric backpack skateboard on Kickstarter a couple months ago that didn't have as sleek of a design as this Zboard model. However, this Zboard is pretty expensive for a skateboard when you could get a really decent bike for that price. However, you wouldn't look as cool on a bike as you would cruising down the suburban streets on a McFly hoverboard.


If you live in Chicago, you can check out the Back to the Future Delorian at the Wormhole Café in Wicker Park as you strut your stuff in full McFly gear. You would become a sort of hipster God in the eyes of the patrons. Then you can laugh and roll away with a cappuccino in hand: mission accomplished. Great Scott! You are too cool for school.



See more news at:


Carvey (via kickstarter)

The Carvey is a CNC router that carves your design into any piece of wood, soft metal, and plastic. What's more, is it's so simple a 5th grader could probably use it... or at least that's what their video suggests. Their campaign blew up Kickstarter as they reached their goal of $50,000 in 1.5 hours! The Carvey campaign is now huge, as they have currently raised over $650,000 in about ten days (I know – I wish I was them right now). The early bird Carveys are all gone at a price of $1999, however, you can score a Carvey of your own for $2399 and up now.


This CNC (carver) router looks pretty polished, so I guess there is some truth to taking advantage of not being the first to enter a market. I imagine they have been able to learn from the mistakes and folly of others. While 3D carving into plates of material is not a new concept, since engraving is an art that has existed for ages, it is new to do it in your own home. The Carvey carves into objects rather than creating the object using an extrusion process like a 3D printer does. But, the important thing is it works and seems to be able to make some pretty cool things. I remember using a laser cutter to make inlay designs into a wood pane... the issue is the laser entirely charred the wood inlay unevenly which made it look like it accidentally caught fire at one point. - Not the look I was going for.


With this 3D carver, you don't have that problem. It seems simple enough to use and creates clean and sleek designs. However, it must be noted that the Carvey is not actually 3D as of yet. The Kickstarter campaign notes that, although the Carvey hardware could carve in 3D, the software only supports 2.5D at this moment. The compatible software they have paired their Carvey with is called Easel and is free to download. So, at least you won't have to spend extra on that.  Easel combines CAD, CAM, and Controller for a software that is intuitive and make it easy to just hit print. The Carvey will also calibrate itself so you don't have to worry about the exact positioning of the board.


Although I don't necessarily have a use for it, I kind of want one. If you are an arts and crafts maker, or carpenter, I imagine this will come in handy and actually pay for itself after a while. Otherwise, you can make the best gifts in town, if you don't have a practical use for it.


Carvey's build area is 12”x8”x2.75” and the milling bit inside the carver can be changed to a variety of sizes. The controller and firmware are open source so that you can program the Carvey yourself and use your own software, if you wish. Overall, it seems pretty cool and an easy plug-and-play solution for ultimate usability.



See more news at:


The Air Umbrella: using air to protect against water (via kickstarter)

A Chinese inventor named Chuan Wang has drastically reinvented the umbrella. In late September, Wang headed a successful Kickstarter campaign for his Air Umbrella. The Air Umbrella looks like a wand and deflects water away from the user, acting like an umbrella, using high pressure air. It seems like a cool futuristic idea, except for the poor person who happens to be standing next to you -they'll be pissed and soaking wet.


His original Kickstarter goal was $10,000, and Wang raised over $86,000 in funds for his idea. Now, the idea seems to be a reality with the Air Umbrella set for release in December 2015. It will cost between $88 and $108 depending upon which size wand you opt for. The Air Umbrella currently comes in three sizes. The smallest one can fit in a purse at only 30 cm long, but the battery life on that one is only 15 minutes. The largest Air Umbrella is 80 cm long and has a battery life of 30 minutes. Needless to say, you shouldn't count on taking a nice stroll through the park with this thing. But, you could possibly make it to the nearest corner store and back before it gives out on you.


I imagine this was created for the ideal, metropolitan American that only walks from the house to their car and from the car to the office.


The Air Umbrella works via a strong motor-propelled fan inside the top of the wand. The fan sucks air out of the bottom of the wand and throws it out of the top vents at high speed: creating an invisible umbrella made of air. The bottom of the wand has an on and off button. The bottom part of the umbrella also controls the air pressure: for those light misty rains that don't require full throttle.


This design was created by Wang at Nanjing University and Beijing University with a group of postgraduates in the Aeronautics and Astronautics department. Who knows, this idea may catch on and everyone will have an Air Umbrella.

Its niche is that it can easily fit in a purse or bag and you don't need to worry about strong winds breaking your umbrella. While it does seem like a clever idea, I have a few umbrellas that still work just fine and cost significantly less than $88. What puts me off most is the battery life. You'd probably have to keep a few emergency umbrellas around for when your Air Umbrella runs out of juice... then you'll just be left in the middle of a storm, soaking wet, holding a phallic shaped object in your hands.


Wang says he is currently working on increasing the battery life. I hope he does. This may be one of those inventions that becomes perfect with version 2.0. If you don't mind the short battery life, give the Air Umbrella a whirl in December – pun intended. If nothing else, it would be perfect for LARPing (live action role-playing ) in and outside of the rain.



See more news at:


The previous post is at: [Building a quadcopter] [Part 1]


In the last post, I started by linking to the various resources and buying the basics and then testing the 'air' by getting the motors to run. In this post I will share my experience in building my own frame.


The frame


Since the BLDCs and ESC are OK tested at this point, I wanted to start with the frame. A look online and those carbon fibre frames are beautiful. Unfortunately, they are also very expensive and for those who know me, I get an allowance that keeps my shopping habits in control. Hence the next best thing is to build one. Easier said than done, what do I build with? I have a lot of scrap wood lying around so that would be  my material of choice. But before we start we need instructions or atleast some design rule with which to work.


Using google, I found these.


The interesting stuff is the dimensions. I was not able to find the exact theory behind the frame design but I did get an idea of what could be right. Here is a drawing from the above links




I am using 10 inch props so this could have been used as is. But I increased the distance between props from 10.5in to 16in. This was because I wanted to give the electronics a little more room an wanted to experiment with larger frame dimensions. In my mind, if I increase the inter prop distance, I will gain manuverability- sensitivity to small changes in prop RPMs (newton's levers right?) On the other hand, I will loose stability. Thats OK since I am going to experiment and if I find it to be too unstable, I can always shorten the arms.


With that, I took some scrap wood and sandwiched the joints in plywood in the center. The result is...


The motors needed to be mounted hence a bit of drilling was necessary. I also made some sinks for the shaft of the motors. Applicable for this case but may not be necessary for all motors.





Some zip ties and we have the ESC in place.

IMG_8445.jpg IMG_8775.jpg

Thats good. Now I need some landing gear. For that I had some 8in  PVC pipes in the scrap. So I cut some rings out and stuck them to the arms of the quad. The result is easier to understand...

2014-11-04 12.40.16.jpg

With that we have a frame. I did take test electronics and mount them on the frame then wired a long wire to a pot and tried to see if it could lift. Here is the result.

As you can see we need a brain for this thing. In the next post I will try some sensors and see if we can design a logic for this thing.





RomoCart uses an overhead a pico projector and RGB depth sensor allowing users to play AR games (via 河本の実験室)


Romo debuted back in 2011 and has since gone through several revisions, giving it a more sleek design with better mobility. For those who don’t already know what Romo is, it’s a mobile dock from Romotive with tank-like treads that uses an iPhone as its brain, essentially turning the smartphone into an interactive robot. While using the iPhone for a robot is great in itself, a pair of Japanese hackers (Ken Kawamoto and Tomoaki Akitomi) has taken it to an all-new level by turning it into an AR gaming platform. Their RomoCart works with an overhead RGB depth sensor and pico projector that turns the room it’s in into an AR game suspiciously similar to Nintendo’s Mario Kart.



The RomoCart uses its RGB depth sensor to map the layout of obstacles and the position of other robots.


The gaming platform works by using the RGB sensor to scan the location of both floor obstacles in the room and generates an optimal racetrack, which is then displayed using the pico projector. The platform tracks other players as well using the sensor and projects the game environment based on their location. Just like Mario Kart, objects such as power-ups are positioned on the AR track that can boost speed, fire projectiles and even drop bananas to make others spin-out and wreck. The RomoCart gaming platform isn’t currently available for those interested in replicating it, however the pair plans to release the source code when they can get around to it sometime in the future.


An ASUS Xtion sensor was used to garner the information the RomoCart needs to map-out the track.




Ever since I saw the quadcopter for the first time, I wanted to have mine. Unfortunately, I am financially err... 'less apt' to actually buy one and imports cost an arm and a leg. Hence i set out to build my own by either buying parts from the cheapest resource or extarcting from scrap what I could. I also blog about this stuff independently at and you are welcome to check that out as well.

This post is the first in a series where I will document the process I went thought to make an actually flying quadcopter. I am open to suggestions and will incorporate what you give me. So lets get started.




A quadcopter is... "A quadcopter, also called a quadrotor helicopter, quadrotor,[1] is a multirotor helicopter that is lifted and propelled by four rotors. Quadcopters are classified as rotorcraft, as opposed to fixed-wing aircraft, because their lift is generated by a set of rotors (vertically oriented propellers)." [1]


Simply put it has four rotating blades that collectively produce thrust to lift the whole thing up. Two rotate clockwise and two anticlockwise so it does not keep spinning. The interesting part is that all four of the rotors must be continuously controlled in speed for the system to stay stable in air. It is not the same as setting each at the same speed since the imbalance in weight will cause it to drift towards one side. Hence it is a control system with the input being it's orientation-tilt, movement, acceleration and output being the speed of rotation of each motor. I will discuss the theory in later sections but first lets get some basic hardware out of the way.


The necessary

So what is required to build one? Here is the list.

  1. A frame: where you mount everything. Wood or carbon fiber or anything else.
  2. Motors and propellers- what takes you up!
  3. Battery- The juice
  4. Speed Controllers or ESC to control the motors
  5. Control Board: The Brain
  6. Radios: To communicate remotely.
  7. Helmet and bandages- when things go wrong


Selecting the hardware


There are a gazillion guides out there that tell you which motors and rotors and frame and all that stuff. Why? Because it is an art where you select what is right for you.

..."A rule of thumb is Required Thrust per motor = ( Weight x 2 ) / 4"[2] There are a number of things you need to consider. The link given is a great place start your research about motors and propellers. I chose the 10x4.5 DJI Style Props for multirotors which I got from ebay. For the motors again, there are a number of options. "Motors are rated by kilovolts, and the higher the kV rating, the faster the motor spins at a constant voltage."[3] I went with the EMAX MT 2213 935KV Brushless Outrunner Motor (For Multirotors) again from ebay. The rotor and motor combo was recommended by a lot of people.



To control these, we need an ESC or electronic speed control which are usually an AVR microcontroller with MOSFETS which sequence out pulses for the BLDC motor operation. BLDC are brushless motors which have working similar to stepper motors but we can make these run at several thousand RPMs. You can buy single ESC for each motor but I bought a single 4 channel ESC EMax 4X25A Brushless Quadcopter SimonK ESC - (SimonK Firmware).


The next obvious part is the power. I initially though I would power them up from a bench PSU but I was wrong. I ended up

The ESC is rated at 25Amps for a reasons and I don't have that kind of power supply. Instead I bought a LiPo battery from ebay. It was the cheapest one that was above 2000mAh. Turnigy 2200mAh 3S 20C Lipo Pack (with XT60 Connector)


The rest of the hardware details as wel go along.


Testing the air


With that I wanted to check if things would work. Hence I set them up with an arduino. The code is pretty simple.


#include <Servo.h>

Servo esc1, esc2, esc3, esc4;
int throttlePin = 0;
char ser;

int throttle;
void setup()

void loop()
  //ser =;

  int throttle = analogRead(throttlePin);
  throttle = map(throttle, 0, 1023, 0, 179);
    throttle = 179;
  }else if(ser=='0'){
    throttle = 0;
  }else if(ser=='5'){

//  esc1.write(0);
//  esc2.write(throttle);
// esc3.write(0);

// esc1.write(0);

// esc2.write(0);


There are four ESCs and four BLDCs. The result is...


So that works.

In the next post, I will discuss the frame...



[1] Quadcopter - Wikipedia, the free encyclopedia




Cornell’s new soft robot walking across snow


If you've ever wondered what the military robots of the future will look like, Cornell researchers have a game changer in store for you: a robot that can withstand water, heat, pressure and extreme temperatures and in the form of a blob-- a pink one, at that.


The new robot looks a lot like a squishy starfish. Mostly made of silicone, the robot was built tough. While it doesn't exactly fit our idea of robotic innovation, it is indeed one of the most resilient robots around, as it can crawl across puddles, over fire (for up to 50 seconds), withstand temperature of -20 degrees Celsius and even survive being run over by a car. Boys, we've got ourselves a war machine.


The floppy robot was designed to function in rugged terrains. Unlike most soft robots, this one can function with or without a tether for up to two hours (thanks to its trusty, dusty battery pack). Users can also strap a video camera to the top of the robot. If it survives the battle fields, it will bring some wicked footage back to the home base.


This soft bot has super strength for its size, as it can carry up to 18 pounds on its back. Sitting at about a foot and a half wide, this bot is inexpensive, too; it costs just over $1,100 to make one of these creepy critters.


While its durability is impressive, it's speed is not. This nearly indestructible robot is unfortunately painfully slow.. Also, while its silicon shell is great for bypassing potentially dangerous substances, its sensitive parts are exposed and easy for enemies to squash. All things considered, this bot still might make the cut for the next robotic soldiers.


Soft robots may not look like the rave of the future, but their resilience, durability and cost efficacy may mean the next army of bots will be, well, squishy.



See more news at:

mit listens.jpg

Sound being recorded by MIT researchers via footage of vibration of nearby objects (via MIT)

Maybe you heard this already... It seems that the far-fetched gadgets from James Bond films are becoming a reality.  Novel technology created by MIT researchers, Microsoft and Adobe can convert the subtle vibrations of an object in a room into sound waves in order to spy on conversations. While they hope this technology may lead to a variety of unexpected inventions, they are currently relishing the cool factor.


Footage is captured by focusing on an object within the room. The footage is high-speed, typically between 2,000 and 6,000 frames per second in order to get the best quality audio. The highspeed enables the footage to capture the minuscule vibrations which can be seen as a result of a nearby conversation or music. The footage passes through an algorithm that converts the recorded object into audio, which is spookily accurate in comparison with the original. In one of their experiments, they were able to accurately interpret speech from behind soundproof glass, 15 feet away.


The masterminds behind this are MIT researchers Abe Davis, Frédo Durand, Bill Freeman, and Neal Wadhwa. Also joining them are Michael Rubinstein from Microsoft and Gautham Mysore from Adobe.  To capture the subtle vibrations in objects, they had to employ a unique method of capturing the micro-vibrations that are imperceptible to the naked eye. The vibrations of objects in a room move at about a tenth of a micrometer, according to the research team.


Hence, in order to capture these micrometers, the team had to capture 5 thousandths of a pixel. In order to do this, they take close-up footage and note the change of a single pixel's color in order to note changes in video smaller than a pixel. 


The camera footage utilizes various camera filters, which note variations in pixel's colors, over time, in a variety of scales, positions, and orientations. An algorithm then interprets the data from each filter in order to reconstruct the movement of the object over time. It is at this point that clear boundaries can be seen from the variations in pixel color, which notes minute movements.


Their YouTube demonstrates their tests of the technology using a variety of filmed objects and different type of sound (including speech and music). While the technology is definitely not infallible and slightly seems to alter the timbre of the original, it is still quite accurate. For the most part, there seems to be occasional dropped sounds, or notes. Overall, it is quite eerie how accurate this technology already is considering they want to continue to develop it. I guess now the NSA will have even less trouble spying on the public using surveillance video.


Their findings will be published as a paper for the Siggraph computer graphics conference. Hopefully someone will catch the wind of this innovation at the conference and develop new applications based upon this technology. While spying is an obvious application, Davis is hopeful about the possibilities this technology opens up in materials science. The properties of an object alter the type of vibrations produced. The group wonders what application this can have in determining the material properties of an object judging solely by the manner in which it reacts to sound vibrations. 



See more news at:


It begs to be admired... and craves to cut knees and feet. Part of the Gravity Collection (via

Playing with magnets is fun when you're a kid but for some, like Dutch designer Jolan Van der Wiel, the fascination never ends. That's why he created the gravity stool. By using magnets, iron shavings, dye, clay and water, Wiel manipulated the gravitational pull of magnets to make a true one-of-a-kind masterpiece.


Imagine landing on Mars and seeing an upside-down icicle with a top that's just flat enough to sit on: that's the gravity chair. The galactic stool looks like something straight out of space and it is shaped by the force of gravity. Wiel designed the funky futuristic stool and believes natural forces should be used to shape more things in our modern world. Move over Frank Lloyd Wright.


The gravity stool is made by mixing iron shavings into a batch of his special clay mixture. The ratio of clay to metal is 9 to 1 and after Wiel mixes his metallic clay, he pours it into a pan. The pan (which eventually forms the seat of the stool) holds the mixture and is placed below a structure that holds three powerful magnets at just the right position to create the legs of the stool (Wiel designed the structure himself). Slowly, the now-magnetized clay in the pan begins reaching towards the magnets, bit-by-bit until it completely dries. The final product is a stone stool that was largely shaped by gravity and air pressure. Each and every stool is unique and although it's nothing you'd see at West Elm, it sure does make a great conversation piece.


Wiel's project is called Architecture meets Magnetism. While the first series of magnetic designs include funky sculptures and the gravity stool, Wiel eventually envisions entire buildings being constructed using gigantic magnets. Inspired by Gaudi's La Sagrada Familia, Wiel is set on taking his "natural" designs to the streets - and the runway--, as he hasteamed up with a fashion designer to make new, fashionable attire for the next league of space cadets. Work it R2, work it!



See more news at:


The Tempescope recreates the weather outside in a nifty box, and you can make your own (via Tempescope Project)


A nifty gadget unveiled in the beginning of October at the CEATEC trade show in Japan is called the Tempescope. It is an 'ambient weather display' you can put on your coffee table or in your office. The Tempescope recreates the weather outside in your own home which allows you to gain a direct sense of what the real weather is like outside. Mostly, this new gadget is pretty cool and eye catching as it seems to set the tone for your home or office.


The Tempescope can recreate the level of cloud coverage, rain, and lightning occurring outside. It also uses blue and red LEDs to give a sense of what the temperature outside is like. In a way, this device seems to settle the issue between Fahrenheit and Celsius as all cultures would be able to gain a tactile sense of what the weather is like from within your own home.


You can watch a video of what the Tempescope can do in action.


I must admit that while it may not serve a necessary purpose, it is very aesthetically pleasing and majestic and seems like a fun thing to have around the house.


Not surprisingly it was created by a designer, Ken Kawamoto, from Japan and will debut on Kickstarter next year. If you can't wait until Kawamoto releases their Kickstarter campaign, the designer has released plans and instructions on how to create your own Tempescope on GitHub here:


In order to do so you'll need an Arduino microcontroller, acrylic boards, a water diffuser, a remote transmitter, and some LEDs (among a few other gadgets and gizmos). They have also released source code which allows the Arudino to sync wirelessly with a smartphone or PC to access weather information to use as data. Their instructions thus far are not entirely in-depth, but if you have some knowledge already, then you can tinker and figure out how to personalize your own Tempescope.


The water diffuser allows the Tempescope to create the mist and clouds according to the Weather data it receives. The LEDs are used to indicate the temperature visually (using a red to blue scale), and they can also be used to recreate lightning during a thunderstorm. The remote transmitter is used to send and receive data from the PC or smartphone.


The Tempescope can also recreate rainfall by piping water from the bottom of the Tempescope up through plastic tubing and dripping it down the acrylic box to make realistic rainfall. The design right now is up and running, but Kawamoto wants to add some more touches before trying to mass produce it for the Kickstarter campaign.


This concept is really cool and combines the beauty of design with a practical purpose. It also seems like a fun pet project to try which could make a cool gift or be a topic of constant conversation in your home. And if all else fails, you can always use a Tempescope as a really cool paperweight.



See more news at:

ebola bot.png

Ebola pathogen killing robots being rolled out into a hospital near you (via Xenex)


With the recent Ebola outbreaks and widespread fear one company is capitalizing on their invention of a germ-zapping robots. Created by Xenexthese new Xenex robots are now being used in over 250 hospitals in the USA and the San Antonio company is trying to get the robots shipped to Africa to kill Ebola germs there. Xenex is also trying to secure contracts with airlines to have germ-zapping robots used on planes that may transport infected passengers.


These germ-zapping robots work by blasting germs and pathogens with UV-C light. UV-A and UV-B light is what we experience on earth as it filters through the atmosphere. However, UV-C light is deflected by the ozone layer, and rightfully so, as it is potentially lethal.


Zapping germs and viruses with UV light has been a treatment for a while but the old method was highly dangerous to humans. Old disinfecting UV lights used mercury to issue death rays to germs, however, these were also lethal to humans.


The UV lights Xenex employs use xenon gas to produce UV-C light, which is 25,000 times stronger than the light UV light produced by the sun. Hence, the light is very effective at killing the Ebola virus as well as other germs by damaging their DNA. So, Xenex has these virus killing robots running around hospitals zapping patient rooms and hallways for 5 minutes at a time which is enough to kill all germs. to including thosehiding behind the cupboard, according to Xenex representatives, (Although that seems far fetched). This may enable Ebola outbreaks to dwindle, or at least become manageable, but should probably be used in conjunction with other cleansing methods, like liquid bleach.


Each of these germ-zapping robots is supposed to cost $115,000. That's a pretty penny for some to pay, but Xenex is appealing to the US Department of Defense to use these robots as a method of preventing further outbreaks. The Ebola outbreaks continue to spread globally and, as travel bans have been deemed unsuitable in the long-run by Press Secretary Josh Earnest, greater disinfection efforts may be the best bet to keep Ebola at bay.


Currently, there is no cure for Ebola and the only real treatment option is prevention. Doctors are working on a drug that will allow Ebola victims to live longer while other treatments are developed, but this is still not fully formed and in use.


The perk to using Ebola-bots is that it reduces the need for hospital personnel to disinfect large areas, which increases their risk of contracting Ebola themselves. If Ebola continues to rise, instead of fall, then Ebola-bots could be everywhere, zapping the virus in cities and towns before it reaches more critical levels. For now, the Xenex germ-zapping robots will continue to be used in US hospitals for the time being. Hopefully they'll make a difference in reducing the number of Ebola victims..



See more news at:

burger in space.png

Burgers and fries in space... waiting to be devoured by alien life forms (via youtube)

This one slipped my recent "10 past and ideas for the future of Space Exploration" post... see those, and come on back!


Two crazy Brits have launched a burger and chips (french fries for you American types)into space as part of a marketing tactic for their Chosen Burger start-up. Fledgling entrepreneurs, Andy Shovel and Peter Sharman, tried to send a piece of their produce into space before but were unsuccessful in their attempt; the camera failed and the balloon didn't reach optimal altitude.


This time, they showed they had learned a thing or two. They succeeded in sending the first meal into space last week (without astronauts to accompany it)– proudly featuring their company logo well above the stratosphere. The balloon allowed the burger and chips to reach 112,000 feet into the air traveling at 35 ft per second. Upon landing, the balloon was about 35 miles away from its launch point in Fulham.


The journey of the space fairing burger and fries was captured flawlessly by a GoPro camera and the video was uploaded onto YouTube. You can check it out here:


The meal is majestic as it flies over the Earth. At one point the meal seems to catch some space wind and rattle about, dropping fries back through the Earth's atmosphere and towards the ground. I can't help hoping that it didn't hit anyone in the head on the way down.


Their YouTube video already has had over 130,000 views since being published on October 8th 2014. For a media stunt, I suppose this duo was quite successful so far. The entire stunt only cost them about 2000 Pounds Sterling (about US$3,200) which isn't bad either.


However, this stunt is a lot less magical than the Exobiotanica art project created b, artist Makoto Azuma. He launched plants into space in July to create beautiful pictures of nature in the vast and desolate landscape of space.


From the Exobotanica project (via Makoto Azuma)


New marketing efforts will probably follow suit and, before you know it, IKEA will send the first bedroom set into space. The whole display is possibly a cheap and silly ploy for attention, but here we are gawking at the beauty of the Earth, and the humor of a burger in space. Ah, humans – this would never happen in a world ruled by robots.



See more news at:


Imagine going to your local supermarket and being greeted, served and checked-out by robots. Think it’s impossible? Guess again. The UK-based Tesco supermarket is currently experimenting with a line of robotic, wearable and cognitive computing innovations to change the way you shop.


Tesco is trying to stay ahead of the competition – and not just by slashing prices. Tesco believes that the retail experience should get a digital face-lift, and is working on the very technology to make it possible. Tesco’s Labs division is working hard on a number of innovations, all powered by IBM’s Watson.


Watson is a supercomputer that uses existing data to create new ideas ( also known as, thinking). The cognitive computer works a lot like the human mind and Tesco is experimenting to see if artificial intelligence can enhance its business strategy. So far, the robotic brain has successfully created a number of “palatable” recipes to serve customers in-store (unless you are confident computers know enough about how the human palate works you might want to be careful to make sure the food  isn’t laced with robopoison. You never do know when robots will decide to take over the world).


Tesco’s eerie concept innovations include wearables that can keep track of inventory, incorporating cognitive computing into novel business strategies and robotics that can do your job, only better. Tesco is currently working on a robotic system that can do all of the warehousing and inventory duties usually reserved for… humans. And it won’t be long before robots are employed by the giant supermarket chain. Tesco Chief Information Officer Mike McNamara said the company expects to begin using the technology within the next five years (and the job market for those with advanced degrees doesn’t look much better… time to start asking grandpa about subsistence farming).


Tesco is on the front of the technology wave that has yet to hit the retail sector. In a handful of years, the shopping experience could be entirely automated (which poses some interesting questions about the future of crime, and the job market of the future, or lack thereof). Hold onto your jobs while you can, but don’t be surprised if the HumanBot2000 replaces you. Who really wants to work 40 hours a week anyway? Now you can pursue that painting career you’ve always wanted… until, you know, RoboPicasso hits the scene. C’est la vie.



See more news at:



When people think of Disney, they don’t usually think, “Disney is one of the most innovative robotic companies,” but that is all about to change. The family-friendly brand is pouring tons of resources into its robotics division in the hopes of creating a new line of robots that interact with humans in a more, well, human, way.


One of the most impressive of Disney’s innovations is an air-pressurized system that allows robots to move like you. The system harnesses the power of pneumonic tubes and air pressure to enable mechanical puppets to mimic the motion of the puppeteer exactly. Whatever the puppeteer does, the robot will also do, simultaneously.


The system is pretty impressive. It functions without traditional valves, motors or pumps and is incredibly precise. In a video, the robotic arm was so precise that it was able to pick up a large roll of tape and balance it on top of a cylindrical object. It even successfully played a short round of catch with a small girl, because after all, this is Disney.


The innovation is intended to enhance the way robots move, making them move in a more fluid, and human, way than traditional bots. Considering how many animatronics Disney has in its parks, it’s probably a smart idea to develop a fleet of robots that don’t terrify your toddler.



Disney’s Controlling Humanoid Robots with Motion Capture Data project


Disney is working on a number of other robotic innovations, including robots that can mimic the movements of a person via sensory body suit. Other projects include sitting robots, robots that can juggle and a robotic sensory system that can observe its environment. Disney is no MIT, but these innovations definitely put them on the radar as far as robotic innovation goes. See more at Disney's robotics page.



See more news at:

Filter Blog

By date:
By tag: