1 2 3 4 Previous Next

News

1,512 posts

8ca4762b847fdff98c389a5da59d6c39_large.png

Russell Rubman’s Gittler Guitar. (via kickstarter & Russell Rubman)


Guitars, or stringed instruments, haven’t changed much over the few thousand years since their inception. (A 3,300-year-old stone carving of a Hittite bard playing a stringed instrument is the oldest iconographic representation of a chordophone. Wiki) They still feature a body of some sort, neck, fretboard and headstock that is usually outfitted with six or more strings. On acoustic guitars, it’s the hollow body that produces sound when the strings are strummed, however on electric models, the sound is produced through electronic pickups that channel the sound to an external speaker or amplifier. Various designs have been produced over the years to give both kinds a distinctive look, however they still feature the traditional parts. Back in the 70’s, musician Allen Gittler looked to minimize his guitar’s makeup, stripping away all unnecessary parts but still retaining the instrument’s basic function without the loss or handicapping of its sound. His resulting design did away with the guitar’s body and headstock but retained the frets (situated on a single rod), strings and small strumming area.

 

Taking a page from Allen’s minimalistic design, Russell Rubman has taken that layout and given it a 21st century makeover. His Gittler Guitar still maintains a minimalistic design but is manufactured with aircraft-grade titanium, outfitted with 31 cylindrical frets (complete with LED lighting) and six string tuners positioned on the bottom of the instrument. Sound is captured using magnetically isolated transducers that send a signal to any MIDI interface or computer and then piped to an amplifier. The bottom also features an ‘E-Box’ that has both tone and volume controls to alter the signal to produce different sounds, much like an electric guitar. Russell is currently funding his Gittler Guitar on Kickstarter in an effort to acquire backing to manufacture his futuristic remake. Those looking to get their hands on one can pledge $2,000 or more with delivery just in time for the holidays (estimated delivery by December of this year).

 

C

See more news at:

http://twitter.com/Cabe_Atwell

3X5A6718-M.jpg

University of Buffalo’s 40Lb sensor system. (via UofB and BBC)

 

Wi-Fi signals can be found almost anywhere: in large cities, rural towns and even in the mountains (next to ski resorts and ranger outposts). You can even find them on the ocean, especially on cruise ships, however you will not find them under the ocean. Even though radio waves can penetrate water to a certain degree they have very little range (unless you have access to the US Navy’s ELF frequencies), which ultimately negates watching Netflix at 20-fathoms. A research team from the University of Buffalo is developing a way to overcome the problems surrounding spotty Wi-Fi service found beneath the waves. Actually, the team is hoping to create an underwater internet network for the purposes of improved tsunami detection, submerged oil and natural gas exploration, military surveillance, pollution monitoring and other applications.

 

Instead of submerging Wi-Fi devices encased in waterproof enclosures, the system will work similar to tsunami-detection networks. They work by using sensors on the ocean that send SONAR-based data to buoys on the surface, which then send out that data using a radio-based signal. The application doesn’t rely on the technology itself in creating an underwater network but rather relies on the different collection methods used by various companies and organizations. It’s also in that regard that the researchers are aiming to create a shared standard that would allow communications to be used by anyone. To find if their system could be feasible, the team dropped two 40Lb. weighted sensors into Lake Erie. They then typed a command into a Wi-Fi enabled laptop and sent the command to the sensors, which was then successfully ‘pinged’ back to the laptop after bouncing off a nearby concrete wall. As to when their system could be implemented into existing or newly designed submerged systems is unknown, however their platform is ‘sound’ and could be installed sometime in the near future.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

250px-Ibm_pc_5150.jpg

IBM 5150 PC... (via Wiki)


The late 70’s saw the birth of personal computers, which at the time were being developed in garages by home-brew enthusiasts (Bill Gates and Steve Jobs were among those enthusiasts). Not soon after, companies such as RadioShack, Commodore International and Apple were successfully selling their own affordable take on desktop PCs to both companies and individual consumers. When 1980 rolled around, product test engineer (at the time) for IBM William Lowe came up with an idea to get the company into the burgeoning personal computing market (it was, however, the leading provider of corporate mainframes at the time). He believed it was possible to conceive, engineer and manufacture a personal computer within oneyear’s time, which was unheard of back then. The company took a chance on Williams’s idea and he went forth and compiled an engineering team, known as the ‘Dirty Dozen’, to make the new project a reality. Instead of designing proprietary technology and software, William and his team looked to the fledgling Silicon Valley companies for off the shelf parts.


william-c-lowe.jpg

William C. Lowe (via wiki)

 

A year later, the iconic beige-box was born, which was a surprise to those in the industry, including IBM. Known as the IBM 5150 Personal Computer, it featured an Intel 8088 (clocked @ 4.77MHz) processor, 16 to 256kb of memory running Microsoft’s DOS 1.0 operating system. The set-up cost consumers a mere $1,565, however that was without a monitor or even disk drives, although they were available in different configurations of the 5150. After the PC’s release, Apple took note of IBM’s first offering and actually placed a whole page ad in the Wall Street Journal stating ‘Welcome IBM. Seriously’ as something of a blasé taunt. Microsoft founder Bill Gates was apparently at Apple’s headquarters at the time of IBM’s unveiling and later stated that it took Apple a full year to realize what had just happened. The 5150 was launched in August of 1981 and by October of that same year, droves of people were dropping $1,000 deposits just to get their hands on one. By the end of the following year (1982), the company was selling roughly one PC a minute per business day (9 to 5). The 5150 was so popular it became known as the ‘PC’, which is still widely used today. Thanks to William’s efforts and insight, the desktop PC is still going strong today and can be found most anywhere on the planet! Unfortunately for us, we have lost yet another entrepreneur of the technology most of us take for granted. William C. Lowe, 72 passed away on October 19, 2013 of a heart attack. He is survived by his wife Cristina, his 4 children and 10 grandchildren.

 

William C. Lowe

January 15, 1941 – October 19, 2013

 

C

See more news at:

http://twitter.com/Cabe_Atwell

300px-Graphen.jpg

Graphene concept art. Single layer...


Graphene is back in the spotlight and its unusual properties are once again being harnessed, this time for use in sticky-type memory. Yang-Fang Chen and his fellow researchers from the University of Taipei designed the memory for use in flexible electronics. Flexible memory has been created before by researchers from the University of Cambridge using nano-wires grown on plastic substrates, however practical applications using the flexible memory are still a decade or more away. Instead of using plastic, the research team from the University of Taipei used graphene coated in a conductive polymer topped with aluminum electrodes to create a flexible memory sticker. Using graphene as the memory’s substrate gives it the ability of having a natural attraction to other molecules (this is known as the van der Waals force), which allows the memory to be attached just about anywhere.

 

In initial tests, the researchers applied their flexible memory sticker to various surfaces, including a business card and a medical bracelet, which they found did not diminish data retention even while curved. They also found that the memory could be applied, removed and applied again a number of times without losing any stored data. The team stated that with a few more parts (like a Wi-Fi module) attached to their sticky memory, the device could conceivably be used as a flexible flash drive. Think of it like a Post-It note that is able to download data from your computer or mobile device, then is peeled off and stuck to another device to upload that data. The possibilities are endless, the team even thinks that they will be able to incorporate the memory into other flexible electronics sometime in the near future.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

1011_1_02_L.jpg

Toyota’s Advanced Active Safety Research Vehicle. I look forward to computer controlled autos. Mostly to avoid traffic jams (via Toyota)


Anyone who’s watched I, Robot (Will Smith version) can recall the car chase scene with an Audi that’s capable of driving itself and avoiding collisions. While that fictitious car was featured in a science fiction film, there have been several successful autonomous vehicles that are capable of the same feat, although they are not mass-produced. According to a recent press release from Toyota Motor Corporation, the company plans to introduce their advanced driving technology to consumers in only two years. Known as the Automated Highway Driving Assist (AHDA), the technology uses a series of sensors that allow the vehicle to take control and avoid collisions. The system actively looks for vehicles and other obstacles in the car’s path, and if dangerous conditions are detected, the car swings into action by taking control of the vehicle’s brakes and steering to avoid the obstacle.

 

The system does give the driver a chance to react before it is initiated, at which point it brings up a visible notice on a display as well as sounding an alarm. AHDA is actually comprised of two separate technologies, with one known as Cooperative-adaptive Cruise Control, which communicates wirelessly with the surrounding vehicles in order to maintain a safe distance from each other. The second piece of technology uses Toyota’s Lane Trace Control system, which uses millimeter wave radar along with HD cameras to aid in steering control to keep the vehicle in its driving lane. Toyota has already fielded the technology on a limited scope with test vehicles driving on Tokyo’s Shuto Expressway and is set to expand sometime in the next few years.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

ara1blogpost.png

Motorola’s Project Ara. Building your own smartphone seems like a dream. I hope this catches on. (via Motorola)

 

Every smartphone has features some users do not want or could do without. It is in that sense that Motorola has turned to designing a modular smartphone, which would allow its users to connect the hardware they prefer for the applications they use. Known as Project Ara, the idea is to design modular pieces that consist of certain hardware elements such as Wi-Fi, connection ports or even keyboards that connect to one another on a basic platform. The idea for the modular phone came from (the now defunct) Phoneblok project that would allow users to pick and choose their own hardware that fastened to a connectable base. Actually, Motorola teamed up with the creators of Phonebloks for their Project Ara and are currently looking to employ what they call ‘Ara Scouts’ to help design the project’s modular pieces.

 

The possibility of a modular smartphone is certainly incredible. Imagine being able to switch out cameras when a newer version comes along without the need to replace your entire phone or easily replacing a damaged speaker (think the EVO 4G) without the need to get it serviced. Unlike the original Phoneblok design, Ara will use an ‘endoskeleton’ (known as endo) that holds the modules in place, which could include everything from processors, different displays or an extra battery (incredibly convenient). While the prospect of a truly modular phone is only in the planning stages at the moment, Motorola will be releasing a MDK (Mobile Developers Kit) to those who signed up to be an Ara Scout as early as the end of this year. That would mean that a fully modular smartphone could hit the market as early as the middle of next year, however until then we will still have to use our non-modular phones.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

timaxlife ti.jpg

MaxLife concept diagram. Lifetime cycles are increased, but not a big improvement in energy density (via TI)

 

Texas Instruments is one of the most dominant technology companies ever. Behind Intel and Samsung, it is the world's third largest producer of semiconductors. In addition, they are the largest manufacturer of digital signal processors and analog semiconductors. Young students may just know of TI as producers of their world famous graphing calculators. However, for the older, more experienced students, they quickly learn TI has technology that can be found everywhere. In fact, many of the ICs used for basic electronics are all created by TI.

 

There is also one additional area TI's technology excels at. That would be in energy efficient electronics. One of the more popular devices is the MSP 430 microcontroller family. These MCUs allow developers to create embedded applications, which can manage power extremely efficient. The CPU can work with speeds up to 25 MHz or can be lowered to save power in applications. More importantly, the MCU has a low power idle mode. When working in this mode the CPU will draw as little as 1 micro-Amp of current. Along with the low power capabilities, this MCU can also work with all the usual embedded electronics communication protocols and peripherals.

 

As of late, TI has been trying their hand at a new energy saving technology. That would be battery management chips. Back in March, they released their bq2419x family of chips, which were claimed to have the potential to reduce charging times to half their current lengths. This was an extremely demanding technology which many companies wanted a piece of. This is largely due in part to the emergence of tablets and smart phones. All Android users are well aware of the battery draining apps we all so often use. TI is the company looking to provide a solution to ease all of our frustrations.

 

As of recently, TI has announced the release of a few more energy efficient chips. Collectively, they are known as MaxLife chip sets. These include bq27530 and bq27531 fuel gauge circuits, which will be working alongside the bq2416x and bq2419x chargers. Together they are expected to provide faster charging times and increase the longevity of batteries by up to 30 percent. The charger is directly controlled through an autonomous battery management system, which provides users with greater flexibility. For example, due to the autonomous control there is less software overhead to help designers integrate it more easily into systems. Additionally, it provides better thermal management and battery safety.

 

The MaxLife technology from TI is now available in a development kit. The development kit features a bq27531 fuel gauge connected via I2C to a bq24192 charger. Using such combinations charging up to 4.5Amps can be achieved for single cell lithium ion batteries. This is one of the first successful technologies, which will allow batteries to charge faster without damaging the battery. I do not believe it will be long before we see these chips integrated into consumer products.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

castar.jpg

Artist concept of the finished castAR glasses (via castAR kickstarter page)

 

Augmented and virtual reality headwear has risen in popularity ever since Google Glass and the Oculus Rift hit the market. Those glasses and head-mounted displays either overlay interactive applications on the direct or indirect environment being viewed or create a computer-generated environment putting its user in that simulated world. Those two perspectives are usually separate from each other and integrated into a single design rather than both being on the same device. This is usually do to the fact there is too much hardware involved to be packed into a small space, however one company has seemingly managed to incorporate both AR and VR applications into a simple pair of glasses. Just 20 hours after posting their proposition on Kickstarter, Technical Illusions reached their funding goal of $400,000 for their castAR AR & VR system.


Ex Valve employees, Jeri Ellsworth and Rick Johnson, designed the glasses (first developed for Valve’s castAR project) so that users are able to see 3D holographic projections situated right in front of them. This is done using two micro-projectors housed on both sides of the glasses frames, with each projecting a portion of the image on a flat surface. The users eyes are able to focus naturally on the integrated images thereby eliminating the dizziness and nausea experienced using other headsets while gaming (Oculus is notorious for this). The system uses active shutter glasses that have a 120Hz refresh rate, which is necessary to view 3D video and images (higher would be better however). The glasses work by projecting the images (@ 720p resolution) onto their specialized retro-reflective RFID tracking surface that uses IR markers, which are then projected back to the systems glasses with the images/video piped to the lenses through an HDMI connection. A camera (connected via USB) housed in the center of the glasses frames scans the surface for IR LEDs (built into the surface mat) that tracks the users head, which specialized software then adjusts the image depending on the viewed angle. A simple clip-on attachment allows users to convert the glasses from projected AR into true augmented reality (used without the mat) or full virtual reality similar to the Oculus Rift.

 

Another interesting add-on for the castAR system includes the use of their ‘Magic Wand’ controller, which has an IR marker situated on its tip that allows users to use the device as either a gaming joystick of sorts or a 3D input device. It is also outfitted with several buttons, an analog joystick as well as a trigger that allows for additional options with multiple applications. Gaming with the castAR system isn’t limited to just video games, as the RFID mat can be used for board games as well. Users can affix RFID bases to game tokens or miniatures, like those from Dungeons and Dragons or MechWarrior, which can show vital or crucial information about the player on the virtual board. The board can also be created and configured using the company’s castAR software suite, which allows for online gaming as well, so friends can play against one another over an internet connection. Those looking to get their hands on one of the castAR systems can do so through a pledge of $189 and up, which nets you a pair of castAR glasses and a 1-meter X one-meter surface pad. $285 gets users the whole package including glasses, mat, AR & VR clip-on and Magic Wand.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

iPadKid.jpg

I would have loved a tablet when I was his age... (via LAUSD)

 

Earlier this month (October 2013), the Los Angeles Unified School District unveiled their ambitious proposal to introduce a series of tools designed to raise academic standards needed for students to succeed in college or the skills needed for a career. The $1 billion dollar project aims at giving every student (more than 600,000) in the district an iPad tablet in an effort to build 21st century skills revolving around technology to better prepare them for the future. While the initiative is designed to ‘level the playing field’ for both wealthy and underprivileged children (allowing them access to the same opportunities) the roll out of the Project has had a few shortcomings. First, the parents of those children wanted to know why the students were not being taught traditional vocational skills such as machine shop, while others queried about school board politics and priorities rather than anything about the technology or the Project itself.

 

The district’s Technology Project is in conjunction with the nation’s Common Core States Standards Initiative (for children in K through 12), which provides a set standard for mathematics and English language arts. The second issue stems from the tablets themselves (purchased at $678 each), as some schools are not equipped with the Wi-Fi needed to download the educational material in class. These schools will need to be upgraded so the children have access to the related content, which may or may not have been included in the billion-dollar budget. As the first batch of tablets were rolled out to roughly 47,000 students who were allowed to take them home, the district found that some of those students were technology savvy and quickly disabled the tablet’s firewall allowing them to surf the web freely and visit sites deemed inappropriate by school standards (suffice it to say, they weren’t learning math). This presented a liability to the district, as students could become the victims of sexual predators while using school property.

 

The schools quickly remedied that problem by restricting the tablets to in-school use only until it finds a solution around those hacking endeavors. Another issue underlying the iPads is how the district will repair or replace the tablets if they become damaged. Apple has stated that it will replace 5% of those that no longer function, leaving the schools to find their own solution for the other 95%. The problems don’t stop there as L.A. Unified forgot to factor in the necessary training that some teachers would need to use the iPads in their curriculum, as some have never used one or any other tablet for that matter. Other issues such as theft and the inclusion of keyboards for classwork have also become factors that need to be addressed. While the Unified District has slapped a Band-Aid on a few of those cracks, it will need to do a lot more, and soon, before the dam breaches and it takes more than $1-billion to fix those problems. Still, the thinking was in the right direction, as students will undoubtedly need to obtain technology related skills if they’re to succeed in the future.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

First, I am going to suggest a soundtrack for reading this post… Please hit play below and move on:

 

 

Touchscreens are becoming the user interface of the future. It began with smart phones, then came the iPod Touch and then tablets. Now we can find touchscreens in many places. Some restaurants have small touchscreen computers which let diners flip through the menu and check the balance of their current bill. Newer apartment and building complexes are integrating touchscreen directories in the front of the building to get in contact with the person of interest. In addition, many computers and laptops are offering the choice of a touchscreen monitor for an increased user experience. With that said, many new touch technologies are beginning to emerge. So let’s take a look at some of these upcoming innovations and catch a glimpse into the future of the touchscreen interface.

 

ultrahaptics.JPG

(via Ultrahaptics)

 

To begin with we have a new haptic feedback system being developed by researchers from the University of Bristol in the UK. This system integrates a leap motion controller along with ultrasound speakers placed behind a display. While the leap motion controller is in place to track the user’s fingers and hand gestures, the ultrasound speakers provide a sensation to the fingers to give users feedback that will stimulate the sense of touch. The system is called UltraHaptics and consists of an array of 320 of the previously mentioned ultrasound speakers.

 

Tom Carter, a lead researcher involved in the work stated, “What you feel is a vibration. The ultrasound exerts a force on your skin, slightly displacing it. We then turn this on and off at a frequency suited to the receptors in your hand so that you feel the vibration. A 4-Hertz vibration feels like heavy raindrops on your hand. At around 125Hz it feels like you are touching foam and at 250Hz you get a strong buzz.”

 

 

A similar technology being developed by Disney Research is one that involves electrostatic forces to simulate a sense of touch. Rather than using sound waves to compress the skin and provide the feeling of a textured surface, Disney's researchers have been employing electrovibrations which can stretch and compress the skin. The vibrations have been used to create the same type of lateral friction one would experience when sliding their finger across a bump.

 

“Our brain perceives the 3D bump on a surface mostly from information that it receives via skin stretching,” said Ivan Poupyrev, who directs Disney Research, Pittsburgh's Interaction Group. “Therefore, if we can artificially stretch skin on a finger as it slides on the touch screen, the brain will be fooled into thinking an actual physical bump is on a touch screen even though the touch surface is completely smooth.”

 

 

Both of the aforementioned technologies are hoping to enhance the user experience and create more tactile-rich displays. On the other hand, Japanese company, AsukaNet, is developing an Aerial Imaging Plate. Very similar to what a hologram is like, the system would project an image which will appear to be floating in front of a user. The user can then navigate through menus or anything a touchscreen interface may be used for. To accomplish this task a tablet interface is used alongside reflective surfaces which project the image in front of a user at a 45 degree angle. Furthermore, the user must also stand in a specific position in front of the display. If not, the image will appear as a normal flat surfaced picture. The company mentions this feature can be used as an advantage in scenarios in which privacy is important. For instance, when interacting with an ATM this feature could increase the privacy since only the user in front of the display would be able to see which numbers they are poking at. Also the projected display increases the sanitation many interfaces lack. Since there will be no physical contact involved, germs and viruses cannot be transmitted from one person to another. 

 

eyetoy.jpg

Printed toy with a little optical imaging sensor built in... (via Disney Research)

 

The last of the new and innovating touch technologies we can expect to be seeing in the future is curved touch screens. One of these comes from Disney Research and is made during the process of 3D printing. Thanks to a light sensitive plastic, known as photopolymer, optical fibers can be printed alongside the main structure of a 3D object. What this allows the engineers to do is connect the optical fibers to an image source which can then transmit the information to and from a curved surface. Disney has already created many prototypes, many of them cartoon-shaped creatures with large eyes that look around the room.

 

lgflex.jpgLGGFlex.jpg

The bendy screen on the left. Possible image of the actual LG G Flax phone on its way to the market. (via LG)

 

LG has also announced a flexible display which is set to be released next year. It is a 6 inch flexible OLED screen which LG claims to be indestructible. The display is going to made from aplastic substrate while also consisting of layers of a film-type encapsulation and a protection film. Overall the display is going to be only 0.44mm thick, weigh only 7.2 grams, and possess a vertical concave radius of 700mm.

 

“LG Display is launching a new era of flexible displays for smartphones with its industry leading technology,” said Dr. Sang Deog Yeo, Executive Vice President and Chief Technology Officer of LG Display. “The flexible display market is expected to grow quickly as this technology is expected to expand further into diverse applications including automotive displays, tablets, and wearable devices. Our goal is to take an early lead in the flexible display market by introducing new products with enhanced performance and differentiated designs next year.”

 

LG will not be the only ones in the flexible display market. Samsung has also been showing off some prototypes and experimenting with technologies of their own. We can expect these technologies to hit the market some time late next year. However, CES 2014 is also right around the corner. As the world's largest gathering place of consumer electronics, it is almost certain we will be seeing many more innovative display technologies.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

Arduinos are awesome for what they have introduced to the world. They have allowed young and old people to easily learn and adapt to working with microcontrollers. They have made computing and digital control available to everyone and have kept everything open sourced along the way. As a result, they now have one of the largest communities in the world of DIY. However, Arduinos have their limitations, and with the “internet of things” slowly working its way into our lives, Arduino does not want to be one step behind.

 

With that said; two new boards are on the way from Arduino. They are the Arduino Tre and the Galileo. These boards will feature processors capable of handling Linux applications which will throw the Arduino right into the competition with single board computers (SBCs).

 

ArduinoTre_LandingPage.jpg

Arduino Tre... I suspect this board may be on element14 some day soon. (via Arduino)

 

To start with the Tre, it is going to feature a 1–GHz Texas Instruments Sitara AM355x processor (ARM Cortex A-8). If this processor sounds familiar that's because it may be to a lot of people. This is the same processor which the Beagle Bone Black features. Indeed, the Arduino Tre is the result of a close collaboration with the BeagleBoard.org foundation. As a result, the board will also have tons of support and help already available for working with the A-8 processor.

 

“By choosing TI's Sitara AM355x processor to power the Arduino Tre, We're enabling customers to leverage the capabilities of an exponentially faster processor running full Linux. Our customers now have a scalable portfolio at their fingertips, from the microcontroller – based Uno to the Tre Linux computer,” commented Massimo Banzi, co-founder of Arduino.

 

The board will come in an architecture which will look like a normal Uno board in the middle, with expanded pins and connectivity around the outside. This design will allow all of the current and previous shields to be compatible with the Tre as well as expand upon previous projects one may have worked on. In addition, it looks like the board is going to feature pins for Zigbee compatibility in the middle.

 

The board will feature lots of GPIO pins along with 1080P HDMI support and high definition input/output audio. An Ethernet 10/100 port will be available along with connection for LCD expansion. Furthermore, as with many other SBCs, a SD card slot will be available for storage which usually holds the system image. One interesting note is that this will be the first and only -Arduino board so far which will be manufactured in the United States.

 

IntelGalileo_fabD_Front_450px.jpg

Arduino Galileo, Intel joins the fray... (via Arduino)

 

Moving on to the Galileo, this will be a board made in collaboration with Intel and Arduino. It will feature an Intel Quark SoC X1000 application processor. This is one of the first Intel chips designed for SBCs which will be aimed at a market previously dominated by ARM chips.

 

“Intel Galileo features the Intel Quark SoC X1000, the first product from the Intel Quark technology family of low-power, small core products,” an Intel representative said. “Intel Quark technology will extend Intel architecture into rapidly growing areas-from the Internet of Things to wearable computing in the future.”

 

The Galileo board, like the Tre version, will feature a pin out which resembles the Uno version of the Arduino board. However, all of the pins will operate at 3.3V and will have a jumper available which will make all the voltages at the pins 5V if needed. Since the previous and current Arduino shields will be compatible, this feature will be needed if using with a shield. In addition to the Uno's features, the Galileo will have a mini – PCI Express slot, 100Mb Ethernet port, Micro-SD slot, RS-232 serial port, and 8MB of NAND flash memory.

 

Arduino will now be a major player in the SBC market. The Arduino has already made a prominent name foritself in the microcontroller world, and that huge community will likely be passed on to their SBC boards. One major advantage the Intel based board will bring to the market is support for x86 architecture. In addition, Intel CEO, Brian Krzanich has announced they will be donating 50,000 boards to over 1,000 universities worldwide. This will definitely kick things off on the right direction. The Galileo is set to be released in November and sell for under $60. The Arduino Tre is going to be released in Spring of 2014 sometime and a price has not yet been announced. Nevertheless, makers and the DIY communities now have a plethora of boards to choose from and experiment with. Which one will you choose?

 

C

See more news at:

http://twitter.com/Cabe_Atwell


 

Nine Inch Nails (NIN) like to put on some visually stunning shows. For their next tour around the world, they will be using some new tech to improve their visual effects. One of those new technologies being employed is a Microsoft Kinect which will track the bands movements and project that video onto mobile screens on stage. In addition, standard video cameras and thermal video cameras will be used to capture additional footage. The mobile screens on stage will display all types of visuals and give the illusion of the band disappearing and re-appearing throughout the show. The show will be the result of a collaboration between the band's group leader Trent Reznor, lighting designer Roy Bennett, and artistic director Rob Sheridan.

 

Furthermore, NIN is also trying their hand in a different type of recording. One of their latest albums, Hesitation Marks, will be recorded twice, once in the standard “loud” recording and one in an alternate “audiophile” mastering. The band says the differences will be subtle to most people, but the audiophile version will sound slightly different on high end equipment and may be preferred by those with an understanding of the mastering process.

 

Mastering Engineer Tom Baker adds, “I believe it was Trent's idea to master the album two different ways, and to my knowledge it has never been done before. The standard version is “loud” and more aggressive and has more of a bite or edge to the sound with a tighter low end. The Audiophile Mastered Version highlights the mixes as they are without compromising the dynamics and low end, and not being concerned about how “loud” the album would be. The goal was to simply allow the mixes to retain the spatial relationship between instruments and the robust, grandoise sound.”

 

One of the more interesting moves the band has made in the past was crowd sourcing 400GB worth of video footage from shows played throughout a tour. The tour, “Lights in the Sky,” took place in 2009 and after their plans for a film had to be canceled, they decided it would be better to just put the video footage online for people to mix up on their own. The HD footage was totally unedited and not anything for recreational viewing at all. However, their motives were that some people with extra time on their hands and some editing skills would put together their own versions of a film for fans to view. As we can see NIN has some clever moves up their sleeves for keeping their fans pleased. It will be interesting to see how their new tour works out with all the visual effects being employed. It is highly likely that more bands will be taking notes and using technology for their own uses and effects.

 

C

See more news at:

http://twitter.com/Cabe_Atwell


Searching for parts: easily one of the most important tasks when creating a new design.  Get lazy here, and it will haunt designs later.  The importance is paired with the significant amount of time that it takes to get just the right parts selected. This is probably why engineers everywhere fight specification changes once the parts are defined. It's a fine balance for each part to work with one another in concert.

 

Given the importance of the part selection process, surely there are many different tools out there to help discover, keep track of part possibilities, and manage notes that one comes up with while searching.  But sadly, the only tools that exist are the search tools provided by the part distributors.  These are great for finding parts based on a wide array of specs, however each search is merely a way to find information.  While some CAD tools like Cadence and Synopsys have solutions, they are expensive and generally only accesible to engineers at large companies.  Beyond that, taking notes and remembering parts is a job usually ascribed to Excel or a paper notebook.  That's a great solution... for the year 1998!


That's where Frame-It steps in.  Their team noticed that the wealth of information collected during part selection needed an organized location and digital notepad.  From there, they created Frame-It.  It's a Google Chrome extension that will save webpages or documents that are being browsed while allowing the user to take notes on the content.  The content can also be named and organized with folders and tags which makes it easy to find later:

 

Capture View.jpg

 

Where does the saved data go?  Straight to the user's frame-it account in the cloud, all without disrupting the part exploration process.  When users later say, “Oh! I was just looking at a part last week that might help!” they can refer to their Frame-It workbench and review all of their data:

Workbench View.jpg

It's one of those products that is so simple, clean, and straightforward that one might wonder why it hasn't been mainstream for years.

 

As with any totally new product launch, there are a few things that make it tricky to use.  First, if you're taking notes and switch to another tab, it will erase the work-in-progress.  (I use 2 different browser windows to account for this).  Also, scrolling through a document can only be done with a mouse scroll wheel instead of the side bar.  A few quirks, but certainly easy to temporarily accept.

 

I have been using it for over a year on a few projects and have found it to be a remarkable tool for capturing information and accessing it later.  It is fast, unobstructive, and builds a wonderful personal repository for future projects.  I encourage you to give it a shot!

Class Photo.jpg If you've been in electronics for any time at all, it is easy to see a pattern develop between hardware and firmware/software engineers.  As soon as a problem crops up, it is instantly clear to the hardware team that it is the code that needs fixing.  However when asking the software engineers, they confidently state that the hardware is at fault.   When pressed for details, both sides will end up saying something like, “Well, I don't know what is wrong, but I know it can't be related to my part of the design because of _______.”

 

How is it that these assumptions are being made time and time again about the source of the problem coming from the other side?  Much of it probably stems from human nature's inclination to toss any problem over the wall and hope someone else fixes it.  This is likely strengthened by an engineer's confidence in a design they have spent untold hours creating.  But another source is likely rooted in a lack of understanding how the other side of hardware/software works.

 

How can we prevent this unproductive riff raff?  It might help to have anyone involved in electronics (either the hardware or software side) take at least one course in microcontroller design to show the connections (and problems) that occur between hardware and software.   Any piece of circuitry will eventually need to be controlled or communicate with software, and software usually involves the real world at some point.  The most remarkable A/D circuit is useless if the communication bus that the digital signal must pass over does not have the required bandwidth.  Similarly, a beautiful chunk of code written to control an RGB LED matrix won't work if the hardware isn't designed to supply the required amount of power.  A course that forces the engineer to face problems on both sides can be humbling; for example a hardware engineer might spend hours troubleshooting his or her code only to find that the motor was connected to the wrong power rail.

 

An example of such a course, ECE4760, taught by Prof. Batten at Cornell University is now available on YouTube and offers a wealth of information that would greatly help anyone designing in the Electrical Engineering space.  It could be replicated at home thanks to the $10 MSP430 Launchpad and the free development tools.  And while an outside observer won't be able to benefit from in-person instruction, they will be able to easily engage the communities like Element14's MSP430 group or TI's E2E forum.  The best part is that the course is project based, giving students an opportunity to learn the difference between 'This should work' and 'This does work.'

 

Of course this is just one suggestion to give each side a glimpse of the other.  After 5 years in the field any lessons of humility will be mostly forgotten.  Have you noticed the “it's their problem” exchange before?  What do you think would help with it?  Please share your thoughts in the comments.  In the end, we're all on the same team!

LED in LA.jpg

Clinton Climate Initiative LED streetlight... Los Angeles - now the "City of LED Lights" (via LA Bureau of Street Lighting)

 

 

Street lighting has been around since the ancient Romans and Greeks used them for safety purposes (tripping over obstacles, etc.) and to keep thieves at bay. Those were of course oil lamps and were in use all over the globe until 1875 when Russian inventor Pavel Yablochkov introduced his ‘Yablochkov Candle’ to the world. The first use of his electric lamp was in Paris where 80 of them were deployed to light the Grand Magasins du Louvre department store, which is where the city subsequently earned its nickname ‘The City of Lights’. Since that time, electric incandescent street lights in one form or another have been used to illuminate highways and on/off ramps for over 100 years, however this is about to change.

 

LEDs have proven to be much more efficient  than traditional incandescent lamps when it comes to power use, so it was only natural that the Los Angeles Bureau of Street Lighting used them to replace the city’s aging streetlights. The city’s Light Emitting Diode Street Lighting Retrofit (part of the Clinton Climate Initiative), has replaced over 140,000 of their streetlights with more efficient LED units over the past four years. As a result of switching over to LEDs, the city’s energy use for lighting has been reduced by over 62% and carbon emissions were reduced by over 47,000 metric tons per year. The city’s light pollution has also been reduced due to using white LEDs, which is also getting high-praise from the LAPD as well as the Dark Skies Association (the premier authority on light pollution). Los Angeles isn’t the only city to make the transformation over to LED street lighting, as Las Vegas replaced 42,000 lights back in May of this year and Austin is looking to replace 35,000 of their lights. San Antonio is set to install 20,000 LED lights as well, as the trend to go green is taking the US by storm.

 

The adoption of LED lighting has been steadily increasing over the years from 3 million units in the beginning of 2012 and is expected to grow to over 17 million by the year 2020. This is to be expected as the demand for energy has been increasing over the last few decades as more countries have been expanding their infrastructure. The benefits of using LED lighting are obvious, LA alone has reduced its electric bill by 7 million dollars with an additional savings of 2.5 million through reduced maintenance expenditures. The city isn’t finished with replacing the defunct high-pressure sodium fixtures, as they plan to switch out an additional 70,000 units with the second phase of their initiative, which is expected to be completed by 2014. The total budget to replace those fixtures tops out at $56.9 million, which may sound like a lot of money but considering the savings that will accumulate over a period of a few years will save the city money in the long run.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

Filter Blog

By date:
By tag: