1 2 3 Previous Next

Embedded

132 posts

Memristor circuit.jpg

Memristor Circuit. Researchers at UC Santa Barbara and Stony Brook University successfully built a neutral network to house memristors. The prototype was successful in recognizing small images and may be expanded to develop futuristic computers that simulate the human brain. (via UC Santa Barbara)


A team at the University of California – Santa Barbara and Stony Brook University is on the brink of finally breaking the secret on how to develop memristors on their own neutral hardware using old perceptron technology. Memristor research has been a long time coming, but if the researchers are successful, the devices can assist in computer energy consumption management and may allegedly lead to thinking computers that mimic human neurons and synapses.

 

Memristors, or memory resistors, are thought to be a crucial component to developing computers that can really “think” like human brains. A human brain will build brand new synapses based on an individual’s need for a particular type of information. A mathematician, for example, would have a very different brain, structurally, than a musician, because the part of the brain most used would become more developed over time. Computer scientists think memristors are the key to allowing computers to work in this way, as they can regulate the flow of electrical energy to various circuits, based on which circuits are most frequently used.

 

41634054_01_d-533x371_f-1_c-451x182.png41634054_02_d-600x217_f-0_c-300x108.jpg

Concept Blueprint (via UC Santa Barbara & Nature)

 

Although memristors are a common topic of conversation for future computer-building, scientists struggle with building a neutral hardware to house them. The new study published by UC Santa Barbara and Stony Brook University, however, may change that. The team built a 12 x 12 memristive crossbar assay that functions as a single perceptron, or an early neutral network often used for pattern recognition and basic information organization. The team programmed a network of perceptrons to decipher things like letters and patterns and say together, the micro hardware functions as a collection of basic synapses.

 

The hardware is built using aluminum and titarium, but manufactured under low temperatures to allow for monolithic three-dimensional combination. This allows for the memristor to “remember” the amount of energy and the direction of the previous current for future use, even after the main device has been powered off. This recognition is currently possible using other technology, but it is much more involved. Using memristors means easier functionality while using no power.

 

In the trial, the memristor model was able to decipher 3 x 3-pixel back-and-white patterns into three types. The model they created had thee outputs, ten inputs and 30 perceptron synapses. In future, the team plans to shrink the current device down to 30nm across, in the hopes to simulating 100 billion synapses per square centimeter.

 

While some argue computers will never have the real processing power of the human brain, others say memristors will still be useful as analog memory devices or components of logic for larger systems. Since they use no energy, but record energy used, memristors may also be useful for energy management.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

zylotech

LPC4357-EVB

Posted by zylotech May 25, 2015

Hi all,

 

I recently bought a development board LPC4357 - EVB .

I can not so much regarding ARM programming or no experience when it comes ULINK2 , Keil and LPC4357 - EVB .

 

I have downloaded Examples2 \ GPIO \ Gpio_LedBlinky project .

When I program GPIO_ledBlinky in InFlash mode.

All goes well , I see that LED flashes.

 

But when I remove ULINK2 from the USB port when the LED stops flashing.

Should not be the code to be programmed into the LPC4357 chip when selecting " InFlash " ?

 

And why can not program in SPIFI mode (the flash is full chip Erased before i try flash in SPIFI mode)


DIP switch is

1. Down

2. UP

3. UP

4. UP

I get an error like this pop-up message "ERROR: Flash Download failed - "Cortex-M4"

 

in Keil Build Output windows

Load "C:\\LocalData\\LPC4357-EVB\\Examples2\\GPIO\\Gpio_LedBlinky\\Keil\\SPIFI 64MB Debug\\example.axf"

Erase Done.

Programming Failed!

Error: Flash Download failed  -  "Cortex-M4"

Flash Load finished at 14:32:41

 

Best Regards

 

Martin

Europa squid.jpg

NASA eel bot that may delve into the depths of moon Europa. NASA recently announced their current 15 winners of NIAC funding for $100,000 for each candidate. Among them is a project to develop a robotic eel to explore Europa, Jupiter’s moon.  (via NASA)

 

 

Anyone seen that movie Europa Report? It may have inspired NASA...

 

NASA recently announced their winners of their annual NASA Innovative Advanced Concepts (NIAC) program. There are 15 winners in total that have far-out ideas (pun intended) about making science fiction a reality. NASA is hoping that these highly innovative, and a bit crazy, ideas will lead them to advances that can progress their ability to delve further into space.

 

One crazy idea that just might work is NIAC 2015 winner Mason Peck’s research to design a robotic eel that can explore the depths of Europa, one of Jupiter’s many moons. The idea is highly innovative and calls for the invention of new technologies – including new power systems.

 

A mock-up for the robot design is seen above. It would be a soft-bodied robot that can swim and explore the aquatic depths of Europa. Peck describes the robot as more of a squid than an eel, as NASA calls it. The science behind it is pretty inspiring. The body of the eel/squid would have ‘tentacle’ structures that allow it to harvest power effectively from changing electromagnetic fields.  The energy will power its rover subsystems, one of which allows it to expand and change shape to propel itself in water and on land. It would do this by electrolysis of water, creating H2 and O2 gas that will be harvested to expand, and combusted internally to act as a propulsion system. To learn more about the other 14 winners who scored $100,000 to develop technology like this, see their extensive report.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

CVD-Graphene-640x354.jpg

Chalmers University of Technology researchers have found that large area graphene helps prolong the spin of electrons over longer periods of time (via Chalmers)


Chances are you own a smartphone, tablet or PC/laptop that features some form of solid-state technology - typically in the form of RAM, flash drives or SSD hard drive. Those devices are faster than their mechanical counterparts and new findings by researchers from Sweden’s Chalmers University of Technology are set to make that technology even faster and more energy efficient through the use of graphene.

 

Specifically, they found that large area graphene is able to prolong the spin of electrons (spintronics) over a longer period of time over that of ferrous metals. Spintronics deals with the intrinsic spin of electrons in a magnetic moment- or the torque it will experience when an external magnetic field is applied. As mentioned above there are already spintronic devices on the market, however they use ferrous metals for their base platform. It’s the impurities in those metals that hold spintronics back from becoming a mainstream component in today’s electronic circuitry- limiting the size of the components themselves.

 

This is where graphene comes into play as the material extends the area of spintronics from nanometers to millimeters, making the spin of those electrons last longer and travel farther than ever before. So why is that good? Data (in the form of 1’s and 0’s) is encoded onto those electrons as they spin up and spin down rather than relying on the other method of turning the electrical state of off and on using traditional circuits. The problem is as the process nodes become smaller it results in increased electrical ‘bleed’ across transistors in the off state thereby preventing us from building transistors that consume less power.

 

Using graphene as the substrate for spintronics allows for the electrons to maintain their spin alignment to a duration of 1.2 nanoseconds and transmit information contained in those electrons up to 16-micrometers long without degradation. Of course, progress doesn’t come without its problems- in this case it’s the graphene itself or rather the manufacturing process. Producing large sheets of the one-atom thick substance is still an issue for manufacturers and when it’s produced it usually has defects in terms of wrinkles and roughness, which can have negative effects on electron’s spin rate and decay.

 

The researchers however have found that the CVD (Chemical Vapor Deposition) method is promising and the team hopes to capitalize on it to produce a logical component in the short term with a long-term goal of producing graphene/spintronic-based components that will surpass solid-state devices in both speed and energy efficiency.

 

See more news at:

http://twitter.com/Cabe_Atwell

microchip ceo.jpg

Microchip CEO Steve Sanghi (via Microchip)


Microchip Technology, Inc., is celebrating this week, as it was just named the number one provider of 8-bit microcontrollers (MCU) globally. The title was awarded by Garner’s annual ranking publication, in its 2014 edition.

 

Microchip Technology, Inc., is an innovation giant that specializes in mixed-signal, Flash-IP and analog solutions. It has long been a leader in the microcontroller industry and although the powerhouse is celebrating its reclaim of the top spot for 8-bit MCUs, it is a leading provider of 16-bit and 32-bit MCU production as well.

 

Microchip is committed to growing its MCU technologies in all markets, including 8-bit, 16-bit and 32-bit product lines, and its dedication and commitment to excellence is paying off. The technology innovator was ranked the fastest growing MCU supplier of all top 10 providers in 2014. Its rate of growth was charted as double that of its competitors. With this, the company was also named one of the top 10 providers of 32-bit MCUs for the first time ever. While its stats across the MCU industry are impressive, what’s most striking is that Microchip closed a 41% revenue deficit to reclaim the stop spot from Renesas.

 

Renesas is a company resulting from the merge between NEC, Hitachi and Mitsubishi. These three companies were the leading semiconductor companies of Japan and when they merged, Microchip was knocked out of the top spot for 8-bit MCUs in 2010. At the time, Renasas’ business was 41% larger than that of Microchip, but it worked tirelessly each year, and finally won with a 10.5% advantage over the Japanese supplier in 2014.

 

MCUs are used for a number of different products, including watches, mobile phones and many digital household electronics. The need for MCUs is increasing, as the consumer market and global technologies shift toward digitization. The Internet of Things devices, “smart” household products and other digital devices will all rely on MCUs for their processing power as the demand for technologically advanced goods continues to rise – good news for Microchip.

 

Microchip offers a wide range of MCU products in its portfolio, including MCUs for analog peripherals, core independent peripherals, low-power products and more. If you’re interested in Microchip products, you can find a complete list of their solutions on their website.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

DAB

TI MSP432 Webinar.

Posted by DAB Apr 30, 2015

Hi All,

 

I just saw the official TI Webinar on the new MSP432 processor.

 

The 13 USD Launchpad is very impressive, but the new features of the TI software are awesome.

 

They evidently spent some time looking at the excellent Cypress Semiconductor software and have upgraded CCS with a lot of very nice user features with simplified control.

 

Definitely worth looking at.

 

DAB

The world is getting smarter. We are surrounded by talk about smartphones, smartwatches and even smartfridges – all components of the much-heralded internet of things (IoT).

While these devices all incorporate sensors and processors to interpret and display data, there is a pressing need to store the data too.

Smart devices typically store data in NAND Flash memory chips, and the price per bit of these chips has fallen dramatically. This can be ascribed in a large part to the dramatic increases in memory density that have reduced the amount of silicon needed to store individual "bits" of data.

- You can read the rest of this article on the Toshiba innovation section: http://toshiba.semicon-storage.com/eu/design-support/innovationcentre/tcm0048_eMMC.html

DAB

My PI is Alive!

Posted by DAB Apr 11, 2015

After watching everyone else explore the Raspberry Pi, I finally took the plunge with the RPi 2.

 

I finally got all the pieces in place, plugged it in and about 15 min later, my RPi 2 was alive and well.

 

My only complaint was the 6 point type used for the little guide included in the box.

 

Luckily I bought the camera kit and it came with a real sized guide so I could actually read the text.

 

Next step is to hook up the camera and wifi.

 

I have no idea how long these actions will take, but I will give you another post documenting my experience.

 

Meanwhile may all your Pi's be good.

 

DAB

Demand from consumer and mobile markets, automotive and industrial sectors, and emerging Internet of Things (IoT) applications is driving Flash storage technology to aggressively move to smaller and smaller process nodes.

 

 

Unfortunately, many chipsets and NAND host controllers are unable to perform error correction using more than 1-bit or 4-bit ECC, and often the cost to update these make moving to later process nodes prohibitive.

 

 

Single-Level Cell NAND Flash (SLC NAND), which stores one bit per cell and can endure around 60,000 write/erase cycles, is currently the most widely used Flash technology for these applications.

 

 

Extending product lifetimes with new NAND flash technologies

Built-in ECC NAND (BENAND™) is a new type of SLC NAND Flash that has an embedded ECC function capable of offloading the burden of ECC from the host processor.

 

Toshiba’s engineers have enabled a number of customers to integrate BENAND into both existing and new designs, delivering the benefits of migrating to the latest device technology while helping avoid the high costs associated with significant system redesign or long-term use of legacy technology.

 

In one case, Toshiba assisted a customer achieve a cost-effective lifetime extension for a Bluetooth® hands-free product aimed at the automotive aftermarket. The original design used SLC-NAND with 1-bit ECC to store the boot code, OS image, application code, application parameters and user data.  However, the availability of the SLC-NAND was becoming uncertain.

 

To compound matters, the original, custom SoC processor was unable to meet the increased ECC demands for more recent, and cost-effective, SLC-NAND generations.

 

Rather than redesign the device and upgrade the processor, or engage a longevity support system to guarantee the supply of the original NAND, despite the increased cost, the OEM turned to Toshiba’s BENAND.

 

This solution helped avoid any need to change the design of the device, while enabling the OEM to enjoy the cost-down benefit of the more cost-efficient 24nm BENAND chips.

 

Pulled from the Innovation centre by Toshiba. You can read the full article below.

 

Extending product lifetimes with new NAND flash technologies | Innovation Centre | TOSHIBA Semiconductor & Storage Produ…

NAND.png

 

Flash is the storage technology used inside the thinnest, lightest laptops and nearly every cellphone, tablet and mobile device. With users of these devices constantly demanding increasing functionality the amount of NAND flash memory needed has grown accordingly. Traditional planar NAND flash memory, however, is nearing its practical scaling limits, posing significant challenges for the memory industry.


Happily, once again technology is coming to the rescue. Last week, coincidentally on the same day and in separate announcements, Micron/Intel and Toshiba/SanDisk announced the availability of flash cells that are vertically stacked in multiple layers, known as 3D NAND technology. Products using 3D NAND are expected to be able to keep flash storage solutions on track for continued performance gains and cost savings, driving more widespread use of flash storage. This is important because solid state drives (SSDs) employing flash have had a significant impact on computing, but although prices have dropped, the capacities still lag far behind those of traditional magnetic hard drives.


The 3D NAND technology jointly developed by Intel and Micron (who have partnered to make 3D NAND Flash since the formation of their joint venture in 2006) stacks 32 layers of data storage cells vertically.  It uses floating gate cells a universally utilized design refined through years of high-volume planar flash manufacturing and enables what the companies say is the highest-density flash device ever developed—three times higher capacity than other NAND die in production. The immediate result will be seen in gum stick-sized SSDs with more than 3.5 terabytes (TB) of storage and standard 2.5-inch SSDs with greater than 10TB capacity.

Because capacity is achieved by stacking cells vertically, the individual cell dimensions can be considerably larger. This is expected to increase both performance and endurance and make the technology well-suited for data center storage. What is more, in the Intel/Micron design a new sleep modes enable low-power use by cutting power to inactive NAND die (even when other die in the same package are active), dropping power consumption significantly in standby mode.


The 256Gb multilevel cell version of 3D NAND is sampling today with select partners, and the 384Gb triple-level cell design will begin sampling later this spring.


Toshiba's 3D NAND structure (which will also appear under the SanDisk label since the two have a NAND joint venture) is called BiCS, for Bit Cost Scaling., Toshiba’s new flash memory stores two bits of data per transistor, meaning it's a multi-level cell (MLC) flash chip. It can store 128Gbits (16GB) per chip. Toshiba said its 48-layer stacking process enhances the reliability of write/erase endurance, boosts write speed, and is suited for use in diverse applications, but primarily solid-state drives (SSDs).Sample shipments of products using the new process technology began last Thursday. Toshiba is preparing for mass production in their new Fab2 at Yokkaichi Operations,


Toshiba.jpg

For its part last year Samsung became the first company to announce it was mass-producing 3D flash chips, which it calls V-NAND. Samsung’s chips stack 32-layers of transistors. V-NAND crams in 3-bits per transistor in what the industry refers to as triple-level cell (TLC) NAND. Because Samsung uses TLC memory, its chips are said to be able to store as much as Toshiba's 48-layer 3D NAND -- 128Gbits or 16GB.


Going forward these and subsequent 3D NAND announcement could mean SSDs will have the density to see it eclipsing hard drives as the primary storage medium in devices meeting most people’s needs.

silicene_Fig1a2.jpg

Silicene Structure concept art (via UT at Austin)

 

While some researchers are hard at work to achieve quantum computing on a chip, scientists from the University of Texas at Austin’s Cockrell School are busy making history. The research team recently created an atom-thick transistor made from silicon particles, called silicene, which may revolutionize computer chips.

 

There had been talk about the development of silicene, but it had yet to be constructed, until recently. Assistant Professor in the Department of Electrical and Computer Engineering Deji Akinwande and lead researcher Li Tao successfully built the first-ever silicene chip last month. The team looked to current graphene-based chip development for guidance, but discovered a major issue at the onset – silicene was sensitive to air.

 

To circumvent this issue, Akinwande and Tao worked with Alessandro Molle of the Institute for Microelectronics and Microsystems in Agrate Brianza, Italy, to construct the delicate material in an airtight space. The team was able to form a thin silicene sheet by condensing silicon vapor onto a crystalline silver block in a vacuum chamber. Once the sheet was formed, silicene atoms were placed on a thin silver sheet and covered with a layer of alumina that was one nanometer thick. Once formed, the team was able to peel the silicene sheet off of the base and move it to an oxidized-silicon substrate. The result was a functional silicene transistor that joined two metal groups of electrodes.

 

The transistor was only functional for a few minutes before crumbling due to instability in air. While the transistor’s capabilities were rather archaic, the UT team was successfully able to fabricate silicene devices for the first time ever through low-temperature manufacturing. As silicone is a common base for computer chips, the researchers are confident that the technology could be adopted relatively easily, to make for faster, low-energy digital chips.

 

The team of scientists plans to continue its research to develop a more stable silicene chip. Having a super-thin silicene transistor could incredibly enhance the speed of computing, but it isn’t without competition. Graphene-based transistors have been under development for quite some time and may also be a solution to the question of how to enhance computing capabilities. Both technologies, however, may fail to surpass the potential power of the Università degli Studi di Pavia in Italy’s newest quantum chip. The chip features entanglement capabilities, potentially allowing an entire network to function as one unit. The new technology may also make cyber threats a thing of the past.

 

At present, emerging chip technologies are all still in need of further development before they are ready to hit the market. No one knows which technology will prevail, but it certainly is exciting.

 

The Cockrell School’s Southwest Academy of Nanoelectronics, the U.S. Army Research Laboratory’s Army Research Office and the European Commission’s Future and Emerging Technologies Programme funded the University of Texas at Austin-based project.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

photon-entanglement-ring-resonator.jpg

Photon Entanglement Ring Resonator visualization (via Davide Grassani, Stefano Azzini, Marco Liscidini, Matteo Galli, Michael J. Strain, Marc Sorel, J. E. Sipe, and Daniele Bajoni)


As IBM readies its brain-like computer-on-a-chip for mass production, the Università degli Studi di Pavia in Italy is making history, as it just built the very first chip capable of entangling individual light particles. The new technology may inspire a host of novel computing innovations and quite possibly put an end to cyber threats as we known them.

 

Entanglement is an essential quantum effect that enables the instant connection between two particles, regardless of distance. This means that anything done to one particle will be instantaneously done to another particle, even if it is at the other end of the universe. The entanglement of photons isn’t a new technology, but researchers at the Università degli Studi di Pavia, including co-author on the paper Daniele Bajoni, made history in successfully scaling the technology down to fit on a chip.

 

Researchers have been trying to scale down entanglement technology for years. Typically, the technology is harnessed through specialized crystals, but even the smallest set-up was still a few millimeters thick. Bajoni and his team decided to try a different approach and instead built what they call micro-ring resonators onto an ordinary silicon chip. The resonators embed coils into silicon wafers that capture and re-release photons. The design results in successful entanglement at an unparalleled width of 20 microns, or one-tenth the thickness of a strand of human hair.

 

The technology has huge implications for computing, as entanglement can exponentially increase computing power and speed. Computing communication can become instantaneous, as can other communication technologies. Tweeting at the speed of light, anyone? While these potentialities for advancements in computing are impressive, the biggest impact it may make is in inhibiting cyber threats.

In entanglement, particles act as one cohesive unit. Hackers operate by identifying weaknesses in computer and information systems and exploiting them. If computing and information systems, however, operate as one cohesive unit, there would be no way through which a hacker could breach the system, thus eliminating cyber threats. Sorry Dshell analysts.

 

The new quantum chip is infinitely more powerful than even the most cutting-edge supercomputers around today. It has the potential power to revolutionize communication, computing and cybersecurity, by enabling the adoption of quantum technologies, such as quantum cryptography and quantum information technologies. When we can expect to see this technology rule supreme, however, is another subject entirely.

 

Bajoni believes the technology is the connector through which innovation technologies can begin harnessing quantum power on a small scale, but others disagree. Some believe ring resonators must be produced on a nanoscale first to compete with up-and-coming nano-processors. Only time will tell, but our bet is cybersecurity stakeholders, at the least, will begin looking into the chip’s development. Until quantum mobile communication is available, however, you’ll just have to upload your social media photos like everybody else, 3-4GBs at a time.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

bb0.png

PowerBar installed (via Andice Labs)

 

If you've ever thought of designing a BeagleBone-based vigilante robot that fights crime in the rural Mojave Desert using only battery power, now you can with Andice Lab's PowerBar. The PowerBar was designed exclusively for the BeagleBone open hardware computer and enables it to function fully on DC, or battery, power. Portability is inspiring.

 

bb1.png

PowerBar attached to BeagleBone (via Andice Labs)

 

The PowerBar is a "micro cape" power supply that provides the low-power BeagleBone (SBC) computer with enough energy to run from anywhere, even in outer space (cue Twilight Zone theme song). The battery pack runs 5V of energy to the computer and even offers 15V over-voltage protection and reverse-voltage protection to protect against surges. It's a simple power pack that works for both BeagleBone White and Black.

 

bb2.jpg

BeagleBone White (via BeagleBoard)

 

BeagleBoard's BeagleBone is a single board computer based on Linux that runs Android and Ubuntu. The White version comes equipped with an AM335x 720MHz ARM processor, 256MB DDR2 RAM, 3D graphics chip, ARM Cortex-M3 and 2 PRU 32-bit RISC CPU's. BeagleBone Black was made with developers in mind and features double the power, with 512 DDR2 RAM, 4GB 8-bit built-in EMMC flash memory and a NEON point accelerator. Both computers offer USB, Ethernet and HDMI connectivity. It also runs Cloud9 IDE and Debian. What makes it unique is its open hardware design.

 

bb3.jpg

BeagleBone Black (via BeagleBone)

 

Open hardware designs take open-source to a whole new level. Not only are software platforms completely open to developers, but designs are too. That means you can buy a BeagleBone Black, or you can go directly to the BeagleBoard website and find the instructions for building your very own. Open hardware is developed for the love of innovation and raising up the next generation of tinkerers. My only critique of this cape is that I could do the same with an external cell-phone battery backup. Countless battery bricks out there too.

 

The development of the PowerBar now allows us to take our innovations on-the-go. Now remote locations all over the world can still gain access to the unscripted power of BeagleBone. If you take the lead from one tinkerer, you can power your very own brewery using the mini computer. Even the pirates in the Mojave Desert would raise a glass to that.

cpulse.jpg

The cPulse is seen in action being used as a home rave device (via Codlight)


The French company, Codlight Inc. is currently seeking funding on Kickstarter to produce one of the first fully customizable LED Smartphone cases. While the prospect of becoming a walking, breathing billboard advertisement doesn't particularly appeal to me, I must give Codlight Inc. credit for the multitude of features and uses it offers.

 

The company certainly left no stone unturned when they programmed the cPulse smartphone case for a variety of uses. The cPulse LED case can act as everything from a notification banner, to a homemade rave device, to a form of light therapy. This feature can also be used to mimic a good old-fashioned analog clock radio.

 

The cPulse uses a panel of 128 high-efficiency LED lights powered by the Smartphone battery, and controlled by a custom program which allows the user to specify different commands, modes, notifications, and create customizable light shows set to music.

These light displays sap battery power at a rate of about 7% per hour so you may want to have quarters on hand if you need to call someone on short notice. - Remember payphones?

 

The LED light panel and the smartphone case  are 3D printed by Sketchfab and Sculpteo. Kickstarter backers who fund at least $79 to this Codlight initiative will receive a kit that will allow them to 3D print their very own cPulse case. Donors who are a bit more generous, funding at least $89 will receive a fully functioning cPulse case delivered to their home.

 

At the moment, the case is specifically made for the Android 4.4 smartphone, however if the project gets off of its feet, its easy customization could allow anyone to own a cPulse.

 

I must say, I am still pretty impressed by the functionality of this device, even though it is entirely unnecessary and a product of a culture of consumption and excess.

 

For now, Codlight Inc. is asking for no paltry sum, with a pledged goal of $350,000. They are currently nowhere near the goal, but still have about a month left to raise over a quarter of a million dollars.

 

If you are obsessed with bright, shiny objects and want to blind and dazzle those around you, you can get your very own cPulse from Kickstarter.



C

See more news at:

http://twitter.com/Cabe_Atwell

onbeep.png

A real-life Star Trek communicator for $99 (via OnBeep)


OnBeep is a San Francisco start-up company that recently unveiled its Onyx communicator to technocrats in New York, sparking buzz. OnBeep is only one year old, but they raised $6.25 million in early 2014 to develop their Onyx device: something that lets you communicate with groups of people at the touch of a button.

 

The working, finished product was only unveiled early last month, but Business Insider, CNN, Forbes, and Wired already have something to say about it. The design is meant to be worn on any type of clothing, handbags, belts, or even put inside your pocket. The ease of talking at the push of a button was inspired by Star Trek, so your LARPing adventures can be fortified by this device for sure.

 

In practice, the Onyx seems like an expensive, stylish speaker phone in the style of a walkie-talkie. In terms of hardware and design, it basically is exactly that. But the co-founder, Jessie Robbins notes that it does more: it allows a group of people to work together and stay focused on the task at hand. Both Robbins, and the OnBeep CTO, Greg Albrecht, have experience in emergency situations as firefighters and EMTs. Hence, the Onyx really makes sense when you need to communicate real-time with a group of colleagues and can’t afford to waste time messing around with a phone.

 

The cool thing about the Onyx is that in thoeryit allows you to collaborate with anyone around the world. For now, radio frequency regulations mean that people outside the US can't technically buy the Onyx. Considering the amount of funding OnBeep has raised, it seems like a matter of time before the Onyx is available everywhere. The device can currently be pre-ordered for expected release in December 2014. The current cost of the Onyx is $99 which seems a bit steep for an extension of your smartphone, but I can see how it can be super helpful depending on your job environment.

 

I can certainly see businesses adopting this technology as a new part of team management: cutting the time and space between employees. Perhaps this is why so many business gurus are interested in the technology since it enables people to work together, real-time, outside of boring meetings.

 

The Onyx works by using Bluetooth to sync to your smartphone. In order to take advantage of Onyx's capabilities, you must download the OnBeep smartphone app which is currently available for iPhone and Android systems. The Onyx then takes advantage of wireless data/WiFi to contact your networks and stay connected. The app allows you to manage your groups, see who's available, and see where every member of your team is located – if you are worried that Tom forgot the dip, for instance.

 

You can talk to up to 15 people at once with the Onyx, and you can create as many groups as you like. The platform works regardless of network carrier, however it is only compatible with iPhone and Android at the moment.

 

C

See more news at:

http://twitter.com/Cabe_Atwell

Filter Blog

By date:
By tag: