Needless to say, the ability to photograph barcode-less items in the real world and get instant information on them could be huge, a sort of away-from-a-home-computer Google. What remains to be seen is if Sony can bring it to the masses in a palatable format and, of course, what Google will counteroffer if SmartAR takes off.
Designed and manufactured by Polymer Vision, the screen can be rolled and unrolled 25,000 times. The question, obviously, is why would you need a rollable display? Well, as ereaders become ubiquitous the need for them to be almost indestructible. I could see a day when kids get their own ereaders for the nursery a la the Diamond Age. Interestingly, Polymer Vision isn’t the company of note when you think of e-ink displays so either they will license this technology or they could start taking more and more market shares from leaders like Eink.
Jeb Corliss is a professional wingsuit pilot and BASE-jumper – so I think the following video pretty much speaks for itself. I don’t know about you, but I needed an extra injection of wonder and awesome, today:
ThinkContacts is designed to allow a “Motor disabled person to make a phone call to a desired contact by himself/herself”. Requiring a special headset to read users’ brainwaves, it uses brain activity to determine which of three contacts on the screen the user wants to call.
While the app is looking quite basic at present, the project’s wiki at Forum Nokia only opened six days ago meaning this is likely to be an early-stage project
From textually.org: Swedish software developer, The Astonishing Tribe, is testing a iPhone application called Reconiizr that will enable the user to find names and numbers of complete strangers.
The user simply has to take a picture of a person and hit the ‘Recognize’ button.
The photo is then compared to shots on social networking sites including Facebook and Twitter before personal information, which can include phone numbers, addresses and email addresses, is sent to the user.
The app works on phones with a camera of five or more megapixel resolution
COULD your cellphone learn to predict what you are going to do before you’ve even started doing it?
Communications engineer Arjen Peddemors thinks so, and along with colleagues at the Technical University of Delft in the Netherlands he has devised a system that learns users’ behaviour patterns to provide them with an enhanced cellphone service. It could, for example, prevent the phone starting large downloads such as music tracks or podcasts when your behaviour suggests you are about to go out of network range.
Such prediction has become possible because smartphones like the Nokia N97 and Apple iPhone contain accelerometers that sense motion. They are normally used to reorient images when the screen is flipped from vertical to horizontal, or by software that responds to a shake of the phone. But Peddemors realised that they also generate a data stream that reflects every move the phone’s owner makes.
Routine events such as going to work are likely always to involve similar sequences of actions: locking the front door, opening the garage, getting in the car, for instance. The Delft system uses telltale sequences and timings like this to create an electronic signature of particular events.
A neural network software app running on the phone is then trained to predict what happens next and act accordingly. So if your regular drive to work takes you through a particular phone cell, the “going to work” signature could trigger the software to negotiate with the cellphone network to ensure that the cell will have the 3G capacity to maintain your streaming music channel as you drive through it.
Tech Crunch has all the gory details, but this video gives you the gist – the heavyweight that Google now is just entered the Augmented Reality world, with an Android only (for now) application, Google Goggles:
Today’s addition to the ultimate cyberpunk-future-present kit list:
iKey’s rugged AK-39 keyboard is designed to be worn on the arm, providing a simple, compact data-input solution that does not restrict the user’s arm movements. It is designed to meet MIL-461 standards and is intended for use in very harsh electro magnetic interference (EMI) environments. The AK-39′s small-footprint design features essential components for military and public safety applications, including an integrated Force Sensing ResistorTM (FSR) pointing device with left- and right-click functionality, and adjustable green LED backlighting that is also available in a night vision (NVIS) compatible configuration. Designed with gloved users in mind, the AK-39′s snap-on faceplate eliminates accidental key strokes and can be easily removed to clean the pad.
Drool. And how we laughed the cyberpunk CosPlayers wearing Power Gloves.
If only I had friends on tour in un-disclosed locations in Afghanistan… if only.
This video shows the effect of the high-energy laser beam from the Boeing Advanced Tactical Laser (ATL), fired at a stationary truck from a US Air Force NC-130H (Hercules) flying over White Sands Missile Range, New Mexico, on August 30, 2009. The ATL is a chemical oxygen iodine laser (COIL), and is a scaled-down version of the megawatt-class high-energy laser in the Boeing YAL-1 Airborne Laser (ABL).
Docomo has released a pair of earphones that read your eye movement and convert that into a control system that can control your phone, mp3 player or anything you can dream up. Electrodes fitted to the padded portion of the headphones detect the direction of vision by measuring muscle movement in the same way as with electrocardiograms (ECG) and electro-oculargrams (EOG). Simply by looking up, down, left, and right, users can turn phone switches on and off, as well as adjust volume.
MIT Professor Missy Cummings (a former F-18 Hornet Navy Pilot), and her team of 30 students and undergrads, have successfully demonstrated how an iPhone could be used to control an Unmanned Area Vehicle, or UAV.
As part of their work at MIT’s Humans and Automation Lab (HAL, heh), the team thought about ways to improve on the suitcase-sized controller that soldiers must currently lug around to control hand-thrown Raven UAVs.
The iPhone app they developed sends GPS coordinates to the craft, which then in turn can send photos and video back to the iPhone.
Those 140-character “microblog” posts to Twitter don’t constitute much more than links, dinner recipes, and bitching, right? Be careful with the bitching, though—a property management company in Chicago has filed a lawsuit against a tenant who tweeted an off-the-cuff comment about the company. The company, Horizon Group Management, says that the Twitter user in question sent the message maliciously, and is now asking for $50,000 in damages.
There are several reasons why this lawsuit is breaking new ground, not the least of which is its Twitter origin. There is much debate as to whether people’s Twitter streams are more like blogs—which are increasingly being held to the same legal standards as regular media when it comes to defamation—or a giant chat room, where most people presume “anything goes.” It may actually be somewhere in between, but the one problem with trying to hold tweets to a higher journalistic standard is the hard character limitation—it’s difficult to back up your comments within 140 characters (or even within several 140-character tweets), plus links to sources or pictures of evidence.
The other question is: did Horizon make any effort to sort out this issue with Bonnen before filing the lawsuit? It doesn’t seem so, given Bonnen’s immediate deletion of her Twitter account after the lawsuit was filed, but we admittedly don’t know the answer (and Horizon did not respond to our request for comment by publication time). The lawsuit makes no mention of the company making any effort to ensure that Bonnen’s apartment doesn’t have mold or to work with her to address her concerns.
Either way, the company has now managed to position itself as one that a lot renters and prospective homeowners wouldn’t want to do business with, unlike those that monitor their reputations on Twitter to address customer service issues. Zipcar, Boingo, one of my local pizza places, and even Allstate and Comcast have all swooped in to help out Ars staffers in need after we have aired some complaints. Even if Bonnen really had no mold and Horizon was technically innocent, the bad PR from this move will surely do more damage than Bonnen’s message to 20 of her best Twitter friends.
The ubiquitous barcodes found on product packaging provide information to the scanner at the checkout counter, but that’s about all they do. Now, researchers at the Media Lab have come up with a new kind of very tiny barcode that could provide a variety of useful information to shoppers as they scan the shelves — and could even lead to new devices for classroom presentations, business meetings, videogames or motion-capture systems.
The new system, called Bokode, is based on a new way of encoding visual information, explains Media Lab Associate Professor Ramesh Raskar, who leads the lab’s Camera Culture group. Until now, there have been three approaches to communicating data optically: through ordinary imaging (using two-dimensional space), through temporal variations such as a flashing light or moving image (using the time dimension), or through variations in the wavelength of light (used in fiber-optic systems to provide multiple channels of information simultaneously through a single fiber).
iPhone developers and users excited by the prospect of augmented reality apps, which overlay information and controls on top of real-world objects seen through a camera, have been told to sit tight until the next release of the iPhone OS exits beta.
Although iPhone 3.1 has so far only been known to expose some video camera controls for developers, third-party producer Acrossair was told by Apple that the future release would be needed for its Nearest Tube and future Nearest Subway apps to work properly.
The apps are already highly dependent on the built-in compass and autofocusing camera of the iPhone 3GS, both of which are needed to alternately recognize the direction the iPhone is facing as well as to get a detailed enough look at a subject to tag it with information. As a demonstration of the technology, Acrossair’s software can show the subway stops visible in a particular direction and their distance relative to the user.
Acrossair’s app looks very cool. If progress continues linearly, we’re really never going to get lost again.
How would you like to be able to point your iPhone towards an object – the Eiffel Tower, for example – and instantly see the admission price, working hours, its height and other information
A patent called ID App does just that; it recognizes an object based on visuals (through the iPhone’s camera), a RFID reader or through GPS, and then fetches the data from related databases…in the beginning it’ll probably just take you to a related Wikipedia page.
Another patent focuses on facial recognition… It could bring you info about a person (scary, I know) just by pointing a camera at him; or it could be used for security, enabling only recognized users to use the device
Do you hear that sound? It’s the Augmented Reality Future knocking on your door…
Ever wondered where your trash goes to die? New Scientist is collaborating with Massachusetts Institute of Technology in a ground-breaking experiment to electronically tag and follow ordinary trash as travels from ordinary garbage cans to landfills, recycling plants, and possibly some extraordinary destinations.
The team behind the experiment, MIT’s Senseable City lab, led by Carlo Ratti, have made a device that is about the size of a small matchbox and that works like a cell phone – without the phone bit. A SIM card inside the chip blips out its location every 15 minutes, the signal is picked up by local cell phone antennae and the chip’s location is relayed back to MIT.
Ratti’s team and New Scientist have already deployed a test run of 50 tracked items of trash ranging from paper cups to computers in Seattle. Several thousand more will be released in Seattle and New York garbage cans later this summer and we’ll chuck a batch into the London trash for good measure.
Swedish software and design company The Astonishing Tribe are currently developing Augmented ID, an augmented reality concept for mobile phones. This utilizes facial recognition software (supplied by Polar Rose) to visualize the digital identities of those around you.
By simply aiming your mobile device at someone, you would be able to access that individual’s pre-selected information through floating icons that would appear around their image. These could contain anything from a phone number and email address to links to their favorite content or social networking platforms.