we see things differently now

Posted by on May 15th, 2012

This photo, taken with Google’s HUD+ prototype Project Glass, has been circulating around the net over the past few days. It keeps coming back to mind; mostly when I’m at the park trying to get that shot of my dog being super cute, to post on hipstergram.

The shorthand for HUDs has generally been “Terminator vision”, but this powerfully shows its most compelling use is just the opposite.

Remember when we used to call buildings to see if people were in them? Remember when we sent taps along wires to talk to our distant loved ones? Remember when we covered fires with blankets to send signals? We see things differently now.


Google rumored to bring HUDs to market this year

Posted by on February 21st, 2012

From NYTimes:

According to several Google employees familiar with the project who asked not to be named, the glasses will go on sale to the public by the end of the year. These people said they are expected “to cost around the price of current smartphones,” or $250 to $600.

The people familiar with the Google glasses said they would be Android-based, and will include a small screen that will sit a few inches from someone’s eye. They will also have a 3G or 4G data connection and a number of sensors including motion and GPS.

Seth Weintraub, a blogger for 9 to 5 Google, who first wrote about the glasses project in December, and then discovered more information about them this month, also said the glasses would be Android-based and cited a source that described their look as that of a pair of Oakley Thumps.

They will also have a unique navigation system. “The navigation system currently used is a head tilting to scroll and click,” Mr. Weintraub wrote this month. “We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.”

The glasses will have a low-resolution built-in camera that will be able to monitor the world in real time and overlay information about locations, surrounding buildings and friends who might be nearby, according to the Google employees. The glasses are not designed to be worn constantly — although Google expects some of the nerdiest users will wear them a lot — but will be more like smartphones, used when needed.

Internally, the Google X team has been actively discussing the privacy implications of the glasses and the company wants to ensure that people know if they are being recorded by someone wearing a pair of glasses with a built-in camera.

One Google employee said the glasses would tap into a number of Google software products that are currently available and in use today, but will display the information in an augmented reality view, rather than as a Web browser page like those that people see on smartphones.

The glasses will send data to the cloud and then use things like Google Latitude to share location, Google Goggles to search images and figure out what is being looked at, and Google Maps to show other things nearby, the Google employee said. “You will be able to check in to locations with your friends through the glasses,” they added.

thanks to thedaniel for the tip-off!


FuturePresent News Special – 1-11-11

Posted by on November 1st, 2011

Here’s your menu for today’s FuturePresent news round-up:

  • MSFT’s “Productivity Vision 2011″ video:
    YouTube Preview Image
    via GeekWire who give this nice description:

    As the new video opens, special eyeglasses translate audio into English in real-time for a business traveler in Johannesburg. A thin screen on a car window highlights a passing building to show where her meeting will be the next day, based on information from her calendar. Office workers gesture effortlessly to control and reroute text and charts as the screens around them morph and pulse with new information.

    And on and on from there, making our modern-day digital breakthroughs seem like mere baby steps on the road to a far more spectacular future.

    Now I want my fucking spex now as much as the next cyberpunk, BUT… actual world problems solved here? ZERO. When the current estimate is that 80 Million new jobs need to be created to replace the ones lost during this recent period of disaster capitalism, building a shinier operating system hardly seems likely to help.

  • In better cyberpunky news, from the very same Microsoft, there’s OMNITOUCH:
    YouTube Preview Image
    via Design Taxi, who give us this succinct description:

    OmniTouch is depth-sensing projection system worn on the shoulder.

    With the system, hands, legs, arms, walls, books and tabletops, become interactive touch-screen surfaces—without any need for calibration.

    If only they didn’t look so terrible. Get ya mod on there future-dwellers!

  • It may have over 5Million views, but let’s take a look at the QUANTUM LEVITATION video again
    YouTube Preview Image
    via Gizmodo. Advances in basic science and engineering, now we’re talking!

  • If you like SCIENCE! you’ll love simulated pocket universes:

    Some of these universes would collapse instants after forming; in others, the forces between particles would be so weak they could not give rise to atoms or molecules. However, if conditions were suitable, matter would coalesce into galaxies and planets, and if the right elements were present in those worlds, intelligent life could evolve.

    Some physicists have theorized that only universes in which the laws of physics are “just so” could support life, and that if things were even a little bit different from our world, intelligent life would be impossible. In that case, our physical laws might be explained “anthropically,” meaning that they are as they are because if they were otherwise, no one would be around to notice them.

    MIT physics professor Robert Jaffe and his collaborators felt that this proposed anthropic explanation should be subjected to more careful scrutiny, and decided to explore whether universes with different physical laws could support life.

    The MIT physicists have showed that universes quite different from ours still have elements similar to carbon, hydrogen, and oxygen, and could therefore evolve life forms quite similar to us, even when the masses of elementary particles called quarks are dramatically altered.

    Jaffe and his collaborators felt that this proposed anthropic explanation should be subjected to more careful scrutiny, so they decided to explore whether universes with different physical laws could support life. Unlike most other studies, in which varying only one constant usually produces an inhospitable universe, they examined more than one constant.

    Whether life exists elsewhere in our universe is a longstanding mystery. But for some scientists, there’s another interesting question: could there be life in a universe significantly different from our own?

    In work recently featured in a cover story in Scientific American, Jaffe, former MIT postdoc, Alejandro Jenkins, and recent MIT graduate Itamar Kimchi showed that universes quite different from ours still have elements similar to carbon, hydrogen, and oxygen, and could therefore evolve life forms quite similar to us. Even when the masses of the elementary particles are dramatically altered, life may find a way.

    “You could change them by significant amounts without eliminating the possibility of organic chemistry in the universe,” says Jenkins.

    Keep reading… And if that’s not heavy enough for you, how about a paper on the Mass of the universe in a black hole (via reddit)

  • From the macro to the micro – Scientists create computing building blocks from bacteria and DNA [PhysOrg]:

    The scientists constructed a type of logic gate called an “AND Gate” from bacteria called Escherichia coli (E.Coli), which is normally found in the lower intestine. The team altered the E.Coli with modified DNA, which reprogrammed it to perform the same switching on and off process as its electronic equivalent when stimulated by chemicals.

    The researchers were also able to demonstrate that the biological logic gates could be connected together to form more complex components in a similar way that electronic components are made. In another experiment, the researchers created a “NOT gate” and combined it with the AND gate to produce the more complex “NAND gate”.

    The next stage of the research will see the team trying to develop more complex circuitry that comprises multiple logic gates. One of challenges faced by the team is finding a way to link multiple biological logic gates together, similar to the way in which electronic logic gates are linked together, to enable complex processing to be carried out.


Link Dump 20-05-2011

Posted by on May 19th, 2011
  • Bionic hand for ‘elective amputation’ patient

    “The operation will change my life. I live 10 years with this hand and it cannot be (made) better. The only way is to cut this down and I get a new arm,” Milo told BBC News prior to his surgery at Vienna’s General Hospital.

    Milo took the decision after using a hybrid hand fitted parallel to his dysfunctional hand with which he could experience controlling a prosthesis.

    Such bionic hands, manufactured by the German prosthetics company Otto Bock, can pinch and grasp in response to signals from the brain that are picked up by two sensors placed over the skin above nerves in the forearm.

  • Vuzix Announces New See-Through Augmented Reality Enabled Video Eyewear

    The STAR 1200 is a see-through AR-enabled binocular Video Eyewear that is expected to be used in a wide variety of industrial, commercial, defense and some consumer applications. Building from Vuzix’ award winning technology in AR-enabled video eyewear, the new display will allow users to view the real world scene while also viewing relevant computer generated information, graphics and alerts. The AR glasses will provide connectivity to VGA, component and composite video sources. The STAR 1200 comes with 6 degrees of freedom (DOF) motion tracking sensors and a built in camera for tracking and recognizing the real world. This allows 3D computer generated content to be locked in place when overlaid within the user’s real worldview.

  • Swiss Scientists Design a Turbine to Fit in Human Arteries

    “The heart produces around 1 or 1.5 watts of hydraulic power, and we want to take maybe one milliwatt,” Pfenniger explains. “A pacemaker only needs around 10 microwatts.” At the Microtechnologies in Medicine and Biology conference in Lucerne, Switzerland, earlier this month, Pfenniger presented results from a trial in which a tube is designed to mimic the internal thoracic artery, a millimeters-wide vessel that doctors sometimes cannibalize for surgery because it is redundant. The most efficient of the three off-the-shelf turbines he tested produced around 800 microwatts, which could run devices much more power hungry than today’s pacemakers

  • Sovereign Bleak installing magnets (in his fingertips) [VIDEO]

Song of the Machine

Posted by on April 23rd, 2011
http://www.vimeo.com/22616192

Song of the Machine is my favourite kind of design fiction, combining multiple forms of extrapolation from the present into the future.

Unlike the implants and electrodes used to achieve bionic vision, this science modifies the human body genetically from within. First, a virus is used to infect the degenerate eye with a light-sensitive protein, altering the biological capabilities of the subject. Then, the new biological capabilities are augmented with wearable (opto)electronics, which, by mimicking the eye’s neural song, establish a direct optical link to the brain. It’s as if the virus gives the body ears to hear the song of the machine, allowing it to sing the world into being.

So we’ve got advances in genetic engineering combined with electronic ones to overcome a biological disability through continuing man’s progress, it’s ongoing co-evolution with the tools he creates. Except this marks a Rubicon Moment, the crossing of a threshold into a merger between man and his technology and the result is something far more, a step toward the posthuman.

Get used to this. Better living through upgrades.

For more details see this article in the Guardian by the consultant to this project, Dr Patrick Degenaar, optogenetics researcher at Newcastle University and leader of the OptoNeuro project.


We see things differently

Posted by on January 11th, 2011

We’re 11 days into 2011 and I’m watching the north of my country drown on live-television, as they in turn switch between exhausted officals giving press conferences, to reports straight from social media. In fact, they’re just sending viewers straight to #qldfloods. But, look.. SHINY!

Let’s face it, we’re going to need ever better methods to record disaster pr0n and navigate our way through it. OK, we don’t need them, but some kind of distraction is needed now and again. What have we got so far this year?

Augmented reality HUDS? Check. This was just released for skiers:

Introducing  Transcend, Recon Instruments’ collaboration with Colorado’s Zeal Optics. Transcend is the world’s first GPS-enabled goggles with a head-mounted display system.

Minimum interaction is required during use, sleek graphics and smart optics are completely unobtrusive for front and peripheral vision making it the ultimate solution for use in fast-paced environments.

Transcend provides real-time feedback including speed, latitude/longitude, altitude, vertical distance travelled, total distance travelled, chrono/stopwatch mode, a run-counter, temperature and time. It is also the only pair of goggles in the world that boasts GPS capabilities, USB charging and data transfer, and free post-processing software all with a user-friendly, addictive interface.

Just like the dashboard of a sports car or the instruments of a fighter jet, Transcend’s display provides performance-enhancing data, but only when you choose to view it. Safe, smart, fun…all wrapped up in the hottest goggle frame of 2010/11.

Now, of course you ask, but how will I best show my friends a panoramic, interactive recording of that sick black run (or train for the next one)? Sony has just the thing:

Besides looking über futuristic, Sony’s “virtual 3D cinematic experience” head mounted display (aka ‘Headman’) sports some fairly impressive specs. The tiny OLED screens inside are head HD resolution (1280 x 720), and the headphones integrated into the sides of the goggles are outputting high quality simulated 5.1 channel surround sound.

OK, that’s just a prototype. But something like it will be coming soon, so leave some space for it in your underground bunker.

But m1k3y, you say.. “those are great and all, but WHERE’S MY CLATTER?!” Well, I saved the best for last:

In 2008, as a proof of concept, Babak Parviz at the University of Washington in Seattle created a prototype contact lens containing a single red LED. Using the same technology, he has now created a lens capable of monitoring glucose levels in people with diabetes.

It works because glucose levels in tear fluid correspond directly to those found in the blood, making continuous measurement possible without the need for thumb pricks, he says. Parviz’s design calls for the contact lens to send this information wirelessly to a portable device worn by diabetics, allowing them to manage their diet and medication more accurately.

Lenses that also contain arrays of tiny LEDs may allow this or other types of digital information to be displayed directly to the wearer through the lens. This kind of augmented reality has already taken off in cellphones, with countless software apps superimposing digital data onto images of our surroundings, effectively blending the physical and online worlds.

Making it work on a contact lens won’t be easy, but the technology has begun to take shape. Last September, Sensimed, a Swiss spin-off from the Swiss Federal Institute of Technology in Lausanne, launched the very first commercial smart contact lens, designed to improve treatment for people with glaucoma.

The disease puts pressure on the optic nerve through fluid build-up, and can irreversibly damage vision if not properly treated. Highly sensitive platinum strain gauges embedded in Sensimed’s Triggerfish lens record changes in the curvature of the cornea, which correspond directly to the pressure inside the eye, says CEO Jean-Marc Wismer. The lens transmits this information wirelessly at regular intervals to a portable recording device worn by the patient, he says.

Like an RFID tag or London’s Oyster travel cards, the lens gets its power from a nearby loop antenna – in this case taped to the patient’s face. The powered antenna transmits electricity to the contact lens, which is used to interrogate the sensors, process the signals and transmit the readings back.

Each disposable contact lens is designed to be worn just once for 24 hours, and the patient repeats the process once or twice a year. This allows researchers to look for peaks in eye pressure which vary from patient to patient during the course of a day. This information is then used to schedule the timings of medication.

Parviz, however, has taken a different approach. His glucose sensor uses sets of electrodes to run tiny currents through the tear fluid and measures them to detect very small quantities of dissolved sugar. These electrodes, along with a computer chip that contains a radio frequency antenna, are fabricated on a flat substrate made of polyethylene terephthalate (PET), a transparent polymer commonly found in plastic bottles. This is then moulded into the shape of a contact lens to fit the eye.

Parviz plans to use a higher-powered antenna to get a better range, allowing patients to carry a single external device in their breast pocket or on their belt. Preliminary tests show that his sensors can accurately detect even very low glucose levels. Parvis is due to present his results later this month at the IEEE MEMS 2011 conference in Cancún, Mexico.

“There’s still a lot more testing we have to do,” says Parviz. In the meantime, his lab has made progress with contact lens displays. They have developed both red and blue miniature LEDs – leaving only green for full colour – and have separately built lenses with 3D optics that resemble the head-up visors used to view movies in 3D.

Parviz has yet to combine both the optics and the LEDs in the same contact lens, but he is confident that even images so close to the eye can be brought into focus. “You won’t necessarily have to shift your focus to see the image generated by the contact lens,” says Parviz. It will just appear in front of you, he says. The LEDs will be arranged in a grid pattern, and should not interfere with normal vision when the display is off.

For Sensimed, the circuitry is entirely around the edge of the lens (see photo). However, both have yet to address the fact that wearing these lenses might make you look like the robots in the Terminator movies. False irises could eventually solve this problem, says Parviz. “But that’s not something at the top of our priority list,” he says.

So close… And Terminator eyes? That’s a feature, not a bug. YES PLEASE!


DARPA’s SCENICC to make future soldiers omniscient

Posted by on December 23rd, 2010

From WIRED’s Danger Room:

In a solicitation released today, Darpa, the Pentagon’s far-out research branch, unveiled the Soldier Centric Imaging via Computational Cameras effort, or SCENICC. Imagine a suite of cameras that digitally capture a kilometer-wide, 360-degree sphere, representing the image in 3-D (!) onto a wearable eyepiece.

You’d be able to literally see all around you, including behind yourself, and zooming in at will, creating a “stereoscopic/binocular system, simultaneously providing 10x zoom to both eyes.” And you would do this all hands-free, apparently by barking out or pre-programming a command (the solicitation leaves it up to a designer’s imagination) to adjust focus.

Then comes the Terminator-vision. Darpa wants the eyepiece to include “high-resolution computer-enhanced imagery as well as task-specific non-image data products such as mission data overlays, threat warnings/alerts, targeting assistance, etc.” Target identified: Sarah Connor… The “Full Sphere Awareness” tool will provide soldiers with “muzzle flash detection,” “projectile tracking” and “object recognition/labeling,” basically pointing key information out to them.

And an “integrated weapon sighting” function locks your gun on your target when acquired. That’s far beyond an app mounted on your rifle that keeps track of where your friendlies and enemies are.

The imaging wouldn’t just be limited to what any individual soldier sees. SCENICC envisions a “networked optical sensing capability” that fuses images taken from nodes worn by “collections of soldiers and/or unmanned vehicles.” The Warrior-Alpha drone overhead? Its full-motion video and still images would be sent into your eyepiece.

Keep reading..


Pioneer’s prototype windshield HUD

Posted by on October 21st, 2010

Pioneer have prototyped windshield HUD technology:

YouTube Preview Image

It uses lasers. Therefore it is Science!


HUD pr0n in Iron Man 2

Posted by on May 5th, 2010

We want HUDs, we love Iron Man.

IronMan's in-suit HUD

(pic nicked from io9)

Here’s a neat viral video that will make more sense once you’ve seen Iron Man 2.

(viral vid via Tech Digest)

Gimme!


Parrot – an AR drone you can pilot from your iDevice

Posted by on January 6th, 2010
YouTube Preview Image

Meet Parrot – ‘a wifi helicopter with two cameras’, or basically your own personal UAV.

A fantastic piece of tech. However, as Chris Arkenberg pointed out, “Compelling AR ultimately requires HUD glasses.” (Something I’ll be investigating personally this year.)

This hasn’t stopped Mr TheStreetFindsIt’sOwnUseForThings, William Gibson, himself from leading the discussion on just what cool uses this tech can be put to.

Welcome to 2010.


Gryphon tactical wingsuits = covert death from above

Posted by on December 3rd, 2009

I really wasn’t sure what I wanted for Christmas until now.  In fact, this is just the sort of system a twenty-first century Santa needs.

From WIRED’s Danger Room:

…described as a modular upgrade for parachute systems for use in “high-altitude, high-opening” jump missions, typically carried out by Special Forces. This 6-foot wing gives a glide ratio of 5:1, which means that a drop from 30,000 feet will allow you to glide about 30 miles. The makers estimate that this would take around 15 minutes, giving an average speed of about 60 miles an hour.

“All equipment is hidden in a lifting body optimized for stealth, the radar-signature is extremely low,” says the Gryphon data sheet (PDF). “Detection of incoming Gryphon soldiers by airborne or ground radar will be extremely difficult.”

Gryphon has a guidance system and heads-up display navigation. Best of all, the company are looking at an option for bolting on small engines similar to those used in Yves Rossy’s setup. These will increase the range to more than 60 miles, but will also make it possible to cover long distances from low altitude so that the entire mission can be more stealthy.

Yes, so while Yvs Rossy (aka Fusion Man) won’t sell to the military, other companies are happy to.

thanks to my buddy Tone for the tip-off!

Previously:


AUGMENTED REALITY!!1

Posted by on September 6th, 2009

Rejoice, augmented reality is here! OK, now.. what’s next?!

Wait, first we better roll the video – Bruce Sterling’s keynote for the launch of the Layar Reality Browser, in which, Bruce being Bruce, he drops a metric ton of reality of these hip Dutch hackers. In fact this is mandatory viewing for anyone in a tech scene faced with the dangerous prospect of imminent success:

Now, while there’s still a little hype juice left to squeeze out of the lemon, let’s run a bunch of clips showing just how now Augmented Reality is.  Proof being, if it barely works, it’s cutting edge tech, riiight?!

First, Layar’s main rival – The Acrossair iPhone 3GS AR Browser:

Then the Yelp ‘Easter Egg’, later revealed to be using Layar’s platform:

Further proof that one of the initial killer apps for AR will be tourism, the augmented London Bus app for the iPhone:

Finally, a little South Korean weirdness to finish things off – Maptor, an AR “torch” is I guess the best description:

Yes, I know, I know, I know.. AR isn’t really here until it’s Clatter, right. Or HUDs at a minimum. Well, be patient.. Lockheed Martin just dropped a cool $US 1M to Microvision “to develop a see-through eyewear system for ground soldiers.” Cross-fingers, we’ll be getting our grubby paws on those in a few years.

So where do this leave us fine citizens of this marvellous Future Present?

By our current measure of the state of Future – ie Japanese anime – the world of Eden of the East is just around the corner, but Dennō Coil might be a ways off yet. As fans of this show know, that’s your shining example of the realisation of technology as magic, which Bruce mentions in the keynote.

With such wonders on the horizon, I can’t help wondering what’s lying beyond it. Anyone care to take a guess?


Karl Schroeder on ‘Rewilding’

Posted by on August 1st, 2009

The following speech by Karl Schroeder is an excellent summation of the future we’ve been documenting here, the world that lies just around the corner:

YouTube Preview Image

His thoughts on, well I guess you have to call it Nature 2.0, are a nice progression on some of Kevin Kelly’s ideas in his book Out of Control.

via BoingBoing | Futurismic


CNN story on Virtual Vision

Posted by on June 20th, 2009

The story here is that CNN is reporting this. Or as reader Paul Luthy wrote “It seems significant that a major network news source is treating this as science news…”

So not much progress has been made since we posted about this in Jan ’08; but more people are aware of it now. Thanks CNN!


German students developing OLED data glasses

Posted by on June 6th, 2009

From gizmag:

oled data glasses

Students at the Fraunhofer Institute in Germany are developing a pair of interactive data eyeglasses that can project an image onto the retina from an organic light-emitting diode (OLED) micro-display, making the image appear as if it’s a meter in front of the wearer. While similar headwear only throws up a static image, the students are working on eye-tracking technology that allows wearers, with just the movement of the eyeball, to scroll through information or move elements about.

via Mac Tonnies


Vuzix teases us with more details about it’s coming HUD

Posted by on January 10th, 2009

As promised these have been spotted at CES.

Pocket lint has touched them and has more details:

Vuzix’ proprietary Quantum optics technology gives Wrap 920AV wearers a “see-thru” video experience in the form of a functional pair of sunglasses with prescription lenses optional.

Claiming to be capable of 3D, Vuzix mysteriously states that an “optional Bluetooth 6-Degree of Freedom tracking sensor and/or Stereo Camera Pair will enable users to upgrade their Wrap 920AV to experience virtual, augmented and mixed reality environments“.

Awesome. Now stop teasing and gimme!


Vuzix Teases Us With Awesome Virtual Reality Wrap 920AV Video Sunglasses

Posted by on December 19th, 2008

Via gizmodo.com:

Apparently, the Wrao 920AV will be “the first to actually function as sunglasses or portable video eyewear. It’ll combine virtual reality (VR) capabilities as well as augmented reality (AR) features.” Holy crap that is awesome.


- Photo via gizmodo.com.


Microvision have developed the first decent HUD!

Posted by on December 4th, 2008

I am one happy little camper right now. I have been waiting for HUDs to de-lurk from the future since, well, forever really.

It looks like the tech has finally progressed to a point to make them pragmatic. Witness Microvision’s new product:

The information being displayed in the eyewear optics would originate in the mobile device and arrive at the eyeglasses through a wired or wireless connection. The mobile device eyewear viewing experience could be completely see through, providing the wearer with a visual information overlay, while not losing awareness of their surroundings. Or, the viewing experience could be occluded, offering the wearer an immersive, visual experience where the wearer purposefully escapes their immediate surroundings.

So you can toggle between using them as a HUD or just a straight video player. Nice.

Far more information in their detailed spec.

But where is the Buy button? If I am reading this right they are still looking for partners to pair up with and distribute this. Which is just achingly frustrating. I need this now!

thanks to G_A_P_S for the heads up on this!


Nikon’s Media Port UP, a HUD media player

Posted by on October 18th, 2008

oh sure, they're ads are all HUD and shit.. but where is it?! So one of my things to do in Japan was going hunting for HUD tech. Like a good little cyberpunk, I head straight for Akihabara and start asking around.

Despite the best efforts of my buddies that were translating for me, all I got was blank stares from the shop clerks. The best advice I got was to try Hong Kong, which is apparently now the home of edgy, crazy tech?!

Disappointed, I skulked off, and had idle thoughts about sneaking over to HK. The next day, what do I see on the train? A very HUD looking ad from Nikon, for the Media Port UP, which:

…incorporates display, headphones, mobile A/V player, Wi-Fi capability, high-capacity memory, and power source in a single compact unit is the first of its type. The UP allows users to easily enjoy high-quality images, videos, and music anywhere.

While it’s far from Clatter, this is available to buy right now. Sure, it isn’t on a contact lens, but “viewing with a sensation equivalent to that of viewing of a 50-inch large screen from a distance of three meters” isn’t too shabby either.

I’ll be curious to see what the take up rate on this is. As this photo from Akihabara News shows, they don’t actually look that bad on:


Where’s my Clatter?

Posted by on January 4th, 2008

Clatter screenshot

I can hear you all asking. Where’s my alien dancing girls?!

Well, an Israeli company, Lumus are just about to debut their HUD solution.

Lumus Optics HUD screenshot

Unfortunately, they’re dorky as hell.

dorky HUD

And while it’d probably get you kicked out of any decent Grinder Bar, it’s perfect for the next retro-future 90s cyberpunk party you get dragged along to.

via Charles Stross