24 Feb 2012

Google Glasses are real, will use two 0.52-inch micro displays



There’s no shortage of gossip in the tech community in regards to how the so-called Google Glasses would work. The basic idea is that some kind of wearable computer that displays on the lenses of a pair of glasses. As incredible as that sounds, there’s not a whole lot of tech out there that really fits that description closely. The idea that these lenses would act as displays for information seems really interesting, but there’s a need to combine a certain amount of function and fashion, while somehow not make all of this sound like a gadget from a spy thriller. But it’s not a spy thriller, we’ve been told the glasses are real, and that we will more than likely be seeing a prototype of them at Google’s developer conference this June.



Google always has their hands in a few cookie jars. When you’re working at a company full of geeks and your employer tells you that 20% of your time can be spent on whatever you want, the end result is going to be pretty awesome sometimes. Google Glasses are a pet project, an experiment to see whether or not people are actually ready for a paradigm shift in how computers are used.

Basically, you have glasses on your head with a camera facing out, and that camera’s job is to record your gestures. When you’re browsing, you reach your hand out and glide through the air like you are touching the content being shown to your eyes, and that controls your navigation. The display is transparent, so even if you are looking at something, you aren’t losing sight of the outside world.

How would that work exactly? We’ve seen micro LCD displays in the past — screens smaller than a dime with an impressive 800 x 600 resolution popped up a few years ago. We’ve learned from a source that the glasses use something very similar: A pair of micro LCD displays project an image to a mirror, which points the image from the screen to a small part of the glasses, close to the nose. This part of the glasses, just to the right of where your eyes would look to see though a normal pair of glasses, is slightly angled to catch the image from the reflection. The result is a 0.52-inch display on each side of your nose that combines to offer you a 960×540 resolution right in a set of glasses. By splitting up the hardware on either side of your face, you end up with two 1.5 x 1.5-inch squares handling the most complicated part of this whole thing, which is the display. To an onlooker, the surface of the actual displays that your eyes are seeing isn’t much bigger than a dime.

Nothing I have seen so far indicates that these devices are ready to hit shelves anytime soon. The expectation is that we will get a really cool demonstration of this technology at Google I/O in June, but right now the display tech is the only thing ready to be used. If you were to take the display technology and put it on a pair of glasses, you’d get a transparent version of those big goofy virtual reality goggles that were sold when I was a kid.

So, what would you do with a wearable computer? I’m sure we will find out soon, so you better start thinking.

Source: www.geek.com

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...