![]() |
Isn't that something you should do at home instead of blocking the aisle in the damn store... like the people calling their SO to find out what they should buy because they didn't make a fucking list. :mad:
|
Flintos, The whole thing is do we want a projection system we interact with, is that going to be the way of the future? Or is it going to be a better version of what we already have: a small mobile device with a bright, beautiful display we interact with. It's the projection system I'm the most meh about; every projection system I've ever seen is weak, only usable in certain light, and very imprecise. Take it outdoors and it's useless, and I'm wagering there ain't a damn thing you can do about that, because you have to fight the light of the sun.
And "works on every surface" is probably not going to fly either, because it has to be the right color to interact with your colored fingers, and has to be relatively flat in order to do some of the things you'll want to do. You can project a button onto a toilet paper roll, but you can't sort your projected photos on one, nor can you look up the toilet paper allergen website for more information. Maybe special glasses would do the trick. |
The projection system seems stupid, yes. But think about this...
I have always imagined the ideal peripheral to a human-integrated information system to be some type of heads-up display. That is a great way to present information, layered "on top" of the real-world environment, but what can't it do? It isn't an INUPUT device (unless you added some eye tracking capability). What does the clunky, prototype "6th sense" device do? It combines an input AND an output system "on top" of the real world environment. Input and output is accomplished, from the user's perspective in one "place" (you don't have to type on an input device , a keyboard, and also look at an output device, a monitor) and furthermore, this "place" is the world--the actual, real world you are walking around in. In order to do this, it has to involve the dubious technique of creating a feedback loop between a projection system and a visual recognition engine (the details of which we are wholely ignorant of at this point). Take a step back from the details, the specifics, and think about the idea here (what, in my mind, this demo was designed to communicate). A human-integrated interface that layers itself "on top" of the real world, and allows seamless input and output, out there in front of your eyes--where we have been looking at things, and doing stuff to things, for millions of years. Compare this to our recent habit of pulling little boxes out of our pocket and punching little buttons on them. THAT, my friend, is not the way to go. That is a stopgap that will be archaic soon. They may not have solved the problem here, but they have realized the correct idea. This is the TED Conference - "ideas worth spreading" ... |
Well OK, you're closer to convincing me.
The two holdups will be pattern recognition and sunlight. Both are really amazingly difficult. Right now, it can't locate the tip of your index finger of your right hand, unless you have a pen cap on it. What we'll have next is the ability to ask questions via speech recognition. That system allows you to leave the phone in your pocket (as does the TED prototype, which costs $350 but requires $350 worth of phone). You press a button on your earpiece and ask it: What is my checking account balance? It states the answer through your earpiece. How do you ask that question of this system? Hold up your empty wallet and draw a question mark? Here's the prototype that you can use right now. It can only answer certain kinds of questions, and it can only answer them via screen, but you know they'll be branching that out, just as soon as they've used this system to hone their speech recognition with a broad spectrum of users. Oh and also Compare this to our recent habit of pulling little boxes out of our pocket and punching little buttons on them? We're past buttons now, we're touching the screens of high-quality motion-sensing displays... |
I don't like that one because it can only tell you about the Golden Gate Bridge and the size of giant squids. And you have to wear a blue shirt.
|
Quote:
|
(instant thread reactivation)
Holy fucking shit alert! Google just gave us we wanted. http://www.google.com/mobile/goggles You're at the Sydney Opera House and you want to know more about it. Take a picture of it. Google figures out what you just shot and tells you the detail. You're in a bookstore and you want the full Amazon information on a book. Take a picture of the cover... Somebody gives you their business card. Now you have to put their details in your contacts. Of course not, just take a picture of the card... You're in front of a local restaurant and you want ratings and reviews for it... just point your phone at it, the GPS knows where you are and can fetch details. Sure, you can take a picture of the UPC code... or the label on the bottle... more and more things will just "work" if you ask Google. And soon, do you want to know what kind of tree that is? Take a picture of the leaf... The pertinent question has already been asked by Flint, what to do with all this magic? - when there is more magic every day. |
Holy Fucking Shit!
Quote:
|
Just gimme a tricorder already.
|
There is also an app for iPhone called Redlaser that allows you to take a pic of a UPC and get info, including numerous online sales prices, on your phone lickity split.
|
We are closer to the infocalypse.
|
I love TED. They always have fun stuff.
|
Google Goggles update: now it translates.
http://googlemobile.blogspot.com/201...th-google.html Quote:
|
More apps needing a better human interfaces here.
|
Y'know those sci-fi movies where aliens come to earth and for good or bad decide to become a corporation and slowly release their technology........
|
All times are GMT -5. The time now is 08:49 AM. |
Powered by: vBulletin Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.