bkdelong: (Default)
[personal profile] bkdelong

Dammit, dammit, dammit. I need to get better at writing down what's in my head.

When I go out to walk the dogs, am driving somewhere, trying to get off to sleep, I'm usually inside my head brainstorming. I'm pretty good with mind visualization and I swear I could live in there. It's be better than TV if I could connect it to my.....what is it.....visual and aural cortex?

Anyway, in the past year on one of my walks, I spent a lot of time looking at the stars. It was definitely late spring or summer and quite nice out. I was looking up and trying to identify constellations and stars with little success. Being a Scifi geek and always pondering the stars, space and other "star systems" and being a technologist, futurist and pseudo-transhumanist, I'm always thinking of ways to make life easier. The spiritual side of me is constantly fascinated by coorespondences in more Earth and astrological-based religiousness.

So I started dreaming.

We have plenty of handheld and mobile technology nowadays. MIT and other educational institutions are revolutionizing wearable computing. GPS and geolocation applications are all the rage and augmented reality is quite the growing area of study Here's my idea:

First you need the hardware system. Take a portable, handheld or wearble computer - it needs to have a decent graphics card and be capable of handling some pretty quick rendering. Input today would be via voice recognition or single-handed keyboard. Video output would be via some sort of HUD-capable glasses like the Eyetap or other existing peripherals. On the glasses would be a GPS receiver capable of realizing the direction it's facing, altitude and, obviously, longitude and latitude. For the most part, all of this exists. It may not be cheap but it's out there.

Then the application. Actually, I have several applications in mind for this platform but this particular one is Astronomy-related. On the computer is a database containing location information for as many stars, planets, planetoids, moons, sattelites and other celestial objects we know and love. When fed GPS, altitude and directional information, it will overlay labels and trace lines based on the user's preference for level of detail. These labels would appear next to stars and other astronomical objects and the lines would be in the shapes of constellations - augmenting reality.

I got this idea from the "Your Sky" application by Autodesk founder and AutoCAD co-founder John Walker. It does exactally what I described above except on one's laptop and you have to manually feed it location information. Plus it renders a fairly simple graphical representation of the sky and though it has a "from the horizon" view, it's pretty limited. Still - I actually tried to take it outside, stick it on the roof of my car and stargaze. Didn't work.

Anyway, the keyboard would only be needed to set preferences and launch the program. Other than that, you just need to have the glasses on and you can be barefoot, sitting on the grass with your hands planted on the ground, staring up at the sky - the ultimate stargazing experience.

I've mentioned something similar to this either on mailing lists, IM chats, or IRC discussions. Unfortunately, I can't seem to find anything on Google and since it didn't ever make it to the blog, I can't say I had the idea before the Celestron SkyScout was announced at the CES show this past week. It's handheld, has location-awareness via GPS and other factors, and audio capabilities. You point it at a celestial object and it will give you a text readout on what it is as well as, I'm guessing, audio readout of the text. With any luck it has a headphone jack port as well. You can also use the item to tell you where an object is and even use a USB port to connect online, download software updates and information about new objects like the space shuttle or a comet passing through.

Hey! It's still behind what is capable today! I see no HUD and no real graphical display - maybe I'm still ahead of the curve. Still - it's DAMN cool.

Still, the full extent of my proposed system will most likely be the grandchild of today's emerging mobile, GPS/RFID location-based applications but it's neat to see a lot of the idea behind this brainstorm start taking shape. Projects like AJAX and mod_pubsub allow for the constantly-updating display of Web-based data streams so I could drive past something in my laptop on wireless with a GPS and be told various info about the objects I am going past. As the Semantic Web continues it's climb, more and more information around the world will be geotagged and slapped together with all sorts of valuable metadata.

Soon, this will become more and more ubiquitious on mobile devices and as I'm walking past a grocery store, I'll be told of the specials for the day. Standing in front of a historical plaque, I will be given a full history of the object being detailed including pictures and possibly even audio and video. If I'm looking for a public bathroom or Starbucks, I can have my device direct me there.

That's where the HUD glasses come in - we can't type, read, and walk at the same time. Not without walking into people, objects or - worse - traffic. Having a Heads-Up Display as opposed to screen-goggles will augment reality instead of showing me something completely virtual so while I'm being fed data and information about my surroundings, I can still see where the hell I am going.

If I really wanted to go high-tech with the input, it would be equipped to handle subvocal recogniton - that is, the ability to measure vibrations in your vocal chords so even though you're not saying anything that can be heard, the vibrations can be converted to signals that can be translated into speech. Silent voice-recognition. NASA is already doing some incredible research on subvocalization - check out page 12 of this 2005-2006 Intelligence Report from the space administration's Intelligent Systems Division.

In addition to that, the HUD would have a built-in HD camera, (or whatever the high-resolution video is far into the future), allowing me to zoom in on particular objects or just, in the case of my celestial application, let me see a close-up picture of the star or part of the moon I am looking at.

The applications for such a platform are almost inconceivable. RPGs and FPSs can be taken outdoors and people can be IN them. No more complex VR environment with force-feedback, omni-directional treadmills, and gesture recognition. You could walk with your own two feet, built-in sensors connected to your wearable would track limb movement and potentially provide appropriate simulated stimulus while your HUD would track the environment around you and your system would render and project the people, monsters or objects you are interacting with. Stick on the ground? Augmented reality can project it as a +2 Bastard Sword with a giant encrusted ruby in the hilt.

Obviously such a system would have incredible military applications and, per usual, the military industrial complex will most likely have been the ones providing the funding for the research that produced both the applications, hardware, smart clothing and peripherals.

Ah, the future. One can hardly wait.

Date: 2006-01-08 05:55 pm (UTC)
ivy: (polite raven)
From: [personal profile] ivy
That would be super cool. (I'm also really impressed by the SkyScout. I now want one. [grin] A pity there are so few places near Seattle where one can go stargazing. I really like your ideas, though.

Date: 2006-01-08 06:00 pm (UTC)
From: [identity profile] bkdelong.livejournal.com
Wow, really? Thanks :) There are many, many times I wish I had the hardware/programming know-how to start hacking this stuff together.

What would be really interesting is augmenting reality to the point of disregarding weather - ie projecting a clear, starry sky on a cloudy, foggy night. However I think tapping into the visual cortex like we've done with hearing and cochlear implants is still a ways off - otherwise, screw the glasses.....I'd just tell my brain what to see.

more organised version?

Date: 2006-01-09 08:43 pm (UTC)
From: (Anonymous)
Hi! Would you be interested in writing a slightly more organised and structured overview of your ideas for http://future.wikicities.com/wiki/Wearable_computing ? There is already some stuff at http://future.wikicities.com/wiki/Augmented_reality and http://future.wikicities.com/wiki/Virtual_reality that you might like to check out.

Re: more organised version?

Date: 2006-01-09 08:46 pm (UTC)
From: [identity profile] bkdelong.livejournal.com
Cool, thanks. If I forget, feel free to remind me. Out of curiosity, how did you find my post? I'm looking for good LJs or blogs on these topics.

Profile

bkdelong: (Default)
bkdelong

April 2020

S M T W T F S
   1 234
567891011
12131415161718
19202122232425
2627282930  

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 8th, 2025 07:15 pm
Powered by Dreamwidth Studios