bkdelong: (Default)

Though I oft begrudge pulling my slothic body away from my Net Addiction to walk the dog, (gods forbid that I get fresh air or do something that contributes to the household for a change), it's an exercise that provides me with enough stimuli to keep my normally racing brain busy allowing me to brain storm. Similar to the massive amounts of multitasking I do on the computer except it involves walking, feeling the air, smelling the smells, listening to a cacaphony of noise etc instead of flitting between virtual Windows.

Before I get to my BrainStream of the evening, I wanted to note something my brother Nate, ([livejournal.com profile] necr03), was thinking about. Until he truly starts posting to his own LiveJournal, I want to store his ideas somewhere. He was discussing design of future vehicles. Not sure if he was referring to automobiles, airplanes or spacecraft as it was a few days ago. But he was pondering a day when transmitters and microprocessors were so small and inexpensive, a vehicle could be absolutely covered with them and instead of absorbing or tricking radar, simply retransmit the signal right through itself. and back to the sender once it had bounced back to them.

Taking that a step further, I could see hundreds of microcameras being mounted on a vehicle while at the same time, the external body of the vehicle is one giant computer monitor. The image on the other side of the vehicle is displayed on the opposite side of said vehicle as if it were not there. Difficult to do with tire and the undercarriage though perhaps a bit easier to handle the windows and windshield.

Anyway, back to my idea. I had my little light bulb earlier today while walking the dog. But while completing the final jaunt for the night, I started thinking about holography and the Star Wars scene where R2D2 projects a 3D image of Princess Leia giving a message. I still think holography of that sort is a ways off. We have seen people display similar images on a screen of moving fog but unless scientists get much better control over light in a space, (I don't know- perhaps projecting dust or some sort of matter in the shape of the holograph and the light around it helps make the 3D effect or something), it'll be a long time before we see truly realistic, interactive holograms. Damn - and I wanted to play a magic user and cast Major Image and other Illusory spells.

However with all my ramblings on Augmented Reality, I think that technology is the median between normal-space and holography. As HMDs and wearable HUDs get more and more simple, non-intrusive and inexpensive, using wearable computers combined with these devices along with mounted location-aware hardware one could completely create the fact of an interactive, quite realistic hologram.

I was thinking how much of a PITA hand-mounted keyboards or even dealing with voice/subvocal recognition to navigate a GUI would be. Then I remembered all the research I did into current efforts of "gesture recognition" - that is, detecting hand/body movement and causing a GUI to react as a result. So on a screen in front of you, you would see a series of windows. Hold up our hands and make a grabing movement. The "hand" cursor on the screen would close on the window and allow you to move it where you wish. Hold up a single finger and point in the general direction of the scroll bar, crook your finger and make a scrolling motion - voila, you're moving down the screen.

My idea would become a part of my previously mentioned wearable computing environment - a multi-beam laser scanner mounted on the HMD, (along with the GPS device), that would cascade several hundred to a tousand beams of light parallel to the front of your body. Your hands would then become the input device and you can either bring up a keyboard to type on or simply use your hands to navigate windows combined with your voice as input....or even alternate GUI navigation features. Activating the device would take a voice command or button somewhere on your wearable but once activated, it immediately calibrates and allows you full control.

Examples of gesture recognition navigation of a GUI environment can be seen in the movie "Minority Report" as well as the television show "Earth: Final Conflict" where the technology was used to pilot the Taelon shuttles.

bkdelong: (Default)

Dammit, dammit, dammit. I need to get better at writing down what's in my head.

When I go out to walk the dogs, am driving somewhere, trying to get off to sleep, I'm usually inside my head brainstorming. I'm pretty good with mind visualization and I swear I could live in there. It's be better than TV if I could connect it to my.....what is it.....visual and aural cortex?

Anyway, in the past year on one of my walks, I spent a lot of time looking at the stars. It was definitely late spring or summer and quite nice out. I was looking up and trying to identify constellations and stars with little success. Being a Scifi geek and always pondering the stars, space and other "star systems" and being a technologist, futurist and pseudo-transhumanist, I'm always thinking of ways to make life easier. The spiritual side of me is constantly fascinated by coorespondences in more Earth and astrological-based religiousness.

So I started dreaming.

Read more... )

Ah, the future. One can hardly wait.

Profile

bkdelong: (Default)
bkdelong

April 2020

S M T W T F S
   1 234
567891011
12131415161718
19202122232425
2627282930  

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 3rd, 2025 03:25 pm
Powered by Dreamwidth Studios