It reminds me of this video that I found four years ago.
Occulus VR Headset -> CCP Pilot Ship -> Convert ISK to real money -> Pay to be a flight pilot -> Actually have them control a real ship in space to mine -> OMG
I daydream about the future and I wonder why some people don’t any more. They are too present in what’s happening now. The best part about daydreaming is that you can root part of it into the facts and observations of the now and link it to what may just seem to be within grasp.
People have analysed and scrutinised science fiction for predictions and facts. We have 2001 a Space Odyssey with its flat screens and its pretend gravity. Some aspects of which we’re still waiting on. There was Jules Verne, perhaps not strictly seen as science fiction by some but still the use of submarines wasn’t necessarily common. Then there’re tales such as Neuromancer and The Matrix assuming that we’ll all be taken over by the machines or use virtual reality as if it’s real; but what you don’t greatly see, or perhaps no-one is noticing. Is when the virtual reality interacts with the real. It begs the question, what is real? Why isn’t it real just because it’s digital? Physical? Are feelings nought but electrical impulses firing and we’re mimicking this?
I play Eve Online. Bare with me. The premise and role of this game is that you are a person, you fly a ship in space. Your role? Anything you want. It’s open, you’re in a sandbox environment where you can build and destroy and do as you please with money and with goods and services. In fact, it is possible to sustain your subscription to the game by making in-game money to purchase more time cards, however you may have to sink a lot of time into the game to get this far.
It practically emulates life, you (can) have a job and be part of a clan, a corporation. To simplify you can be a trader, a fighter or a miner. It’s being a miner that I’m most interested in with the point I’m trying to make here.
CCP, the company behind Eve Online have started to work with Oculus Rift, this simulation or game allows you to virtually enter a spacecraft within the Eve Online universe. The craft is also directly controlled as opposed to instructed as previously done.
So the thought I had was, what if CCP effectively became a bank. Or at least could trade/buy/sell money like Second Life. In Second Life the name says it all, you’re meant to be able to have a job, create items, goods, clothing, mainly aesthetic items but also script and 3D model items to be traded and sold. It’s entirely possible there are people in the world who make a living being paid working for others in the game or at least selling their virtual goods. Even purchasing land.
I’m saying that if CCP did that, you would have a virtual job, a futuristic job, flying a spacecraft.
I’m not done.
We can bridge this gap. We already use software programming to control real life hardware. There is a virtual jump there to something physical. You create your code and you upload it to your chip, that chip can then send out signals and control hardware. This is a high level concept and there are many details which would need to be ironed out before this would be realistic in what I’m about to suggest.
What if we really mined asteroids and you controlled the mining ship from inside a rendered simulation, or a real-time feed from home using similar or the same controls which you already used inside the game, such as Eve Online ? Perhaps you could pre-configure the commands and route to do the job or you could do it live. Latency issues aside.
Mining asteroids is more than likely really coming. How we actually mine them has yet to be ascertained. Reducing life risk and cost whereby we could control drones to do it would, as I see it, be advantageous. Perhaps we’re already refining the controls and scheme to do it? Perhaps we’re already running the simulations, maybe we’ll be doing it for real within 300 years.
Electronic circuits, we use them everywhere, they’re almost an organism in themselves. In fact, we have given a name to when they are of the form of an organism: Robot, Cyborg, Artificial Intelligence, Computer. We use electronics to identify ourselves, authorise who we are, communicate with one another. In essence, an extension of ourselves and self assertiveness.
We use them for pass cards, to gain access to areas that we’re supposed to and deny where we’re not. To control our heart beats and also to augment our bodies in other ways to hopefully improve them or at least, prolong our short lives. Which, are actually getting longer.
Some believe that the future is for us to be integrated with electronics, our consciousness to transfer into or with them, to live immortal. To have circuitry embedded within our bodies so that we’re easily certified and identified as to whom we are.
I think, the next jump will be organic circuitry and its understanding. What do I mean?
Stem cells, they can create organs, given a template that is (from my limited understanding). Which can be of our own construction, as it were, so that it is not rejected. Currently if you implant a circuit into you, it will itch, it wants to be rejected by the body. Metals will break down and poison us.
Let’s create an organic circuit, one made from cells or similar. Make it one with us but customised, perhaps articulated with stem cells so that it’s not rejected by the body and so, all we have to figure out then is how to send and receive these signals from the circuits, how to power them? Heat perhaps? or maybe bodily constructions that do a form of energy recovery using blood-hydro-energy plants that’re as small as nano bots?
Imagine, we’re starting to be able to read pictures from peoples minds and also control devices using the electrical impulses from the brain. Here’s the inevitable “what if”, but, how about we have made these organic circuits that can receive and transmit data – they can then extended to doing it in a wireless manner.
I have seen technology for reading from the mind, for receiving some data back into the mind (such as the hearing aids) but what this could potentially suggest is the ability to have telepathy amongst those whom have a combined system put into place.
Is it really so far fetched, now?
Pretend that reading pictures from the mind is possible and extends to sound, feel, touch, smell, the senses can be presented and transmitted.
How can you even teach the mind to control what you could send and receive over an organic circuit plugged into such a thing?
How could we receive the data that we’ve just read from someone else? Would it be implanted as a memory or, more likely, would it be injected as part of the receptor that currently exists – such as the connectivity for the eyes, ears, tongue.
Would that mean that you would have to switch, between physical and ‘virtual’ senses?
How could you prevent someone from interfering with you, your transmissions, your reception?
What is more, the current incarnation of the internet and the virtual world, would this finally mean you could interact with it physically?
It depends. On what we can work out, what break through we can make, if, when they find out, that it’s revealed to anyone that can do something with it and, mainly I think, if we would go crazy being able to do these things. It may just be the final bridge between the organic and the metal electronic.
In academia, advancements can sometimes be large leaps and very bold, depending on proof. Sometimes though, I think that the best improvements come in increments as they’re tested and spread out to everyone. Then someone gets an idea and it can be shook up again, become affordable and is taken up en mass.
I was looking at some old photographs, I forget where, of an office and it was full of rows upon rows of desks. Piles of paper stacked high and people sat at them with pen and ink, passing the paper around.
Move forward in time, the paper is reduced, typewriters take their place, calculators appear and the desk number is lessened.
Again another jump, fewer desks again and we have computer systems, still some paper (and a lot more post-it notes).
I wonder if anyone considers the next step on this? Perhaps the advancement of technology with video games entertainment can provide some insight, where the need to communicate with friends and team-mates regularly have produced interesting mixtures of text and voice based chat systems, with some video.
From what I have observed though, video chat takes a back-seat unless there is a real-world ‘meat space’ object that needs to be shown. If the item is something created in the ‘metaverse’ of the video game space then there is little to no requirement of actually seeing the person behind the virtual entity.
I had an idea. What if, instead of a monitor, keyboard and mouse, you used your body and voice, perhaps even your mind to make natural interactions with your surroundings which could then be anything from anywhere?
In fact, perhaps there could be entities which are not even really there. I thought that a combination of existing hardware could do this, there’s Microsoft Kinect, Vuzix Eyewear and the OCZ Neural Impuse Actuator.
As I’ve said before, with comments about the ‘meta verse’ and technology such as Google’s Project Glass. I think that some others are becoming aware of this kind of idea. Including Microsoft with their ‘Kinect Glasses’ leaked concept.
This is not so strange, people coming up with similar or the same ideas, at the same time. I think that it happened with the television and the telephone. Amusing that it could be happening with another method of communication, or at least, interaction.
Perhaps there’s a natural evolution to this?
These alternative, virtual, reality, 3D methods of interacting with the ‘meta-verse’ or ‘computer world’ or ‘another earth’ are fantastical, possible, realistic.
How does a blind person interact with it?
In the days when I mainly used my Nintendo Entertainment System, then the Commodore Amiga 1200 it consisted of sitting on the floor by the television. This quickly evolved into having my own, smaller, television where I then sat at a desk with the Amiga 1200.
After many years I’ve been prone to the typical pains that blight many users, lower back pain, repetitive strain injury in the wrists, neck, headaches, etc. I think to myself:
“Wouldn’t it be nicer, if using a computer felt more natural to how we interact with everyday objects. Sure, using a mouse or a game-pad can be a fast way to use items but is it not possible for me to walk, to sit, to just think and control a computer?”
I believe it is. Devices such as the OCZ Neural Impulse Actuator have came along, while not believed by many and picked up by few. This is a step in a good direction.
Not only this, though, for I also dislike computer monitors. In fact, they are such a hazard that there is written UK law on the correct installation and use of them in the workplace. So how, then, about using Vuzix Eyewear ? Or how about, Google’s Project Glass ? It’s not entirely science-fiction either as some people have attempted to put it together.
Still, this is not quite full interaction with a computer system. Relying purely on voice. It feels to me that it requires something else, something tangible to interact with. How far away are we, from the type of interaction you can get from Heavy Rain’s ‘ARI’ ?
How would such a system work? Is it possible that a ‘virtual reality’ system such as this could actually interact with the ‘real world’ ? How would it map it?
With technology such as Microsoft’s Kinect that can map an area in 3D, perhaps if it could be made in such a smaller scale or even implemented into standard rooms (or perhaps into something being worn so that it is done as it is viewed) then this can be presented and interacted with as required via some form of eye-wear or, as some are even calling it, ‘wearable computing‘ which it appears, is not an entirely new concept but is being discussed.
Still, we are a little away from the technology depicted within fiction and while the current solutions do not entirely take into account control from the mind, I firmly believe that it is possible to get there and I would love to be able to work with it.
Another Earth is a film which I watched recently, the fantastical idea that there’s another planet on the other side of the Sun where there’s an exact duplicate of the planet we live on and every person in it. Like the typical ‘parallel dimension’ theories that’re often thrown around in TV shows such as Sliders.
What if, though, there is already ‘Another Earth’, but it isn’t populated yet? In fact, it’s still developing and evolving. Here’s an ‘out there’ concept.
It’s (on) the internet.
I’m pretty sure this isn’t a new idea, but I’ll attempt to collect my thoughts on the topic. I feel that the internet was born out of a need to communicate, also to share information. This could be the key part of humanity, communicating, along with surviving of course. So what do I mean that ‘Another Earth’ is ‘the internet’ ?
In the virtual world ‘Second Life’ (and a lot of ‘massively multi-player games’) we can see where a world synthetically created mimics physical reality. Items are bought and sold, people own land and social relations are developed. A key part which I found interesting though is that you can transfer in-game funds to physical reality. So your ‘in game currency’ can be transferred into dollars, subsequently dollars, or any other international currency. People then, were creating characters, purchasing items and taking up jobs in Second Life.
Yes, jobs. Working, earning money in various ways, from socially questionable to being a shop clerk or just using their 3D modelled persona to demonstrate attire. Perhaps then, this is just the beginning.
This isn’t the only virtual reality which allows you to transfer between virtual and physical goods, CCP’s Eve-Online allows you to purchase ‘game time’ with in-game currency, potentially making it so that you never have to use ‘physical’ cash to lengthen your subscription. They’ve also enabled the ability to purchase computer graphics hardware, further extending this example.
So this idea is becoming no longer new, but where can it go from here? Accessing this ‘other Earth’ is quite specialist. You’d have to know about computer systems to acquire hardware to run it suitably, the language the software is presented in (which may well be mainly localised to English) and also have the communication network set up for it to access it.
So for it to advance I think that there’s two main obstacles to overcome. The first is the interface barrier, the second is the language barrier.
With the interface barrier it seems apparent with Second Life as an example that the virtual world mimicking physical reality is the first stepping stone, to access this naturally we have seen ‘VR’ or ‘Virtual Reality’ equipment come into play, which in its early years has been cumbersome, heavy and requires a lot of computing and energy power. However, now technologies such as ‘Google Glass’ and ‘Vuzix Eyewear’ are coming along to help overcome the tie to being sat at a computer monitor with a keyboard and mouse.
The language barrier, imagine you have two people who have a ‘virtual job’ (let’s call it) and they have to communicate with one another, but they speak different languages? Audio recognition is coming along in leaps and bounds which can translate what you say into text, at the very least trained to your voice. So, then, all that is required is for this to be translated from one language to another and played back or even just displayed. This interim technology then breaks down that wall of communication and with it being in the ‘internet virtual world’ then it is nigh on instantaneous.
Jump a few concepts ahead. You have a person, from England, you have another, from Japan. Each walking around with their new method of accessing this ‘other Earth’ active from the physical world. They have a virtual job managing a Russian communication forum, the systems translate into the localised native language for them naturally from their speech.
This could just be a start.