Imagine that you are in a VR adventure where you enter a room full of characters. They are all minding their own businesses: talking to each other, playing games and, in general, being alive. When you approach a group, or a single character, you start talking with them normally and they talk to you like if you were one of them. They are all following more or less a script for a story where you are the protagonist, and it is through interacting with all characters how you can progress in your very personal adventure. How could we create this?
In March 2017 a was convinced by a friend to assist to the Hack_Construct hackaton in Manchester. What made this hackaton interesting was that it was aimed to the Health and Safety processes in the construction world. Teams were formed by a good mix of construction engineers, architecs and programmers; and the goal was to use technology to find new ways to improve their incredibly obsolete H&S procedures.
I am working on a new VR game for PC. The concept is quite simple: you fly a stunt-kite that has to perform a choreography following the music.
I started thinking about what would be a good VR-PC standing experience and kite flying is perfect: you are usually facing the same direction, taking maximum 1-2 steps forward/backward while controlling the lines. No need to teleport around or use the controllers awkwardly, you just swing your arms, walk a little and look around!
This is a post and not a proper page because I just wanted to highlight some of the features I am developing for the game. In this project I am taking care of the code and shading while a colleague is doing the 3D, so most (all) of the pictures are a work in progress.
I always like to think about my house as a playground, that is why I try to do a pillow fort every now and then (I am getting incredibly good at it) with mazes, levers and riddles. But maybe I am finally getting old and for the last one I thought that moving mattresses, tables and hanging ropes everywhere was not a great idea because, when the puzzle is finished, I would have to dismantle it and I won’t be able to replay it ever again.
I recently went to my first “Escape the Room” game ever. In these games you are locked for 60 minutes in a room with some friends and have to solve some puzzles to get out. I really had a great time, it adds so much more things to the experience than the video-game versions! I was also well surprised with the difficulty: instead of 4 or 5 very hard riddles you have dozens of smaller ones, making it a more satisfying than frustrating experience… I needed to do one of those Escape the Room games for myself!
In December 2015 I was invited to Granada Gaming, a Video-games festival held in my home town, to talk about VR and my interaction experiments. Very exciting times!
I had to give two talks: the first one was oriented to all the professionals (coders, artists, journalists) where I explained some of the decisions I took while creating Apnea (my always in-progress videogame). The second talk was for a general public and for this one I wanted to talk about something that seems to concern a lot of people: VR limitation and why FPS won’t work very well at the beginning.
I won’t cover the whole talk here as many of the interaction experiments showcased can be found already in the “VR Wireless” post and my github page, but I created something I thing is a cool hack to solve one of the main trends in VR movement: the blink transition.
October 2014 came fast and I was ready for another HackManchester after having a blast the previous year. But this time having to work +60 hours in a week made me take the decision of doing something much simpler so I could get some sleep.
At this point I was starting to experiment with an idea for what later will become Apnea, my first ever commercial/experimental game (still in the making… but more on this on another post). One of the key features of Apnea was the detection of the user steps using the HMD’s accelerometer and another one was the detection of the user breath with the microphone. Soon I realized I had a problem: every time the user walked very strange signals appeared in the breath detector, quite odd! I fixed those problems much later, but by that time I decided what if I make a small interactive game out of this odd behaviour?
Half a year after HackManchester I decided to give a go to a Data-Based Hackaton… this time joining a team with friends. The result was the Light Raider, an Android app that encourages running among Mancunians by targeting lamps in the streets. It went really well as we won in “life quality” topic … and I used that money to get me an Oculus Rift SDK 2 (but that is a different story) and we even got showcased by some of the local media.
After leaving my VR Gun project for a while I decided to go to HackManchester 2013 and give it the push it deserves by creating not just the Gun, but a full VR experience. In 25 hours I managed to finish the weapon and modify an existing game named Angry Bots to be playable with all the freedom of a wireless system!
A lot of things have happened since the last update. Being the most important one that I achieve to get a job in the augmented reality industry! This has keep me very busy for the last year, but I have effectively managed to learn a lot in the field and also Android, iOS, Unity3D and many… many more. I love it!
Now that my “learning” curve has eased out I finally got time to carry on in my personal projects / hacks. And this time I promise I will put some proper code and tutorials (by the way I have enabled the comments so feel free to comment code in the older entries).
So this first new entry is related to the fact that I just received an Oculus Rift. This HMD is called to push the VR back in the trenches and I wanted to try it first hand… It promises real-time head tracking and perfect 3D vision of the environment with an epic field-of-view. You can move around the scenarios and after 3 minutes you truly believe you are inside. But head tracking is not everything in Virtual Reality and there is still some edges to cut: one of those is solving the problem of “moving around” with some kind of localization and I will talk about that one at some point; but the interesting one today is interacting with the environment, and when I say interacting I mean shooting at things… time for some Virtual Reality FPS.
One of the advantages of the AR system developed is the ability to couple the logic of a video game. At the same time I was coding the system, my friend Carlos Torija was designing a video
game, he created the artificial intelligence and logic and then we both added simple graphics with openGL and gave it some AR. In this game you have to evade/attack an evil drone that follows and tries to kill you. The game has been produced to be played in a open space and it has virtual walls! Next step includes map recognition.
I also started another AR game using my system. I planned to release it for bada 2.0 but Samsung keeps delaying it so the game is unfinished. The game is an augmented stunt kite simulator, at the moment it has really simple physics and fixed wind but I plan to add a wind system using weather forecast and advanced physics in order to perform realistic tricks.