Steam launch of Freedom Locomotion VR with Oculus Rift + Touch compatibility

Link to the demo: http://store.steampowered.com/app/584170/

The Freedom Locomotion VR demo is finally ready to launch on the Steam platform. It's worth noting that the HTC Vive only version of this demo has been available on the Itch.io site for over a month now, so essentially I've held off on the Steam release mostly so that I could make it fully compatible with the Oculus Rift + Touch controllers.

Going by the events of the last week and a half, this is a timely release, with Oculus winning back the favour of the VR consumer public, by (mostly) fixing up long standing tracking issues, providing a significant price drop to their Rift and Touch system and also releasing the first real 'Killer App' for VR in Epic's Robo Recall.

While the original Freedom Locomotion VR demo would 'run' on the Rift platform, it had significant compatibility issues including the fact that it wasn't properly tracking the head motion, that the menu was inaccessible, hand rotation was off, grab functionality was off, and that the Vive control scheme didn't gel well with the Touch control scheme (for example, having to push and hold down the Vive touchpad to move).

All these issues have since been rectified, and significant effort has been put into ensuring that the locomotion controls are well suited to the Rift. Differences between the Touch and Vive controllers have resulted in the need for individualized differences in options for the two platforms - the touch keeps full thumb stick directional input, while the Vive now defaults to 'simple touch' where the direction depends on the direction of the controller itself (but allows for either a forward or backwards movement based on that orientation).

Similarly, the HUD elements are repositioned slightly to account for differences in Field of View between the two devices (side note: the Rift has a facial interface that biases it towards more angular european faces while the Vive has a flatter facial interface that biases it towards flatter asian faces like my own - which in turn alters the amount of field of view achieved on either device). Additionally, all the graphics and tutorials have been customized to fit the Rift.

Just as much time has been poured into improving and iterating the original system based on the detailed feedback provided by many users. The system now has a calibration process that better tailors the sensitivity of the locomotion to the individual user. It also has complete control at the lowest movement ranges, so even a very tiny amount of head movement while movement is active will result in some small degree of motion.

The HTC side has also received various changes that will hopefully improve the experience for users. The most important is a small change in how movement activation works. Instead of press and holding the touchpad to keep the movement active, users can now simply click the touchpad and maintain thumb contact.

This has the benefit of rejecting erroneous thumb touch as movement input (the touchpad is a big surface with excellent thumb resting potential), while reducing the amount of stress and strain affecting both the user's thumb, and the touchpad itself - which is unfortunately afflicted by a design issue where a component is liable to slip out during usage, resulting in erroneous double clicking among other things.

Dash and Blink step locomotion has been changed up as well, and made to match the Oculus version. Instead of clicking faster to move faster (to reduce unnecessary stress on the touchpad), you now use a combination of elbow/hand angle and distance of hand from body to determine step speed and step size respectively. If you point it towards the ground, you'll go slow. If you point it towards the horizon with your arm stretched out, you'll go fast.

This has a nice effect where you can affect a wide range of motion in Dash or Blink step without uncertainty (i.e. you're not going to accidentally touch the outer portion of the touchpad and suddenly move 4x faster than you expected by touching the middle portion of the touchpad).

The Steam launch of this demo sees the inclusion of a new (optional) turning mechanic. The snap turn is familiar in name, and similar in function to other existing snap turns (in that the user will instantly rotate to face the new direction, avoiding vection from visual motion completely), but has a key difference that I feel helps to elevate it into a fantastic turning solution for VR.

Where other snap turn mechanisms has you push the analogue stick or press the touchpad to turn a fixed angle in the direction you indicate (sometimes 30 degrees, sometimes 45 and sometimes 90 degrees), with the Snap Turning in Freedom Locomotion, it's variable, based on the angle of the controller relative to the angle of the HMD.

This means if you want to Snap Turn directly to your right, you can point to the right and press the button. If you want to turn incrementally to the left, you just point left a bit and click the button a few times. And if you want to do an immediate 180, you point backwards and press the button.

Once you've played around with it for a few minutes, you'll be able to turn in the exact direction you want with very little error, and no nausea.

Finally, as a last minute tweak, I changed up the way that the system smooths out the head motion, making it more direct than before. It feels like a substantial improvement in terms of responsiveness, and gives you a much more direct 1 to 1 control over your foot and head motion and your motion in the virtual world. This change applies to both the Vive and Rift versions.

Beyond the demo itself, work will continue on the locomotion system. The primary intent behind the Freedom Locomotion System is ultimately to provide a system that provides the widest range of users with accessibility to a system that allows for large scale continuous movement like the type that we're familiar with from traditional games.

To that end, I have plans to include sliding locomotion... at this point, I think there is an entrenched user base that simply won't compromise on this one. As much as sliding locomotion is unsuitable for all potential VR users, it is also the best solution for a pretty reasonable range of the existing (and probably future) users.

I'd also like to add in a proxy based teleportation system. A proxy teleportation system essentially has the user pilot an avatar to a location before the user teleports and assumes control of the avatar, which provides a nice compromise between the 'safeness' of pure teleportation and the game play preserving functionality of continuous locomotion (whatever its form), where the player must account for obstacle traversal and enemy avoidance.

We've already seen proxy teleportation in a couple of VR experiences - like VRChat and Skeleton Fighter VR (both available on Steam), so have ample evidence to show that this works very well. Of course we'll add our own twist on the formula to make it better gel with the rest of the system - but it'll essentially offer an option that allows for any user that can only stomach room scale and teleportation VR to also play games that end up using the Freedom Locomotion System.

Speaking of which, the plan is still to develop the Freedom Locomotion System into a plugin that will be initially available on the Unreal Engine and then eventually on the Unity engine, so that other developers can tap into a ready made versatile, high quality locomotion solution that provides them with the ability to design a game flexibly without having to worry too much about game play incompatibilities between the various locomotion solutions.

This is work that will happen following the optimization of the demo and will hopefully help to positively impact the VR industry as a whole in the years to come.

Working hard on improving the Freedom Locomotion System

I'm still here. I've received a lot of good feedback from users of the last week or so about the Freedom Locomotion System, and it's hugely appreciated. While I did what testing I could before release, it simply cannot compare to range of feedback that you'd get from a public release.

A lot of the feedback was highly positive. They got what I was trying to do, and like I had hoped, after playing around with it for a bit, started to really get immersed into it. A lot of people are hoping to see this system or one like it in future VR games, and given that's my goal, makes me very pleased to hear indeed.

Not all of the feedback was singing high praise though. A good number of users commented on a number of issues (although many of them still enjoyed other elements of the demo despite the shortcomings that they identified).

A few key things that have been identified from the demo and I'm currently working hard to resolve, remedy and improve include:

Walking movement not sensitive enough. This seemed to affect a large number of users. On the other hand, others seemed to be very happy with the way things moved. After watching a couple videos from the community that tested out the demo and posted their own experiences of it up, I realized that the range of movement styles varied significantly, especially at lower speeds.

I mean, we all walk at roughly the same speed in the real world. But the specifics of how we walk can vary significantly from individual to individual. When you take out the horizontal component of movement (as on the spot movement does), that difference is quite significant in terms of head motion.

As a result, the required solution is to provide a system that can allow users to calibrate for a wide range of movement styles, and that's what I've been working on today. From my own early testing, it seems to have been very successful, providing reasonable motion for a wide variety of movement styles. Hopefully I can get some extra testing to prove it out, but I'm excited to get this fix out there - because if it works as well as I think it does, it'll mean that the rest of the people that weren't won over by CAOTS will start seeing what the fuss is about.

Another issue is that some users were still getting motion sick, especially during the less standard movements, like going up or down a slope, or falling down. This is of significant concern to me, as it goes against the broad based solution I'm trying to achieve. I'm not just trying to make an immersive locomotion solution, but a widely accessible one.

In reality, I already had a couple solutions that could help to reduce the amount of motion sickness experienced under the comfort options. The comfort option that restricts field of view when moving the user around artificially is a tried and proven technique, while the comfort boundary option is analagous to a 'virtual cockpit' that works well in other cockpit based VR experiences.

But neither of these were on by default, as the field of view restriction functionality I had in the demo was not up to the level of quality I was hoping for. The main problem been that instead of a sharp boundary restricting the user's field of view, it was a vignette that darkened the outside and gradually got lighter towards the middle. This had the unfortunate side effect of making it feel like the user was wearing sunglasses every time the comfort functionality activated.

Over the last week, I've worked hard to improve this functionality, and have resolved the technical issues that prevented me from utilizing the better version of the field of view restriction. I've also made it more 'intelligent', so that it'll change how much it blocks of the view depending on how much vection is expected from the motion (so that falling for example restricts the field of view a lot more). An extensive set of options have also been included to allow users to tweak the parameters individually to their hearts content if the defaults aren't to their preference.

Additionally, this new field of view restriction solution works extremely well with the comfort borders, allowing the user to see the borders in their full field of view, even while the view of the game world is restricted. 

Beyond the comfort options, I've also improved the climbing mechanics significantly. Users will now get a lot more feedback on what they can and can't grab at when climbing, and when they're successful in grabbing at something. Additionally, I've communicated more clearly in the tutorial how the climbing works (you put your virtual thumb and forefinger around an edge so that you can pinch it).

As a result, these changes should make climbing around a substantially easier task for many more users than before.

Finally, I'm currently working on a detailed first time calibration system. Instead of providing a set of defaults that just haven't worked for everyone, I'll be making first time users go through a process of finding the options that suit them best before they're put into the demo proper.

In reality, this amounts to explaining the options clearly and allowing them to select it for themselves. But this sort of communication and expectation setting is an important part of design I feel. It should prepare them for some degree of complexity, but also introduce those concepts before hand, rather than dumping them into it cold.

Also it helps as many users simply accept the defaults as a given, and won't delve into an extensive range of options even if they're available. The calibration process then simultaneously provides a better fit to the wide range of different users, as well as making them aware that they can actually change and alter a lot of settings in the system.

Once I'm done with this, I'll publish an update so that users can try it out. After this update, my plan is to resolve the Oculus Rift compatibility issue (i.e. it's barely compatible currently, as I've done nothing to make it so - and as a result there are a number of issues with running the current version of Freedom Locomotion VR Demo on the Rift). The Rift enabled version should see a launch on the Steam store once it's done.

Very excited to get these updates out there and into the hand of more users. I'm hoping to get this pre-Rift compatibility update done in a few days. I strongly believe in what I'm doing here; making VR locomotion more immersive and more accessible, and I don't plan on resting until that's true for as many people as possible.

Demo is launched

https://hugerobot.itch.io/freedom-locomotion-vr

Well, you probably already know it if you're reading this.

It's compatible with the HTC-Vive. Proper Oculus Rift compatibility coming later.

There are a few outstanding bugs I know about (and probably many I don't), but it's in a 'good enough to see what it's all about' state.

From here, I'll be fixing up outstanding issues that public feedback provides me, and looking to make it Rift compatible.

This is just a tech demo at this point. I've been getting plenty of requests to make it available as a plugin and all that. And I've come around to thinking it's a good idea.

But to make it game ready - for other developers to put into their own games and systems and have them ship it - there'll be a lot more work that needs to be done, on refactoring the code, making it performant, ensuring that it's well documented, and all that good and necessary stuff needed to get it out more broadly into the end user's hands.

In the mean time, I can still be contacted and will be happy to assist other developers with more hands on implementation of the ideas and systems found within the Freedom Locomotion System.

The first blog post!

The first dev blog. It's been a fun ride getting here. This started after I took a break from writing a book about 'The Future of Virtual Reality'. A lot of the ideas I was putting in there has been folded into the work I've done in Freedom Locomotion VR. Similarly, I have a lot more ideas of how to proceed from here... but until recently, those ideas were fairly worthless.

Not because they weren't good ideas - they were great ideas, as I well knew. Just that ideas don't do much without the sweat equity to turn them into reality.

As a one man band, I'll be plodding away continue to iterate, refine and polish the Freedom Locomotion System, while expanding its functionality more broadly into general interaction. As an ideas guy, I've generally got way more ideas than I have time or talent to execute upon. But at the same time, with a design background, I can put attention to detail to make sure things are just right when executing the idea.

If you're interested in what I'm doing, and you're interested in VR, or working in VR development, get in contact. We might be able to work together and create something amazing together!