Lykta requires accurate positioning indoors, and the road to a functional solution has seen a couple of (educational) dead ends along the way. With the horror experience starting to come together, I think it was time to start off by talking a bit about the challenges we've been facing.
Many location-aware apps use GPS, but this provides at best an accuracy down to a couple of meters outdoors, and is even more imprecise indoors. Our current solution is based heavily on Sony's Playstation Move system and their Move.Me API, which provides sub-centimeter accuracy in positioning in an area of about 3.5x4 meters, as well as very accurate orientation tracking. However it is a rather closed system, as the calculations on the data from the controllers and tracking camera are all done in an actual PS3 system running special server software, with the position and orientation data sent over network to a computer. The picture below explains our current setup.
When we started the project we had heard about this cool thing called sensor fusion, where different sensor data, e.g. that from the gyros, accelerometers and the compass of a smartphone, is combined to a more accurate whole. We hoped we could use this to calculate the position and orientation of the phone. We managed to create an algorithm that made it possible to track the orientation, but this was very sensitive to compass disturbances.
The data from the smartphone sensors was far from accurate enough to use for position tracking however. Our first solution used a ceiling-mounted Wiimote as a wireless IR-camera to track strong IR-LEDs on the handheld unit. However it was hard to cover a wide enough area as the Wiimote camera has a narrow field of view.
We had a closer look at the PS Move and realized it had great accuracy for positioning. A camera detects the position and size of the glowing ball on the controller, and from this the controller position can be calculated in relation to the camera. Sony's Move.Me API, which we ultimately chose to use, did not seem to be accessible outside the US, so we started off with the open source PS Move API by Thomas Pearl. We wrote a Unity plugin for it and got to try the experimental orientation tracking capabilities. They worked, but at the time they had problems with drifting which required constant recalibration, something that would be tricky to do in a real use context.
When we finally inquired about the Move.Me software we were lucky and got in touch with just the right person at Sony, and got hold of the Move.Me server software surprisingly quickly. It turned out to surpass our expectations, as it continually calibrates orientation on the fly by comparing movement according to the camera image with data from the sensors in the wand. An additional bonus is that position and orientation data is given relative to flat ground. However, one hard-coded limit that hits us hard however is the maximum range of 3.5 meters, which gives a pretty small play area.
Since early autumn me and some classmates have been working to realize a sort of virtual reality flashlight concept. We've been trying out all kinds of solutions, including IR LEDs and Wiimotes, but around Christmas we finally found the perfect tracking technology... the oft-overlooked Playstation Move!
We just tried it out with a sort of "virtual art gallery" application in a local library, here's an uncut video from the event. I'm not providing any input at all, except pointing the "flashlight" around. There is some network lag but we're working on reducing that.
Though similar concepts have been explored before, we haven't found any solutions that work this well without requiring expensive motion capture equipment, cables or AR tracking markers. We call it Project Lykta, Swedish for lantern. The handheld unit is wireless, though when we shot this video it was still being charged for the main event.
I'm set to explore this concept further over the coming months in my master thesis, Creating emotive experiences through handheld projection mapping. Basically, I'm aiming to create a horror experience for two players using this technology.
It was a really good learning experience to record and edit a whole episode, and the final thing features music and cutting between segments of the program. The episode is in the Swedish but if you know the language and enjoy retro games, then please have a listen!
The weekend of the 8th of June me, Erik Svedäng and Robert Edström arranged the first of hopefully many Enterhake game jams. We had over 20 participants, many from the #indieGBG group. The Interaction design programme generously allowed us to use the whole second floor in Kuggen, so there was lots of space in a really creative environment.
The overarching theme was camera, but we also gave everyone Kinder eggs and handed out bonus points to those who managed to motivate their games using the contents of their eggs! In total 5 original and widely varied games were made, spanning from networkeded multiplayer photo games to trippy Pokémon-snap inspired experiences.
I worked together with Linus Nordgren from Hello There and Juha Kangas from Ludosity to make a mobile-augmented game which revolves around a green-sensitive mobile camera.
I did a walk-through of the jam late during the first night (and some more the next afternoon).
More pictures and videos, as well as the games themselves, can be found on the jam website.