Our Score

10

, , ,  

Review of the Lone Star Clash 2

December 6, 2012

Working in an eye tracking company and being eSports fans, we wanted to know if we could do something interesting in that field with the technology we have access to. We started by contacting teams and managed to get
someone from vVv Gaming and two players from Quantic Gaming to use one. The initial feedback seemed positive enough so we decided to push on.
We got in touch with the Lone Star Clash 2 and decided to attend and do some live eye tracking, which is really out of our element. Eye trackers are usually used in a nice experimental setting, with controlled lighting, all the time in the world and the participants either get money out of it or just have nothing better to do. Doing it on stage with progamers who need to be concentrating on their game with minimal down time is something else entirely.

What we did (aka the Tech part):

Since we had limited time, we went for the most obvious statistics we could think of. These were looks per minutes and Average look time for both the map and the resources, along with the number of fixation per minutes and the average fixation time. These statistics can be calculated without any integration in the game except for the position in screen coordinates of the map and resources. The last thing was the heatmap which was generated using the data collected during a match.

Our physical setup consisted of two Mirametrix S2 Eye trackers, one for each player’s screen, and 3 laptops. The eye trackers were each plugged in a different laptop, which was beside the player’s computers. The eye tracker software ran on these machines so that we had no effect on the computers that ran the game. We used the software on the same computer that runs StarCraft 2 numerous times without any problem (same thing for our gaming partners!), but since we had no time to do extensive testing on the Lone Star Clash setup we preferred to be completely autonomous.  We had the third laptop, the so called server, in the control room with all the other technical stuff. This third laptop ran the software that calculated the statistics and generated the data; it connected to the software one the other laptops through tcp/ip.

Before each match the players needed to be calibrated on the screen. The way we did this was by having the laptop with the eye tracker software running on it connected to the player’s screen in the secondary input. All we had to do was get the player in his normal playing position, switch the input of the screen, do the calibration and switch the screen input back to the gaming computer. That whole process usually took under 45 seconds.

On the laptop in the control center I could see the faces of the players from the eye tracker point of view, to make sure they stayed in the field of view and that the data was valid. I would start the recording when a match started and stop it at the end and generated the result images that were shown on the stream.

The cursor shown during the match was simply an XSplit extension that connected to the laptops through tcp/ip. It was simply activated when we wanted to show one of the players gaze. The spectator mode needed to be switched to the player’s perspective beforehand though because the screen coordinates shown by the cursor would make no sense otherwise.

What we learned:

This exercise was a new experience for us. Not only was it a first time at such an event but it was a first time behind the scenes. This being a live broadcast really changed our way to do things. Since we did not want to be a nuisance to the players if the calibration was not working right for a reason or another we would simply not use the data for that player.

We also learned a lot about gamers! None of us realized that some of them didn’t play in a full HD, 16:9 format, which made certain data points invalid. The whole experience allowed us to determine what the proper way to set up for the next time would be doing a live event. We also managed to analyze the data and identify statistics that are more telling than the ones we had.

What we plan

One of the first things that will change in the future is the data we give to the viewers. Things like an evolution of LPM through time, which gives viewers a great insight in terms of the state of the game, are definitely going to make it in our next event.

Other technical things, like pre-calibrating, mounting the trackers permanently to the screens and adapting to the various resolutions are all things we hope to do in the future when it comes to live events. This would make everybody’s life easier and would allow the focus to be on the right thing: the game.

Spower Rock & Mad Gazer

 

 

Have your say