For starters, analytics are rarely used in pre-launch development, unless it’s the rare case where intelligence gathered from one title informs the next. They are more typically seen post-launch. When they are used in a live service with creative iteration taking place, they can be very helpful so long as the dev team follows best practices.
At the highest level we can describe this with stock phrases like “analytics are no good without a soul,” but it’s more constructive to look at what they are and are not good for. Here are some things that game analytics are good for, without tampering with the creative process:How much is an asset or a mechanic being used?
Knowing what’s actually consumed helps you align resources with actual use. If you have 80% of your team working on content that only 15% of players ever see, there better be a good reason.Which mechanics or assets are related to outcomes our business team cares about, like spending or churn?
If there’s an asset that seems connected to conversion, it’d be good to know it—not to make copies of it, but so that the creative team can look more closely at why that asset is working. It may well be that doing “more” of it would break something else.What are players going to do next?
Predictive analytics can be used to help CRM or community better manage the player base.Why are players doing what they are doing?
This is the one question analytics have the hardest time answering. The best practice here is to talk to players and ask them, or observe their play and interview them after. Models can offer insights into the answer, but they typically require more training than developers have and few automated tools exist.Where are players going wrong?
Psychologists have long known that “flow” is the sense of losing yourself in actions and is highly pleasurable. So, we want players to have a sense of flow. Flow occurs when the level of challenge hits the Goldilocks stage. Not too hard, and not too easy.
Analytics Used Badly
Let’s consider the case of analytics used badly as an example. A platform game has a level—let’s say level 3—in which the player needs to take three actions: beat the mini-boss, jump over a ditch, and beat the end boss. The BI team (or the software package) runs a correlation between the churn rate and the levels of the game and sees that level 3 has a high correlation with churn, and a manager tells the team to fix the problem.
What’s wrong with this case? First, correlation isn’t always causation. But let’s say there really is a causal problem. Is it with “level 3” or is it with something more specific? A more granular approach would be to set up logging events for level 3 and see where players are dropping off. In this case, we might imagine that the analytics show that 85% of players beat the mini-boss on their first attempt, 30% beat the ditch jump ever, and of those who make it to the endboss, 90% beat it on the first try. What’s the problem? Probably something to do with the ditch, but the other two assets are suspiciously easy. We’ll get to them in a minute.
What’s The Solution?
What’s the solution? It’s not to say “remove the ditch.” It’s to go to the level designer, the QA team, or—in the best case—to the players and ask them what’s going on with the ditch. Imagine that players enjoy the jump because their character makes a funny sound, but the ditch is slightly too wide. You then AB test a narrower ditch and see if the rate climbs to something more reasonable.
Then the analytics show that level 3 still has lousy ARPU or retention. So, maybe the ditch was a problem by being too hard, but maybe the bosses were too easy as well. Tweak, test, rinse-repeat.
At no point in this process did analytics tell the developer the right solution, but they uncovered the needle in the haystack and then the creative took over to make a contextually appropriate fix.
Time Is Helpful In Getting At Causality
I also suggest that including the element of time is helpful in getting at causality. I include this in our systems so there’s some insight into whether the two “related” things are actually driving one another. Here’s an example: If players hitting level 4 convert 40% of the time, and do so within 30 minutes on average, and those who hit level 6 convert 40% of the time within 15 minutes, that tells you that there’s something appealing about level 6, or perhaps unappealing about level 4.
You can replace “level X” in this example with any kind of action you’ve instrumented. For example, the event could be a quest or a map or even a social event like “invited to friend list.” Again, think about analytics as helping find the needle in the haystack. Sometimes it points right to it, and other times it removes 99% of the hay. Either way, a dev should still pick the needle up and decide what to do next so that a data-driven decision doesn’t happen mindlessly.