This project focus' on discovering the affect of dynamically changing a games difficulty based on player performance using gameplay analytics

2D, Endless Runner, Case Study

Concept

Santa Run is a 2d platform-based endless runner that utilises simplified dynamic difficulty adjustment to assist players of all skill levels in having an enjoyable and balanced experience. This unique system analyses players performances each time they play, to build accurate assumption overtime on whether the difficulty of the game should increase or decrease to suit their needs. The study proved that DDA systems can improve worse players performances whilst showing that the majority of them found the game more interesting as a result. However, it did not provide sufficient evidence on the effect against players who are outperforming the game and need and increased difficulty, suggesting further research is needed.

My Role: Solo Developer/Researcher

The biggest challenge faced on this project was designing how the difficulty system would calculate the player's performance, and how it would then use it to change the difficulty of the game.

 

During my literature review for this project, I found some similar systems that had been used on other 2d games. This gave me a good understanding of metrics I could track to understand performance. Using this information I developed the system to judge the player performance based on their survival time and the number of coins collected each run. It then compares these results against target values entered by the designer to then judge whether the player performed well or not.

 

Using this evaluation the system then increases or decreases the acceleration of the player's character which in turn makes the level easier or harder.

To track these changes and view how they affected player performance I created a system that would record the players average results each time the difficulty was evaluated. At the end of the experiment, this could be converted into a line graph showing their increase or decrease in difficulty throughout their session. It was expected that lowering difficulty increased time survived and coins collected. 

Secondly, to track how players reacted to the changes in difficulty I used had them complete a questionnaire that asked questions about their enjoyment and perceived difficulty of the challenge. These results were then correlated with their performances in the Case study.

Finally, for a more accurate assessment of the system's effects, participants were separated into groups based on their skill level. The first three attempts of the level acted as a diagnostic, with their average result being used to place them in a beginner, intermediate, advanced group. This then allowed me to review how the system interacted with different types of player. 

Click "See the Case Study" To view the full report and results

Click "See the GDD" To see a full breakdown of the DDA system and how it works as well as other features in the project.

Key Skills

  • Gameplay Analytics

  • Prodecural Generation

  • Internal and External Playtesting 

  • Participant and Tester Gathering

  • UX Testing

  • C# And Unity 

  • Games Research

Features

  • Dynamic Difficulty System

  • Procedural platform generation 

  • Generated Game Analytics for study review and evidence