At King, one of the major assets of the company is the games, since that is one of the major sources of revenue. The storyline, genre, and gameplay make them distinctive and unique but one of the characteristics that they are share is also the high quality. The creation of a King game requires a lot of time spent investigating, tweaking, and polishing until the final result – and one of the keys to delivering a good product is testing.
How do we test a game?
Testing of a game has several levels, from the simple and focused unit testing to acceptance tests done on the final product. Those tests are performed on King but with the peculiarity that must be performed on each one of the platforms where the game is distributed: Android, iOS, and Windows. But hey, aren’t all devices the same? Of course not… market fragmentation delivers us a very wide range of devices! We must also cover a high range of OS versions since each one provides features to their users… quite complex, isn’t it?
We also have to think about how King games are not isolated. Most of them can work as that, but their strength comes when you are connected, when you are online. Several services are available when you are online: promotions, gifts, game missions, friends, leaderboards, etc. We have also to test the game with those components in mind. What can we do?
Docker on a single game
With Docker we can package all the external services into different containers and deploy them on a machine (or multiple machines) so we can test out of the box the game with its connections. Do we need a server service (Application Server)? Yes? Then pack a container in there. A database? Yes? Put another container there.
Usually what we do is to create configurable containers on runtime. Since backends can be connected to several services we might want to control those connections. Sometimes we might want to control the interactions with a controlled provided data so we use a container, others we just point to a staging shared environment for quick gameplay tests or to a balanced container environment to check failover. All that is controlled on runtime with parameters so we have a quick deployment phase.
Docker allows us to quickly test and control the different scenarios that a normal user might encounter and all can be reproduced into a controlled environment just by switching services on and off.
Testing shared components
As stated before, doing games is one of King’s biggest assets but another one is creating services that can be shared throughout the King ecosystem. A game might introduce some new feature that might be interesting to other games (gifts, promotions, sales, etc..), it does not make sense to reimplement it on another game but to share that feature.
How do we test shared services?
In the process to develop and test a shared component we have the problem of interoperability. Will my new feature work on all games? Will games that don’t implement it yet have problems?
Shared services need to be able to test the interoperability of their services with all the games that are connected with them. They also need to make sure that new games can integrate easily with their current API and keep the new versions backwards compatible (up to an agreed point). How do we deal with that at King?
Docker for shared components
At King we also use Docker to prepare the environments of the shared components. As we saw on the games, we can delivery a container that can plug-in and plug-out local or shared components. With shared components we might want to deploy our solution and connect two or three games in it (we use the containerized Docker backends) then we can run our automatic test suite.
The pros of this solution are that we can test the interoperability of different games that might be at different stages of integration. We can switch on and off features and test how that affects other connected games. We can also share the same automated test suite among both games when we generalize the suite.
Package and share
As we see, all Docker containers are more useful when used together but they are also created by each game team or shared technology team. This helps to maximise the knowledge that each team has on their own product to create the best possible container but they also share containers among teams.
Databases, queues, caches, etc… – all those systems can be shared among different teams and at King we distribute them within our internal Docker registry system. So any team can get latest version of a database container just doing a Docker pull. We also make use of base image containers to provide teams with shared scripts, folder structure, or deployment procedures so the teams can extend them and provide their own customisations.
All those things are cool, but automated things are cooler. Whenever a team creates a new application or tool they can create the packaging script, push it to Git and our internal Continuous Integration system will detect the change, prepare the package, test it, create the container, and distribute it through the registry. From now on, all the teams that need it can interact with that application.
All those tools have an impact in development procedures and their performance. With Docker containers we have achieved at King the opportunity to test all the features that we want on a ‘real’ environment on the developer desk. They can easily, with a Gradle or Maven task, execute an automated suite in an environment that mimics production without the problems or overheads of deploying into staging.
We have also achieved the possibility of isolating an environment to fully control the conditions or easily replicate the scenario of a bug for reproduction, fixing, and checking. Also, integration of features is easier since teams can just request a specific version of a product and integrate with it and whenever they want check it out with the latest version.
What do you think? Might this work for you?