Fire up your Unreal Engine-based game on all Graviton cores

TutoSartup excerpt from this article:
We will start with the game image build process that supports both CPU architectures… How we built a multi-platform game image The following code and configuration excerpts have been edited to better fit the blog format… This starts with building a Docker image from the game code and assets i…

This post is written by Yahav Biran, Principal Solutions Architect, and Matt Trescot, Games, SA Leader – Americas

Historically, the art of creating and running complex game servers locked developers in to a single CPU architecture, typically Intel/AMD. Our developers tell us it’s hard to introduce different CPU architectures once game servers are built for a given processor. In this article we’ll show you how to build an Unreal Engine game with full support for the AWS Graviton processor. Plus, we’ll show you how to meet your performance requirements at a 42% lower cost than comparable current generation x86-based instances. Let’s dive in.

Dedicated game servers need to support multiple players at a predictable tick-rate per CPU core; also, player events such as 3D calculations need to be batched back to the connected players at the same tick-rate for a fun and fair game experience. Hence, it requires that CPU cores be fully allocated to the game session to avoid context switching caused by simultaneous multi-threading. As a result, game server operators asked us how to deactivate x86 Hyper-Threading Technology (HT) on x86-based servers. Additionally, during critical game scenarios, the server can use CPU cache instead of CPU memory when relaying game state to all connected players. Unlike x86 Hyper-Threading, AWS Graviton instances map every vCPU to a physical core and offer a large processor cache dedicated to players’ real-time game events.

We show 42% cost savings with the Graviton instance for the same performance, 22% more throughput and a 20% lower price than on x86 cores with an Unreal Engine-based game, Lyra Starter Game. We deployed two game sessions on the latest generation Amazon EC2 compute optimized instances, one on Graviton-based, c7g.large, and the second on x86-based, c6a.large.

We set out to load both servers to their CPU capacity, so we used 30 shooter bots (B_ShooterBotSpawner) because we found that the shooter bots require 0.3 CPU cores to simulate a game session with high concurrency that consumes up to 95% of the machine CPU.

Figure 1 - Bots spawning during load test on both CPU types

Figure 1 – Bots spawning during load test on both CPU types

The game server started at 12:30 and we added 30 bots with 5 players for each session up to 12:40 where both servers (upper graph – lyra-X86 and lyra-Graviton) were at their CPU capacity. We let the game run for 10 game sessions, 11 minutes each, and observed game server outbound traffic (middle graph – pod_network_tx_bytes) and memory allocation (bottom graph).

Figure 2 - Resources usage during the simulation

Figure 2 – Resources usage during the simulation

The simulation demonstrated that connected clients had a close-to-real-game experience of the game based on steady outbound traffic rates and memory consumption on both servers. The CPU usage of the X86 instance (lyra-X86) was 97% and 75% was consumed by the Graviton instance (lyra-Graviton).

The remainder of this post explains how you can take advantage of Graviton’s cost performance advantages. We will start with the game image build process that supports both CPU architectures. Next, we will describe how to deploy the game, play it, and observe the results.

How we built a multi-platform game image

The following code and configuration excerpts have been edited to better fit the blog format. The full sample code is published on the aws-sample github repo. Below we describe the changes we made to the game’s code and config in the continuous integration phase.

Figure 3 - Game continuous integration

Figure 3 – Game continuous integration

We use AWS CodePipeline to automate the game build process. This starts with building a Docker image from the game code and assets in AWS CodeBuild that is pushed to Amazon Elastic Container Registry (Amazon ECR) so it deploys as a container which the player can play.

The build process needed to create two different images because Amazon Graviton instances use ARM RISC (Reduced Instruction Set Computer), and Intel uses CISC (Complex Instruction Set Computer). We modified the existing continuous integration with Docker’s multi-platform images so the game image references both Graviton and x86 processors to simplify configuration in the continuous delivery system.

The first step (1/StartBuild) is to compile the package and game binaries using Lyra Sample Game to produce game server binaries sets, for Graviton and x86 using the command docker build.

Avoid use of platform specific package names like amd64 and aarch64 to simplify the build scripts. For example, instead of using:

ARG ARCH=aarch64 or amd64

RUN curl "https://awscli.amazonaws.com/awscli-exe-linux${ARCH}.zip"

RUN unzip awscliv2.zip

RUN ./aws/install

Use:

RUN pip install awscli

Next, we build (2/Pull code & build) the image by reusing the docker build step for x86 and run it on a Graviton-based instance (see the BuildARMAssets and BuildAMDAssets CodeBuild projects) and then create a docker manifest that includes both images (see AssembleAssetsBuilds CodeBuild project) so the docker client (the k8s node that hosts the container app) pulls the platform specific docker image at deploy-time instead of maintaining two docker image configurations.

Note that our example uses the CodeBuild images for Graviton and x86 to build the same code and config. The extra CodeBuild step (BuildARMAssets) runs at the same time as the original step (BuildAMDAssets), keeping the total build time 12 minutes for fresh build and 5 minutes when using ECR cache for continuous incremental builds.

Figure 4 - CodePipeline that pulls code and build two images with single image URI in ECR

Figure 4 – CodePipeline that pulls code and build two images with single image URI in ECR

How we deployed the game

We deployed the Graviton and X86 variants of the game on Amazon Elastic Kubernetes Service (EKS) as a Kubernetes (K8s) deployment that controls the number of game servers (K8s pods). We controlled the game server connectivity with a NodePort K8s service per K8s pod to allow public access to the game clients. We used Karpenter to provision the Amazon EC2 instances and used Amazon CloudWatch Container Insights to monitor the CPU, network, and memory consumption.

We used Karpenter to provision the Amazon EC2 instances and used Amazon CloudWatch Container Insights to monitor the CPU, network, and memory consumption.

We used a multi-platform docker image to ensure both k8s deployment variants run the same code and configuration. We used the pod lifecycle postStart hook to create a pod-specific label which is used by the players that connects the game.

Finally, we used the pod lifecycle preStop hook to clean-up the service when the pod terminates.

image: $AWS_ACCOUNT.dkr.ecr.$AWS_REGION.amazonaws.com/lyra:lyra_starter_game
imagePullPolicy: Always
command: ["/usr/local/lyra_starter_game/LyraServer.sh"]
lifecycle:
  postStart:
    exec:
      command:["/usr/local/lyra_starter_game/create_node_port_svc.sh"]
  preStop:
     exec:
      command:["kubectl delete svc `kubectl get svc|grep $POD_NAME | awk '{print $1}'`"]

Try it yourself by following the tutorial and deploy the Amazon EKS Container Insights using the quick start instructions.

Play and observe the game server performance

The last step is to play the game is to discover the game server endpoints and connect the game clients with the Lyra Game Starter server.

In the example below, our game server endpoints are:

34.216.42.162:32384, 35.82.31.15:32019

[$]kubectl get svc | grep NodePort| awk '{print $1,$5}'

lyramd64-6bf8cdd4db-ts7hq-svc-34-216-42-162 7777:32384/UDP

lyrarm64-5568b7bc6c-wtrqx-svc-35-82-31-15 7777:32019/UDP

If you haven’t done so in the build phase, compile and package the client binaries for your favorite OS and connect the game servers. On Windows:

./Binaries/Win64/<PROJECT_NAME>Client.exe 34.216.42.162:32384 -WINDOWED -ResX=800 -ResY=450

./Binaries/Win64/<PROJECT_NAME>Client.exe 35.82.31.15:32019 -WINDOWED -ResX=800 -ResY=450

The -WINDOWED, -ResX=<HORIZONTAL_RESOLUTION>,and –ResY=<VERTICAL_RESOLUTION> command-line are set here for convenience. This enables you to see both client windows on the same screen for testing purposes.

At this point you will be in two game sessions opened so get ready to shoot 30 bots 😀

At this point you will be in two game sessions opened so get ready to shoot 30 bots

Conclusions

Graviton instances provide the scalability, performance, and cost effectiveness needed to provide an excellent gaming experience for a large number of players. Epic Games and AWS have collaborated to build support for AWS Graviton instances into Unreal Engine. We have shown you a fun game powered by Unreal Engine that uses 22% less CPU with c7g.large and is 42% more cost efficient than the c6a.large EC2 instance. We’ve also shown you a method to benchmark your game on Amazon EKS, AWS CodePipeline, and Amazon CloudWatch. We invite you to try this with your game to benefit from high performance and optimized cost.

Fire up your Unreal Engine-based game on all Graviton cores
Author: Chris Launius