For Adventure Ocean we had to develop and deploy not only the games but a scalable architecture and network infrastructure that spanned many technologies and stacks. Here’s how we built it.
At a high level the concept is pretty straight forward, we needed to create a game server that would record player progress and scores. We broke this down into two pieces, the Arcades & the Server. The Arcades are responsible for gameplay, showing user progress & interfacing with the server. The server is responsible for tracking user progress, managing Arcade configurations and health checks.
The idea was to create a number of independent services and software programs that would handle their specific roles and responsibilities, this would ensure that any points of failure were easily identified and could be fixed. These pieces would have a common communication channel and this ensured that we could easily work independently and then when time came for integration by having a defined spec and interface to implement all of the pieces just worked together.
The most important piece to the application was the Arcade, this is what the users interfaces with and it needs to always be available. Our approach was to break this down into smaller pieces of software and create a common communication channel that would ensure a sandbox like experience when developing games as well as when developing the UI and software to interface with any hardware peripherals. The approach was to create games that acted as video game cartridges, you could plug them into any arcade and they would just work, this meant that all Arcades needed was a game package and as long as the game package implemented the correct messaging interface it would be plug and play.
Acted as the primary orchestration layer, on launch it would do a system check of any connected hardware such as the RFID / Magstripe readers and ensure that they were functional. Once hardware was verified it would create a websocket server for internal communications between Unity and itself. Then it would launch the Unity runtime, this process had a health check applied to it to ensure constant uptime and if a crash were to occur it would auto restart and report any issues back to the server for further investigation.
We used React for building the UI / Presentation layer of the application. This allowed us to lean in on the power of web tech to iterate faster and not worry about having to build complex UI logic in unity which goes back to our guiding principal of separation of concerns for each piece of software.
We wanted to make the games completely independent of any other dependencies. This goes back to the overall idea of making them like game cartridges, where they did absolutely nothing but be games. They implemented a common communication interface with the websockets so that the Electron app could communicate and pass data from the server to the games and the games could pass data back to Electron for processing.
The servers primary responsibility was to track user progress and store configurations for each Arcade. The server implemented a REST API for all data transfers as well as a websocket server for realtime communications with the Arcades to track status and ensure availability and report any issues that may happen during the day.
The server was built using Express as the web server, Openrecord.js as the ORM for interfacing with PostgresSQL and the SocketIO for web sockets. It was packaged up and deployed as a Docker image which allowed for easy configuration on different ships and a central image meant we would push updates and have different ships download those updates if needed in an automatic manner.
This was all encrypted and required access tokens that were distributed to the Arcades when they were added to the system. When you booted up the system the server would await for connections from Arcades that were on the network, when those tried to connect the server would verify if a token was present and valid and if not wait for an admin to name the Arcade, assign a game and other configuration details. When the Arcade was added it received its API key – this key could be revoked at anytime and renewed if needed.
As I mentioned before the Games were designed to be plug and play with any Arcade, this meant that Admins could create any number of configurations for screens and games, and depending on flow of users or changes to the physical space they could put different games on different Arcades as they wanted to.
A Game package was a zip that contained the main Unity exe, a configuration JSON file that has paths to all assets that would be needed such as Artwork for that specific game, if the game required specific Hardware like a sound bar, Kinect, etc.
When an Arcade was assigned a Game it would download the zip from the server, unzip and place on the local storage for the Arcade. Each time the Arcade would connect to the server it would verify the version number and if the version had changed it would automatically download and install that update. It would also check if the game had changed and would automatically download and install.
On the hardware side we were running Origin Chronos PC’s with the 2080 TI RTX cards – so the GPU power was awesome and we really pushed the graphics and processing power of the machines for the gameplay.
We were interfacing with RFID and Magstripe card readers for scanning users in for gameplay. These were just USB peripherals that Electron interfaced with directly and parsed any data from them to pass back to the server for verification and data retrieval.
From a networking perspective we were given a VLAN for the system along with a NAT for access when shoreside in case we needed VPN access to debug a machine or make larger updates to system.
Working with Royal IT we standardized the network infrastructure so that when deploying on new ships we could carry over much of the same infrastructure ( IP’s, NATS, naming conventions ).
Installing the Arcade system was an interesting task as it had to be done after the ship had been upgraded in dry dock and was crossing the Atlantic ocean. There are only a few dry docks in the world that are large enough to support cruise ships, one of these is in Cadiz Spain. We would meet the ships there, have orientation with the IT and fabrication teams, then prepare for our crossing.
Crossings took from 1.5 – 2 weeks to complete. During this time we would work to ensure that the electrical requirements were met for each of the Arcades, that the wiring and network infrastructure was in place and then that the physical location made sense given the type of game that was intended to be hosted on each particular Arcade.
After the initial install we need to do testing, this requires the machines to run 24/7 for 5 days uninterrupted. While this is going on we do tests for any kind of Network failures such as the server going down, recovery and reconnection. We also do tests to ensure that the cooling systems are working properly for each machine. Since the Arcades are mounted in an enclosure that is cut into the bulkhead there needs to be a specific space for airflow, if there are issues with this airflow then the machines will overheat and there will be long term issues.
So that’s how Adventure Ocean came together on the technical side. It’s an incredibly fun project and technical challenge and the final product of everything coming together works like a charm and scales as needed!