Q1 under wraps

Three months of full time game dev complete.  At this point, some indie developers might have a game to show.  Not I!

Month 1: Draw something to the screen.  Slip & stumble over Rust

Month 2: Draw a bit more.  Light things up.  Simple asset pipeline

Month 3: Multithreading, networking, databases and architecture.  Rust is 50% less scary

Most of the technical pieces I needed to confront are understood.  Nothing compels me away from continuing to pursue this game.  Rust is not as mature as I'd like, at least community libraries.  Documentation is sub-par and my overall development speed is still 20% that of C#/Typescript.  Realistically speaking, the best I could hope for was being at a place where things are moving forward.  I'm happy with where things reside.

Architecture is becoming more important to figure out.  In Rust, it's common to have multiple "crates" - aka projects, in separate folders within a single github repo.  They call it a workspace.  Not exactly a clean micro-service approach but very convenient in writing a large project relying on multiple local libraries.  A single repo for this game is probably fine.

I'm taking a multiplayer-first approach to architecture.  The client "node" will draw things to the screen, accept user-input and present existing worlds/realms for the player to log into.  Assuming a local single player session, the player will launch an "orchestrator" node which manages the "server" experience.  


That orchestrator will launch/connect to a local instance of SQL to begin generating the world.  

Once complete, the orchestrator will request the launching of one or more "game" nodes from a "control plane" that can differentiate between a local computer and a cloud network. 

The game nodes will begin subscribing to events from the orchestrator and start synchronizing data from the database.  The game node runs the business logic of the simulation.  

The orchestrator will launch a "login" node and one or more "proxy" nodes.  The login node gives the client an initial endpoint to authenticate against and hands off the connection to a proxy node.  That proxy node is the gateway between the game node/s and clients.

You end up with something like this:

Something like this could scale within a cloud environment by adding additional hardware when the world grows and many players connect.  The control plane lets the orchestrator understand if it needs to literally spin up a physical server or vm (within a cloud environment) or thread within a client process (on local machines).

How is the world "partitioned" so that multiple game servers operate cooperatively within that single-world experience?  For a future blog post I suppose.

Comments

Popular posts from this blog

One More Step

New to this