Rambling about C# 9.0, games and networking

Back at it once again! Rambling about stuff, and not really even trying to make a point. I'm warning you. What else are you supposed to do when your train is late by several hours?

* * *

I recently got an urge to test out the new C# 9.0 language features and see if they could make it easier to write state-centric games (like USG:R I blogged about earlier). TypeScript is nice, but nothing really beats C#, so this is quite exciting. With the addition of data classes (now called just records), the promise was that it would be easier to use immutability, which is one of the core principles with React and Redux.

And things did work. And they worked just like with Redux. But that's the problem. The Redux way is that each reducer produces the new state by creating a new instance of the state with the changed part replaced: return {...oldState, counter: oldState.counter + 1}. Very simple, and the reducers stay pure by not mutating the input values. And each state object can be safely stored in case there's need to do some kind of time travel debugging or state replaying, or anything like that. But the very huge downside of this is that it gets very messy if the state hierarchy is any deeper.

The alternative is something like ImmerJS, where the reducers are allowed to mutate the state directly: state.very.deep.someArray.push(value). The library takes care of efficiently making a copy of the state object as needed, meaning that if something isn't modified, it also doesn't need to be copied either. So that 10 000 element array isn't copied each time the counter is incremented. And it works great. The code is A LOT simpler, and the performance penalty isn't actually that huge thanks to the on-demand functionality.

Imagine my dissapointment when I realized this. I'd been waiting for records for maybe few years already, before even hearing about ImmerJS. And then when I finally get them, the problem I wanted them to solve had changed... But that isn't to say they aren't a good addition, and can't be used elsewhere. But state stuff was what I was really waiting them for. For carrying simpler things - espicially events - they are great.

* * *

But back to the state games. The dream is to be able to write them in C# and surpass the productivity of React and Redux. So, what do? Why not just mutate the global state like all the other normal games. The state can be explicitly copied if need be. Well. That is a good point. Can't really counter it. (There's also some very React-like bindings for Blazor and MAUI, solving the other part of the equation).

Except if the domain is network games! If the game can be constructed in such a way that there isn't too many state changes (and preferably the state itself is small), it would be trivial to transmit those changed states over the network, and render the state in the client. And the big thing: what if we took a page out of the ImmerJS playbook, and replaced the state object on the server with one that keeps track of the concrete changes made to it? Then just the changes themselves could them be transmitted over the network. No need for expensive copying, and no need to transmit the whole state. Also no need to "manually" compute the deltas, as the data structure itself does it. It sounds so cool!

Although realisticly I'm not sure if this has ever been a problem. The states in most games should be relatively small anyway, and especially with the hardware today the deltas can be computed with just brute force by even relatively simple algorithms. Just like games like Quake 3 have been doing for ages. I highly doubt it will at any point really be that prohibitely expensive. But one can always dream of doing things better. Especially when it comes to cloud-scale and IoT, where every cycle counts.

Speaking of which. Related to the above, I've been building a general purpose framework for state based applications. I'm not sure if I'll ever really get to using it, but it's been nice coding something of my own once in a while. And using Redis always evokes warm fussy feelings :) If executed well, a framework like that might have some money-making potential. Or just be a nice tool to easily create some multiplayer games. Or just research, as always. That's the most important point.

This framework I'm making consists of an ASP.NET Core SignalR WobSocket gateway that handles user registrations and authentication (with EdDSA JWTs! Although with an unsafe curve, because libraries...) and then connects them to sessions that are persisted on a sharded Redis pool. The clients receive state updates and can send inputs to a session's input queue. They can also optionally see the other clients participating in the session. The session itself is managed by server applications (for the lack of a better name). They connect the the session's Redis server and take ownership over sessions assigned to them (or otherwise delegated upon them). Then they simply take inputs from the input queue and mutate the state based on the input and the current state (just like Redux), and finally publish the new state to the clients (in the future hopefully with a delta). They can also inject their own events (such as time passing) to the event queue. If a client needs to reconnect, it can simply read the current state from Redis. And if the server crashes or needs to restart, the state and the inputs are persisted in Redis, resulting in no data loss. Well, of course as long as Redis stays alive.

The key point is that this kind of architecture enables laughably easy way to upgrade the server code, as there isn't really any downside to killing the server application and then having it restart with changed code. When it starts it just reads the current state and starts processing inputs like before. Of course this could just be achieved by co-operatively shutting down the old server instance and saving the state before it closes. But that isn't any fun. And it would be extra effort to support taking over individual sessions. With that system it comes for free. Not that it would really be of that much use, but how cool is that in theory! (Each session has a host serial, and only the application server holding the most recent serial can make changes to the state. When taking over a session the serial is just incremented, invalidating the old server.)

Anyway. What I also like about this is how scalable it is. There isn't any application-specific code on the gateway and as all the clients communicate only via sessions, and one client is connected to only one session at a time. This means that it is easy to spin up as many instances of the gateway as is needed, and other than the Redis session backplane there really isn't any cross-communication between the gateways. Except of course the user registration and login. Also, the sessions themselves don't need to talk to other sessions and are completely self-contained except for the orchestration (what server instance starts serving a new session). This means that the Redis side of things can also be easily scaled by sharding the sessions by their id. And further, the server application nodes themselves can also be infinitely scaled. So should something happen and the games running on that framework became hugely popular, it is no problem architecture-wise to just spin up some more instances.

The only problem is how reliant this is about Redis. The initial prototype I've been building makes excessive use of Redis lua scripting and combines dozen operations in things like adding an input to the queue. It should be extremely unlikely, but should something go sideways during the execution on such a script, the recovery won't necessarily be easy. Although most of those operations are about checking consistency and updating expiries anyway, so it really isn't a problem. But what I am really interested about is the performance. I'm really curious to see what kind of performance charasteristics this kind of system has. Also, scalability is nice and all, but as I've talked about before, single-node performance is what is really the name of the game. Less nodes, less cost. This isn't really going to end up in that category ':D But hey, we have to remember that all's of course not that straightforward, as development time is also a factor. For once...

And this thing here is actually really simple and easy. I hope to prove it. At least to myself :d After that I'd like to test some other topologies. Dwell on all the missed performance and the system's dependency on Redis. A loose coupling between the gateway and the servers, but an even tighter coupling decoupling them.

But at least it'll be fun.

Dreaming about Ampere

Hello again,

quite a while since the last update. Again. But at this point you should know me well enough to expect it. Or maybe you just happened to read the previous post. Go figure.

Anyway.. One lesser thing I've been pondering is how me upgrading to an ultrawide display might have played a part in me not finding as much joy from gaming as I used to. Sure, this a minor thing considering all the other factors, but a factor nonetheless. Ar at least the lack of the best possible GPU is a good excuse to not play games. Irregardless of that, I've been in the process of upgrading my GPU for a while now, and I was extremely happy with how good of a product Nvidia managed to launch with Ampere!

"A great generational leap in performance", good pricing, and even the Founder's Edition models seem very good. Typically FEs themselves have been a bit lacking, but now it seems that they might actually be the better product! I'll still of course have to wait for the reviews, but I don't remember being this exicted about hardware in quite a while. It's also one of the very few things I've really been excited about the whole year.

As that's not all. I recently got myself the Valve Index VR kit. The waiting list for it was so long that I'd almost forgotten about it. Now, I've mostly been using it so get some exercise in form of Beat Saber, but I also did play through Arizona Sunshine. AS has some flaws, but the gunplay itself was rather nice. I also started Half-Life Alyx a while ago elsewhere, but haven't really gotten to play it more. This is unfortunate. Instead, I've been busy playing Minecraft, and just simply keeping busy.. This is hopefully maybe changing, but we'll see. Anyway. What saddens me a bit is how increasing the render resolution scale in AS made the game a lot more clearer, but completely killed the FPS. Maybe the new GPU will chance that! And most certainly I'll get that sweet 144 Hz on Destiny 2 again, too!

Hmh. It seems that "whiles" are the staple of my timeframes :p And while this post feels a bit lackluster, it feels good to produce some content once in a while. Maybe I'll even get motivated to write some development-themed post at some point, too. We can all hope, at least :) We'll see, we'll see...

And I know what you are thinking. About everything. We'll see.

Presenting USG:Rerolled


Despite the hardships, I’ve been able to at least occasionally dedicate some time for the continued development of this year’s Finnish Game Jam / Global Game Jam game. Worked alone this time, and still made a nice game. With TypeScript :o
Now, with the continued effort, it’s starting to look pretty good! The game itself is a mix of Dicey Dungeons, Slay the Spire and maybe even FTL - Faster than Light. It’s about dice-rolling and loot in a turn-based combat, from encounter to encounter. Originally it was to have gameplay that would have emulated Cultist Simulator in at least some questionable way, and hence got the name SDC: Slay Dicey Cultists. But then it occurred to me that this could be an excellent chance of carrying on the torch from the project that is my unicorn, The Peli – or as more recently known, USG. So let’s give it up for USG:Rerolled!
The main focus of the game is the equipment, and the many effects the pieces can have. This made me to choose to implement the effects with straight up code, instead of trying to codify all the effects in some kind of standardized structural form. It's been a good choice for productivity. But I dread the day I need to make some kind of breaking chance. I also did some snooping, and found that this is how other games have chosen to approach this problem, too. As the game will have (and already has) quite an assortment of equipment, it only made sense to also create and editor for the items. And the editor. Well… I guess I’ve spent as much time on it as the game, or something `:D
The editor has some standard fields for the most basic attributes of the equipment. It also has an integrated code editor with syntax highlighting and auto-completion. This is achieved by embedding the very same text editor component (add diff viewer) that powers Visual Studio Code, the Monaco Editor. The changes made via the editor are versioned separately from the rest of the game, and the editor has an ASP.NET Core backend for implementing the filesystem and code generation functionalities. Upon saving the data, the game itself is automatically reloaded with the new equipment data. I’m rather pleased of the setup. I’m planning on extending the editor for also creating random encounters for the game to balance out all the combat. Then there’s always some quality-of-life improvements to be done… But overall, it’s already in a surprisingly good shape! The editor even has a graph of the saved versions and their relations...
The latest addition to the game was an encounter map, which brings some structure to the game. Compared to the work already put to the game, this was a relatively small addition, but it did take a few days to get the SVG-based drawing and random generation to work in a satisfactory way. Evening out the randomness would be the next step. Speaking of next steps, I’m kinda testing out if I could bring ships with limited hardpoints to the mix, without everything getting too confusing. Then I’d be quite close to the dream that is USG. See you next time!




Plans


Everything I planned for. Crumbling before my very eyes. And I’m not even talking about the pandemic.

I kinda thought that now that I got my studies finished, I’d get a chance to focus on what is important and make time to do things. But that didn’t quite plan out, as there surfaced some things that’ll be requiring my attention. And not just for a little while, but for several years :( And some things that directly contradict everything.

Though it (mostly, not completely) depends on whether I think about those things or not (: It would be wise to give some thought to one of them, but thinking about it doesn’t really help it at the moment… Was this vague enough? :s

But, as they say: it is important to try to keep sense of normality in a time of crisis. Remains to be seen how well that'll pan out.

Nuget package creation addentum

A short and informational blog text for once!

In preparation to some big plans™ I changed the way I generate my Nuget packages. The old way where all the metadata was specified in the .nuspec file was otherwise good, but it didn't specify version metadata for the actual DLL.

I've now changed the process so that version is specified in the .csproj file, via the VersionPrefix element. I've also defined a VersionSuffix element with a value of dev. This way when the project is built "locally" the assembly version ProductInfo field reads for example 0.1.0-dev. Whereas when the Nuget package is built I pass the option -Properties VersionSuffix=, resulting in 0.1.0. And as a side effect of moving the version element away from nuspec, I also had to also specify -Version $version, where $version is a PowerShell variable parsed from the csproj file and include a placeholder version in the nuspec file.

While slighly complex, this now allows me to fetch the version programmatically and know whether the library is an "official" release version via Nuget, or a locally compiled version with unversioned changes. I've also considered to just inject the version together with the prefix as a build step, but for now I'll try my luck with this thing that needs the assembly to reside on disk:

PS: there's a ton of unresolved issues with the Nuget CLI program, many of them several years old. When investigating the above, I also noticed that my Nuget packages don't include external libraries as dependencies even when using the option -IncludeReferencedProjects. It is a reported bug, but hasn't been fixed. One alternative might be to use dotnet pack instead of nuget pack, but I'm not sure what other changes that would entail.

InfluxDB broke, again

It is not a long time ago that I blogged about how I experienced my first fault with InfluxDB. And now it has happened again. This time a bit different, though. And much worse.

It all started with the familiar NATS alert, so I tried rebuilding the indexes again with buildtsi and thought that would be it. I was wrong.

This time there was an invalid CRC on one data block, which caused segfaults(lol) within the InfluxDB executable. Speaking of error checking... the verify command in influx_inspect has a bug. It erroneusly reports a block as healthy, because the right counter is never incremented. But anyway... The block is faulty. What now? Where is the option to fix it, or remove the block? Removing the offending file manually doesn't help.

So in the end this made me abandon the data, and start from scratch :'( And for some reason docker-compose kept doing some weird caching with the data (even when it was a filesystem mount), even after downing and removing the old container, and emptying the old data directory, and it was an effort to get everything working again...

I should probably consider upgrading the hardware. If that is the real problem. For the lulz I also asked for a quote of InfluxDB enterprise. If I cluster it, then one node can fail and it recovers, right? Right?? But I don't expect the quote to be realistic. And even if it was, it's probably still too much. One alternative might be Apache Druid. But it, too, seems a bit too young a product.