An application to record go games. Try it out at go.chiquit.ooo!
Because I want to review them after I play them.
I wanted a way to record games played in person on my phone, but the apps I tried were hard to use on a touchscreen. This application is built specifically to cater to my needs, which are to record, store, review, and export games. AI review is tragically outside the scope.
There almost definitely is. My ulterior motive was to have a relatively simple web application that I could reimplement in various languages and platforms. I'm the only anticipated user, so I can set my own requirements and timelines, and I get to sharpen my skills through repetition. "Recording go games" is a pretty minimal set of features for a website, but it's still a meaningful step up from "Hello, World".
So far, I've finished four implementations:
- Python + Django - Written using Django and Django REST Framework. The original, reference implementation. I am also using the built-in migration system to manage DB schema changes, since I don't want to keep migrations synchronized across all the implementations.
- Python + FastAPI - Also written in Python, but using FastAPI for the REST stuff and SQLAlchemy for the DB. So far this is my favorite :)
- Haskell - Written in Haskell using scotty for the REST API and hasql for the DB. It's a powerful language, but the build times are atrocious.
- Rust - Written in Rust using axum for the REST API and tokio-postgres for the DB. Architecturally it's practically identical to the Haskell implementation, but I'm much fonder of Rust as a language.
- Deno - Written in TypeScript using the deno runtime as opposed to the more established Node.js. Deno ships with a surprisingly complete web server API, so I opted to use it directly instead of a more traditional web framework.
OpenAPI specs are provided for FastAPI and Django. Because I took some shortcuts on the Django implementation, the FastAPI API is more canonical.
I'm more interested in API services than traditional web template stuff, so I decided to make it a single page application. The website consists of three parts:
- The web app, written in Vue.
- The API servers, in all their glory.
- The Postgres database.
The web app sends requests to the server using the REST API, which updates the database accordingly. The REST API and the database schema are both well-defined, so as long as the various server implementations conform to those expectations, they are interchangeable.
For some extra credit, I decided to manage the deployment myself on my own hardware rather than using the free tier of a PaaS like Heroku. I had some Raspberry Pis lying around that are ample for my user base. The "proper" tool to manage a containerized deployment would be something like Kubernetes, but my understanding is that the primary value proposition of the heavy-duty containerization tools is scalability, which is not part of my use case. Instead, I'm just using Docker Compose to manage the various API servers.
An interesting caveat of using Raspberry Pis is their truly abysmal processing power. It is actually more than sufficient for running the services since there is practically no load on the API server, but it is quite problematic for the build process. Pis have ARM processors, so Docker images built on x86 architectures are not compatible. So far I have just been building the images on the Pi, but it is quite slow. I did decide to just commit all the static resources directly into the repository so that build step can be run in my development environment.
I'm also using nginx as a frontend proxy for all my locally hosted servers, SSL termination, and static file hosting.
Each implementation maintains its own set of unit tests. I initially tried to unit test all the code, but it proved to be painfully tedious to continuously rewrite the same exhaustive unit tests in each language. I ultimately settled on unit tests for the code related to the core game logic, since that is the most intricate, algorithmic, and error-prone.
Because all the server implementations have the same API surface, they can all use the same integration tests, which makes life much easier. The integration test harness uses Docker Compose to bring the various server containers up and down as needed, and to manage a postgres container to provide the DB. The integration tests are the arbiter of implementation compliance, which is the only reason I feel like good unit tests are redundant.
I did not and will not write any. There is a time and a place for browser testing, and this is neither. If the application was an order of magnitude larger, had more user stories than a single QA person could check in a reasonable time, had massively more development activity, and had a substantial user base who would be negatively impacted by regressions, then I would consider it.
I have also been keeping a blog chronicling my efforts. There is no intended audience for it, it's more of a meditative retrospective for me to maintain focus on what I've been working on.