A sample project to use in testing workshops with the theme of testing and building "more testable" distributed (serverless) systems.
Relevant theoretical materials include:
Other good, practical material includes:
- Understand Legacy Code is great, see for example A quick way to add tests when code has database or HTTP calls
- You Don't Hate Mocks; You Hate Side-Effects
Based on the minimalist-serverless-starter project.
Configurations for ESLint and Prettier are reasonable starting points. The TypeScript config is very strict to get the most out of TS features. Serverless Framework is optimized (ARM architecture; short log retention; no versioning), CORS-activated, and set to safer-than-default settings.
The application starting point (the handler) is located at src/handler.ts
and a first demonstrational test is at tests/unit/demo.test.ts
. The rest of the tests and other "finished" materials are in the __finished__
folder and might need updates to their import paths when you place them in the root again.
- Recent Node.js (ideally 18+) installed.
- Amazon Web Services (AWS) account with sufficient permissions so that you can deploy infrastructure. A naive but simple policy would be full rights for CloudWatch, Lambda, API Gateway, and S3.
- Ideally some experience with Serverless Framework as that's what we will use to deploy the service and infrastructure.
Clone, fork, or download the repo as you normally would. Run npm install
.
npm start
: Run application locallynpm test
: Test the business/application logic with Jestnpm run build
: Package application with Serverless Frameworknpm run deploy
: Deploy application to AWS with Serverless Frameworknpm run teardown
: Remove stack from AWS
Using npm start
you can start using the local endpoint http://localhost:3000/greet
to call the service.
curl http://localhost:3000/greet
Which should respond back with:
"Hi there!"
The workshop is meant to be dynamic and interactive, but the below outlines an overall learning/experience flow for participants.
We need a service to greet people.
- Scope
- We use the concept generally, for a more extensive take see What is in your Testing Scope?
- Boundary
- See for example Defining Test Boundaries – An example and Avoid Test Duplication
- Contra-variant testing
- Determinism
- "Contra-variant testing": The benefits of testing the majority of code on a use-case level rather than per-function level.
- Confidence can be causated by determinism (in code) - determinism can be achieved by controlling side effects.
- Look at
tests/unit/demo.test.ts
to familiarize yourself with the structure of a typical unit test. - Look at
src/handler.ts
. How can we test this? How might you think about the scope of a given test - bigger, smaller and their pros/cons? - Implement a unit test on the entire handler. What do you foresee as issues with this solution?
- Split out "business logic" from the handler. How is testing, reliability and confidence improved by doing this?
- Reimplement the unit test on business logic, not on the handler.
We need support for dynamic input/output, i.e. providing and responding with your name.
- Mutability
- Read Tiny Programming Principles: Immutability and Immutable object
- See also To mutate or not – on Entities and Value Objects and the approach in Elegant Objects
- Validation
- Invariant
- "Always valid" domain model
- Read Always-Valid Domain Model, also covers the above concepts
- The dangers of "dumb" POCOs/POJOs and mutability of data.
- Leverage prior logical validation if input remains unchanged/un-mutated.
- Implement new functionality. How do we support both the new and old behaviors?
- Think about validation: At which levels can/should we validate? Once, or across all boundaries? How could we be supported by using API-level schema validation? (See
api/Greeter.validator.json
for an example) - Implement validation functions. Demonstrate both structural/compositional ("functional") approach and object-oriented (DDD-inspired: value objects, Data Transfer Object, "always valid state") approach.
- Implement any additional tests.
For "un-named" requests, we want to send back a response so it looks something like
Hi there, Luke Skywalker!
.
- Testing vs monitoring and observability.
- Fallacies of distributed computing - What is it really that we want to test when we conduct integration testing?
Make sure to uncomment setupFilesAfterEnv
in jest.config.js
to get the mocking capability working.
- How do we test an external service?
- Getting test data and storing it co-located to our code and tests.
- API response mocking using our test data. What about schema changes?
- Handling errors and problem states correctly.
- Implement tests.
We need to communicate each request to our service by emitting an event.
- Side effects
- Read the high-level Side effect (computer science) and more practical Side effects
- How to test around boundaries of systems, especially when using managed products like message queues and databases.
- Implement an event emitter (see
__finished__/src/infrastructure/emitter/Emitter.start.ts
). What problems do we get when using this for our testing? - Abstract the implementation into an interface and inject a dummy/mock/local variant for testing.
- Making an implementation testable ("test aware") up until the point of producing potentially adverse side effects.