A playground to build technology supporting the development of modular monolithic (modulithic) Java applications.
Note
|
tl;dr: Modulith is a convention based (but customizable) approach to use Java packages to define logical modules in monolithic Spring Boot applications, accompanied by an Spring Boot integration test extension to bootstrap tests for individual modules only. The tests also structurally validate the module structure. |
-
Create a simple Spring Boot application (e.g. via Spring Initializer.
-
Add the Moduliths dependencies to your project:
<dependencies> <!-- For the @Modulith annotation --> <dependency> <groupId>de.olivergierke.moduliths</groupId> <artifactId>moduliths-core</artifactId> <version>1.0.0-SNAPSHOT</version> <dependency> <!-- Test support --> <dependency> <groupId>de.olivergierke.moduliths</groupId> <artifactId>moduliths-test</artifactId> <version>1.0.0-SNAPSHOT</version> <scope>test</scope> <dependency> </dependencies> <repositories> <repository> <id>spring-snapshots</id> <url>https://repo.spring.io/libs-snapshot</url> </repository> </repositories>
-
Setup your package structure like described here.
-
Create a module test like described here.
When it comes to designing applications we currently deal with two architectural approaches: monolithic applications and microservices. While often presented as opposed approaches, in their extremes, they actually form the ends of a spectrum into which a particular application architecture can be positioned. The trend towards smaller systems is strongly driven by the fact that monolithic applications tend to architecturally degrade over time, even if – at the beginning of their lives – an architecture is defined. Architecture violations creep into the projects over time unnoticed. Evolvability suffers as systems become harder to change.
Microservices on the other hand promise stronger means of separation but at the same time introduce a lot of complexity as even for small applications teams have to deal with the challenges of distributed systems.
This repo acts as a playground to experiment with different approaches to allow defining modular monoliths, so that it’s easy to maintain modularity over time and detect violations early. This will keep the ability to modify and advance the codebase over time and ease the effort to to split up the system in the attempt to extract parts of it into a dedicated project.
In software projects, architectural design decisions and constraints are usually defined in some way and then have to be implemented in a codebase. Traditionally the connection between the architectural decisions and the actual have been naming conventions that easily diverge and cause the architecture actually implemented in the codebase to slowly degrade over time. We’d like to explore stronger means of connections between architecture and code and even look into advanced support of frameworks and libraries to e.g. allow testability of individual components within an overall system.
There already exists a variety of technologies that attempts to bridge that gap from the architectural definition side, mostly by trying to capture the architectural definitions in executable form (see jQAssistant and Existing tools) and verify whether the codebase adheres to the conventions defined. In this playground, we’re going to explore the opposite way: providing conventions as well as library and framework means, to express architectural definitions directly inside the codebase with to major goals:
-
Getting the validation of the conventions closer to the code / developer — If architectural decisions are driven by the development team, it might feel more natural to define architectural concepts in the codebase. The more seamless an architectural rule validation system integrates with the codebase, the more likely it is that the system is used. An architectural rule that can be verified by the compiler is preferred over a rule verified by executing a test, which in turn is preferred over a verification via dedicated build steps.
-
Integration with existing tools - Even in combination with existing tools, it might just help them to ship generic architectural rules out of the box with the developer just following conventions or explicitly annotating code to trigger the actual validation.
-
Enable developers to write architecturally evident code, i.e. provide means to express architectural concepts in code to close the gap between the two.
-
Provide means to verify defined architectectural constraints as close as possible to the code (by the compiler, through tests or additional build tools).
-
As little invasive as possible technology. I.e. we prefer documented conventions over annotations over required type dependencies.
At its very core, Modulith assumes you have your application centered around a single Java package (let’s assume com.acme.myapp
).
The application base package is defined by declaring a class that is equipped with the @Modulith
annotation.
It’s basically equivalent to @SpringBootApplication
but indicates you’re opting in into the module programming model and package structures.
Note
|
Notation conventions
|
Every direct sub-package of this package is considered to describe a module:
com.acme.myapp (1)
+ @Modulith ….MyApplication
com.acme.myapp.moduleA (2)
+ ….MyComponentA(MyComponentB)
com.acme.myapp.moduleB (3)
+ ….MyComponentB(MySupportingComponent)
o ….MySupportingComponent
com.acme.myapp.moduleC (4)
+ ….MyComponentC(MyComponentA)
-
The application root package.
-
moduleA
, implicitly depending onmoduleB
, only public components. -
moduleB
, not depending on other modules, hiding an internal component. -
moduleC
, depending onmoduleA
and thusmoduleB
in turn.
In this simple scenario, the only additional means of encapsulation is the Java package scope, that allows developers to hide internal components from other modules. This is suprisingly simple and effective. For more complex structural scenarios, see More complex modules.
An individual module can be run for tests using the @ModuleTest
annotation as follows:
package com.acme.myapp.moduleB;
@RunWith(SpringRunner.class)
@ModuleTest
public class ModuleBTest { … }
Running the test like this will cause the root application class be considered as well as all explicit configuration inside it. The test run will customize the configuration to limit the component scanning, the auto-configuration and entity scan packages to the package of the module test. It will also verify dependencies between the modules. See more on that in More complex modules.
For moduleB
this is very simple as it doesn’t depend on any other modules in the application.
Without any further configuration, running an integration test for a module that depends on other modules, will cause the ApplicationContext
to start to fail as Spring beans depended on are not available.
One option to resolve this is to declare @MockBean
s for all dependencies required:
package com.acme.myapp.moduleA;
@RunWith(SpringRunner.class)
@ModuleTest
public class ModuleATest {
@MockBean MyComponentB myComponentB;
}
An alternative approach to this can be to broaden the scope of the test by defining an alternative bootstrap mode of DIRECT_DEPENDENCIES
.
package com.acme.myapp.moduleA;
@RunWith(SpringRunner.class)
@ModuleTest(mode = BootstrapMode.DIRECT_DEPENDENCIES)
public class ModuleATest { … }
This will now inspect the module structure of the system, detect the dependency of Module A to Module B and include the latter into the component scan as well as auto-configuration and entity scan packages.
If the direct dependency has dependencies in turn, you now need to mock those using @MockBean
in the test setup.
In case you want to run all modules up the dependency chain of the to be tested use BootstrapMode.ALL_DEPENDENCIES
.
This will cause all dependendent modules to be bootsrapped but unrelated ones to be excluded.
If you find yourself having to mock too many components of upstream modules or include too many modules into the test run, it usually indicates that your modules are too tightly coupled. You might want to look into replacing those direct invocations of beans in other modules by rather publishing an application event from the source module and consume it from the other module. See [sos] for further details.
Sometimes, a single package is not enough to capture all components of a single module and developers would like to organize code into additional packages. Let’s assume Module B is using the following structure:
com.acme.myapp
+ @Modulith ….MyApplication
com.acme.myapp.moduleA
+ ….MyComponentA(MyComponentB)
com.acme.myapp.moduleB
+ ….MyComponentB(MySupportingComponent, MyInternal)
o ….MySupportingComponent
com.acme.myapp.moduleB.internal
+ ….MyInternal(MyOtherInternal, InternalSupporting)
o ….InternalSupporting
com.acme.myapp.moduleB.otherinternal
+ ….MyOtherInternal
In this case we have two supporting packages that contain components that depend on each other (MyInternal
depending on InternalSupport
in the same package as well as MyOtherInternal
in the other supporting package).
By convention, on the module level, only dependencies to the top-level module package are allowed.
I.e. any type residing in another module that depends on types in either ….moduleB.internal
or moduleB.otherInternal
will cause an @ModuleTest
to fail.
In case a single public package defining the module root is not enough, modules can define so called named interface packages that will consitute packages that are eligible targets for dependencies from components of other modules.
com.acme.myapp
+ @Modulith ….MyApplication
com.acme.myapp.moduleA
+ ….MyComponentA(MyComponentB)
com.acme.myapp.complex.api
+ @NamedInterface("API") ….package-info.java
com.acme.myapp.complex.spi
+ @NamedInterface("SPI") ….package-info.java
com.acme.myapp.complex.internal
o ….MyInternal
As you can see, we have dedicated packages of the module annotated with @NamedInterface
.
The annotation will cause each of the packages to be referable from other modules dependencies, whereas non-annotated packages of the module (internal
) won’t (including the module root package).
Note
|
Conventions
[check circle] – already implemented [question circle] – not yet implemented |
Given the module conventions we can already implement a couple of derived rules:
[check circle] Assume top-level module package the API package — If sub-packages are used, we could assume that only the top-level one contains API to be referred to from other modules.
[check circle] Provide an annotation to be used on packages so that multiple different named interfaces to a module can be defined.
[check circle] Prevent invalid dependencies into module internal package. — All module-subpackage by default except explicitly declared as named interface.
[question circle] allowedDependencies
would then have to use moduleA.API
, moduleB.SPI
. If a single named interface exists, referring to the module implicitly refers to the single only named interface.
[question circle] Verify module setup — We can verify the validity of the module setup to prevent configuration errors to go unnoticed:
-
[question circle] Catch invalid module and named interface references in
allowedDependencies
.
[question circle] Derive default allowed dependencies based on the Spring bean component tree — by default we can inspect the Spring beans in the individual modules, their dependencies and assume the beans structure describes the allowed dependency structure.
This can be overridden by explicitly declaring @Module(allowedDependencies = …)
on the package level.
[question circle] Correlate actual dependencies with the ones defined (implicit or explicit) — Even with dependencies only defined implicitly by the Spring bean structure, the code can contain ordinary type dependencies that violate the module structure.
[question circle] No cycles on the module level — We should generally disallow cycles on the module level.
-
As Spring Application Events are a recommended means to implement inter-module interaction, we could register an
ApplicationListener
that exposes API to easily verify events being triggered, event listeners being triggered etc.
-
ArchUnit — Tool to define allowed dependencies on a type and package based level, usually executed via JUnit.
-
jQAssistant — Broader tool to analyze projects using a Neo4j-based meta-model and concepts and constraints described via Cypher queries.
-
Structurizr — Software architecture description and visualization tool by Simon Brown. Includes Spring integration via automatic stereotype annotation detection.