Staging ground for the exploration and development of systems useful for infrastructure simulation toys and games, with a focus on the city simulator genre.
It contains various more or less loose parts that might prove useful for other projects and for reference.
To see which license applies to which parts, see Licensing.
- Paths or file names pefixed by "(*)" indicate that they are somewhere
in the code base. This is useful if they're somewhat likely to
change location and unlikely to be mixed up with other files when using
some tool to search the code base (so, maybe not (*)
facade.tscn
or something. :p). - When referencing object-oriented code, method and member names are
prefixed by
.
. - Is
*
used in a path directly, it's to be read like a wildcard, referring to all files that would match. Example:main_scenes/root*
would refer tomain_scenes/root.tscn
andmain_scenes/root.gd
.
Mostly for development, there are a variety of scenes and scripts in
main_scenes
which can be picked using buttons. The buttons are generated by
the dev-menu system which can be looked at in the sub-folder main_scenes_lib
,
based on the contents of the folder res://main_scenes
. By default, F9
toggles the visibility of the dev-menu. The chosen scene is saved in
user://dev.config
, as defined in res://global_defaults.gd
. For camera
controls and how to save the camera's state, see
PlayerWorldInterface.
In the global_libs
directory, there are a number of fits and starts of
libraries, as well as stubs that might not contain anything yet, but serve
as "sticky notes" of sorts for future units of code organization.
There's currently very little, but this is where stuff for orchestrating the generation and fitting-together of city parts is supposed to go, e.g. classes/data structures that encode the dimensions of a house and the nature of its sides, so it can be generated to fit its surroundings.
When messing around with vertices, some functionality is best plopped in a function, so it's simple to re-use but also for people to just copy and use in other projects. The "city_" prefix of this library means that its functions have some sort of bias towards use with city stuff, or at least things related to buildings.
An example of this is shear_line
, which provides the core functionality of
.shear_vertices
of ALine
of mesh_lib
. The city bias comes in with the
y-axis getting special treatment.
Mostly classes helpful for building meshes based on godot's ArrayMesh.
The core of them are organized in a class hierarchy which builds on a common
base class ASegment
, which is a wrapper around three kinds of arrays:
ArrayMesh.ARRAY_VERTEX, ArrayMesh.ARRAY_NORMAL and ArrayMesh.ARRAY_TEX_UV,
which can each be obtained by calling .get_array_vertex
, .get_array_normal
and .get_array_tex_uv
respectively (which, in the base implementation,
return their underscore-prefixed counterparts (e.g. ._array_vertex
)). The
A
prefix in these classes stands for "Array", as in"ArrayMesh", as opposed
to hypothetical AM
classes, which would stand for"ArrayMesh".
Any class that builds on ASegment
can override these methods and do what's
necessary there for its functionality.
ATransformableSegment
is a particularly notable subclass, as it introduces
transformability using Transform3D
, making it easier to build segments out of
other segments. Notably it keeps an untransformed ASegment
as a member,
.untransformed
, which it references whenever its values are (re-)calculated.
It also introduces an important method into the class hierarchy, .apply_all
,
which re-calculates the internal state of the object, in this case
._array_vertex
. It may be overriden by subclasses as needed.
This also introduces the concept of explicitly calling .apply_...
methods
after making alterations to segment objects, which makes it possible to have
segment classes which can be altered without frivolous re-calculations before
all the necessary changes for an appropriate re-calculation are done.
AVertexTrackingSegment
introduces .vertices: Array[Vertex]
, whereas every
Vertex
object maps to one or multiple Vector3
of the vertex array. They act
as delegates of sorts, knowing both an .untransformed_segment
and a
.transformed_segment
, with the latter intended to hold a reference to the
segment they belong to and the former its .untransformed
segment. They're also
explicitly permitted to make direct alterations to ._array_vertex
of both
segments.
This enables the convenient reference and alteration of vertices that overlap on the same position, but are, for some higher-level purposes, the same vertex.
Built on that is AModifiableSegment
which features a modifier stack. Modifiers
are subclasses of Modifier
(coming with M
-prefixed names), which implement the
.modify
method to define their behaviour. The method gets called in the
apply_all
pipeline, specifically by apply_modifiers
, which passes an
IndexChangeTrackingSegmentMutator
to.modify
exposing the segment the pipeline
is going over as .segment
, as well as various methods to alter the segment
whilst making sure that all the vertex tracking stuff introduced by
AVertexTrackingSegment
is still being kept in order. The .segment
there
corresponds to .untransformed
as far as anAModifiableSegment
object is
concerned. A good example of what this looks like is MShearVertices
.
Then come the segment classes for primitives which are subclasses of
AModifiableSegment
, such as AQuad
or ALine
.
Also a subclass of it is AMultiSegment
, which serves as a base class for
segment classes designed to combine segment objects into more complex segments,
such as AHorizontallyFoldedTriangle
.
Included in mesh_lib
, it provides ADebugOverlay
, which can be used to
visualize vertices and their values in ArrayMesh.ARRAY_VERTEX
-arrays.
The place for city-specific mesh_lib
-like/based stuff. Currently contains
ATwoSidedRoof
(an AMultiSegment
), an attempt at a procedural gabled roof
with an ascending/descending ridge (the corresponding "Mess" workspace is
(*)a_side_roof_mess.*
).
A system to stick Node3Ds together by the means of PartConnector
objects.
This is done using PartControl
objects, which are passed a Node3D to act on
upon instantiation. The connectors are gleaned from the Node3D's skeleton if
available, or the fallback skeleton the PartConnector
generates for itself if
the Node3D doesn't have one (it looks for a node named "Skeleton3D", which
should be there if the object (in Blender) is parented to an armature (assuming
gltf/glb export)).
PartControl
is designed as a base class which is either fully usable out of
the box for simple scenarios with no more than two bones or to be subclassed
to support other use cases. Importantly, it provides
default_receiving_connector
and default_docking_connector
. If the fallback
skeleton is used they reference the same connector.
For subclassing, there are ._init_...
methods which can be overriden without
having to re-noodle ._init
.
Notably, connector bones are modeled by Node3DTailedBone
, which means the
system expects each bone to have a counterpart suffixed by _Tail
by default.
The vector between a bone and its tail is to be understood as the direction
it is facing, with the tail supposed to be the "away" point, as in the point
the vector is pointing towards.
BasicBuildingPartControl
is a subclass of PartControl
from part_lib, and
introduces two connectors that could serve as a basis for every city part
expected to work with the part system: A bottom left front and bottom right
front connector - based on matching bones, of course.
A subclass of this is BasicFacadePartControl
, which introduces a top left
front connector, of course with the expectation of a matching (tailed) bone on
the skeleton.
A skeleton used for a city part with this system should have the following bones:
Bottom_Left_Front
andBottom_Left_Front_Tail
.Bottom_Right_Front
andBottom_Right_Front_Tail
.- and, if it's a facade, also
Top_Left_Front
andTop_Left_Front_Tail
.
There's also PartTransformer
, which is concerned with things like changing
the size of a part according to some other metric, like columns in a facade
grid, for example. Currently, the function respan_3_columns
makes use of it.
A very basic, experimental example of this can be looked at in
(*)blender_house_assembly_mess.gd
.
Currently in its early experimental stages and subject to wild changes, it enables the JSON-export of meshes from Blender and runtime-import into Godot.
The Blender exporter addon lives in res://utils/raw_export/blender_addon
,
in the sub-dir raw_export
. The sub-dir dev
contains the files needed
to develop the addon, such as a blend-file with a test cube, a coordinate
texture for it (which has been copied into res://assets/parts/textures/
,
since the dev
-dir has a .gdignore file) and
start.sh, which creates a
blender_user_config
dir in dev
, links the addon into it, and then
starts Blender with the addon available.
Mind you that the current default config of the script is to link the user's Blender config into it as well, which means activating the addon in Preferences is still required, and will enable it for your Blender install in general. That will probably change, though.
!!! WARNING !!!: start.sh
still has had little testing, and there were
(now fixed) issues where the system user's hotkeys ended up getting
overwritten. Considering untested constellations and future changes in
Blender this might not account for in time, it's probably best to take
a backup of your Blender settings and hotkeys before using it.
Besides that, start.sh
is highly configurable and written so that it could
be used for Blender addon development in general. Unlike the addon itself,
which is licensed as GPL-3.0-or-later, start.sh
is licensed
under the terms of the MIT.
The (runtime) importer code is currently in res://global_lib/raw_export
, and
res://main_scenes/raw_export_cube_assembly_mess.*
is the "mess" where the
development of raw_export, as well as adjacent developments (such as relevant
improvements of mesh_lib) are coming together.
As of commit 310b3e214a1731f09121a142e00e6bc44970f36f, the mess is rendering vertices and UVs as expected. It also looks like the normals are correct, but the cube might not be the best model to tell. Some more work might have to be done to accommodate normals regarding transforms, too.
Also, note that the coordinates on the texture refer to Blender, and aren't the same in Godot. What matters here is that an object's side is the same in Blender and Godot - e.g. the right, front or top side are the same in both, and that there is an out-of-the-box-no-further-ado way for that to be the case.
As such, Blender coordinates shouldn't be literally referenced in Godot to associate things with vertices. That should be the realm of the metadata feature, which isn't implemented yet.
Once the model-combination feature is implemented, it'll enable procedural
mesh generation based on Blender modeled mesh pieces. The idea is that, for
example, one would be able to model and texture windows, doors, facade
decorations and other architectural pieces that belong to facades in Blender,
and then, in the game, at runtime, combine those JSON-exported pieces into
one single MeshInstance
, including UV maps, normals and texture mappings.
This leaves room for the idea to run the JSON-exporter in a constantly running Blender process serving as a standing asset generator whilst the game is running.
However, at least as of the time of this writing, citygamesystems isn't particularly committed to relying on that kind of "Blender server" (if you want to call it that), but its development is mindful of the option.
There's currently (a not yet published) experiment called "Noodleplosion" which contains a working proof-of-concept TCP/IP server in Blender. That could be built upon in order to tap into Blender's power, such as geometry nodes, in order for a city game to request JSON-meshes from a running Blender process according to basic frame-meshes and/or curves (e.g. roads, adjacent building frames, etc.).
This architecture could also be useful for other types of generators, e.g. AI based ones, or entirely different workflows, such as players using Blender to create ingame-objects on the fly, or for modders to have a much faster feedback loop during the asset creation process.
Basically a camera, focused on an RTS-style view and with development features
in mind. It can be found in res://world_objects/player_world_interface
. For
copy-pasting it into another project, mind that it also needs some of the files
in res://global_lib/nilable_types
.
By default, F4
saves the camera's state to user://dev.config
and quits.
Be aware that currently, the state isn't per scene. So, if you save the state
in one scene, it'll be loaded by any PlayerWorldInterface
that's being
instantiated, regardless of scene.
It's built to work out of the box with sane defaults once copied into a project.
Features are configurable in the inspector, and if you'd like to customize the
keybindings, just add the relevant action(s) to the input map (it's designed to
prioritize your project's action settings unless you check
Override Existing Actions
in the inspector. Caveat: If your project has
actions with the same name without a keybinding and PlayerWorldInterface
is not
configured to override, it will currently have no keybinding for the affected
actions as a result).
It supports floating as well as collision-based "sliding" on a surface, switching between the modes by double-tapping space (by default).
res://dev/visualization
was built to aid with debugging using visual
indicators and "noodle" connections between them, which would be useful
when debugging node graphs (the transportation network kind, not the
Blender kind xD). It features chainable methods, so the indicators can be
set up in a (line-broken) "one-liner" wrapped in an assert
statement.
This enables debugging indicators that are only included in debug builds.
Be wary of the clunkiness of the builder-pattern-esque typing of the chainable methods and how it interacts with subclassing, though.
For quick visual debugging, simple CSG-based indicators constructed in functions has proven to be more wieldy than the Visualization library. Cavedig has become the place where these go, with the name inspired by the term "caveman debugging". ;)
However, in that spirit, it's also the place for any sort of whipped up indicator serving such a purpose.
It can be found in res://dev/cavedig
.
Found in res://dev/PlaneMap
, it's a basic plane with noise displacement
supporting mouse clicks for testing things.
The project files for this can be found here. They're CC0.
The 3D model itself, (*)facade_b.gltf
, is used in
main_scenes/blender_house_assembly_mess.*
.
This is a first stab at a workflow for building parts (using part_lib
)
that can be used for procedural generation. In this case, it's a whole
facade, which would still allow for a more hands-on approach to building
design, but also allow for procedurally generated floor plans (according to
which the facades would then be put together).
However, it also pays respect to a particular idea of sub-facade parts, where facades (or partial facades) would be organized in a grid and parts would be built with that grid in mind, so they could be combined procedurally to form facades or partial facades.
That is why in the .blend file,
you'll see *Grid*
objects (e.g. QuarterGrid_Middle
), that are supposed
to help with alignment when making facade parts. Facade D is thought to be
partitioned as follows: Vertically, it consists of a left (4 quarters),
middle (6 quarters) and right portion (4 quarters). Horizontally it's
divided into storeys, whereas 1 storey is 4 quarters high. The grid starts
2 quarters below the surface line, in order to accommodate the offset
caused by the half-storey height of its foundation.
Armature bones serve as hints for part_lib
-based procedural systems to
understand the part's most basic "snap" points (and, beyond the scope of
this particular implementation, any snap point really), thus Facade_D
features the three minimally required bones for
CityPartLib.BasicFacadePartControl
on the skeleton, which is called
Facade_D_Skeleton
in Blender (you can see it when unfolding the Facade_D
object) and Skeleton3D
in Godot:
Bottom_Left_Front
: Assuming that parts would be assembled from left to right, bottom to top, this point would mostly be used to snap a facade part to the previous part in the assembly.Bottom_Right_Front
: When assembling a storey (left to right), this is where the next facade part would be snapped to (using theBottom_Left_Front
bone of that next part).Top_Left_Front
: For the first part in a storey, this is where the first part of the next storey (assuming bottom-to-top assembly direction) would snap to. It's also the snap-to point for the first part of a roof assembly (or a whole roof part if it was procedurally generated as a whole mesh).
This would also enable slightly different facade assembly algorithms, e.g. one where facade parts are stacked on top of each other, with the resulting stacks then fit together horizontally.
Originally this was centered around the official Stable Diffusion model (1.4), which showed promising results even without any fine-tuning. With the advent of the Mitsua model, which is trained on CC0 data and some additional datasets with permission, the process was switched to Mitsua. As the original process, which involved generating a whole facade with one door and 9 windows in one go, didn't work very well with Mitsua (no fine-tuning), the process now involves various masks to inpaint the windows and the door into a (generated) facade.
The base facade image for this was generated using img2img, using a
"template" image (see the facade_d_1.*_grid_fixed.xcf
files). The files
contain guides that help in placing the windows and the door, whereas the
overall canvas is to be thought of as corresponding to the grid in the
Blender file (the QuarterGrid_Middle
object in Blender when viewed from
the front (the1
(numpad) view if you use the default shortcuts)). However,
as the grid in the file is offset by two of its quads on Blender's z axis,
it doesn't correspond precisely here, as the texture doesn't come with a
sous-de-terre bit. Also, mind you that the windows and the door here only
serve as a basis for Mitsua to generate window and door shaped
approximations that serve as a base for inpainting.
The template idea could be expanded into fully automated facade generation at runtime, albeit it would either have to be exact enough to map to a 3D facade model, or such a model would have to be generated based on the generated facade texture (basically what's been done by hand here).
The default license for the project is MIT (refer to LICENSE-MIT). The license of the blender addon raw_export is GPL-3.0-or-later (the LICENSE-GPL file in the repository's root is included for the sole purpose of GitHub picking it up (via licensee), so that it's clear that not everything is licensed under the MIT).
Textures (the CC0 also applies to the corresponding PBR maps for each file):
- assets/parts/textures/01392-3252335151-tiled roof doubleroman grovebury interlocked tiles interocked roof tiles tiled roof regent renown rounded tiles medieva.png
- assets/parts/textures/dark_wood_a_1_4_from_mitsua_01488-4241078918-Ebony wood planks wood grain inside of tree dark wood wooden planks wenge dark wood slab anime manga watercolor drawing.png
- assets/parts/textures/facade_d_1024x1024_1_6.png
- assets/parts/textures/roof_facade.png
Misc: