An experiment/tool for dynamically generating Ecto and Absinthe schemas from a Postgres database. Relies heavily on ideas from Postgraphile, and very explicitly on the introspection query that Postgraphile uses introspect the database.
This isn't on Hex but if you wanted to try it out locally, you can clone the repo and use the path option to install it in a Phoenix app like so:
def deps do
[
{:pg_gen, path: "/path/to/local/pg_gen"},
]
end
You can run it once, via mix tasks, to generate files and call it a day.
# To generate the ecto schema
mix pg_gen.generate_ecto --schema app_public
# To generate the absinthe schema
mix pg_gen.generate_absinthe --schema app_public
You can also use it more like Postgraphile.
defp deps do
[
# add to your deps:
{:pg_gen, path: "../path_to_pg_gen"},
{:absinthe, "~> 1.6"},
{:absinthe_plug, "~> 1.5"},
{:absinthe_phoenix, "~> 2.0"},
{:dataloader, "~> 1.0.0"},
{:base62_uuid, "~> 2.0.0", github: "adampash/base62_uuid"},
{:absinthe_error_payload, "~> 1.1"},
]
Then run:
mix deps.get
In this example, I'm using a forked UUID62 lib with an alphabet order that matches a corresponding Node library so they can interop.
This is relatively normal Phoenix+Ecto, with the exception that you set up configure it twice; once with "authenticator" credentials and once with your root credentials. . Set up the db credentials, db name, etc.
The way this currently works is you configure two repos, like so:
config :example, Example.Repo,
username: "postgres",
password: "postgres",
database: "example",
hostname: "localhost",
show_sensitive_data_on_connection_error: true,
pool_size: 10
# Configure your database
config :example, Example.AuthenticatorRepo,
username: "postgres_authenticator",
password: "password",
database: "example",
hostname: "localhost",
show_sensitive_data_on_connection_error: true,
pool_size: 10
The AuthenticatorRepo
is what PgGen uses to run the introspection query,
which generates all the code for the GraphQL server (from Ecto to Absinthe). It
assumes it's a db role which has the default permissions for RLS. If a user
doesn't have permissions to read a table, that table won't be in your
public-facing schema.
(If you weren't planning to use RLS, you could probably just duplicate the config between the two. It would be sensible to make this configurable.)
If you are using the AuthenticatorRepo
, you'll want to add the very barebones
module like:
defmodule Example.AuthenticatorRepo do
use Ecto.Repo,
otp_app: :example,
adapter: Ecto.Adapters.Postgres
end
To the children in your application supervisor, add your elevated repo if using (your default repo should already be here) and PgGen:
chilren = [
# ..others...
# Start the Elevated Ecto repository
Example.AuthenticatorRepo,
# Start PgGen
{PgGen.Supervisor, schema: "app_public", output_path: "../../aboard/data/elixir-schema.graphql"},
]
You'll notice that PgGen requires setting your scheme (defaults to public
if
you don't) and a path to write a generated GraphQL schema. This is handy for dx
tooling.
If you're coming from Postgraphile, you're probably using RLS (and already did
the AuthenticatorRepo
bit above). In order to execute queries using RLS with
Ecto, you need to do wrap your DB queries in transactions that set the user
role. My code that does this is in Example.Repo
and looks like:
alias Ecto.Adapters.SQL
@doc """
A function to wrap Ecto queries in transactions to work with Postgres
row-level security. Requires a session_id
"""
def as_user(session_id, txn)
when is_binary(session_id) and is_function(txn) do
transaction(fn ->
SQL.query(
Example.Repo,
"""
SELECT
set_config('role', 'authenticated_user_role', true),
set_config('jwt.claims.session_id', $1, true)
""",
[
session_id
]
)
txn.()
end)
end
Then to wrap your GraphQL requests in these transactions, you can use this very simple wrapper plug:
defmodule AbsintheAsUserPlug do
require Logger
def init(options), do: Absinthe.Plug.init(options)
def call(%{assigns: %{session_id: session_id}} = conn, opts) do
Logger.debug("Running absinthe in transaction")
{:ok, conn} =
Example.Repo.as_user(session_id, fn ->
Absinthe.Plug.call(conn, opts)
end)
Logger.debug("Finished running transaction")
conn
end
def call(conn, opts) do
Absinthe.Plug.GraphiQL.call(conn, opts)
end
end
I'm not going to go into auth here, but the session id would, of course, be necessary for this whole thing to work.
In router.ex
:
scope "/" do
pipe_through(:api)
# forward("/graphiql", GraphiqlAsUserPlug, schema: ExampleWeb.Schema.Types)
# forward("/graphiql", Absinthe.Plug.GraphiQL,
forward("/graphiql", GraphiqlAsUserPlug,
schema: ExampleWeb.Schema,
interface: :simple,
socket: ExampleWeb.UserSocket
)
# forward("/graphql", Absinthe.Plug, schema: ExampleWeb.Schema)
forward("/graphql", AbsintheAsUserPlug, schema: ExampleWeb.Schema)
end
In endpoint.ex
, to set up subscriptions:
socket "/websocket", ExampleWeb.UserSocket,
websocket: true,
longpoll: false,
check_origin: ["http://localhost:5678"]
The code is not always pretty. I've been spelunking a lot, but it's been pretty interesting and fun. Please feel free to play around, submit PRs, etc.
If you're using row-level security to secure data access, you have to disable parallel execution in Ecto. Leaving it on will mean certain queries will happen outside the transaction that enforces RLS.
defmodule Example.Repo do
use Ecto.Repo,
otp_app: :example,
adapter: Ecto.Adapters.Postgres,
# setting in_parallel to false keeps all queries inside the same
# transaction, which is required for RLS. From the ecto docs:
#
# :in_parallel - If the preloads must be done in parallel. It can only be
# performed when we have more than one preload and the repository is not in
# a transaction. Defaults to true. Setting to false ensures subqueries and
# dataloader preloads stay inside the same transaction.
# https://hexdocs.pm/ecto/Ecto.Repo.html#c:preload/3-options
in_parallel: false
alias Ecto.Adapters.SQL
@doc """
A function to wrap Ecto queries in transactions to work with Postgres
row-level security. Requires a session_id
"""
def as_user(session_id, txn)
when is_binary(session_id) and is_function(txn) do
transaction(fn ->
SQL.query(
Example.Repo,
"""
SELECT
set_config('role', 'authenticated_user', true),
set_config('jwt.claims.session_id', $1, true)
""",
[
session_id
]
)
txn.()
end)
end
# ...
end
-
Mutations for the Absinthe schema. Currently only supports read.
-
Cursor-based pagination
-
Not relay support, but support for the
nodes
structure Postgraphile supports. E.g.,query { workflows { pageInfo { startCursor endCursor } nodes { id name # ...etc. } } }
-
Support exposing postgres functions
-
Options for how codegen works. E.g., optional resolve methods. Right now I'm just using
dataloader
, which helps batch db queries and avoids N+1. -
Overrides/customization. E.g., it'd be nice to be able to write some config or override code that allows you to provide your own resolver for a specific field, etc. Similar to above.
-
A version that doesn't generate/write code to files, but instead just mounts an engine that dynamically updates (and could hot reload in dev mode). José Valim's answer to this StackOverflow question lays out how this could work. It doesn't seem that difficult tbh, and would allow you to use this similarly to how Postgraphile works.
-
Subscriptions. Have not toyed with this idea yet but seems like you could generate subscriptions if you wanted... Also maybe not worth doing programatically, since they're more use-case based.
-
RLS. I actually have this working in an example app by wrapping the Absinthe plug in an ecto transaction, but it's not currently in the repo.
-
RLS eject. It seems interesting and somewhat worthwhile bring the RLS checks into generated app code using Ecto
fragment
s.
Most of my testing is against the existing Aboard database.
mix test.watch
If available in Hex (again, it's not), the package can be installed
by adding pg_gen
to your list of dependencies in mix.exs
:
def deps do
[
{:pg_gen, "~> 0.1.0"}
]
end
Documentation can be generated with ExDoc and published on HexDocs. Once published, the docs can be found at https://hexdocs.pm/absinthe_gen.