/sql-converter

Build SQL queries and export extract in dedicated format (python dict, CSV, JSON). Mirrored from https://gitlab.com/p2m3ng/sql-converter

Primary LanguagePythonMIT LicenseMIT

SQL Converter

pipeline status coverage report

Build SQL queries.
Export SQL extract in dedicated format.

Presentation

This tool can be used by developers to dump SQL data to python dict or JSON and export it to CSV or JSON files.
Drop databases to readable format, export fixtures becomes easy.

Installation

Clone repository:

$ git@gitlab.com:p2m3ng/sql-converter.git

Or install with pip:

$ pip install git+ssh://git@gitlab.com/p2m3ng/sql-converter

Configuration

If you use sql-converter as a package, you will need to create the file directly in your virtualenv or use environment variables to create it.

Setup file has to be created here: sql_converter/settings/files/config.yaml.

Environment Variables

SQL_EXPORT_DB_HOST="localhost"
SQL_EXPORT_DB_PORT=1234
SQL_EXPORT_DB_USER="root"
SQL_EXPORT_DB_NAME="my_database"
SQL_EXPORT_DB_PASSWORD="password"

Usage:

SQL Query Builder

from sql_converter.query import Query, Table

First, declare Table and build a Query.

name is the table name.
fields are the requested fields. If the parameter is not set, fields will be replaced by a SELECT *.
alias is the custom alias of the table. The system generates a default value which can be overridden if the parameter is filled.

authors = Table(
    name="author",
    fields=["id", "name", "first_name", "nationality"],
    alias="aut"
)
books = Table(
    name="books",
    fields=["id", "author_id", "title", "isbn"],
    alias="boo",
)

Queries objects can be build in chaining arguments.

query = Query(prettify=False) \
    .add(table=authors) \
    .add(table=books) \
    .join(table1=authors, field1="id", table2=books, field2="author_id", type="") \
    .order_by(table=authors, field="-id") \
    .limit(number=5) \

query.build()

By default, fields are protected:

SELECT `aut`.`id`, `aut`.`name`, `aut`.`first_name`, `aut`.`nationality`, `boo`.`id`, `boo`.`author_id`, `boo`.`title`, 
`boo`.`isbn` 
FROM `author` AS `aut` 
    INNER JOIN `books` AS `boo` 
        ON `boo`.`author_id` = `aut`.`id` 
ORDER BY `aut`.`id` DESC 
LIMIT 5;

The prettify parameter prints it in a more readable format:

SELECT aut.id, aut.name, aut.first_name, aut.nationality, boo.id, boo.author_id, boo.title, boo.isbn 
FROM author AS aut 
    INNER JOIN books AS boo 
        ON boo.author_id = aut.id 
ORDER BY aut.id DESC 
LIMIT 5;

SQL Export

Parameters

from sql_converter.convert import SQLConvert

export = SQLConvert(
    query=query.build(),
    headers=query.headers,
)

query can be a raw SQL query, or be generated by the Query builder.

headers is a list of fields. They can be filled when declaring a Table or embedded in a list of strings.

export_to gets file format by its extension (.csv or .json) or returns data as a list of dictionaries if None. Default: None

export.make(pprint=True, json=False)

pprint print chosen output to console. Default: False

json force output as usable JSON string. Default: False

With Query Builder

export = SQLConvert(
    query=query.build(),
    headers=query.headers,
    export_to=f"authors.csv",
)

data = export.make(pprint=True)

With Raw SQL

Headers are mandatory.

from sql_converter.convert import SQLConvert

query = """SELECT aut.id, aut.name, aut.first_name, aut.nationality, boo.id, boo.author_id, boo.title, boo.isbn 
FROM author AS aut 
    INNER JOIN books AS boo 
        ON boo.author_id = aut.id 
ORDER BY aut.id DESC 
LIMIT 5;"""

headers = ["id", "name", "first_name", "nationality", "id", "author_id", "title", "isbn"]

export = SQLConvert(
    query=query,
    headers=headers,
    export_to=f"authors.json",
)

data = export.make()