- Utilities to load file, save file, zip file
- File Stream Writer
- File Stream Reader
- Implement ETL process for Data Processing, Business Intelligence
Extract-Transform-Load (ETL) is a data integration process involving the extraction of data from various sources, transformation into a suitable format, and loading into a target database or data warehouse.
- Extracting data from various sources.
- Transforming the data into a suitable format/structure.
- Loading the transformed data into a target database or data warehouse.
- core-go/io is designed for batch processing, enabling the development of complex batch applications. It supports operations such as reading, processing, and writing large volumes of data.
- core-go/io is not an ETL tool. It provides the necessary libraries for implementing ETL processes. It allows developers to create jobs that extract data from sources, transform it, and load it into destinations, effectively supporting ETL operations.
Use Cases of core-go/io in ETL:
- Data Migration: Moving and transforming data from legacy systems to new systems.
- Data Processing: Handling large-scale data processing tasks like data cleansing and transformation
- Data Warehousing: Loading and transforming data into data warehouses.
- Business Intelligence: Transforming raw data into meaningful insights for decision-making, to provide valuable business insights and trends.
Specific Use Cases of core-go/io
- go-sql-export: export data from sql to fix-length or csv file.
- go-hive-export: export data from hive to fix-length or csv file.
- go-cassandra-export: export data from cassandra to fix-length or csv file.
- go-mongo-export: export data from mongo to fix-length or csv file.
- go-firestore-export: export data from firestore to fix-length or csv file.
- go-sql-import: import data from fix-length or csv file to sql.
- go-hive-import: import data from fix-length or csv file to hive.
- go-cassandra-export: import data from fix-length or csv file to cassandra.
- go-elasticsearch-import: import data from fix-length or csv file to elasticsearch.
- go-mongo-export: import data from fix-length or csv file to mongo.
- go-firestore-export: import data from fix-length or csv file to firestore.
- Popular for web development
- Suitable for Import Flow
- Reader, Validator, Transformer, Writer
Reader Adapter Sample: File Reader. We provide 2 file reader adapters:
- Delimiter (CSV format) File Reader
- Fix Length File Reader
- Validator Adapter Sample: Schema Validator
- We provide the Schema validator based on GOLANG Tags
We provide 2 transformer adapters
- Delimiter Transformer (CSV)
- Fix Length Transformer
We provide many writer adapters:
-
SQL:
- SQL Writer: to insert or update data
- SQL Inserter: to insert data
- SQL Updater: to update data
- SQL Stream Writer: to insert or update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- SQL Stream Inserter: to insert data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush. Especially, we build 1 single SQL statement to improve the performance.
- SQL Stream Updater: to update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
-
Mongo:
- Mongo Writer: to insert or update data
- Mongo Inserter: to insert data
- Mongo Updater: to update data
- Mongo Stream Writer: to insert or update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- Mongo Stream Inserter: to insert data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- Mongo Stream Updater: to update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
-
Elastic Search
- Elastic Search Writer: to insert or update data
- Elastic Search Creator: to create data
- Elastic Search Updater: to update data
- Elastic Search Stream Writer: to insert or update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- Elastic Search Stream Creator: to create data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- Elastic Search Stream Updater: to update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
-
Firestore
- Firestore Writer: to insert or update data
- Firestore Updater: to update data
-
Cassandra
- Cassandra Writer: to insert or update data
- Cassandra Inserter: to insert data
- Cassandra Updater: to update data
-
Hive
- Hive Writer: to insert or update data
- Hive Inserter: to insert data
- Hive Updater: to update data
- Hive Stream Updater: to update data. When you write data, it keeps the data in the buffer, it does not write data. It just writes data when flush.
- File Stream Reader
- Delimiter (CSV format) File Reader
- Fix Length File Reader
- File Stream Writer
- Transform an object to Delimiter (CSV) format
- Transform an object to Fix Length format
- onecore: Standard interfaces for typescript to export data.
- io-one: File Stream Writer, to export data to CSV or fix-length files by stream.
- Postgres: pg-exporter to wrap pg, pg-query-stream, pg-promise.
- Oracle: oracle-core to wrap oracledb.
- My SQL: mysql2-core to wrap mysql2.
- MS SQL: mssql-core to wrap mssql.
- SQLite: sqlite3-core to wrap sqlite3.
- oracle-export-sample: export data from Oracle to fix-length or csv file.
- postgres-export-sample: export data from Posgres to fix-length or csv file.
- mysql-export-sample: export data from My SQL to f11ix-length or csv file.
- mssql-export-sample: export data from MS SQL to fix-length or csv file.
- onecore: Standard interfaces for typescript to export data.
- io-one: File Stream Reader, to read CSV or fix-length files from files by stream.
- xvalidators: Validate data
- import-service: Implement import flow
- query-core: Simple writer to insert, update, delete, insert batch for Postgres, MySQL, MS SQL
- Oracle: oracle-core to wrap oracledb, to build insert or update SQL statement, insert batch for Oracle.
- My SQL: mysql2-core to wrap mysql2, to build insert or update SQL statement.
- MS SQL: mssql-core to wrap mssql, to build insert or update SQL statement.
- SQLite: sqlite3-core to wrap sqlite3, to build insert or update SQL statement.
- Mongo: mongodb-extension to wrap mongodb, to insert, update, upsert, insert batch, update batch, upsert batch.
- import-sample: nodejs sample to import data from fix-length or csv file to sql (Oracle, Postgres, My SQL, MS SQL, SQLite)
Please make sure to initialize a Go module before installing core-go/io:
go get -u github.com/core-go/io
Import:
import "github.com/core-go/io"