This project provides a utility for efficiently copying large amounts of data into a PostgreSQL database using the bulk data copy feature. It is designed to handle high-volume data imports and exports, making it ideal for scenarios such as data migration, data warehousing, and ETL (Extract, Transform, Load) processes.
- Python 3.6+ :
brew install python3
- pip3 :
brew install pip3
- psycopg2 library :
brew install psycopg2
- if it failes with
ssl not found
then run
env LDFLAGS="-I/opt/homebrew/opt/openssl/include -L/opt/homebrew/opt/openssl/lib" pip --no-cache install psycopg2
- if it failes with
pip install -r requirements.txt
conn = psycopg2.connect(
host='<DB_HOST>',
port=<DB_PORT>,
user='<DB_USER>',
password='<DB_PASSWORD>',
database='<DB_NAME>'
)
- Update
file_name
: csv file name which should contain all the table columns without csv headertable_name
: Table nametable_columns
: All the column names from the table to be copied
arch -arm64 python3 bulk-copy.py
I would like to express our sincere gratitude to the original creator of this project in javascript:
AGPL-3.0