Apify act for inserting crawler results into a remote PostgreSQL table.
This act fetches all results from a specified Apifier crawler execution and inserts them into a table in a remote PostgreSQL database.
The act does not store its state, i.e. if it crashes it restarts fetching all the results. Therefore you should only use it for executions with low number of results.
INPUT
Input is a JSON object with the following properties:
{
// crawler executionID
"_id": "your_execution_id",
// PostgreSQL connection credentials
"data": {
"connection": {
"host" : "host_name",
"port" : "port_number",
"user" : "user_name",
"password" : "user_password",
"database" : "database_name"
},
"table": "table_name"
}
}
The act can be run with a crawler finish webhook, in such case fill just the contents of data attribute into a crawler finish webhook data.
Additionally to crawler results, it is also possible to specify a dataset id, to fetch the result from a dataset.
{
// id of dataset to fetch rows from
"datasetId": "dataset_id",
// PostgreSQL connection credentials
"data": "connection_credentials"
}
Alternatively you can directly specify the rows to be inserted (i.e. not fetching them from crawler execution).
{
// rows to be inserted
"rows": [
{"column_1": "value_1", "column_2": "value_2"},
{"column_1": "value_3", "column_2": "value_4"},
...
],
// PostgreSQL connection credentials
"data": "connection_credentials"
}