This project contains a Jenkinsfile for an Extract, Transform, Load (ETL) pipeline, designed to streamline data processing workflows.
The pipeline is structured to perform the following steps:
- Environment Setup: Specifies the execution environment, indicating the target environment for the data processing tasks.
- Extraction: Extracts data from multiple sources. It includes parallel stages for extracting data from different sources concurrently.
- Transformation: Applies data transformation operations, such as cleaning, filtering, and aggregating the extracted data.
- Loading: Loads the transformed data into the target destination, such as a database or data warehouse.
- Input Check: Prompts the user to verify if the data has been loaded correctly.
- Ambient Check: Indicates the environment in which the pipeline is running, especially useful when deploying to different environments like development, testing, or production.
- Post-Execution Cleanup: Ensures workspace cleanup after the pipeline execution, maintaining a clean environment for subsequent runs.
To use this pipeline:
- Configure the Jenkins job to point to this repository.
- Customize the pipeline parameters and options as per your requirements.
- Run the pipeline to execute the ETL tasks.
- Jenkins server
- Jenkins Pipeline plugin