A REST API designed to convert Linked Open Data (LOD) from public services, formatted in RDF (Resource Description Framework) using the Core Public Service Vocabulary Application Profile (CPSV-AP), into JSON format. This API simplifies the integration of LOD into modern applications, making public service data more accessible for developers.
- Overview
- Features
- Tech Stack
- Setup and Installation
- API Endpoints
- Configuration
- Extending the API
- License
Linked Open Data (LOD) has become crucial for promoting transparency and interoperability in public services. However, RDF-based LOD adoption faces challenges due to its complexity. The CPSV-API resolves this by providing a user-friendly API to transform RDF into JSON, making it easier to integrate public service data into modern software applications.
- SPARQL Querying: Retrieve public service data using SPARQL.
- JSON Output: Converts RDF data to JSON, making it easy to work with in most modern applications.
- RESTful Endpoints: Access public service data, organizations, legal resources, and more through REST API endpoints.
- Docker Integration: Deploy the API quickly and reliably using Docker.
- Extensible Architecture: Easily add support for new data sources by extending the provider interface.
- Python: Core language of the project.
- FastAPI: High-performance web framework for the API.
- RDFLib: Library for working with RDF data.
- SPARQLWrapper: Interface for querying SPARQL endpoints.
- YAML: Configuration files for flexibility.
- Docker: For containerized deployment and consistent environment setup.
- Python 3.8+
- Docker (optional but recommended)
-
Clone the repository:
git clone https://github.com/Matzi24GR/Services-Api cd Services-Api
-
Install the required dependencies:
pip install -r requirements.txt
-
Edit
config.yaml
to specify your data sources and SPARQL endpoints. -
Run the API:
uvicorn app.main:app --reload
-
Build the Docker image:
docker build -t cpsv-api .
-
Start the API using Docker Compose:
docker-compose up
The following RESTful endpoints are available:
- GET /providers: Lists all active data providers.
- GET /services: Returns a merged list of public services from all data providers.
- GET /services/{id}: Fetches details of a specific public service by id.
- GET /organizations: Returns all public organizations related to the services.
- GET /evidences: Lists all evidences/documents required for the services.
- GET /requirements: Lists all service requirements.
- GET /rules: Returns all rules governing the services.
- GET /legalResources: Lists legal resources linked to public services.
- GET /outputs: Lists the outputs or results of services.
Configuration
The API can be configured using two YAML files:
-
config.yaml
: This file defines the data sources and their SPARQL endpoints. Example configuration:providers: - tag: "bdti-mitos" name: "BDTI Virtuoso With Mitos Data" cpsv_version: "3.1.1" type: sparql url: "https://virtuoso-1706142355.p1.bdti.dataplatform.tech.ec.europa.eu/sparql/" graph_uri: "https://mitos.gov.gr:8890/"
-
provider/sparql/data/{version}-config.yaml
: These files, located in the provider/sparql/data folder, provide version-specific configuration for each CPSV-AP version. The {version}-config.yaml file maps the CPSV-AP JSON-LD vocabulary to the endpoints for data retrieval. This allows dynamic construction of SPARQL queries based on the configuration.Example of a
3.1.1-config.yaml
file:queries: getRequirements: target: Requirement elements: - Requirement.fulfils - Requirement.hasSupportingEvidence - Requirement.identifier - Requirement.name - Requirement.type
Each version of CPSV-AP is mapped using a corresponding JSON-LD file, allowing seamless integration of various vocabulary versions. You can configure and modify how each data element is retrieved through these files.
To add a new data source class:
- Implement a new provider class by extending the
Provider
interface. - Define SPARQL queries in the corresponding YAML configuration file located in
provider/sparql/data
. - Register the new provider in
config.yaml
and implement the corresponding endpoints in the API.
The current design is flexible and allows for future extension without altering the core logic of the API.
This project is licensed under the European Union Public License (EUPL).