Welcome to the Kubiya Community Tools repository! This collection of tools, maintained by the Kubiya team and community, is designed to supercharge your automation capabilities with the Kubiya Platform. Our focus is on rapid time-to-automation, ease of use, and seamless integration with your existing infrastructure.
graph TD
A[Copy Tool Source URL] -->|Paste in Kubiya Platform| B(Add New Source)
B --> C{Automatic Tool Discovery}
C -->|Connect to Teammates| D[Start Automating]
D --> E[Local Development]
D --> F[CI/CD Integration]
D --> G[Kubernetes Deployment]
- Copy the repository URL:
https://github.com/kubiya-org/community-tools.git
- Paste it in Kubiya Platform under Resources -> Sources
- Connect the source to your Teammates
- Start automating!
- Flexible Tool Creation: Use Python SDK or YAML to define tool schemas
- Secrets Management: Declare and manage secrets required by your tools securely
- Language Agnostic: Run any code (Python, Go, Bash, etc.) within Docker containers
- File Handling: Add files, clone repositories, and manage dependencies
- Environment Variable Management: Declare and inject runtime variables
- Rapid Time-to-Automation: Go from idea to working automation in minutes
- Infrastructure Agnostic: Run on your own trusted Kubernetes-enabled infrastructure
- Local Development: Test and develop tools locally with ease
- CI/CD Ready: Seamlessly integrate with your existing CI/CD pipelines
- Secure by Design: Tools are designed with security and robustness in mind
Creating and associating tools with Teammates is a breeze:
graph LR
A[Create Tool Schema] -->|Python SDK or YAML| B(Define Tool Structure)
B --> C{Local Testing}
C -->|Pass| D[Add to Repository]
D --> E[Kubiya Discovery]
E --> F[Associate with Teammates]
F --> G[Automated Orchestration]
- Create tool schemas using Python SDK or YAML
- Define tool structure, including file specs, environment variables, secrets, and runtime requirements
- Test locally with your own environment variables and secrets
- Add to the repository
- Kubiya automatically discovers and makes tools available
- Associate tools with Teammates in the Kubiya Platform
- Sit back and watch the magic happen!
Tools can be defined using:
- YAML Syntax: For simple, quick configurations
- Python SDK: For complex tools with greater flexibility
To handle sensitive data securely, tools can declare required secrets in their schema. Kubiya will then inject these secrets at runtime, based on the Teammate's configuration.
from kubiya_sdk.tools import Tool, Arg, FileSpec
class MyCustomTool(Tool):
def __init__(self):
super().__init__(
name="my_custom_tool",
description="A custom tool that processes data",
type="docker",
image="python:3.9-slim",
content="""
#!/bin/bash
python /tmp/process_data.py
""",
args=[
Arg(name="input_file", type="str", description="Input file path"),
Arg(name="output_file", type="str", description="Output file path"),
],
env=["API_ENDPOINT"],
secrets=["API_KEY"], # Declare secrets required
with_files=[
FileSpec(
destination="/tmp/process_data.py",
content="""
import os
import sys
def process_data(input_file, output_file):
# Your data processing logic here
print(f"Processing {input_file} to {output_file}")
print(f"API Endpoint: {os.environ.get('API_ENDPOINT')}")
print(f"API Key: {os.environ.get('API_KEY')}")
if __name__ == "__main__":
process_data(sys.argv[1], sys.argv[2])
"""
)
],
)
This example demonstrates:
- Defining a Docker-based tool
- Adding a Python script as a file
- Declaring required environment variables and secrets
- Passing arguments to the tool
- Securely handling sensitive data
name: my_custom_tool
description: A custom tool that processes data
type: docker
image: python:3.9-slim
content: |
#!/bin/bash
python /tmp/process_data.py $input_file $output_file
args:
- name: input_file
type: str
description: Input file path
- name: output_file
type: str
description: Output file path
env:
- API_ENDPOINT
secrets:
- API_KEY # Declare secrets required
with_files:
- destination: /tmp/process_data.py
content: |
import os
import sys
def process_data(input_file, output_file):
# Your data processing logic here
print(f"Processing {input_file} to {output_file}")
print(f"API Endpoint: {os.environ.get('API_ENDPOINT')}")
print(f"API Key: {os.environ.get('API_KEY')}")
if __name__ == "__main__":
process_data(sys.argv[1], sys.argv[2])
-
Using Custom Dockerfiles:
You can use custom Dockerfiles for more complex setups:
from kubiya_sdk.tools import Tool class AdvancedTool(Tool): def __init__(self): super().__init__( name="advanced_tool", description="Tool with custom Dockerfile", type="docker", dockerfile="""
FROM golang:1.16-alpine WORKDIR /app COPY . . RUN go build -o main . CMD ["./main"] """, content="./main", # ... other configurations )
2. **Cloning Repositories**:
Clone and use external repositories in your tools:
```python:examples/repo_tool.py
from kubiya_sdk.tools import Tool
class RepoTool(Tool):
def __init__(self):
super().__init__(
name="repo_tool",
description="Tool that uses an external repo",
type="docker",
image="alpine/git",
content="""
git clone https://github.com/example/repo.git /repo
cd /repo
./run_script.sh
""",
# ... other configurations
)
-
Using Different Languages:
You can use any language supported by Docker:
from kubiya_sdk.tools import Tool class GolangTool(Tool): def __init__(self): super().__init__( name="golang_tool", description="Tool written in Go", type="docker", image="golang:1.16", content="""
cat << EOF > /app/main.go package main
import "fmt"
func main() { fmt.Println("Hello from Go!") } EOF
go run /app/main.go """, # ... other configurations )
### Environment Variable and Secrets Handling
- **Environment Variables**: Tools should declare all required environment variables in their schema.
- **Secrets**: Sensitive data should be declared in the `secrets` field. Kubiya will securely inject these secrets at runtime based on the Teammate's configuration.
- **Integrations**: Kubiya injects environment variables and secrets based on the integrations enabled on the Teammate.
Example:
```python:examples/api_tool.py
from kubiya_sdk.tools import Tool
class ApiTool(Tool):
def __init__(self):
super().__init__(
name="api_tool",
description="Tool that uses an API",
type="docker",
image="curlimages/curl",
content="curl -H 'Authorization: Bearer $API_TOKEN' $API_ENDPOINT",
env=["API_ENDPOINT"],
secrets=["API_TOKEN"], # Declare secrets required
# ... other configurations
)
To test your tools locally:
-
Set up a virtual environment
-
Install the Kubiya SDK:
pip install kubiya-sdk
-
Create a test script:
from my_custom_tool import MyCustomTool import os os.environ['API_ENDPOINT'] = 'https://api.example.com' os.environ['API_KEY'] = 'your_api_key' # For testing purposes only tool = MyCustomTool() result = tool.execute({ "input_file": "test_input.txt", "output_file": "test_output.txt" }) print(result)
-
Run your test script:
python tests/test_my_custom_tool.py
Integrate tool development into your CI/CD pipeline:
- Add tool schema validation tests
- Test tool execution in a controlled environment
- Automate deployment to the Kubiya Platform upon successful tests
- Use version control to manage tool updates
- Modularity: Create small, focused tools that do one thing well
- Idempotency: Ensure tools can be run multiple times without side effects
- Error Handling: Provide clear error messages and handle exceptions gracefully
- Documentation: Include detailed descriptions and usage examples for each tool
- Version Control: Use semantic versioning for your tools
- Testing: Create comprehensive test suites for your tools
- Security: Always declare required secrets and handle them securely
- Kubiya Documentation
- Teammates Guide
- Tool Development Guide
- Sources Documentation
- Local Runners Guide
- Secrets Management
We enthusiastically welcome contributions! Check out our CONTRIBUTING.md for guidelines.
This project is under the MIT License. See LICENSE for details.
Embark on your automation journey with Kubiya! For questions or assistance, consult our documentation or reach out to our vibrant community.
Transform your workflows with Kubiya - Where automation meets innovation!