Thank you for considering contributing to Aignostics Python SDK!
Create a fork and clone your fork using git clone URL_OF_YOUR_CLONE. Then change into the directory of your local Aignostics Python SDK repository with cd python-sdk.
If you are one of the committers of https://github.com/aignostics/python-sdk you can directly clone via git clone git@github.com:aignostics/python-sdk.git and cd python-sdk.
Install or update development dependencies using
make install├── Makefile # Central entrypoint for build, test, release and deploy
├── noxfile.py # Noxfile for running tests in multiple python environments and other tasks
├── .pre-commit-config.yaml # Definition of hooks run on commits
├── .github/ # GitHub specific files
│ ├── workflows/ # GitHub Actions workflows
│ ├── prompts/ # Custom prompots for GitHub Copilot
│ └── copilot-instructions.md # Insructions for GitHub Copilot
├── .vscode/ # Recommended VSCode settings and extensions
├── .env # Environment variables, on .gitignore
├── .env.example # Example environment variables
src/aignostics/ # Source code
├── __init__.py # Package initialization and
├── utils/*.py # Infrastructure for logging, sentry, logfire etc.
├── system/*.py # Module for system management, including service, CLI commands and API operations
├── hello/*.py # Module for "Hello" functionality, including service, CLI commands and API operations
├── cli.py # CLI entrypoint with auto-registration of CLI commands of modules
├── api.py # Webservice API entrypoint with auto-registration of API operations of modules
└── constants.py # Package specific constants such as major API versions and modules to instrument.
tests/aignostics/ # Tests
├── **/cli_tests.py # Verifies the core and module specific CLI commands
├── **/api_tests.py # Verifies the core and module specific API operations
└── fixtures/ # Fixtures and mock data
docs/ # Documentation
├── partials/*.md # Partials to compile README.md, _main partial included in HTML and PDF documentation
├── ../README.md # Compiled README.md shown on GitHub
├── source/*.rst # reStructuredText files used to generate HTML and PDF documentation
├── ../*.md # Markdown files shown on GitHub and imported by .rst files
├── source/conf.py # Sphinx configuration used to generate HTML and PDF documentation
├── build/html/* # Generated HTML documentation as multiple pages
├── build/singlehtml/index.html # HTML documentation as a single page
└── build/latex/aignostics.pdf # PDF manual - generate with make docs pdf
examples/ # Example code demonstrating use of the project
├── notebook.py # Marimo notebook
├── notebook.ipynb # Jupyter notebook
└── script.py # Minimal script
reports/ # Compliance reports for auditing
├── junit.xml # Report of test executions run with pytest
├── mypy_junit.xml # Report of static typing validation run with mypy
├── coverage.xml # Test coverage in XML format generated by pytest-cov
├── coverage_html # Report of test coverage in HTML format
├── licenses.csv # List of dependencies and their license types
├── licenses.json # .json file with dependencies their license types
├── licenses_grouped.json # .json file with dependencies grouped by license type
├── vulnerabilities.json # .json file with vulnerabilities detected in dependencies by pip-audit
└── sbom.json # Software Bill of Materials in OWASP CycloneDX format
make setupDon't forget to configure your .env file with the required environment variables.
Notes:
- .env.example is provided as a template, use
cp .env.example .envand edit.envto create your environment. - .env is excluded from version control, so feel free to add secret values.
make # Runs primary build steps, i.e. formatting, linting, testing, building HTML docs and distribution, auditing
make help # Shows help with additional build targets, e.g. to build PDF documentation, bump the version to release etc.Notes:
- Primary build steps defined in
noxfile.py. - Distribution dumped into
dist/ - Documentation dumped into
docs/build/html/anddocs/build/latex/ - Audit reports dumped into
reports/
- Use VSCode / Testing
make testand flavors, seemake helpuv run pytest -k <test_name>
uv run aignostics # shows helpgit add .
git commit -m "feat(user): added new api endpoint to offboard user"
git pushNotes:
- pre-commit hooks will run automatically on commit to ensure code quality.
- We use the conventional commits format - see the code style guide for the mandatory commit message format.
- You can skip workflows using either commit messages or PR labels:
- Commit message: Include skip markers like
skip:ci,skip:test:long-running,skip:test:unit,skip:test:integration,skip:test:e2e,skip:test:matrix-runner, orskip:test:allin your commit message - PR labels: Add labels with the same names (e.g.,
skip:ci,skip:test:long-running) to your pull request - Both methods work independently - you can use either or both
- Commit message: Include skip markers like
make bump # Patch release
make minor # Patch release
make major # Patch release
make x.y.z # Targeted releaseNotes:
- Changelog generated automatically
- Publishes to PyPi, Docker Registries, Read The Docs, Streamlit and Auditing services
To run the GUI in the browser with hot reloading, use the following command:
make gui_watchmake actNotes:
- Workflow defined in
.github/workflows/*.yml - ci-cd.yml calls all build steps defined in noxfile.py
Build and run the Docker image with plain Docker
# Build from Dockerimage
make docker_build # builds targets all and slim
# Run the CLI
docker run --env THE_VAR=THE_VALUE -t aignostics --target all --help # target with all extras
docker run --env THE_VAR=THE_VALUE -t aignostics --target slim --help # slim flavor, no extrasBuild and run the Docker image with docker compose:
echo "Building the Docker image with docker compose and running CLI..."
docker compose run --build aignostics --help
echo "Building the Docker image with docker compose and running API container as a daemon ..."
docker compose up --build -d
echo "Waiting for the API server to start..."
sleep 5
echo "Checking health of v1 API ..."
curl http://127.0.0.1:8000/api/v1/healthz
echo ""
echo "Saying hello world with v1 API ..."
curl http://127.0.0.1:8000/api/v1/hello/world
echo ""
echo "Swagger docs of v1 API ..."
curl http://127.0.0.1:8000/api/v1/docs
echo ""
echo "Checking health of v2 API ..."
curl http://127.0.0.1:8000/api/v2/healthz
echo ""
echo "Saying hello world with v1 API ..."
curl http://127.0.0.1:8000/api/v2/hello/world
echo ""
echo "Swagger docs of v2 API ..."
curl http://127.0.0.1:8000/api/v2/docs
echo ""
echo "Shutting down the API container ..."
docker compose downNotes:
- The API service is run based on the slim Docker image. Change in compose.yaml if you need the API service to run on the fat image.
pinact run # See https://dev.to/suzukishunsuke/pin-github-actions-to-a-full-length-commit-sha-for-security-2n7pUpdate project to latest version of oe-python-template template.
make update_from_templateWhen submitting application runs to the Aignostics Platform, you can attach custom metadata to provide additional context, tracking information, or control flags. The SDK itself automatically attaches structured metadata to every run, which includes:
- SDK version and submission details: When and how the run was submitted (CLI, script, or GUI)
- User information: Organization and user details (when authenticated)
- CI/CD context: GitHub Actions workflow information, pytest test context
- Workflow control: Flags like
onboard_to_aignostics_portal - Scheduling: Due dates and deadlines for run completion
- Notes: Optional user-provided notes
The SDK metadata follows a strict JSON Schema to ensure data quality and consistency. You can:
- View the schema: SDK Metadata Schema (latest)
- Validate your metadata: The SDK automatically validates metadata before submission
- Extend with custom fields: Add your own metadata alongside the SDK-generated metadata
When submitting runs programmatically, you can provide additional metadata:
from aignostics.platform import Client
client = Client()
# Your custom metadata
custom_metadata = {
"experiment_id": "exp-2025-001",
"dataset_version": "v2.1",
"custom_flags": {
"enable_feature_x": True
}
}
# Submit run with custom metadata
# SDK metadata is automatically added under the "sdk" key
run = client.runs.submit(
application_id="your-app",
items=[...],
custom_metadata=custom_metadata
)The SDK will merge your custom metadata with its own tracking metadata, ensuring both are included in the run submission. The SDK metadata is always placed under the sdk key to avoid conflicts with your custom fields.
- Before starting to write code read the code style document for mandatory coding style requirements.
- Pre-Commit Hooks: We use pre-commit hooks to ensure code quality. Please install the pre-commit hooks by running
uv run pre-commit install. This ensure all tests, linting etc. pass locally before you can commit. - Squash Commits: Before submitting a pull request, please squash your commits into a single commit.
- Signed Commits: Use signed commits.
- Branch Naming: Use descriptive branch names like
feature/your-featureorfix/issue-number. - Testing: Ensure new features have appropriate test coverage.
- Documentation: Update documentation to reflect any changes or new features.