Skip to content

Latest commit

 

History

History
278 lines (212 loc) · 11.1 KB

File metadata and controls

278 lines (212 loc) · 11.1 KB

Contributing

Thank you for considering contributing to Aignostics Python SDK!

Setup

Create a fork and clone your fork using git clone URL_OF_YOUR_CLONE. Then change into the directory of your local Aignostics Python SDK repository with cd python-sdk.

If you are one of the committers of https://github.com/aignostics/python-sdk you can directly clone via git clone git@github.com:aignostics/python-sdk.git and cd python-sdk.

Install or update development dependencies using

make install

Directory Layout

├── Makefile               # Central entrypoint for build, test, release and deploy
├── noxfile.py             # Noxfile for running tests in multiple python environments and other tasks
├── .pre-commit-config.yaml # Definition of hooks run on commits
├── .github/               # GitHub specific files
│   ├── workflows/         # GitHub Actions workflows
│   ├── prompts/           # Custom prompots for GitHub Copilot
│   └── copilot-instructions.md # Insructions for GitHub Copilot
├── .vscode/               # Recommended VSCode settings and extensions
├── .env                   # Environment variables, on .gitignore
├── .env.example           # Example environment variables
src/aignostics/  # Source code
├── __init__.py          # Package initialization and 
├── utils/*.py           # Infrastructure for logging, sentry, logfire etc.
├── system/*.py          # Module for system management, including service, CLI commands and API operations
├── hello/*.py           # Module for "Hello" functionality, including service, CLI commands and API operations
├── cli.py               # CLI entrypoint with auto-registration of CLI commands of modules
├── api.py               # Webservice API entrypoint with auto-registration of API operations of modules
└── constants.py         # Package specific constants such as major API versions and modules to instrument.
tests/aignostics/ # Tests
├── **/cli_tests.py      # Verifies the core and module specific CLI commands
├── **/api_tests.py      # Verifies the core and module specific API operations 
└── fixtures/            # Fixtures and mock data
docs/                    # Documentation
├── partials/*.md        # Partials to compile README.md,  _main partial included in HTML and PDF documentation
├── ../README.md         # Compiled README.md shown on GitHub
├── source/*.rst         # reStructuredText files used to generate HTML and PDF documentation
├── ../*.md              # Markdown files shown on GitHub and imported by .rst files
├── source/conf.py       # Sphinx configuration used to generate HTML and PDF documentation
├── build/html/*         # Generated HTML documentation as multiple pages
├── build/singlehtml/index.html # HTML documentation as a single page
└── build/latex/aignostics.pdf # PDF manual - generate with make docs pdf
examples/                # Example code demonstrating use of the project
├── notebook.py          # Marimo notebook
├── notebook.ipynb       # Jupyter notebook
└── script.py            # Minimal script
reports/                 # Compliance reports for auditing
├── junit.xml            # Report of test executions run with pytest
├── mypy_junit.xml       # Report of static typing validation run with mypy
├── coverage.xml         # Test coverage in XML format generated by pytest-cov
├── coverage_html        # Report of test coverage in HTML format
├── licenses.csv         # List of dependencies and their license types
├── licenses.json        # .json file with dependencies their license types
├── licenses_grouped.json  # .json file with dependencies grouped by license type
├── vulnerabilities.json # .json file with vulnerabilities detected in dependencies by pip-audit
└── sbom.json            # Software Bill of Materials in OWASP CycloneDX format

Build, Run and Release

Setup project specific development environment

make setup

Don't forget to configure your .env file with the required environment variables.

Notes:

  1. .env.example is provided as a template, use cp .env.example .env and edit .env to create your environment.
  2. .env is excluded from version control, so feel free to add secret values.

Build

make        # Runs primary build steps, i.e. formatting, linting, testing, building HTML docs and distribution, auditing
make help   # Shows help with additional build targets, e.g. to build PDF documentation, bump the version to release etc.

Notes:

  1. Primary build steps defined in noxfile.py.
  2. Distribution dumped into dist/
  3. Documentation dumped into docs/build/html/ and docs/build/latex/
  4. Audit reports dumped into reports/

Run tests

  1. Use VSCode / Testing
  2. make test and flavors, see make help
  3. uv run pytest -k <test_name>

Run the CLI

uv run aignostics # shows help

Commit and Push your changes

git add .
git commit -m "feat(user): added new api endpoint to offboard user"
git push

Notes:

  1. pre-commit hooks will run automatically on commit to ensure code quality.
  2. We use the conventional commits format - see the code style guide for the mandatory commit message format.
  3. You can skip workflows using either commit messages or PR labels:
    • Commit message: Include skip markers like skip:ci, skip:test:long-running, skip:test:unit, skip:test:integration, skip:test:e2e, skip:test:matrix-runner, or skip:test:all in your commit message
    • PR labels: Add labels with the same names (e.g., skip:ci, skip:test:long-running) to your pull request
    • Both methods work independently - you can use either or both

Publish Release

make bump   # Patch release
make minor  # Patch release
make major  # Patch release
make x.y.z  # Targeted release

Notes:

  1. Changelog generated automatically
  2. Publishes to PyPi, Docker Registries, Read The Docs, Streamlit and Auditing services

Advanced usage

Developing the GUI

To run the GUI in the browser with hot reloading, use the following command:

make gui_watch

Running GitHub CI Workflow locally

make act

Notes:

  1. Workflow defined in .github/workflows/*.yml
  2. ci-cd.yml calls all build steps defined in noxfile.py

Docker

Build and run the Docker image with plain Docker

# Build from Dockerimage
make docker_build # builds targets all and slim

# Run the CLI
docker run --env THE_VAR=THE_VALUE -t aignostics --target all --help    # target with all extras
docker run --env THE_VAR=THE_VALUE -t aignostics --target slim --help   # slim flavor, no extras

Build and run the Docker image with docker compose:

echo "Building the Docker image with docker compose and running CLI..."
docker compose run --build aignostics --help
echo "Building the Docker image with docker compose and running API container as a daemon ..."
docker compose up --build -d
echo "Waiting for the API server to start..."
sleep 5
echo "Checking health of v1 API ..."
curl http://127.0.0.1:8000/api/v1/healthz
echo ""
echo "Saying hello world with v1 API ..."
curl http://127.0.0.1:8000/api/v1/hello/world
echo ""
echo "Swagger docs of v1 API ..."
curl http://127.0.0.1:8000/api/v1/docs
echo ""
echo "Checking health of v2 API ..."
curl http://127.0.0.1:8000/api/v2/healthz
echo ""
echo "Saying hello world with v1 API ..."
curl http://127.0.0.1:8000/api/v2/hello/world
echo ""
echo "Swagger docs of v2 API ..."
curl http://127.0.0.1:8000/api/v2/docs
echo ""
echo "Shutting down the API container ..."
docker compose down

Notes:

  1. The API service is run based on the slim Docker image. Change in compose.yaml if you need the API service to run on the fat image.

Pinning GitHub Actions

pinact run  # See https://dev.to/suzukishunsuke/pin-github-actions-to-a-full-length-commit-sha-for-security-2n7p

Update from Template

Update project to latest version of oe-python-template template.

make update_from_template

Custom Metadata

When submitting application runs to the Aignostics Platform, you can attach custom metadata to provide additional context, tracking information, or control flags. The SDK itself automatically attaches structured metadata to every run, which includes:

  • SDK version and submission details: When and how the run was submitted (CLI, script, or GUI)
  • User information: Organization and user details (when authenticated)
  • CI/CD context: GitHub Actions workflow information, pytest test context
  • Workflow control: Flags like onboard_to_aignostics_portal
  • Scheduling: Due dates and deadlines for run completion
  • Notes: Optional user-provided notes

SDK Metadata Schema

The SDK metadata follows a strict JSON Schema to ensure data quality and consistency. You can:

  • View the schema: SDK Metadata Schema (latest)
  • Validate your metadata: The SDK automatically validates metadata before submission
  • Extend with custom fields: Add your own metadata alongside the SDK-generated metadata

Adding Custom Metadata

When submitting runs programmatically, you can provide additional metadata:

from aignostics.platform import Client

client = Client()

# Your custom metadata
custom_metadata = {
    "experiment_id": "exp-2025-001",
    "dataset_version": "v2.1",
    "custom_flags": {
        "enable_feature_x": True
    }
}

# Submit run with custom metadata
# SDK metadata is automatically added under the "sdk" key
run = client.runs.submit(
    application_id="your-app",
    items=[...],
    custom_metadata=custom_metadata
)

The SDK will merge your custom metadata with its own tracking metadata, ensuring both are included in the run submission. The SDK metadata is always placed under the sdk key to avoid conflicts with your custom fields.

Pull Request Guidelines

  1. Before starting to write code read the code style document for mandatory coding style requirements.
  2. Pre-Commit Hooks: We use pre-commit hooks to ensure code quality. Please install the pre-commit hooks by running uv run pre-commit install. This ensure all tests, linting etc. pass locally before you can commit.
  3. Squash Commits: Before submitting a pull request, please squash your commits into a single commit.
  4. Signed Commits: Use signed commits.
  5. Branch Naming: Use descriptive branch names like feature/your-feature or fix/issue-number.
  6. Testing: Ensure new features have appropriate test coverage.
  7. Documentation: Update documentation to reflect any changes or new features.