Skip to content

cray44/detection-validator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

detection-validator

Validates Sigma detection rules against paired JSON test samples — no Splunk required.

Given a Sigma rule and two JSON files (malicious events, benign events), it asserts:

  • At least one malicious event matches the detection logic
  • Zero benign events match the detection logic

Conversion through sigma-to-spl is also checked on every rule: if the rule fails to convert or produces MANUAL: warnings, that's surfaced in the output.

Install

pip install -e .
# sigma-to-spl must also be installed for conversion checks
pip install -e ../sigma-to-spl

Usage

# Test all rules, auto-discover validator.yml and test-data paths
detection-validator test rules/

# Test a single rule
detection-validator test rules/network/dns-tunneling-high-entropy-subdomains.yml

# Explicit sigma-to-spl config (Corelight/Zeek field mappings)
detection-validator test rules/ --config config/corelight.yml

# Explicit test-data directory (single rule only)
detection-validator test rules/network/dns-tunneling.yml --test-data ../detection-notes/detections/network/dns/test-data/

# Explicit validator.yml (overrides auto-discovery)
detection-validator test rules/ --mappings config/validator.yml

Output

PASS    network/dns-tunneling-high-entropy-subdomains
         conversion: clean
         malicious: 3/4 matched  benign: 0/5 matched

WARN    network/statistical-beaconing-zeek-conn-log
         conversion: clean
         manual: MANUAL: rule has SPL-only additions not expressible in Sigma — see detection writeup
         malicious: 10/10 matched  benign: 6/12 matched
         benign FPs: 6/12 — SPL additions handle these

SKIP    identity/oauth-device-code-phishing-sigmahq
         no test-data directory

FAIL    windows/some-rule
         malicious: 0/3 matched  <- detection gap

10/11 passed  1 skipped  0 failed

Exit codes: 0 = all pass/warn/skip, 1 = any FAIL or ERROR, 2 = configuration error.

Status codes

Status Meaning
PASS Conversion clean, malicious matched, no benign matched
WARN Assertions pass but rule has MANUAL: warnings (SPL adds filtering beyond what Sigma expresses) — benign FPs are expected
SKIP No test-data configured, or aggregation condition that requires a time window
FAIL Detection gap (malicious not matched), false positive on a clean rule, or conversion error
ERROR YAML parse failure or sample load failure

A WARN is not a CI failure. Detection gaps always fail, regardless of MANUAL: status.

How it works

Sigma-native evaluation

The evaluator works directly on Sigma's detection YAML — not on the SPL output. This keeps it infrastructure-free and tests the source of truth, not a derived artifact.

The detection block is parsed into a condition AST (recursive descent parser handling and, or, not, 1 of, all of). Each named selection block is evaluated against the event using the field modifiers from the rule:

Modifier Behavior
(none) Case-insensitive equality; * as value = field exists with any value
|contains Substring match
|startswith Prefix match
|endswith Suffix match
|contains|all All values must appear in field (AND)
|cidr IP/subnet membership via Python ipaddress
|fieldref Compare field value to another field's value in the same event
|re Regex match

Field resolution uses literal-key lookup first (Zeek flat fields like id.orig_h), then falls back to nested dict traversal (Azure structured fields like DeviceDetail.isCompliant). This handles both log formats correctly without special-casing.

Conversion check

Before evaluation, each rule is run through sigma-to-spl's Converter. Two outcomes flag a rule:

  • Conversion errorFAIL with the exception message
  • MANUAL: in output → rule is marked for lenient evaluation (FPs become WARN instead of FAIL)

MANUAL: warnings are emitted by sigma-to-spl's PostProcessor in two cases:

  1. The rule's logsource category is dns (entropy scoring required)
  2. The raw rule YAML contains a # NOTE: comment (SPL-only logic documented inline)

Rules that document SPL-only additions with # NOTE: are treated as intentionally broad at the Sigma tier — their false positives are expected and handled by the SPL.

What it cannot test

  • Aggregation conditions — rules using count by, stats, streamstats require a time window and are automatically skipped
  • SPL-only filtering — risk scoring, lookup-based suppression, eval/rex transformations; these are covered by the MANUAL: / WARN path
  • Field extraction — transforms applied by the PostProcessor (index routing, macro substitution) are not reflected in the Sigma condition

Test data format

Each detection needs two JSON files alongside the writeup:

test-data/
  malicious-sample.json   # events the rule MUST match (≥1 required)
  benign-sample.json      # events the rule MUST NOT match (0 allowed, unless MANUAL:)

Files can contain a single JSON object or a JSON array. Use _comment keys freely — the evaluator ignores unknown fields. Events that the Sigma rule legitimately fires on but the SPL suppresses (known FPs) belong in benign-sample.json only for rules that carry a # NOTE: / MANUAL: marker.

Rule → test-data mapping

The tool auto-discovers config/validator.yml by walking up from the rules directory. The config maps rule slugs to test-data paths:

rules:
  network/dns-tunneling-high-entropy-subdomains:
    test_data: ../../detection-notes/detections/network/dns/test-data

  # Intentionally excluded (upstream mirror — no local test data)
  identity/oauth-device-code-phishing-sigmahq:
    test_data:

Paths are resolved relative to validator.yml. Rules with test_data: null are treated as explicitly excluded (not flagged as missing by the coverage check).

CI integration

In sigma-to-spl's GitHub Actions workflow, a validate job checks out all three repos as siblings so relative paths in validator.yml resolve identically to local development:

$GITHUB_WORKSPACE/
  sigma-to-spl/       ← main checkout (rules, config)
  detection-notes/    ← test-data source
  detection-validator/← tool

The validate job runs after the convert job and gates the PR. A failed detection assertion blocks merge.

Design decisions

Why Sigma-native instead of SPL evaluation? pySigma is a compiler, not a runtime — there's no built-in "does this event match this rule?" function. Writing a SPL subset evaluator would test a derived artifact (the conversion output) rather than the source of truth. The Sigma condition is what the engineer wrote; that's what should be tested.

Why WARN for MANUAL: rules instead of skipping? Skipping would hide detection gaps. Rules with SPL additions still need to fire on malicious events at the Sigma tier — the SPL only adds filtering of false positives. A WARN confirms the base detection works while acknowledging the SPL handles the precision gap.

Why the # NOTE: convention? pySigma strips YAML comments during parsing, so there's no other way to communicate "this rule is intentionally incomplete" from rule metadata. A # NOTE: in the detection block is a visible, low-ceremony signal that carries context for both the detection author and the tooling.

About

Validates Sigma detection rules against paired JSON test samples — no Splunk required

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages