Skip to content

Avoid unsafe graph cloning in calc path#9610

Open
Nostrademous wants to merge 12 commits intoPathOfBuildingCommunity:devfrom
Nostrademous:ChatGPT_Improvements
Open

Avoid unsafe graph cloning in calc path#9610
Nostrademous wants to merge 12 commits intoPathOfBuildingCommunity:devfrom
Nostrademous:ChatGPT_Improvements

Conversation

@Nostrademous
Copy link
Contributor

Patch set P1: make source assignment non-mutating

Files

  • src/Modules/ModTools.lua
  • src/Modules/CalcPerform.lua
  • src/Modules/CalcSetup.lua

Changes

  1. Add a new helper:
    • modLib.withSource(mod, source)
    • It returns a safe copy of the mod with source applied.
  2. Keep modLib.setSource() temporarily for compatibility, but mark it internal-only / deprecated.
  3. Convert all callers that may touch shared modifier objects to withSource():
    • applyEnemyModifiers()
    • socket/group property application
    • keystone merge path
    • item mirroring / copied item-mod paths
  4. Rule: never mutate a mod returned from parser data, skill data, or cached tables in place.

Suggested implementation

function modLib.withSource(mod, source)
    local copy = copyTableSafe(mod, false)
    copy.source = source
    if type(copy.value) == "table" and copy.value.mod then
        copy.value.mod.source = source
    end
    return copy
end

Acceptance

  • Source labels are still correct in breakdowns / UI.
  • Repeated calc runs do not leak source labels between skills or items.
  • Kalandra / socket property / keystone propagation still works.

Patch set P2: stop deep-copying graph objects blindly

Files

  • src/Modules/Common.lua
  • src/Classes/ModStore.lua
  • src/Modules/CalcActiveSkill.lua
  • src/Modules/CalcPerform.lua

Changes

  1. Establish a rule in code comments and callers:
    • env, activeSkill, ModDB, ModList, and cache entries are graph objects, not safe deep-copy targets.
  2. Replace copyTable() with copyTableSafe() only at graph-risk call sites:
    • ModStore:ScaleAddMod()
    • any copy of mod.value / mod.value.mod that may originate from shared data
  3. Do not globally replace every copyTable() call; keep hot-path tree copies as-is unless they can receive graph objects.
  4. Add one debug helper used only in development builds:
    • assertNoSelfReference(obj, label) or graphNodeTag checks on cache/skill objects.

Acceptance

  • Scaling mods with nested value.mod tables no longer risks recursive copy overflow.
  • No measurable regression in common non-FullDPS single-skill calc time.

Patch set P3: replace copyActiveSkill() with a snapshot-based clone path

Files

  • src/Modules/CalcActiveSkill.lua
  • src/Modules/CalcTriggers.lua
  • src/Modules/CalcMirages.lua (if applicable in your branch)

Changes

  1. Introduce calcs.snapshotActiveSkill(skill) returning only immutable / scalar fields needed to rebuild a calc-local skill:
    • granted effect id/reference
    • gem level / quality / qualityId
    • selected part / socket selection
    • support identifiers (not live support objects)
    • trigger-related flags
  2. Introduce calcs.rebuildSkillFromSnapshot(env, snapshot, actor) that resolves the skill against the new env.
  3. Deprecate direct graph-copy semantics in copyActiveSkill().
  4. Keep srcInstance out of the snapshot unless you use only a stable scalar key; do not carry the live object reference across clone boundaries.
  5. Where callers only need a “temporary calc skill”, rebuild from snapshot instead of cloning the live skill table.

Why

copyActiveSkill() currently carries supportList, actor, socketGroup, summonSkill, and srcInstance references into the clone path, which defeats the idea of a safe isolated copy.

Acceptance

  • Mirage / trigger / minion / multipart skills still calculate correctly.
  • Repeated clone/rebuild operations do not grow the object graph or overflow the stack.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant