Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
833fb95
Isolate demo dependencies and pin orjson for CVE-2025-67221 mitigatio…
evcyen Feb 17, 2026
50b5540
feat: Add LIT integration for interactive model analysis (#121) (#1163)
HetanshWaghela Feb 17, 2026
affe77a
fix: set n_ctx=512 for TinyStories models (#1162)
puranikyashaswin Feb 18, 2026
ebbb965
Fix/tokenize and concatenate invalid token (#1179)
evcyen Feb 19, 2026
a494811
Remove spurious warning for tokenize_and_concatenate (#1177)
evcyen Feb 20, 2026
bdedee7
Add MMLU benchmark evaluation to evals (#1183)
CarlG0123 Mar 12, 2026
8e53661
Fix/1076 logit lens layer norm (#1180)
evcyen Mar 13, 2026
8f6cdd3
Updating Interactive Neuroscope, CI to properly install demo (#1205)
jlarson4 Mar 16, 2026
4a5cc6f
Fix tokenize_and_concatenate splitting tokens across chunk boundaries…
brainsnog Mar 16, 2026
7784be1
Fix deprecated IPython magic() calls in demo notebooks (issue #1036) …
brainsnog Mar 16, 2026
0199ef8
Expose n_ctx override in HookedTransformer.from_pretrained (issue #10…
brainsnog Mar 16, 2026
3f1b19b
Added warning flags for usages of MPS (#1182)
jlarson4 Mar 16, 2026
7b3929b
Add GPT-OSS-20B model support (#1195)
CarlG0123 Mar 16, 2026
28d7a56
fixed the logit lens implementation inside ActivationCache.accumulate…
hartigel Mar 16, 2026
e15d32d
Add Apertus model support with XIeLU activation (#1197)
sinievanderben Mar 18, 2026
aa59475
Updating Apertus setup to properly handle QK norm weights, and proper…
jlarson4 Mar 20, 2026
cd07b6b
CI format checks
jlarson4 Mar 20, 2026
b44d9e4
Updating Transformers version so we can run Apertus and GPT OSS
jlarson4 Mar 20, 2026
1382b2e
lock file regenerated
jlarson4 Mar 20, 2026
6a2e635
Docs cleanup
jlarson4 Mar 21, 2026
f7325c6
Fix attention calculation on mps for torch 2.8.0 (#1068)
BrownianNotion Mar 22, 2026
a48c4e2
HuBERT support rollout (#1111)
david-wei-01001 Mar 23, 2026
995f336
Adding additional testing for HuBERT, Apertus, and GPTOSS (#1210)
jlarson4 Mar 23, 2026
1865f06
Fixing doc strings for Build Docs
jlarson4 Mar 23, 2026
d367b7f
Fix backward hooks Runtime Error (#1175)
evcyen Mar 23, 2026
251bfe6
Gemma test fix
jlarson4 Mar 23, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ jobs:
- name: Install dependencies
run: |
poetry check --lock
poetry install --with dev,jupyter
poetry install --with dev,jupyter,demo
- name: Install pandoc
uses: awalsh128/cache-apt-pkgs-action@latest
with:
Expand Down
4 changes: 2 additions & 2 deletions demos/ARENA_Content.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@
"\n",
" ipython = get_ipython()\n",
" # Code to automatically update the HookedTransformer code as its edited without restarting the kernel\n",
" ipython.magic(\"load_ext autoreload\")\n",
" ipython.magic(\"autoreload 2\")\n",
" ipython.run_line_magic(\"load_ext\", \"autoreload\")\n",
" ipython.run_line_magic(\"autoreload\", \"2\")\n",
"\n",
"if IN_GITHUB or IN_COLAB:\n",
" %pip install torch\n",
Expand Down
4 changes: 2 additions & 2 deletions demos/Activation_Patching_in_TL_Demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@
"\n",
" ipython = get_ipython()\n",
" # Code to automatically update the HookedTransformer code as its edited without restarting the kernel\n",
" ipython.magic(\"load_ext autoreload\")\n",
" ipython.magic(\"autoreload 2\")"
" ipython.run_line_magic(\"load_ext\", \"autoreload\")\n",
" ipython.run_line_magic(\"autoreload\", \"2\")"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion demos/Attribution_Patching_Demo.ipynb

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions demos/BERT.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,8 @@
"\n",
" ipython = get_ipython()\n",
" # Code to automatically update the HookedTransformer code as its edited without restarting the kernel\n",
" ipython.magic(\"load_ext autoreload\")\n",
" ipython.magic(\"autoreload 2\")\n",
" ipython.run_line_magic(\"load_ext\", \"autoreload\")\n",
" ipython.run_line_magic(\"autoreload\", \"2\")\n",
"\n",
"if IN_COLAB:\n",
" %pip install transformer_lens\n",
Expand Down
Loading
Loading