feat: route all @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) to local Ollama#10
feat: route all @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) to local Ollama#10
Conversation
Co-authored-by: blackboxprogramming <118287761+blackboxprogramming@users.noreply.github.com>
|
You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard. |
|
This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation. |
There was a problem hiding this comment.
Pull request overview
This PR adds a complete local Ollama chat integration to what was previously a repository with no runnable application. All @mentions (@copilot, @lucidia, @blackboxprogramming, @ollama) are routed to a local Ollama instance with zero dependency on external AI providers.
Changes:
- New
ollama.jsmodule with handle parsing (parseHandle), recognized handles list (OLLAMA_HANDLES), and a streaming chat client (ollamaChat) targeting the local Ollama HTTP API. - New
index.htmlchat UI with configurable Ollama base URL/model, live online/offline status badge, auto-populated model picker, streaming token-by-token response rendering, and conversation history. - New
ollama.test.jswith 16 unit tests covering handle detection, trailing-dot stripping, case-insensitivity, unknown handles, and edge cases.
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
ollama.js |
Core routing module: canonical handle list, parseHandle() for stripping @mentions, and ollamaChat() streaming client for local Ollama HTTP API |
index.html |
Full chat UI with settings bar, model picker, streaming responses, conversation history, and brand-compliant styling |
ollama.test.js |
16 unit tests for OLLAMA_HANDLES and parseHandle() covering all handles, trailing dots, case-insensitivity, and edge cases |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
The repo had no runnable application. The requirement: every
@copilot,@lucidia,@blackboxprogramming, and@ollamamention must resolve to a local Ollama instance — zero dependency on external AI providers.Changes
ollama.js— routing + streaming clientOLLAMA_HANDLES— canonical list of recognised handlesparseHandle(text)— case-insensitive, strips trailing dots (@blackboxprogramming.), returns{ handle, prompt }with the@handleremoved from the prompt sent to OllamaollamaChat(...)— streams directly to/api/chaton the local Ollama HTTP API; no external AI SDK or providerindex.html— chat UIhttp://localhost:11434,llama3)/api/tags; model picker auto-populated from available local models@handle → ollamatag so routing is always visibleollama.test.js— unit testsOriginal prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.