diff --git a/docs/reference/cli/evcc.md b/docs/reference/cli/evcc.md index a53326d16..f28b4a354 100644 --- a/docs/reference/cli/evcc.md +++ b/docs/reference/cli/evcc.md @@ -18,7 +18,6 @@ evcc [flags] --ignore-db Run command ignoring service database -l, --log string Log level (fatal, error, warn, info, debug, trace) (default "info") --log-headers Log headers - --mcp Expose MCP service (experimental) --metrics Expose metrics --profile Expose pprof profiles --template string Add custom template file (debug only) diff --git a/i18n/en/docusaurus-plugin-content-blog/2025-07-30/highlights-config-ui-feedin-ai.mdx b/i18n/en/docusaurus-plugin-content-blog/2025-07-30/highlights-config-ui-feedin-ai.mdx index f9933e321..a6bc849cf 100644 --- a/i18n/en/docusaurus-plugin-content-blog/2025-07-30/highlights-config-ui-feedin-ai.mdx +++ b/i18n/en/docusaurus-plugin-content-blog/2025-07-30/highlights-config-ui-feedin-ai.mdx @@ -177,7 +177,7 @@ More about this [here](https://github.com/evcc-io/evcc/issues/21747). With the [Model Context Protocol](https://en.wikipedia.org/wiki/Model_Context_Protocol) (MCP for short), it's possible to give LLMs like Claude, Gemini, and ChatGPT structured access to external systems, such as evcc. -With the [CLI flag](https://docs.evcc.io/docs/reference/cli/evcc) `--mcp`, you can activate an experimental MCP server when starting evcc. +When experimental is activated [MCP](https://docs.evcc.io/en/docs/integrations/mcp), an experimental MCP server will be enabled in evcc. You can include the new endpoint (e.g., `http://evcc.local:7070/mcp`) in your LLM's configuration. The topic of MCP and the available tools are still very young and constantly changing. diff --git a/i18n/en/docusaurus-plugin-content-docs/current/integrations/mcp.mdx b/i18n/en/docusaurus-plugin-content-docs/current/integrations/mcp.mdx index 3164b2014..0177b5618 100644 --- a/i18n/en/docusaurus-plugin-content-docs/current/integrations/mcp.mdx +++ b/i18n/en/docusaurus-plugin-content-docs/current/integrations/mcp.mdx @@ -15,11 +15,7 @@ The MCP server is experimental and may change at any time. With the [Model Context Protocol](https://en.wikipedia.org/wiki/Model_Context_Protocol) (MCP for short), it is possible to give LLMs like Claude, Gemini, and ChatGPT structured access to external systems, such as evcc. -Start evcc with the MCP server enabled: - -```bash -evcc --mcp -``` +MCP server is enabled on boot if experimental is activated. ## Usage with Claude Code @@ -42,11 +38,7 @@ Of course, you can also use other LLMs like Gemini or ChatGPT. claude mcp add --transport http evcc http://localhost:7070/mcp ``` -4. Make sure your evcc instance is running with the MCP server: - - ```bash - evcc --mcp - ``` +4. Make sure your evcc instance is running with the MCP server by checking MCP card in the UI configuration. 5. Start Claude Code and enter a query.