[SKV-4350] Add OLLAMA_SUPPORTS_VISION env var, update docs (#4351)
Co-authored-by: Suchintan <suchintan@users.noreply.github.com>
This commit is contained in:
@@ -546,10 +546,11 @@ Recommended `LLM_KEY`: `GEMINI_2.5_PRO_PREVIEW`, `GEMINI_2.5_FLASH_PREVIEW`
|
||||
| `ENABLE_OLLAMA`| Register local models via Ollama | Boolean | `true`, `false` |
|
||||
| `OLLAMA_SERVER_URL` | URL for your Ollama server | String | `http://host.docker.internal:11434` |
|
||||
| `OLLAMA_MODEL` | Ollama model name to load | String | `qwen2.5:7b-instruct` |
|
||||
| `OLLAMA_SUPPORTS_VISION` | Enable vision support | Boolean | `true`, `false` |
|
||||
|
||||
Recommended `LLM_KEY`: `OLLAMA`
|
||||
|
||||
Note: Ollama does not support vision yet.
|
||||
Note: Set `OLLAMA_SUPPORTS_VISION=true` for vision models like qwen3-vl, llava, etc.
|
||||
|
||||
##### OpenRouter
|
||||
| Variable | Description| Type | Sample Value|
|
||||
|
||||
@@ -97,6 +97,7 @@ services:
|
||||
# - ENABLE_OLLAMA=true
|
||||
# - OLLAMA_MODEL=qwen2.5:7b-instruct
|
||||
# - OLLAMA_SERVER_URL=http://host.docker.internal:11434
|
||||
# - OLLAMA_SUPPORTS_VISION=false
|
||||
# Open Router Support:
|
||||
# - ENABLE_OPENROUTER=true
|
||||
# - LLM_KEY=OPENROUTER
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
ENABLE_OLLAMA=true
|
||||
OLLAMA_SERVER_URL=http://localhost:11434
|
||||
OLLAMA_MODEL=llama3.1
|
||||
OLLAMA_SUPPORTS_VISION=false
|
||||
|
||||
# --- DISABLE OTHER PROVIDERS ---
|
||||
ENABLE_OPENAI=false
|
||||
|
||||
@@ -33,9 +33,13 @@ OLLAMA_SERVER_URL=http://localhost:11434
|
||||
|
||||
# Model name in Ollama (check with `ollama list`)
|
||||
OLLAMA_MODEL=llama3.1
|
||||
|
||||
# Enable vision support for multimodal models (qwen3-vl, llava, etc.)
|
||||
OLLAMA_SUPPORTS_VISION=false
|
||||
```
|
||||
|
||||
> Note: Ollama may not support `max_completion_tokens` — Skyvern handles this internally.
|
||||
> Set `OLLAMA_SUPPORTS_VISION=true` if using a vision model like qwen3-vl or llava.
|
||||
|
||||
---
|
||||
|
||||
@@ -119,5 +123,5 @@ curl -s http://localhost:4000/v1/models -H "Authorization: Bearer sk-test" | j
|
||||
|
||||
## Internal References
|
||||
|
||||
- Ollama vars: `ENABLE_OLLAMA`, `OLLAMA_SERVER_URL`, `OLLAMA_MODEL`
|
||||
- Ollama vars: `ENABLE_OLLAMA`, `OLLAMA_SERVER_URL`, `OLLAMA_MODEL`, `OLLAMA_SUPPORTS_VISION`
|
||||
- OpenAI-compatible vars: `ENABLE_OPENAI_COMPATIBLE`, `OPENAI_COMPATIBLE_MODEL_NAME`, `OPENAI_COMPATIBLE_API_KEY`, `OPENAI_COMPATIBLE_API_BASE`, `OPENAI_COMPATIBLE_API_VERSION`, `OPENAI_COMPATIBLE_SUPPORTS_VISION`, `OPENAI_COMPATIBLE_REASONING_EFFORT`
|
||||
|
||||
@@ -318,6 +318,7 @@ class Settings(BaseSettings):
|
||||
ENABLE_OLLAMA: bool = False
|
||||
OLLAMA_SERVER_URL: str | None = None
|
||||
OLLAMA_MODEL: str | None = None
|
||||
OLLAMA_SUPPORTS_VISION: bool = False
|
||||
|
||||
# OPENROUTER
|
||||
ENABLE_OPENROUTER: bool = False
|
||||
|
||||
@@ -1424,7 +1424,7 @@ if settings.ENABLE_OLLAMA:
|
||||
LLMConfig(
|
||||
f"ollama/{ollama_model_name}",
|
||||
["OLLAMA_SERVER_URL", "OLLAMA_MODEL"],
|
||||
supports_vision=False, # Ollama does not support vision yet
|
||||
supports_vision=settings.OLLAMA_SUPPORTS_VISION,
|
||||
add_assistant_prefix=False,
|
||||
litellm_params=LiteLLMParams(
|
||||
api_base=settings.OLLAMA_SERVER_URL,
|
||||
|
||||
Reference in New Issue
Block a user