Ollama API Connection Failure (404 Not Found) at api.llm.gestaltservers.com #30
Loading…
x
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
The asset processor tool's Python script (Asset-Frameworker/gui/llm_prediction_handler.py) is unable to connect to the self-hosted Ollama instance expected at https://api.llm.gestaltservers.com. Attempts to use standard Ollama API endpoints result in HTTP 404 errors.
Environment:
Steps Tried & Observations:
Issue:
There is conflicting information. The script consistently gets 404 errors for all tested Ollama API paths (/api/generate, /v1/chat/completions), indicating the server isn't finding these resources when requested via the script. However, an initial (malformed) curl test suggested /v1/chat/completions might be configured, as it returned a 400 instead of 404. It's unclear if the issue lies purely in the server-side configuration (reverse proxy, Ollama service) or if there's a subtle difference in how the script's requests are formed compared to curl.
Next Diagnostic Steps:
Action Items: