Skip to content

ps method not working through OpenWebUI - ValidationError #497

@ckuethe

Description

@ckuethe

As the subject says, the Client.ps method doesn't work through OpenWebUI. I'm using python 3.10 under jupyter on Ubuntu 22.04, with ollama-python 0.4.7. The server is running Ubuntu 24.04, ollama 0.6.4, and OpenWebUI 0.6.1 (or whatever the latest is at the time)

I think my library versions are all up to date:

>>> !pip install --upgrade ollama
Requirement already satisfied: ollama in /home/ckuethe/jupyter/lib/python3.10/site-packages (0.4.7)
Requirement already satisfied: pydantic<3.0.0,>=2.9.0 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from ollama) (2.9.2)
Requirement already satisfied: httpx<0.29,>=0.27 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from ollama) (0.28.1)
Requirement already satisfied: anyio in /home/ckuethe/jupyter/lib/python3.10/site-packages (from httpx<0.29,>=0.27->ollama) (4.6.0)
Requirement already satisfied: httpcore==1.* in /home/ckuethe/jupyter/lib/python3.10/site-packages (from httpx<0.29,>=0.27->ollama) (1.0.6)
Requirement already satisfied: certifi in /home/ckuethe/jupyter/lib/python3.10/site-packages (from httpx<0.29,>=0.27->ollama) (2025.1.31)
Requirement already satisfied: idna in /home/ckuethe/jupyter/lib/python3.10/site-packages (from httpx<0.29,>=0.27->ollama) (3.10)
Requirement already satisfied: h11<0.15,>=0.13 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from httpcore==1.*->httpx<0.29,>=0.27->ollama) (0.14.0)
Requirement already satisfied: typing-extensions>=4.6.1 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from pydantic<3.0.0,>=2.9.0->ollama) (4.12.2)
Requirement already satisfied: pydantic-core==2.23.4 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from pydantic<3.0.0,>=2.9.0->ollama) (2.23.4)
Requirement already satisfied: annotated-types>=0.6.0 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from pydantic<3.0.0,>=2.9.0->ollama) (0.7.0)
Requirement already satisfied: sniffio>=1.1 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from anyio->httpx<0.29,>=0.27->ollama) (1.3.1)
Requirement already satisfied: exceptiongroup>=1.0.2 in /home/ckuethe/jupyter/lib/python3.10/site-packages (from anyio->httpx<0.29,>=0.27->ollama) (1.2.2)

I can connect to my LLM server and list the models I have downloaded... apparently 53 today.

>>> c = ollama.Client(
    host=f'https://{llm_svc}/ollama',
    headers={'Authorization': f'Bearer {apikey}'})
>>> len(c.list().models)
53

If I run a model directly on the ollama host I can see that it runs and consumes VRAM:

$ ollama run phi4 "why is the sky blue?" | wc
      6     199    1215
$ ollama ps
NAME           ID              SIZE     PROCESSOR    UNTIL              
phi4:latest    ac896e5b8b34    12 GB    100% GPU     4 minutes from now    

I can do the same thing with ollama-python and get some reasonable output:

>>> resp = c.generate("phi4", "why is the sky blue?")
>>> print((resp.done, resp.done_reason, len(resp.response), len(resp.context)))
(True, 'stop' 1393, 281)

Seems like OpenWebUI is correctly passing my commands through to the backend ollama instance, but...

>>> c.ps()
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[57], line 1
----> 1 c.ps()

File ~/jupyter/lib/python3.10/site-packages/ollama/_client.py:609, in Client.ps(self)
    608 def ps(self) -> ProcessResponse:
--> 609   return self._request(
    610     ProcessResponse,
    611     'GET',
    612     '/api/ps',
    613   )

File ~/jupyter/lib/python3.10/site-packages/ollama/_client.py:178, in Client._request(self, cls, stream, *args, **kwargs)
    174         yield cls(**part)
    176   return inner()
--> 178 return cls(**self._request_raw(*args, **kwargs).json())

File ~/jupyter/lib/python3.10/site-packages/pydantic/main.py:212, in BaseModel.__init__(self, **data)
    210 # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    211 __tracebackhide__ = True
--> 212 validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    213 if self is not validated_self:
    214     warnings.warn(
    215         'A custom validator is returning a value other than `self`.\n'
    216         "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
    217         'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
    218         category=None,
    219     )

ValidationError: 1 validation error for ProcessResponse
models
  Field required [type=missing, input_value={'http://llmhost:1143...e_vram/': 12298193578}]}}, input_type=dict]
    For further information visit https://errors.pydantic.dev/2.9/v/missing

But when I just issue the API call directly, then I do get a meaningful response:

>>> resp = c._request_raw('GET', '/api/ps')
>>> resp.json()
{'http://llmhost:11434/': {'models': [{'name': 'phi4:latest',
    'model': 'phi4:latest',
    'size': 12298193578,
    'digest': 'ac896e5b8b34a1f4efa7b14d7520725140d5512484457fab45d2a4ea14c69dba',
    'details': {'parent_model': '',
     'format': 'gguf',
     'family': 'phi3',
     'families': ['phi3'],
     'parameter_size': '14.7B',
     'quantization_level': 'Q4_K_M'},
    'expires_at': '2025-04-06T13:59:14.013026708-07:00',
    'size_vram': 12298193578}]}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions