One of the awkward bits of having to create a Chat instance to access .list_models() is that in ChatOllama() you have to provide a model name... to be able to list the models.
import chatlas
chatlas.ChatOllama().list_models()
#> Traceback (most recent call last):
#> File "<string>", line 1, in <module>
#> File "/Users/garrick/Library/Caches/uv/archive-v0/gi-SY39HQeSIVUWk13RdX/lib/python3.12/site-packages/chatlas/_provider_ollama.py", line 95, in ChatOllama
#> raise ValueError(
#> ValueError: Must specify model. Locally installed models: qwen3:14b, phi4-mini, granite3.3:8b, gemma3:4b, qwen3:4b, qwen2.5-coder:14b-instruct-q2_K, deepseek-r1:14b, nomic-embed-text, llama3.1:8b, llama3.2
It's helpful that the error message gives you a list of installed models! But a little awkward.
And note that you don't need to provide an installed model to access the model list.
chatlas.ChatOllama("not-a-model").list_models()
#> [ { 'created_at': '2025-07-05T08:52:32.279311903-04:00',
#> 'id': 'qwen3:14b',
#> 'size': 9276198565},
#> { 'created_at': '2025-07-05T08:49:29.346200434-04:00',
#> 'id': 'phi4-mini',
#> 'size': 2491876774},
#> { 'created_at': '2025-07-03T16:01:14.509286837-04:00',
#> 'id': 'granite3.3:8b',
#> 'size': 4942891653},
#> { 'created_at': '2025-07-03T16:00:39.590685087-04:00',
#> 'id': 'gemma3:4b',
#> 'size': 3338801804},
#> { 'created_at': '2025-05-12T23:12:53.22791702-04:00',
#> 'id': 'qwen3:4b',
#> 'size': 2620788019},
#> { 'created_at': '2025-03-02T09:54:13.568493478-05:00',
#> 'id': 'qwen2.5-coder:14b-instruct-q2_K',
#> 'size': 5770511454},
#> { 'created_at': '2025-01-30T20:31:16.927061496-05:00',
#> 'id': 'deepseek-r1:14b',
#> 'size': 8988112040},
#> { 'created_at': '2024-10-14T17:01:39.820091269-04:00',
#> 'id': 'nomic-embed-text',
#> 'size': 274302450},
#> { 'created_at': '2024-10-08T09:03:40.302847808-04:00',
#> 'id': 'llama3.1:8b',
#> 'size': 4661230766},
#> { 'created_at': '2024-10-01T09:04:15.313810108-04:00',
#> 'id': 'llama3.2',
#> 'size': 2019393189}]
Created at 2025-08-31 08:54 EDT by reprexlite v1.0.0
One of the awkward bits of having to create a
Chatinstance to access.list_models()is that inChatOllama()you have to provide a model name... to be able to list the models.It's helpful that the error message gives you a list of installed models! But a little awkward.
And note that you don't need to provide an installed model to access the model list.
Created at 2025-08-31 08:54 EDT by reprexlite v1.0.0