Skip to content

Conversation

imfing
Copy link
Contributor

@imfing imfing commented Dec 16, 2024

This PR essentially reverts #279

When I followed

agent = Agent('llama3.2', result_type=CityLocation)

I got pydantic_ai.exceptions.UserError: Unknown model: llama3.2

Traceback (most recent call last):
  File "/dev/github.com/pydantic/pydantic-ai/test.py", line 11, in <module>
    agent = Agent('llama3.2', result_type=CityLocation)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/dev/github.com/pydantic/pydantic-ai/.venv/lib/python3.12/site-packages/pydantic_ai/agent.py", line 160, in __init__
    self.model = models.infer_model(model)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/dev/github.com/pydantic/pydantic-ai/.venv/lib/python3.12/site-packages/pydantic_ai/models/__init__.py", line 302, in infer_model
    raise UserError(f'Unknown model: {model}')
pydantic_ai.exceptions.UserError: Unknown model: llama3.2

turns out the Agent class initialization checks the prefix:

elif model.startswith('ollama:'):

@samuelcolvin samuelcolvin merged commit 339f89c into pydantic:main Dec 17, 2024
15 checks passed
@samuelcolvin
Copy link
Member

thanks, sorry for the mistake.

@asmith26
Copy link
Contributor

asmith26 commented Dec 17, 2024

Apologies for the mistake/inconvenience, I've just realised this myself - I was exploring using Agent by itself vs OllamaModel with Agent and must have gotten in a bit of a muddle:

from pydantic_ai import Agent

# prefix needed here (parsed from https://github.com/pydantic/pydantic-ai/blob/339f89c76837c2264322e0e6e570c957516a6f79/pydantic_ai_slim/pydantic_ai/models/__init__.py#L66)
agent = Agent('ollama:llama3.2', result_type=CityLocation)  

vs

from pydantic_ai.models.ollama import OllamaModel

# no prefix needed here (parsed from: https://github.com/pydantic/pydantic-ai/blob/339f89c76837c2264322e0e6e570c957516a6f79/pydantic_ai_slim/pydantic_ai/models/ollama.py#L32)
ollama_model = OllamaModel(model_name='llama3.2')

from pydantic_ai import Agent
agent = Agent(ollama_model, result_type=CityLocation)

Apologies again for the mistake/inconvenience, and many thanks for the fix/amazing lib!

@imfing imfing deleted the fix-ollama-doc branch December 17, 2024 22:41
@srishrachamalla7
Copy link

I was having some issues with:

agent = Agent('ollama:llama3.2:1b', result_type=CityLocation)

But after seeing this, I tried:

ollama_model = OllamaModel(model_name='llama3.2:1b')

which is now working fine.

Thanks so much for your help and the amazing library!

Here's my updated code:

from pydantic import BaseModel
from pydantic_ai.models.ollama import OllamaModel

class CityLocation(BaseModel):
    city: str
    country: str

# agent = Agent('ollama:llama3.2 ', result_type=CityLocation)  # Original line
ollama_model = OllamaModel(model_name='llama3.2:1b')  # Updated line

from pydantic_ai import Agent
agent = Agent(ollama_model, result_type=CityLocation)

result = agent.run_sync('Where were the olympics held in 2012?')
print(result.data)
#> city='London' country='United Kingdom'
print(result.usage())
"""
Usage(requests=1, request_tokens=57, response_tokens=8, total_tokens=65, details=None)
"""

Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants