Difference between BaseChatModel and BaseLLM for text extraction #315
konstantinoskalyfommatos
started this conversation in
General
Replies: 1 comment
-
I wrote kor before chat models were a big thing, so prompts are probably better tuned for llms all else being equal. Otherwise I'd use langchain_ollama since that's the official integration for ollama. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
In theory, would one of those 2 work better for text extraction, if the model is the same?
from langchain_community.llms.ollama import Ollama
llm = Ollama(model="llama3.1", temperature=0)
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3.1", temperature=0)
Beta Was this translation helpful? Give feedback.
All reactions