rag-system/llm/ollama.py
2025-05-01 12:21:47 -05:00

7 lines
171 B
Python

from langchain_ollama import OllamaLLM
def load_llm():
return OllamaLLM(
model="llama3.2",
base_url="http://localhost:11434",
temperature=0)