Documentation Index
Fetch the complete documentation index at: https://docs.pixeltable.com/llms.txt
Use this file to discover all available pages before exploring further.
module pixeltable.functions.ollama
Pixeltable UDFs for Ollama local models. Provides integration with Ollama for running large language models locally, including chat completions and embeddings.udf chat()
Signature
messages(pxt.Json): The messages of the chat.model(pxt.String): The model name.tools(pxt.Json | None): Tools for the model to use.format(pxt.String | None): The format of the response; must be one of'json'orNone.options(pxt.Json | None): Additional options to pass to thechatcall, such asmax_tokens,temperature,top_p, andtop_k. For details, see the Valid Parameters and Values section of the Ollama documentation.
udf embed()
Signature
input(pxt.String): The input text to generate embeddings for.model(pxt.String): The model name.truncate(pxt.Bool): Truncates the end of each input to fit within context length. Returns error if false and context length is exceeded.options(pxt.Json | None): Additional options to pass to theembedcall. For details, see the Valid Parameters and Values section of the Ollama documentation.
udf generate()
Signature
prompt(pxt.String): The prompt to generate a response for.model(pxt.String): The model name.suffix(pxt.String): The text after the model response.format(pxt.String | None): The format of the response; must be one of'json'orNone.system(pxt.String): System message.template(pxt.String): Prompt template to use.context(pxt.Json | None): The context parameter returned from a previous call togenerate().raw(pxt.Bool): IfTrue, no formatting will be applied to the prompt.options(pxt.Json | None): Additional options for the Ollamachatcall, such asmax_tokens,temperature,top_p, andtop_k. For details, see the Valid Parameters and Values section of the Ollama documentation.