UDFs
chat() udf
Generate the next message in a chat with a provided model.
Signature:
messages(Json): The messages of the chat.model(String): The model name.tools(Optional[Json]): Tools for the model to use.format(Optional[String]): The format of the response; must be one of'json'orNone.options(Optional[Json]): Additional options to pass to thechatcall, such asmax_tokens,temperature,top_p, andtop_k. For details, see the Valid Parameters and Values section of the Ollama documentation.
embed() udf
Generate embeddings from a model.
Signature:
input(String): The input text to generate embeddings for.model(String): The model name.truncate(Bool): Truncates the end of each input to fit within context length. Returns error if false and context length is exceeded.options(Optional[Json]): Additional options to pass to theembedcall. For details, see the Valid Parameters and Values section of the Ollama documentation.
generate() udf
Generate a response for a given prompt with a provided model.
Signature:
prompt(String): The prompt to generate a response for.model(String): The model name.suffix(String): The text after the model response.format(Optional[String]): The format of the response; must be one of'json'orNone.system(String): System message.template(String): Prompt template to use.context(Optional[Json]): The context parameter returned from a previous call togenerate().raw(Bool): IfTrue, no formatting will be applied to the prompt.options(Optional[Json]): Additional options for the Ollamachatcall, such asmax_tokens,temperature,top_p, andtop_k. For details, see the Valid Parameters and Values section of the Ollama documentation.