llm_chat()

Description:

Ask questions using the specified LLM model.

Syntax:

llm_chat(fd,question,[url])

Note:

LLMCli external library function (See External Library Guide).

 

If you think the answer obtained by using the current model is less relevant to the question, try using another model to ask the question again.

Parameter:

fd

LLM connection object.

question

A string, specifying the question you want to ask.

url

According to the reference source, either a URL or the absolute path of a local text file.

Option:

@s

Do not reference contexts from the previous exchanges for the current dialogue, that is, perform single-turn dialogues. By default, perform multi-turn dialogues.

@r

Clear all the previously asked questions and start a new round of dialogue.

Return value:

String

Example:

 

A

 

1

=llm_open("http://localhost:11434/api/chat","sk_adfsdfdf", "deepseek-r1:1.5b")

Connect to Deepseek model through the non-stream mode.

2

=llm_chat(A1,"简单介绍一下集算器函数A.sort(),不需要举例")

 

3

=llm_chat@s(A1,"请介绍下无参数的T.create()函数,并举例")

Perform a single-turn dialogue without referencing contexts from the previous exchanges.

4

=llm_chat(A1,"此函数的详细语法","http://d.raqsoft.com.cn:6999/esproc/func/asortxloc.html")

Ask a question based on A2’s question and irrelevant to A3’s while specifying the reference source.

5

请用集算器SPL脚本实现:A.txt中有一些数字,用空格分隔,现在需要按照数字从高到低排序,并且输出到test.txt,其中A.txt内容如下:80 80 70 60 85 75 95 60 85 70 75 65

 

6

=llm_chat@r(A1,A5)

Clear the question history and start a new round of dialogue.

7

>llm_close(A1)

Close the connection.