LLM
class LLM extends AbstractObject implements IdAble
Properties
$lastQuery | from AbstractObject |
Methods
No description
returns the type of the current state
Indicates whether there are any queued prompts or tool results to send to the model
return the llm message history
return the raw llm message history as json
A unique identifier for this LLM.
return the last llm reply from the history
Submit the queued prompt, evaluate any tool calls, queue their results, and keep going until the model ends its turn
return the model used by the llm
return the provider used by the llm
Submit the queued prompt or tool call results, evaluate any tool calls, and queue their results
returns the token usage of the current state
print documentation for available tools
Return a new LLM with the specified function no longer exposed as a tool
Add an external MCP server to the LLM
append a prompt to the llm context
append the contents of a file to the llm context
Use a static set of tools for method calls, e.g. for MCP clients that do not support dynamic tool registration
Add a system prompt to the LLM's environment
Disable the default system prompt
Details
in
AbstractObject at line 13
__construct(AbstractClient $client, QueryBuilderChain $queryBuilderChain)
No description
in
AbstractObject at line 19
protected null|array|string|int|float|bool
queryLeaf(QueryBuilder $leafQueryBuilder, string $leafKey)
No description
at line 16
LLM
attempt(int $number)
create a branch in the LLM's history
at line 26
Binding
bindResult(string $name)
returns the type of the current state
at line 36
Env
env()
return the LLM's current environment
at line 45
bool
hasPrompt()
Indicates whether there are any queued prompts or tool results to send to the model
at line 54
array
history()
return the llm message history
at line 63
Json
historyJSON()
return the raw llm message history as json
at line 72
AbstractId
id()
A unique identifier for this LLM.
at line 81
string
lastReply()
return the last llm reply from the history
at line 90
LLM
loop()
Submit the queued prompt, evaluate any tool calls, queue their results, and keep going until the model ends its turn
at line 99
string
model()
return the model used by the llm
at line 108
string
provider()
return the provider used by the llm
at line 117
LLMId
step()
Submit the queued prompt or tool call results, evaluate any tool calls, and queue their results
at line 126
LLMId
sync()
synchronize LLM state
at line 135
LLMTokenUsage
tokenUsage()
returns the token usage of the current state
at line 144
string
tools()
print documentation for available tools
at line 153
LLM
withBlockedFunction(string $typeName, string $function)
Return a new LLM with the specified function no longer exposed as a tool
at line 164
LLM
withEnv(Env $env)
allow the LLM to interact with an environment via MCP
at line 174
LLM
withMCPServer(string $name, Service $service)
Add an external MCP server to the LLM
at line 185
LLM
withModel(string $model)
swap out the llm model
at line 195
LLM
withPrompt(string $prompt)
append a prompt to the llm context
at line 205
LLM
withPromptFile(File $file)
append the contents of a file to the llm context
at line 215
LLM
withStaticTools()
Use a static set of tools for method calls, e.g. for MCP clients that do not support dynamic tool registration
at line 224
LLM
withSystemPrompt(string $prompt)
Add a system prompt to the LLM's environment
at line 234
LLM
withoutDefaultSystemPrompt()
Disable the default system prompt