LLM
class LLM extends AbstractObject implements IdAble
Properties
$lastQuery | from AbstractObject |
Methods
No description
returns the type of the current state
return the llm message history
return the raw llm message history as json
A unique identifier for this LLM.
return the last llm reply from the history
return the model used by the llm
return the provider used by the llm
returns the token usage of the current state
print documentation for available tools
append a prompt to the llm context
append the contents of a file to the llm context
Add a system prompt to the LLM's environment
Details
in
AbstractObject at line 13
__construct(AbstractClient $client, QueryBuilderChain $queryBuilderChain)
No description
in
AbstractObject at line 19
protected null|array|string|int|float|bool
queryLeaf(QueryBuilder $leafQueryBuilder, string $leafKey)
No description
at line 16
LLM
attempt(int $number)
create a branch in the LLM's history
at line 26
Binding
bindResult(string $name)
returns the type of the current state
at line 36
Env
env()
return the LLM's current environment
at line 45
array
history()
return the llm message history
at line 54
string
historyJSON()
return the raw llm message history as json
at line 63
AbstractId
id()
A unique identifier for this LLM.
at line 72
string
lastReply()
return the last llm reply from the history
at line 81
LLM
loop()
synchronize LLM state
at line 90
string
model()
return the model used by the llm
at line 99
string
provider()
return the provider used by the llm
at line 108
LLMId
sync()
synchronize LLM state
at line 117
LLMTokenUsage
tokenUsage()
returns the token usage of the current state
at line 126
string
tools()
print documentation for available tools
at line 135
LLM
withEnv(Env $env)
allow the LLM to interact with an environment via MCP
at line 145
LLM
withModel(string $model)
swap out the llm model
at line 155
LLM
withPrompt(string $prompt)
append a prompt to the llm context
at line 165
LLM
withPromptFile(File $file)
append the contents of a file to the llm context
at line 175
LLM
withQuery()
Provide the entire Query object to the LLM
at line 184
LLM
withSystemPrompt(string $prompt)
Add a system prompt to the LLM's environment