Skip to main content

Class: LLM

Extends

  • BaseClient

Constructors

Constructor

new LLM(ctx?, _id?, _historyJSON?, _lastReply?, _model?, _provider?, _sync?, _tools?): LLM

Constructor is used for internal usage only, do not create object from it.

Parameters

ctx?

Context

_id?

LLMID

_historyJSON?

string

_lastReply?

string

_model?

string

_provider?

string

_sync?

LLMID

_tools?

string

Returns

LLM

Overrides

BaseClient.constructor

Methods

attempt()

attempt(number_): LLM

create a branch in the LLM's history

Parameters

number_

number

Returns

LLM


bindResult()

bindResult(name): Binding

returns the type of the current state

Parameters

name

string

Returns

Binding


env()

env(): Env

return the LLM's current environment

Returns

Env


history()

history(): Promise<string[]>

return the llm message history

Returns

Promise<string[]>


historyJSON()

historyJSON(): Promise<string>

return the raw llm message history as json

Returns

Promise<string>


id()

id(): Promise<LLMID>

A unique identifier for this LLM.

Returns

Promise<LLMID>


lastReply()

lastReply(): Promise<string>

return the last llm reply from the history

Returns

Promise<string>


loop()

loop(): LLM

synchronize LLM state

Returns

LLM


model()

model(): Promise<string>

return the model used by the llm

Returns

Promise<string>


provider()

provider(): Promise<string>

return the provider used by the llm

Returns

Promise<string>


sync()

sync(): Promise<LLM>

synchronize LLM state

Returns

Promise<LLM>


tokenUsage()

tokenUsage(): LLMTokenUsage

returns the token usage of the current state

Returns

LLMTokenUsage


tools()

tools(): Promise<string>

print documentation for available tools

Returns

Promise<string>


with()

with(arg): LLM

Call the provided function with current LLM.

This is useful for reusability and readability by not breaking the calling chain.

Parameters

arg

(param) => LLM

Returns

LLM


withEnv()

withEnv(env): LLM

allow the LLM to interact with an environment via MCP

Parameters

env

Env

Returns

LLM


withModel()

withModel(model): LLM

swap out the llm model

Parameters

model

string

The model to use

Returns

LLM


withPrompt()

withPrompt(prompt): LLM

append a prompt to the llm context

Parameters

prompt

string

The prompt to send

Returns

LLM


withPromptFile()

withPromptFile(file): LLM

append the contents of a file to the llm context

Parameters

file

File

The file to read the prompt from

Returns

LLM


withQuery()

withQuery(): LLM

Provide the entire Query object to the LLM

Returns

LLM


withSystemPrompt()

withSystemPrompt(prompt): LLM

Add a system prompt to the LLM's environment

Parameters

prompt

string

The system prompt to send

Returns

LLM