Aller au contenu principal

Class: BaseLLM

Unified language model interface

Hierarchy

Implements

Constructors

constructor

new BaseLLM()

Properties

metadata

Abstract metadata: LLMMetadata

Implementation of

LLM.metadata

Defined in

packages/core/src/llm/LLM.ts:128

Methods

chat

Abstract chat(params): Promise<AsyncIterable<ChatResponseChunk>>

Get a chat response from the LLM

Parameters

NameType
paramsLLMChatParamsStreaming

Returns

Promise<AsyncIterable<ChatResponseChunk>>

Implementation of

LLM.chat

Defined in

packages/core/src/llm/LLM.ts:163

Abstract chat(params): Promise<ChatResponse>

Parameters

NameType
paramsLLMChatParamsNonStreaming

Returns

Promise<ChatResponse>

Implementation of

LLM.chat

Defined in

packages/core/src/llm/LLM.ts:166


chatToComplete

Private chatToComplete(stream): AsyncIterable<CompletionResponse>

Parameters

NameType
streamAsyncIterable<ChatResponseChunk>

Returns

AsyncIterable<CompletionResponse>

Defined in

packages/core/src/llm/LLM.ts:130


complete

complete(params): Promise<AsyncIterable<CompletionResponse>>

Get a prompt completion from the LLM

Parameters

NameType
paramsLLMCompletionParamsStreaming

Returns

Promise<AsyncIterable<CompletionResponse>>

Implementation of

LLM.complete

Defined in

packages/core/src/llm/LLM.ts:138

complete(params): Promise<CompletionResponse>

Parameters

NameType
paramsLLMCompletionParamsNonStreaming

Returns

Promise<CompletionResponse>

Implementation of

LLM.complete

Defined in

packages/core/src/llm/LLM.ts:141


tokens

Abstract tokens(messages): number

Calculates the number of tokens needed for the given chat messages

Parameters

NameType
messagesChatMessage[]

Returns

number

Implementation of

LLM.tokens

Defined in

packages/core/src/llm/LLM.ts:168