Class: BaseLLM
Unified language model interface
Hierarchy
Implements
Constructors
constructor
• new BaseLLM()
Properties
metadata
• Abstract
metadata: LLMMetadata
Implementation of
Defined in
packages/core/src/llm/LLM.ts:128
Methods
chat
▸ Abstract
chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Implementation of
Defined in
packages/core/src/llm/LLM.ts:163
▸ Abstract
chat(params
): Promise
<ChatResponse
>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming |
Returns
Promise
<ChatResponse
>
Implementation of
Defined in
packages/core/src/llm/LLM.ts:166
chatToComplete
▸ Private
chatToComplete(stream
): AsyncIterable
<CompletionResponse
>
Parameters
Name | Type |
---|---|
stream | AsyncIterable <ChatResponseChunk > |
Returns
AsyncIterable
<CompletionResponse
>
Defined in
packages/core/src/llm/LLM.ts:130
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Implementation of
Defined in
packages/core/src/llm/LLM.ts:138
▸ complete(params
): Promise
<CompletionResponse
>
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsNonStreaming |
Returns
Promise
<CompletionResponse
>
Implementation of
Defined in
packages/core/src/llm/LLM.ts:141
tokens
▸ Abstract
tokens(messages
): number
Calculates the number of tokens needed for the given chat messages
Parameters
Name | Type |
---|---|
messages | ChatMessage [] |
Returns
number