Module: core/conversation
Type Aliases
ConversationMessage
Ƭ ConversationMessage: ConversationMessageType
<"user"
, typeof UserMessage
> | ConversationMessageType
<"assistant"
, typeof AssistantMessage
> | ConversationMessageType
<"system"
, typeof SystemMessage
> | ConversationMessageType
<"functionCall"
, typeof FunctionCall
> | ConversationMessageType
<"functionResponse"
, typeof FunctionResponse
>
A type that represents a conversation message.
Defined in
ai-jsx/src/core/conversation.tsx:163
Variables
ConversationHistoryContext
• Const
ConversationHistoryContext: Context
<Node
>
Sets the node that the
component will resolve to.Defined in
ai-jsx/src/core/conversation.tsx:65
Functions
AssistantMessage
▸ AssistantMessage(«destructured»
): Node
Provide an Assistant Message to the LLM, for use within a ChatCompletion.
The assistant message tells the model what it has previously said. See https://platform.openai.com/docs/guides/gpt/chat-completions-api for more detail.
Example
<ChatCompletion>
<UserMessage>I'd like to cancel my account.</UserMessage>
<AssistantMessage>Sorry to hear that. Can you tell me why?</AssistantMessage>
<UserMessage>It's too expensive.</UserMessage>
</ChatCompletion>
==> "Ok, thanks for that feedback. I'll cancel your account."
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› metadata? | Record <string , Jsonifiable > |
Returns
Defined in
ai-jsx/src/core/conversation.tsx:58
ConversationHistory
▸ ConversationHistory(_
, «destructured»
): null
| string
| number
| boolean
| Element
<any
> | Node
[] | IndirectNode
Renders to the conversation history provided through ConversationHistoryContext.
Parameters
Name | Type |
---|---|
_ | Object |
«destructured» | ComponentContext |
Returns
null
| string
| number
| boolean
| Element
<any
> | Node
[] | IndirectNode
Defined in
ai-jsx/src/core/conversation.tsx:70
Converse
▸ Converse(«destructured»
, «destructured»
): RenderableStream
A component that appends messages to a conversation according to a reply
function.
The reply
function is invoked with the new messages produced from its own replies until
it fails to return a conversational message (e.g. UserMessage or AssistantMessage). The first
invocation includes the messages from the children
prop.
Example
<Converse reply={function (messages, fullConversation) {
const lastMessage = messages[messages.length - 1];
if (lastMessage.type === "user") {
return (
<ChatCompletion functions={functions}>
{fullConversation.map(msg => msg.element)}
</ChatCompletion>
);
}
if (lastMessage.type === "functionCall") {
return <EvaluateFunction name={lastMessage.element.name} args={lastMessage.element.args} />
}
return null;
}>
<ConversationHistory />
</ChatCompletion>
==> 'Hello there!'
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› reply | (messages : ConversationMessage [], fullConversation : ConversationMessage []) => Renderable |
«destructured» | ComponentContext |
Returns
Defined in
ai-jsx/src/core/conversation.tsx:284
FunctionCall
▸ FunctionCall(«destructured»
): string
Provide a function call to the LLM, for use within a ChatCompletion.
The function call tells the model that a function was previously invoked by the model. See https://platform.openai.com/docs/guides/gpt/chat-completions-api for more detail. When the model returns a function call, @{link ChatCompletion} returns a @{link FunctionCall} component.
Example
<ChatCompletion>
<UserMessage>What is 258 * 322?</UserMessage>
<FunctionCall name="evaluate_math" args={expression: "258 * 322"} />
<FunctionResponse name="evaluate_math">83076</FunctionResponse>
</ChatCompletion>
==> "That would be 83,076."
Parameters
Name | Type |
---|---|
«destructured» | Object |
› args | Record <string , null | string | number | boolean > |
› name | string |
› metadata? | Record <string , Jsonifiable > |
› partial? | boolean |
Returns
string
Defined in
ai-jsx/src/core/conversation.tsx:101
FunctionResponse
▸ FunctionResponse(«destructured»
): Element
Renders to the output of a previous FunctionCall component, for use within a ChatCompletion.
See https://platform.openai.com/docs/guides/gpt/chat-completions-api for more detail.
Example
<ChatCompletion>
<UserMessage>What is 258 * 322?</UserMessage>
<FunctionCall name="evaluate_math" args={expression: "258 * 322"} />
<FunctionResponse name="evaluate_math">83076</FunctionResponse>
</ChatCompletion>
==> "That would be 83,076."
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› name | string |
› failed? | boolean |
› metadata? | Record <string , Jsonifiable > |
Returns
Element
Defined in
ai-jsx/src/core/conversation.tsx:130
ShowConversation
▸ ShowConversation(«destructured»
, «destructured»
): RenderableStream
Allows the presentation of conversational components (UserMessage et al) to be altered.
Also accepts an onComplete
prop, which will be invoked once per render with the entire conversation.
Example
<ShowConversation present={(msg) => msg.type === "assistant" && <>Assistant: {msg.element}</>}>
<UserMessage>This is not visible.</UserMessage>
<AssistantMessage>This is visible!</AssistantMessage>
</ShowConversation>
==> 'Assistant: This is visible!'
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› onComplete? | (conversation : ConversationMessage [], render : <TIntermediate>(renderable : Renderable , opts? : RenderOpts <TIntermediate , string >) => RenderResult <TIntermediate , string ><TIntermediate>(renderable : Renderable , opts : RenderOpts <TIntermediate , PartiallyRendered []>) => RenderResult <TIntermediate , PartiallyRendered []>) => void | Promise <void > |
› present? | (message : ConversationMessage , index : number ) => Node |
«destructured» | ComponentContext |
Returns
Defined in
ai-jsx/src/core/conversation.tsx:327
SystemMessage
▸ SystemMessage(«destructured»
): Node
Provide a System Message to the LLM, for use within a ChatCompletion.
The system message can be used to put the model in character. See https://platform.openai.com/docs/guides/gpt/chat-completions-api for more detail.
Example
<ChatCompletion>
<SystemMessage>You are a helpful customer service agent.</SystemMessage>
</ChatCompletion>
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› metadata? | Record <string , Jsonifiable > |
Returns
Defined in
ai-jsx/src/core/conversation.tsx:20
UserMessage
▸ UserMessage(«destructured»
): Node
Provide a User Message to the LLM, for use within a ChatCompletion.
The user message tells the model what the user has said. See https://platform.openai.com/docs/guides/gpt/chat-completions-api for more detail.
Example
<ChatCompletion>
<UserMessage>I'd like to cancel my account.</UserMessage>
</ChatCompletion>
==> 'Sorry to hear that. Can you tell me why?
Parameters
Name | Type |
---|---|
«destructured» | Object |
› children | Node |
› metadata? | Record <string , Jsonifiable > |
› name? | string |