Reformulate
Reformulate your system responses using LLM
Overview
The llm.reformulate()
function allows you to generate reformulated responses based on the description of the intended speaker, creating dynamic and context-aware conversational experiences. With this function, you can engage users through natural language and maintain the essence of your conversation design while providing personalized and relevant responses.
Function Signature
Parameters
context
(Context): The context of the conversation. It contains relevant information about the ongoing conversation, represented as a Context object.numTurns
(Int, optional, default=5): The number of previous turns to consider when generating the response. Defaults to 5.personaName
(String, optional, default="System"): The name of the persona roleplaying in the conversation. By defining a persona, you can give the AI a distinct personality to emulate during interactions.personaPersonality
(String, optional, default="You are helpful, creative, clever, and very friendly."): The personality traits of the persona roleplaying. This parameter helps to define the tone and style of the AI's responses.defaultResponse
(String, optional, default="System"): The default sentence that the AI will attempt to reformulate without changing its meaning. This allows you to set the context for the reformulation task.prompt
(String, optional): A custom prompt for the AI persona. This prompt provides additional context for the AI, guiding it on how to approach the reformulation task effectively. If not provided, a default prompt will be used.config
(LLMConfig, optional): A configuration object of type LLMConfig. This parameter is optional and defaults to the default configuration values.
Return Value
The function returns a String
, which is the reformulated response generated by the AI persona based on the input parameters and the provided context.
Example Use Cases
Use Case 1: Contextual Conversation
Output:
Use Case 2: Personalized Response
Output:
Last updated