Zum Inhalt

ResponsesResource

ResponsesResource

ResponsesResource(client: Client)

Bases: BaseResource

add

add(response: str, prompt_template: PromptTemplate, template_variables: TemplateVariables, metadata: LLMConfig | GenerationMetadata | None = None) -> PromptResponse

Add a response to a prompt template.

Parameters:

Name Type Description Default
response str

The response to add.

required
prompt_template PromptTemplate

The prompt template to add the response to.

required
template_variables TemplateVariables | None

The template variables to use for the response.

required
metadata LLMConfig | GenerationMetadata | None

Optional metadata to associate with the response.

None

Returns:

Name Type Description
PromptResponse PromptResponse

The newly created prompt response object.

add_many

add_many(responses: List[str], prompt_template: PromptTemplate, template_variables: List[TemplateVariables], metadata: List[LLMConfig | GenerationMetadata | None] | None = None, timeout: float | None = None) -> List[PromptResponse]

Add multiple responses to a prompt template in bulk.

Use this method when you have a list of responses to add, instead of adding them one by one with the add() method.

Parameters:

Name Type Description Default
responses list[str]

List of responses to add.

required
prompt_template PromptTemplate

The prompt template to add responses to.

required
template_variables list[TemplateVariables]

List of template variables for each response.

required
metadata list[LLMConfig | GenerationMetadata | None] | None

Optional list of metadata for each response.

None
timeout float | None

Timeout in seconds for API requests. Defaults to no timeout.

None

Returns:

Type Description
List[PromptResponse]

list[PromptResponse]: List of newly created prompt response objects.

delete

delete(prompt_response: PromptResponse) -> None

Delete a prompt response.

Parameters:

Name Type Description Default
prompt_response PromptResponse

The prompt response to delete.

required

generate

generate(prompt_template: PromptTemplate, template_variables: TemplateVariables, llm_config: LLMConfig | None = None) -> PromptResponse

Generate a response for a prompt template using an LLM.

This method sends the prompt to an LLM for generation. If no LLM config is provided, the project's default LLM config will be used.

Parameters:

Name Type Description Default
prompt_template PromptTemplate

The prompt template to generate a response for.

required
llm_config LLMConfig | None

Optional LLM configuration to use for generation. If not provided, the project's default config will be used.

None
template_variables TemplateVariables | None

The template variables to use for the response.

required

Returns:

Name Type Description
PromptResponse PromptResponse

The generated response object

Raises:

Type Description
ValueError

If no template variables source is provided (either template_variables or template_variables_id)

generate_many

generate_many(prompt_template: PromptTemplate, *, template_variables: List[TemplateVariables], llm_config: LLMConfig | None = None, timeout: float | None = None) -> List[PromptResponse]
generate_many(prompt_template: PromptTemplate, *, collection: TemplateVariablesCollection, llm_config: LLMConfig | None = None, timeout: float | None = None) -> List[PromptResponse]
generate_many(prompt_template: PromptTemplate, *, template_variables: List[TemplateVariables] | None = None, collection: TemplateVariablesCollection | None = None, llm_config: LLMConfig | None = None, timeout: float | None = None) -> List[PromptResponse]

Generate multiple responses for a prompt template.

Use this method when you have a list of responses to generate, instead of generating them one by one with the generate() method.

Either template_variables or collection can be provided: - If template_variables is given, it will use the provided list of template variables for each response. - If collection is given, it will use the template variables from the specified collection.

Parameters:

Name Type Description Default
prompt_template PromptTemplate

The prompt template to use for generation.

required
template_variables list[TemplateVariables] | None

List of template variables for each response.

None
collection TemplateVariablesCollection | None

The collection to use for the template variables.

None
llm_config LLMConfig | None

Optional LLMConfig to use for generation.

None
timeout float

Timeout in seconds for API requests. Defaults to no timeout.

None

Returns:

Type Description
List[PromptResponse]

list[PromptResponse]: List of newly created prompt response objects.

list

list(prompt_template: PromptTemplate | None = None, template_variables: TemplateVariables | None = None, experiment: Experiment | None = None) -> list[PromptResponse]

Returns the responses belonging to a prompt template, a template variables, or both.

Parameters:

Name Type Description Default
prompt_template PromptTemplate | None

The prompt template to get responses for.

None
template_variables TemplateVariables | None

The template variables to get responses for.

None
experiment Experiment | None

The experiment to get responses for.

None

Returns:

Type Description
list[PromptResponse]

list[PromptResponse]: The list of prompt responses.