Skip to content

Model

A model represents a generative AI model configuration that can be used for inference tasks such as text generation, chat, or embeddings. Models define how to connect to and configure specific AI providers like OpenAI, Anthropic, AWS Bedrock, or others.

Each model must have a unique id and specify a provider. Models are defined at the application level and can be referenced by steps like LLMInference, Agent, or InvokeEmbedding.

Key Principles

Type Discriminator

All models must include a type field for proper schema validation: - type: Model for standard generative models - type: EmbeddingModel for embedding/vectorization models

Referencing Models

Steps reference models by their ID:

models:
  - type: Model
    id: gpt4
    provider: openai
    model_id: gpt-4-turbo

flows:
  - type: Flow
    id: my_flow
    steps:
      - type: LLMInference
        model: gpt4  # References the model by ID

Rules and Behaviors

  • Unique IDs: Each model must have a unique id within the application. Duplicate model IDs will result in a validation error.
  • Model ID Resolution: If model_id is not specified, the model's id field is used as the model identifier for the provider.
  • Provider Requirement: The provider field is required and specifies which AI service to use (e.g., "openai", "anthropic", "aws-bedrock").
  • Authentication: Models can reference an AuthorizationProvider by ID or as a string reference for API authentication.
  • Inference Parameters: The inference_params dictionary allows customization of model behavior (temperature, max_tokens, etc.).

Model Types

Model

Describes a generative model configuration, including provider and model ID.

  • type (Literal): (No documentation available.)
  • id (str): Unique ID for the model.
  • auth (Reference[AuthProviderType] | str | None): AuthorizationProvider used for model access.
  • inference_params (dict[str, Any]): Optional inference parameters like temperature or max_tokens.
  • model_id (str | None): The specific model name or ID for the provider. If None, id is used
  • provider (Literal): Name of the provider, e.g., openai or anthropic.

EmbeddingModel

Describes an embedding model configuration, extending the base Model class.

  • type (Literal): (No documentation available.)
  • dimensions (int): Dimensionality of the embedding vectors produced by this model.

Models can reference AuthorizationProvider for secure API access.

Example Usage