LiteLLM Embeddings icon

LiteLLM Embeddings

Generate embeddings using LiteLLM

Overview

This node generates text embeddings using the LiteLLM API. It is designed to convert input text into vector representations (embeddings) that can be used in AI applications such as semantic search, recommendation systems, or natural language understanding. The node is typically connected to a Vector Store or AI Agent to utilize the generated embeddings effectively.

Use Case Examples

  1. Generating embeddings for a set of documents to enable semantic search in a knowledge base.
  2. Creating vector representations of user queries for improved matching in a recommendation engine.

Properties

Name Meaning
Notice A notice to ensure the node is connected to a Vector Store or AI Agent.
Model The LiteLLM model used to generate embeddings. Models that include 'embed' in their ID are selectable.
Options Additional options to customize the embedding generation.

Output

JSON

  • Embeddings - The generated embeddings output from the LiteLLM API.

Dependencies

  • LiteLLM API key credential for authentication

Troubleshooting

  • Ensure the API key credential is correctly configured and has access to the LiteLLM service.
  • If the node fails to generate embeddings, verify the selected model supports embedding generation (model IDs must include 'embed').
  • Check the batch size does not exceed the maximum allowed (2048).
  • If embeddings seem incorrect, try toggling the 'Strip New Lines' option to see if input formatting affects results.

Links

Discussion