llm_chain

Module tokens

Source
Expand description

§Tokens Module

This module provides utilities for managing tokens in Language Learning Models (LLMs), primarily focusing on measuring the sizes of prompts. This is useful for ensuring that prompts stay within the context window size supported by a given model.

Structs§

Token
Represents a single token.
TokenCollection
A type-safe, enum-backed collection of tokens.
TokenCount
Struct representing token count information, including the maximum tokens allowed and the total number of tokens used.

Enums§

PromptTokensError
Custom error type for handling prompt token-related errors.
TokenizerError

Traits§

ExecutorTokenCountExt
An extension trait for the Executor trait that provides additional methods for working with token counts.
Tokenizer