Expand description
§Tokens Module
This module provides utilities for managing tokens in Language Learning Models (LLMs), primarily focusing on measuring the sizes of prompts. This is useful for ensuring that prompts stay within the context window size supported by a given model.
Structs§
- Token
- Represents a single token.
- Token
Collection - A type-safe, enum-backed collection of tokens.
- Token
Count - Struct representing token count information, including the maximum tokens allowed and the total number of tokens used.
Enums§
- Prompt
Tokens Error - Custom error type for handling prompt token-related errors.
- Tokenizer
Error
Traits§
- Executor
Token Count Ext - An extension trait for the
Executor
trait that provides additional methods for working with token counts. - Tokenizer