llm_client 0.0.6

The Easiest Rust Interface for Local LLMs, and an Interface for Deterministic Signals from Probabilistic LLM Vibes

llm_client

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 4 feature flags, 3 of them enabled by default.

default

llama_cpp_backend (default)

all

mistral_rs_backend