llama-cpp-2 0.1.91

llama.cpp bindings for Rust
Documentation
Build #1690876 2025-01-30T18:31:43.984259+00:00
# rustc version
rustc 1.86.0-nightly (ae5de6c75 2025-01-29)# docs.rs version
docsrs 0.6.0 (29a0e81b 2025-01-22)# build log
[INFO] running `Command { std: "docker" "create" "-v" "/home/cratesfyi/workspace-builder/builds/llama-cpp-2-0.1.91/target:/opt/rustwide/target:rw,Z" "-v" "/home/cratesfyi/workspace-builder/builds/llama-cpp-2-0.1.91/source:/opt/rustwide/workdir:ro,Z" "-v" "/home/cratesfyi/workspace-builder/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/home/cratesfyi/workspace-builder/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "DOCS_RS=1" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "6442450944" "--cpus" "6" "--user" "1001:1001" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:c80049f3b88b82089a44e0f06d0d6029d44b96b7257e55a1cd63dbc9f4c33334" "/opt/rustwide/cargo-home/bin/cargo" "+nightly" "rustdoc" "--lib" "-Zrustdoc-map" "--features" "sampler" "--config" "build.rustdocflags=[\"--cfg\", \"docsrs\", \"-Z\", \"unstable-options\", \"--emit=invocation-specific\", \"--resource-suffix\", \"-20250129-1.86.0-nightly-ae5de6c75\", \"--static-root-path\", \"/-/rustdoc.static/\", \"--cap-lints\", \"warn\", \"--extern-html-root-takes-precedence\"]" "--offline" "-Zunstable-options" "--config=doc.extern-map.registries.crates-io=\"https://docs.rs/{pkg_name}/{version}/x86_64-unknown-linux-gnu\"" "-Zrustdoc-scrape-examples" "-j6" "--target" "x86_64-unknown-linux-gnu", kill_on_drop: false }`
[INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
[INFO] [stdout] e7d269a32c44c2bde56f58b5f8fc500f58f1d14a5c627a03bed50e328b53b20e
[INFO] running `Command { std: "docker" "start" "-a" "e7d269a32c44c2bde56f58b5f8fc500f58f1d14a5c627a03bed50e328b53b20e", kill_on_drop: false }`
[INFO] [stderr] warning: target filter specified, but no targets matched; this is a no-op
[INFO] [stderr]    Compiling llama-cpp-sys-2 v0.1.91
[INFO] [stderr]  Documenting llama-cpp-2 v0.1.91 (/opt/rustwide/workdir)
[INFO] [stderr] warning: unresolved link to `context::sample::sampler`
[INFO] [stderr]   --> src/lib.rs:15:28
[INFO] [stderr]    |
[INFO] [stderr] 15 | //! - `sampler` adds the [`context::sample::sampler`] struct for a more rusty way of sampling.
[INFO] [stderr]    |                            ^^^^^^^^^^^^^^^^^^^^^^^^ no item named `sample` in module `context`
[INFO] [stderr]    |
[INFO] [stderr]    = note: `#[warn(rustdoc::broken_intra_doc_links)]` on by default
[INFO] [stderr] 
[INFO] [stderr] warning: this URL is not a hyperlink
[INFO] [stderr]    --> src/model.rs:526:5
[INFO] [stderr]     |
[INFO] [stderr] 526 | /     /// Apply the models chat template to some messages.
[INFO] [stderr] 527 | |     /// See https://github.com/ggerganov/llama.cpp/wiki/Templates-supported-by-llama_chat_apply_template
[INFO] [stderr] 528 | |     ///
[INFO] [stderr] 529 | |     /// `tmpl` of None means to use the default template provided by llama.cpp for the model
[INFO] [stderr] 530 | |     ///
[INFO] [stderr] 531 | |     /// # Errors
[INFO] [stderr] 532 | |     /// There are many ways this can fail. See [`ApplyChatTemplateError`] for more information.
[INFO] [stderr]     | |_______________________________________________________________________________________________^
[INFO] [stderr]     |
[INFO] [stderr]     = note: bare URLs are not automatically turned into clickable links
[INFO] [stderr]     = note: `#[warn(rustdoc::bare_urls)]` on by default
[INFO] [stderr] help: use an automatic link instead
[INFO] [stderr]     |
[INFO] [stderr] 526 ~     </// Apply the models chat template to some messages.
[INFO] [stderr] 527 |     /// See https://github.com/ggerganov/llama.cpp/wiki/Templates-supported-by-llama_chat_apply_template
[INFO] [stderr] ...
[INFO] [stderr] 531 |     /// # Errors
[INFO] [stderr] 532 ~     /// There are many ways this can fail. See [`ApplyChatTemplateError`] for more information.>
[INFO] [stderr]     |
[INFO] [stderr] 
[INFO] [stderr] warning: `llama-cpp-2` (lib doc) generated 2 warnings (run `cargo fix --lib -p llama-cpp-2` to apply 1 suggestion)
[INFO] [stderr]     Finished `dev` profile [unoptimized + debuginfo] target(s) in 2.05s
[INFO] [stderr]    Generated /opt/rustwide/target/x86_64-unknown-linux-gnu/doc/llama_cpp_2/index.html
[INFO] running `Command { std: "docker" "inspect" "e7d269a32c44c2bde56f58b5f8fc500f58f1d14a5c627a03bed50e328b53b20e", kill_on_drop: false }`
[INFO] running `Command { std: "docker" "rm" "-f" "e7d269a32c44c2bde56f58b5f8fc500f58f1d14a5c627a03bed50e328b53b20e", kill_on_drop: false }`
[INFO] [stdout] e7d269a32c44c2bde56f58b5f8fc500f58f1d14a5c627a03bed50e328b53b20e