llm_client 0.0.6

The Easiest Rust Interface for Local LLMs, and an Interface for Deterministic Signals from Probabilistic LLM Vibes
Build #1386735 2024-10-10T17:37:27.787546+00:00
# rustc version
rustc 1.83.0-nightly (eb4e23467 2024-10-09)# docs.rs version
docsrs 0.6.0 (d5a37845 2024-09-24)# build log
[INFO] running `Command { std: "docker" "create" "-v" "/home/cratesfyi/workspace-builder/builds/llm_client-0.0.6/target:/opt/rustwide/target:rw,Z" "-v" "/home/cratesfyi/workspace-builder/builds/llm_client-0.0.6/source:/opt/rustwide/workdir:ro,Z" "-v" "/home/cratesfyi/workspace-builder/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/home/cratesfyi/workspace-builder/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "DOCS_RS=1" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "6442450944" "--cpus" "6" "--user" "1001:1001" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:6252d7ea7fb8caaf7af6d800c5fb250a9cd862b9a7f9508afb3c54fa7fe1102e" "/opt/rustwide/cargo-home/bin/cargo" "+nightly" "rustdoc" "--lib" "-Zrustdoc-map" "--config" "build.rustdocflags=[\"--cfg\", \"docsrs\", \"-Z\", \"unstable-options\", \"--emit=invocation-specific\", \"--resource-suffix\", \"-20241009-1.83.0-nightly-eb4e23467\", \"--static-root-path\", \"/-/rustdoc.static/\", \"--cap-lints\", \"warn\", \"--extern-html-root-takes-precedence\"]" "--offline" "-Zunstable-options" "--config=doc.extern-map.registries.crates-io=\"https://docs.rs/{pkg_name}/{version}/x86_64-unknown-linux-gnu\"" "-Zrustdoc-scrape-examples" "-j6" "--target" "x86_64-unknown-linux-gnu", kill_on_drop: false }`
[INFO] [stdout] 8971b5a407705f1ee583b2898da7332d9582adb3c1b3cc9d05e13f930d2af73a
[INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
[INFO] running `Command { std: "docker" "start" "-a" "8971b5a407705f1ee583b2898da7332d9582adb3c1b3cc9d05e13f930d2af73a", kill_on_drop: false }`
[INFO] [stderr] warning: Rustdoc did not scrape the following examples because they require dev-dependencies: basic_completion, basic_primitive, decision, device_config, extract_urls, reason
[INFO] [stderr]     If you want Rustdoc to scrape these examples, then add `doc-scrape-examples = true`
[INFO] [stderr]     to the [[example]] target configuration of at least one example.
[INFO] [stderr] warning: target filter specified, but no targets matched; this is a no-op
[INFO] [stderr]    Compiling llm_interface v0.0.2
[INFO] [stderr]     Checking readability v0.3.0
[INFO] [stderr]     Checking html2text v0.12.6
[INFO] [stderr]     Checking clap v4.5.20
[INFO] [stderr] The following warnings were emitted during compilation:
[INFO] [stderr] 
[INFO] [stderr] warning: llm_interface@0.0.2: Failed to build llama_cpp: Failed to build llama_cpp: Failed to execute make command: No such file or directory (os error 2) in 0.2682399 seconds
[INFO] [stderr] 
[INFO] [stderr] error: failed to run custom build command for `llm_interface v0.0.2`
[INFO] [stderr] 
[INFO] [stderr] Caused by:
[INFO] [stderr]   process didn't exit successfully: `/opt/rustwide/target/debug/build/llm_interface-d26a656f151d4e5f/build-script-build` (exit status: 1)
[INFO] [stderr]   --- stdout
[INFO] [stderr]   cargo:rerun-if-changed=build.rs
[INFO] [stderr]   Starting llama_cpp.build Logger
[INFO] [stderr]   2024-10-10T17:37:27.131067Z TRACE llm_devices::build::git: Directory does not exist: /opt/rustwide/target/llama_cpp
[INFO] [stderr]   2024-10-10T17:37:27.131101Z TRACE llm_devices::build: Directory /opt/rustwide/target/llama_cpp does not exist, skipping removal
[INFO] [stderr]   2024-10-10T17:37:27.131116Z TRACE llm_devices::build::git: Cloning https://github.com/ggerganov/llama.cpp  at tag b3848
[INFO] [stderr]   2024-10-10T17:37:27.141445Z TRACE llm_devices::build::git: Successfully cloned https://github.com/ggerganov/llama.cpp  at tag b3848
[INFO] [stderr]   2024-10-10T17:37:27.141787Z TRACE llm_devices::build::make: No CUDA detected - building without CUDA support
[INFO] [stderr]   2024-10-10T17:37:27.141817Z TRACE llm_devices::build::make: Running make command: cd "/opt/rustwide/target/llama_cpp" && "make" "llama-server" "BUILD_TYPE=Release" "-j"
[INFO] [stderr]   2024-10-10T17:37:27.142009Z TRACE llm_devices::build: Directory /opt/rustwide/target/llama_cpp does not exist, skipping removal
[INFO] [stderr]   cargo:warning=Failed to build llama_cpp: Failed to build llama_cpp: Failed to execute make command: No such file or directory (os error 2) in 0.2682399 seconds
[INFO] [stderr] 
[INFO] [stderr]   --- stderr
[INFO] [stderr]   Cloning into '/opt/rustwide/target/llama_cpp'...
[INFO] [stderr]   fatal: unable to access 'https://github.com/ggerganov/llama.cpp/': Could not resolve host: github.com
[INFO] [stderr] warning: build failed, waiting for other jobs to finish...
[INFO] running `Command { std: "docker" "inspect" "8971b5a407705f1ee583b2898da7332d9582adb3c1b3cc9d05e13f930d2af73a", kill_on_drop: false }`
[INFO] running `Command { std: "docker" "rm" "-f" "8971b5a407705f1ee583b2898da7332d9582adb3c1b3cc9d05e13f930d2af73a", kill_on_drop: false }`
[INFO] [stdout] 8971b5a407705f1ee583b2898da7332d9582adb3c1b3cc9d05e13f930d2af73a