Target
- Repo:
CortexLM/vgrep
- Branch:
main
- Commit:
cb9689d1d375b46ff285756fddfbf434eff5a05f
- Multiplier:
0.25x
Summary
Automated check cargo-clippy failed for CortexLM/vgrep on branch main at commit cb9689d1d375b46ff285756fddfbf434eff5a05f.
Command:
cargo clippy --all-targets --all-features
Exit code: 101
Steps to Reproduce
git clone https://github.com/CortexLM/vgrep.git
cd vgrep
git checkout cb9689d1d375b46ff285756fddfbf434eff5a05f
- Run:
cargo clippy --all-targets --all-features
Actual Behavior
The command exits non-zero.
stderr
Checking http-body v1.0.1
Checking regex-automata v0.4.13
Checking errno v0.3.14
Checking mio v1.1.1
Checking parking_lot_core v0.9.12
Checking socket2 v0.6.1
Checking getrandom v0.3.4
Checking serde v1.0.228
Checking openssl-sys v0.9.111
Checking getrandom v0.2.17
Checking serde_json v1.0.149
Checking icu_provider v2.1.1
Checking dirs-sys v0.5.0
Checking ahash v0.8.12
Checking ppv-lite86 v0.2.21
Checking socks v0.3.4
Compiling llama-cpp-sys-2 v0.1.132 (https://github.com/utilityai/llama-cpp-rs?branch=main#f4606237)
Checking block-buffer v0.10.4
Checking crypto-common v0.1.7
Checking console v0.15.11
Checking futures-executor v0.3.31
Checking inotify-sys v0.1.5
Checking http-body-util v0.1.3
Checking bitflags v1.3.2
Checking compact_str v0.8.1
Checking signal-hook-registry v1.4.8
Checking dirs v6.0.0
Checking ring v0.17.14
Checking rand_core v0.9.5
Checking tempfile v3.24.0
Checking inotify v0.10.2
Checking sharded-slab v0.1.7
Checking digest v0.10.7
Checking hashbrown v0.14.5
Checking futures v0.3.31
Checking clap_builder v4.5.54
Checking parking_lot v0.12.5
Checking unicode-truncate v1.1.0
Checking notify-types v1.0.1
Checking signal-hook v0.3.18
Checking icu_normalizer v2.1.1
Checking icu_properties v2.1.2
Checking axum-core v0.5.6
Checking enumflags2 v0.7.12
Checking thiserror v2.0.18
Checking strum v0.26.3
Checking thiserror v1.0.69
Checking libsqlite3-sys v0.30.1
Checking nix v0.30.1
Checking crossbeam-deque v0.8.6
Checking tokio v1.49.0
Checking rand_chacha v0.9.0
Checking signal-hook-mio v0.2.5
Checking lru v0.12.5
Checking console v0.16.2
Checking serde_path_to_error v0.1.20
Checking indicatif v0.17.11
Checking num_cpus v1.17.0
Checking crossterm v0.28.1
error: failed to run custom build command for `llama-cpp-sys-2 v0.1.132 (https://github.com/utilityai/llama-cpp-rs?branch=main#f4606237)`
Caused by:
process didn't exit successfully: `/opt/bounty-challenge/agent/workspace/CortexLM__vgrep/target/debug/build/llama-cpp-sys-2-53d06758b5321e3e/build-script-build` (exit status: 101)
--- stdout
cargo:rerun-if-changed=build.rs
cargo:rerun-if-env-changed=LLAMA_LIB_PROFILE
cargo:rerun-if-env-changed=LLAMA_BUILD_SHARED_LIBS
cargo:rerun-if-env-changed=LLAMA_STATIC_CRT
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/docs/backend/hexagon/CMakeUserPresets.json
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/tokenize/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/fit-params/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/quantize/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/tts/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/cli/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/completion/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/gguf-split/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/imatrix/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/cvector-generator/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/batched-bench/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/export-lora/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/perplexity/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/rpc/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/server/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/mtmd/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/tools/llama-bench/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/unicode-data.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-kv-cache-iswa.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-kv-cache.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/unicode.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-memory-recurrent.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-model-saver.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-vocab.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-vocab.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-adapter.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-impl.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-mmap.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-kv-cache.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-grammar.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-batch.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-context.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-graph.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-arch.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-quant.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-mmap.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-hparams.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-arch.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-cparams.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-io.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-chat.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-graph.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-hparams.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/CMakeLists.txt
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/unicode-data.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/unicode.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-chat.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-model-loader.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-memory.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/llama-io.h
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/minicpm3.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/bitnet.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/falcon-h1.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/starcoder2.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/afmoe.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/ernie4-5.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/deepseek2.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/orion.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/t5-enc.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/phi3.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/command-r.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/llada-moe.cpp
cargo:rerun-if-changed=/root/.cargo/git/checkouts/llama-cpp-rs-274405c613038803/f460623/llama-cpp-sys-2/llama.cpp/src/models/mpt.cpp
cargo:rerun-if-changed=/root/.cargo/g
...
[truncated]
stdout
Expected Behavior
The command should exit with status code 0.
Environment
- OS:
Linux 6.8.0-90-generic
- Python:
3.13.11
Additional Context
- This issue was generated by a local "bounty-agent" run.
- Please apply the
valid label if this is accepted as a bounty-challenge valid issue.
Target
CortexLM/vgrepmaincb9689d1d375b46ff285756fddfbf434eff5a05f0.25xSummary
Automated check
cargo-clippyfailed for CortexLM/vgrep on branchmainat commitcb9689d1d375b46ff285756fddfbf434eff5a05f.Command:
cargo clippy --all-targets --all-featuresExit code: 101
Steps to Reproduce
git clone https://github.com/CortexLM/vgrep.gitcd vgrepgit checkout cb9689d1d375b46ff285756fddfbf434eff5a05fActual Behavior
The command exits non-zero.
stderr
stdout
Expected Behavior
The command should exit with status code 0.
Environment
Linux 6.8.0-90-generic3.13.11Additional Context
validlabel if this is accepted as a bounty-challenge valid issue.