Skip to content

Commit

Permalink
[Build] Quick fix for break (guidance-ai#867)
Browse files Browse the repository at this point in the history
It appears that the latest version of `llama-cpp-python` has broken our build. This is most likely due to something in the tokeniser, but pending a deeper investigation, exclude the latest release from the build.
  • Loading branch information
riedgar-ms authored May 30, 2024
1 parent 7938024 commit cb263a8
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/action_gpu_unit_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ jobs:
run: |
pip install accelerate
pip uninstall -y llama-cpp-python
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75"
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75,!=0.2.76"
- name: Check GPU available
run: |
python -c "import torch; assert torch.cuda.is_available()"
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/action_plain_unit_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
run: |
pip install sentencepiece
pip uninstall -y llama-cpp-python
pip install "llama-cpp-python!=0.2.58"
pip install "llama-cpp-python!=0.2.58,!=0.2.76"
- name: Run tests (except server)
shell: bash
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ jobs:
- name: GPU pip installs
run: |
pip install accelerate
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75"
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75,!=0.2.76"
- name: Check GPU available
run: |
python -c "import torch; assert torch.cuda.is_available()"
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/notebook_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ jobs:
- name: GPU pip installs
run: |
pip install accelerate
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75"
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install "llama-cpp-python!=0.2.58,!=0.2.75,!=0.2.76"
- name: Check GPU available
run: |
python -c "import torch; assert torch.cuda.is_available()"
Expand Down

0 comments on commit cb263a8

Please sign in to comment.