Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump requests from 2.31.0 to 2.32.0 in /libs/infinity_emb #3

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ __pycache__/
# C extensions
*.so

# Pycharn
.idea
models/

# Distribution / packaging
.Python
build/
Expand Down
46 changes: 5 additions & 41 deletions libs/infinity_emb/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
ARG BASE_IMAGE=nvidia/cuda:12.1.0-base-ubuntu22.04
# Use the Python base image
FROM nvidia/cuda:12.1.0-base-ubuntu22.04 AS base
FROM $BASE_IMAGE AS base

ENV PYTHONUNBUFFERED=1 \
\
Expand Down Expand Up @@ -47,51 +48,14 @@ RUN poetry install --no-interaction --no-ansi --extras "${EXTRAS}" --without li
# remove cache
RUN poetry cache clear pypi --all

FROM builder as testing
# install lint and test dependencies
RUN poetry install --no-interaction --no-ansi --extras "${EXTRAS}"

# lint
RUN poetry run ruff .
RUN poetry run black --check .
RUN poetry run mypy .
# pytest
COPY tests tests
# run end to end tests because of duration of build in github ci.
# Run tests/end_to_end on TARGETPLATFORM x86_64 otherwise run tests/end_to_end_gpu
# poetry run python -m pytest tests/end_to_end -x
RUN if [ "$TARGETPLATFORM" = "linux/amd64" ] ; then \
poetry run python -m pytest tests/end_to_end -x ; \
else \
poetry run python -m pytest tests/end_to_end/test_api_with_dummymodel.py -x ; \
fi
RUN echo "all tests passed" > "test_results.txt"

# Export with tensorrt, not recommended.
# docker buildx build --target=production-tensorrt -f Dockerfile .
FROM nvidia/cuda:11.8.0-cudnn8-devel-ubuntu22.04 AS production-tensorrt
ENV PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=off \
PYTHON="python3.11"
RUN apt-get update && apt-get install python3-dev python3-pip $PYTHON build-essential curl -y
COPY --from=builder /app /app
# force testing stage to run
COPY --from=testing /app/test_results.txt /app/test_results.txt
ENV SENTENCE_TRANSFORMERS_HOME=/app/.cache/torch
ENV PATH=/app/.venv/bin:$PATH
RUN pip install --no-cache-dir "onnxruntime-gpu==1.17.0" "tensorrt==8.6.*"
ENV LD_LIBRARY_PATH /app/.venv/lib/$(PYTHON)/site-packages/tensorrt:/usr/lib/x86_64-linux-gnu:/app/.venv/lib/$(PYTHON)/site-packages/tensorrt_libs:${LD_LIBRARY_PATH}
ENV PATH /app/.venv/lib/$(PYTHON)/site-packages/tensorrt/bin:${PATH}
ENTRYPOINT ["infinity_emb"]

# Use a multi-stage build -> production version
FROM base AS production

COPY --from=builder /app /app
# force testing stage to run
COPY --from=testing /app/test_results.txt /app/test_results.txt
COPY /models /models
COPY environment_config.sh ./environment_config.sh

ENV SENTENCE_TRANSFORMERS_HOME=/app/.cache/torch
ENV PATH=/app/.venv/bin:$PATH

ENTRYPOINT ["infinity_emb"]
ENTRYPOINT ["/bin/bash" , "-c", "source ./environment_config.sh "]
7 changes: 7 additions & 0 deletions libs/infinity_emb/environment_config.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/bin/bash
#export OTEL_TRACES_SAMPLER=parentbased_always_off
export OTEL_RESOURCE_ATTRIBUTES=service.name=${SHERLOCK_SERVICE_NAME},host.name=${POD_NAME},host.ip=${POD_IP}
export OTEL_EXPORTER_OTLP_ENDPOINT=http://${HOST_IP}:5680
echo "Set environment variables"
# TODO: opentelemetry-instrument
infinity_emb --model-name-or-path /models
15 changes: 10 additions & 5 deletions libs/infinity_emb/poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.