Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 45 additions & 2 deletions CodeGen/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,8 +1,51 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

ARG BASE_TAG=latest
FROM opea/comps-base:$BASE_TAG
# Stage 1: base setup used by other stages
FROM python:3.11-slim AS base

# get security updates
RUN apt-get update && apt-get upgrade -y && \
apt-get clean && rm -rf /var/lib/apt/lists/*

ENV HOME=/home/user

RUN useradd -m -s /bin/bash user && \
mkdir -p $HOME && \
chown -R user $HOME

WORKDIR $HOME


# Stage 2: latest GenAIComps sources
FROM base AS git

RUN apt-get update && apt-get install -y --no-install-recommends git
# RUN git clone --depth 1 https://github.com/opea-project/GenAIComps.git
COPY GenAIComps GenAIComps


# Stage 3: common layer shared by services using GenAIComps
FROM base AS comps-base

# copy just relevant parts
COPY --from=git $HOME/GenAIComps/comps $HOME/GenAIComps/comps
COPY --from=git $HOME/GenAIComps/*.* $HOME/GenAIComps/LICENSE $HOME/GenAIComps/

WORKDIR $HOME/GenAIComps
RUN pip install --no-cache-dir --upgrade pip setuptools && \
pip install --no-cache-dir -r $HOME/GenAIComps/requirements.txt
WORKDIR $HOME

ENV PYTHONPATH=$PYTHONPATH:$HOME/GenAIComps

USER user


# Stage 4: unique part
FROM comps-base

ENV LANG=C.UTF-8

COPY ./codegen.py $HOME/codegen.py

Expand Down
2 changes: 1 addition & 1 deletion CodeGen/codegen.py
Original file line number Diff line number Diff line change
Expand Up @@ -313,4 +313,4 @@ def start(self):
if __name__ == "__main__":
chatqna = CodeGenService(port=MEGA_SERVICE_PORT)
chatqna.add_remote_service()
chatqna.start()
chatqna.start()
2 changes: 1 addition & 1 deletion CodeGen/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -330,4 +330,4 @@ Then run the command `docker images`, you will have the following Docker Images:
- `opea/llm-textgen:latest`
- `opea/codegen:latest`
- `opea/codegen-ui:latest`
- `opea/codegen-react-ui:latest` (optional)
- `opea/codegen-react-ui:latest` (optional)
8 changes: 4 additions & 4 deletions CodeGen/docker_compose/intel/cpu/xeon/compose.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

services:

tgi-service:
Expand Down Expand Up @@ -100,7 +97,7 @@ services:
ipc: host
restart: always
codegen-xeon-ui-server:
image: ${REGISTRY:-opea}/codegen-ui:${TAG:-latest}
image: ${REGISTRY:-opea}/codegen-gradio-ui:${TAG:-latest}
container_name: codegen-xeon-ui-server
depends_on:
- codegen-xeon-backend-server
Expand All @@ -111,6 +108,9 @@ services:
- https_proxy=${https_proxy}
- http_proxy=${http_proxy}
- BASIC_URL=${BACKEND_SERVICE_ENDPOINT}
- MEGA_SERVICE_PORT=${MEGA_SERVICE_PORT}
- host_ip=${host_ip}
- DATAPREP_ENDPOINT=${DATAPREP_ENDPOINT}
ipc: host
restart: always
redis-vector-db:
Expand Down
3 changes: 2 additions & 1 deletion CodeGen/docker_compose/set_env.sh
Original file line number Diff line number Diff line change
Expand Up @@ -47,5 +47,6 @@ export TEI_EMBEDDING_HOST_IP=${host_ip}
export TEI_EMBEDDING_ENDPOINT="http://${host_ip}:${TEI_EMBEDDER_PORT}"

export DATAPREP_REDIS_PORT=6007
export DATAPREP_ENDPOINT="http://${host_ip}:${DATAPREP_REDIS_PORT}/v1/dataprep"
export LOGFLAG=false
export MODEL_CACHE="./data"
export MODEL_CACHE="./data"
6 changes: 6 additions & 0 deletions CodeGen/docker_image_build/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,12 @@ services:
dockerfile: ./docker/Dockerfile.react
extends: codegen
image: ${REGISTRY:-opea}/codegen-react-ui:${TAG:-latest}
codegen-gradio-ui:
build:
context: ../ui
dockerfile: ./docker/Dockerfile.gradio
extends: codegen
image: ${REGISTRY:-opea}/codegen-gradio-ui:${TAG:-latest}
llm-textgen:
build:
context: GenAIComps
Expand Down
33 changes: 33 additions & 0 deletions CodeGen/ui/docker/Dockerfile.gradio
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Copyright (C) 2025 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

FROM python:3.11-slim

ENV LANG=C.UTF-8

ARG ARCH="cpu"

RUN apt-get update -y && apt-get install -y --no-install-recommends --fix-missing \
build-essential \
default-jre \
libgl1-mesa-glx \
libjemalloc-dev \
wget

# Install ffmpeg static build
WORKDIR /root
RUN wget https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-amd64-static.tar.xz && \
mkdir ffmpeg-git-amd64-static && tar -xvf ffmpeg-git-amd64-static.tar.xz -C ffmpeg-git-amd64-static --strip-components 1 && \
export PATH=/root/ffmpeg-git-amd64-static:$PATH && \
cp /root/ffmpeg-git-amd64-static/ffmpeg /usr/local/bin/ && \
cp /root/ffmpeg-git-amd64-static/ffprobe /usr/local/bin/

RUN mkdir -p /home/user

COPY gradio /home/user/gradio

RUN pip install --no-cache-dir --upgrade pip setuptools && \
pip install --no-cache-dir -r /home/user/gradio/requirements.txt

WORKDIR /home/user/gradio
ENTRYPOINT ["python", "codegen_ui_gradio.py"]
65 changes: 65 additions & 0 deletions CodeGen/ui/gradio/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# Document Summary

This project provides a user interface for summarizing documents and text using a Dockerized frontend application. Users can upload files or paste text to generate summaries.

## Docker

### Build UI Docker Image

To build the frontend Docker image, navigate to the `GenAIExamples/DocSum/ui` directory and run the following command:

```bash
cd GenAIExamples/CodeGen/ui
docker build -t opea/codegen-gradio-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f docker/Dockerfile.gradio .
```

This command builds the Docker image with the tag `opea/codegen-gradio-ui:latest`. It also passes the proxy settings as build arguments to ensure that the build process can access the internet if you are behind a corporate firewall.

### Run UI Docker Image

To run the frontend Docker image, navigate to the `GenAIExamples/CodeGen/ui/gradio` directory and execute the following commands:

```bash
cd GenAIExamples/CodeGen/ui/gradio

ip_address=$(hostname -I | awk '{print $1}')
docker run -d -p 5173:5173 --ipc=host \
-e http_proxy=$http_proxy \
-e https_proxy=$https_proxy \
-e no_proxy=$no_proxy \
-e BACKEND_SERVICE_ENDPOINT=http://$ip_address:7778/v1/codegen \
opea/codegen-gradio-ui:latest
```

This command runs the Docker container in interactive mode, mapping port 5173 of the host to port 5173 of the container. It also sets several environment variables, including the backend service endpoint, which is required for the frontend to communicate with the backend service.

### Python

To run the frontend application directly using Python, navigate to the `GenAIExamples/CodeGen/ui/gradio` directory and run the following command:

```bash
cd GenAIExamples/CodeGen/ui/gradio
python codegen_ui_gradio.py
```

This command starts the frontend application using Python.

## Additional Information

### Prerequisites

Ensure you have Docker installed and running on your system. Also, make sure you have the necessary proxy settings configured if you are behind a corporate firewall.

### Environment Variables

- `http_proxy`: Proxy setting for HTTP connections.
- `https_proxy`: Proxy setting for HTTPS connections.
- `no_proxy`: Comma-separated list of hosts that should be excluded from proxying.
- `BACKEND_SERVICE_ENDPOINT`: The endpoint of the backend service that the frontend will communicate with.

### Troubleshooting

- Docker Build Issues: If you encounter issues while building the Docker image, ensure that your proxy settings are correctly configured and that you have internet access.
- Docker Run Issues: If the Docker container fails to start, check the environment variables and ensure that the backend service is running and accessible.

This README file provides detailed instructions and explanations for building and running the Dockerized frontend application, as well as running it directly using Python. It also highlights the key features of the project and provides additional information for troubleshooting and configuring the environment.
Loading
Loading