Skip to content

Commit 4c27a3d

Browse files
Align faqgen to form input (opea-project#1089)
Signed-off-by: Xinyao Wang <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent 40386d9 commit 4c27a3d

File tree

2 files changed

+12
-8
lines changed

2 files changed

+12
-8
lines changed

FaqGen/docker_compose/intel/cpu/xeon/README.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -114,9 +114,11 @@ docker compose up -d
114114
3. MegaService
115115

116116
```bash
117-
curl http://${host_ip}:8888/v1/faqgen -H "Content-Type: application/json" -d '{
118-
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
119-
}'
117+
curl http://${host_ip}:8888/v1/faqgen \
118+
-H "Content-Type: multipart/form-data" \
119+
-F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5." \
120+
-F "max_tokens=32" \
121+
-F "stream=false"
120122
```
121123

122124
Following the validation of all aforementioned microservices, we are now prepared to construct a mega-service.

FaqGen/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op
2828

2929
```bash
3030
git clone https://github.com/opea-project/GenAIExamples
31-
cd GenAIExamples/FaqGen/docker/
31+
cd GenAIExamples/FaqGen/
3232
docker build --no-cache -t opea/faqgen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
3333
```
3434

@@ -37,7 +37,7 @@ docker build --no-cache -t opea/faqgen:latest --build-arg https_proxy=$https_pro
3737
Construct the frontend Docker image using the command below:
3838

3939
```bash
40-
cd GenAIExamples/FaqGen/
40+
cd GenAIExamples/FaqGen/ui
4141
docker build -t opea/faqgen-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
4242
```
4343

@@ -115,9 +115,11 @@ docker compose up -d
115115
3. MegaService
116116

117117
```bash
118-
curl http://${host_ip}:8888/v1/faqgen -H "Content-Type: application/json" -d '{
119-
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
120-
}'
118+
curl http://${host_ip}:8888/v1/faqgen \
119+
-H "Content-Type: multipart/form-data" \
120+
-F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5." \
121+
-F "max_tokens=32" \
122+
-F "stream=false"
121123
```
122124

123125
## 🚀 Launch the UI

0 commit comments

Comments
 (0)