You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We use the docker-compose to deploy the scarpyd and scrapydweb,
- SCRAPYD_SERVERS=scrapyd:6800
then in the Scrapydweb, all the Log URL start from scrapyd:6800, is there any way to setup different host for the Log or other link related to Scrapyd?
Expected behavior
A clear and concise description of what you expected to happen.
Logs
Add logs of ScrapydWeb and Scrapyd (optional) when reproducing the bug.
(It's recommended to run ScrapydWeb with argument '--verbose' if its version >= 1.0.0)
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
ScrapydWeb version: [e.g. 1.4.0 or latest code on GitHub]
ScrapydWeb related settings [e.g. 'ENABLE_AUTH = True']
Scrapyd version: [e.g. 1.2.1 or latest code on GitHub]
Scrapyd amount [e.g. 1 or 5]
Scrapy version: [e.g. 1.8.0, 2.0.0 or latest code on GitHub]
Browser [e.g. Chrome 71, Firefox 64 or Safari 12]
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
xmeng1
changed the title
Docker compose scrapdweb with scrapd the log url use docker name
Docker compose scrapdweb with scrapyd the log url use docker name
Jun 29, 2023
Describe the bug
We use the docker-compose to deploy the scarpyd and scrapydweb,
then in the Scrapydweb, all the Log URL start from
scrapyd:6800
, is there any way to setup different host for the Log or other link related to Scrapyd?我们使用 Docker compose 编排 scrapyd 和 scrapydweb, scrapydweb 配置 scrapyd 服务使用 Docker 内部 host name
这样所有的 Log URL 的地址都是 scrapyd:6800 开头, 有办法单独配置这个 host 给所有和 Scrapyd 相关的 link 吗?
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Logs
Add logs of ScrapydWeb and Scrapyd (optional) when reproducing the bug.
(It's recommended to run ScrapydWeb with argument '--verbose' if its version >= 1.0.0)
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: