Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在配置文件中将ENABLE_LOGPARSER=True时配置好LOGS_DIR后启动一直报错找不到日志文件夹 #251

Closed
mttccs opened this issue Dec 3, 2024 · 1 comment
Labels
insufficient info No action would be taken until more info is provided

Comments

@mttccs
Copy link

mttccs commented Dec 3, 2024

[2024-12-03 08:11:16,484] ERROR in scrapydweb.run: Check app config fail:


Overriding custom settings from /code/scrapydweb_settings_v10.py


Index Group Scrapyd IP:Port Connectivity Auth
####################################################################################################
1____ dds_________________ xxx:52068___ True_______ None
2
___ inrepo______________ xxx:52068___ True_______ None
3____ preprint____________ xxx:52068__ True_______ None
4____ report______________ xxx:52068___ True_______ ('admin', '111111')
####################################################################################################

LOCAL_SCRAPYD_LOGS_DIR not found: /work/local-scrapydserver/logs

Check and update your settings in /code/scrapydweb_settings_v10.py
以上是部署scrapydweb时的报错,但是检查好几遍后和scrapyd中的logs_dir是完全一致的,很抓狂

@my8100
Copy link
Owner

my8100 commented Dec 3, 2024

Set LOCAL_SCRAPYD_LOGS_DIR only when scrapyd is running on the scrapydweb server.

Does “/work/local-scrapydserver/logs” exist on the scrapydweb server?

# 假如 ScrapydWeb 和某个 Scrapyd 运行于同一台主机,建议更新如下三个设置项。
# If both ScrapydWeb and one of your Scrapyd servers run on the same machine,
# ScrapydWeb would try to directly read Scrapy logfiles from disk, instead of making a request
# to the Scrapyd server.
# e.g. '127.0.0.1:6800' or 'localhost:6801', do not forget the port number.
LOCAL_SCRAPYD_SERVER = ''
# Enter the directory when you run Scrapyd, run the command below
# to find out where the Scrapy logs are stored:
# python -c "from os.path import abspath, isdir; from scrapyd.config import Config; path = abspath(Config().get('logs_dir')); print(path); print(isdir(path))"
# Check out https://scrapyd.readthedocs.io/en/stable/config.html#logs-dir for more info.
# e.g. 'C:/Users/username/logs' or '/home/username/logs'
LOCAL_SCRAPYD_LOGS_DIR = ''
# The default is False, set it to True to automatically run LogParser as a subprocess at startup.
# Note that you can run the LogParser service separately via command 'logparser' as you like.
# Run 'logparser -h' to find out the config file of LogParser for more advanced settings.
# Visit https://github.com/my8100/logparser for more info.
ENABLE_LOGPARSER = False
############################## QUICK SETUP end ################################

@my8100 my8100 added the insufficient info No action would be taken until more info is provided label Jan 12, 2025
@my8100 my8100 closed this as completed Jan 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
insufficient info No action would be taken until more info is provided
Projects
None yet
Development

No branches or pull requests

2 participants