-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Only errors displayed on the report #79
Comments
seems that you are missing some nodejs deps. try to "cd core/nodejs" and run "npm i" |
asd@asd nodejs % npm i audited 45 packages in 1.334s 1 package is looking for funding found 0 vulnerabilities asd@asd nodejs % npm fund |
please try to clone the whole project from scratch and run htcrawl.pw crawl |
I am getting the same error while crawling a website hosted locally **crawl result for: link GET http://localhost:3000/home
Crawl command: python3.3 htcap.py crawl -vwl localhost:3000/home target.db npm dependencies are up-to-date |
The urls with Errors are only displayed on the html report. Rest no urls are visible. I tried using https://htcap.org/scanme/ but still same output
Errors3
probe_killed
probe_failure
HTTP Error 400: Bad Request
Command:
./htcap.py crawl https://htcap.org htcap.db -v
Initializing . . . done
Database htcap-2.db initialized, crawl started with 10 threads (^C to pause or change verbosity)
[================== ] 5 of 9 pages processed in 0 minutes^C
Crawler is paused.
r resume
v verbose mode
p show progress bar
q quiet mode
Hit ctrl-c again to exit
new request found link GET https://htcap.org/scanme/login/
crawl result for: redirect GET https://htcap.org/scanme/db_screen.png
The text was updated successfully, but these errors were encountered: