You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried crawling one website, but after it indicated
queue1|crawled1
Error: cannot crawl page
1links in the queue
Thread-1crawling set(['set([\'set(["set([\\\'https://www.cracked.com/\\\'])"])\'])'])
queue1|crawled2
Error: cannot crawl page "
it kept on calling crawl and create jobs (I guess on each other) a lot of times, then from there, called file_to_set, then indicated the recursion depth exceeded.
How can I fix this?
The text was updated successfully, but these errors were encountered:
I tried crawling one website, but after it indicated
it kept on calling crawl and create jobs (I guess on each other) a lot of times, then from there, called
file_to_set
, then indicated the recursion depth exceeded.How can I fix this?
The text was updated successfully, but these errors were encountered: