-
Notifications
You must be signed in to change notification settings - Fork 130
Multiple users accessing system causes cracking job to never start #359
Comments
I've noticed a bit of load upon job star/enqueue. The build_crack_cmd.rb is a bit clunky and is due for a major rewrite. Ideally, when completed, it'd be more efficient, and might address the above experienced issue. |
Fair enough, is there anything that can be done in the mean time to alleviate this? Other than not having so many people accessing the web interface simultaneously! |
@GrepItAll webbrick isn't a robust webserver which could be the cause. A couple things I can think of.
|
Cheers @ccammilleri, I'll try both of these next time we have multiple users available to test it |
@GrepItAll just curious, did you get a chance to test the suggestions? |
I haven't yet, but coincidentally this is the week where I'm most likely to need it! I will have a room full of people all trying to access the server in a couple of days, so I can try the suggestions out then. |
I tried following your instructions for Puma, but I think more changes are needed to get it to play nicely:
|
I had 8 people trying to access my hashview system the other day simultaneously. Everyone managed to queue a different job at the same time (at my request). However, once someone tried to press play on their job, I found that everyones connection to hashview kept timing out and the hashview log showed it trying to send the same pages out to everyone over and over again. The hashcat job never started because of this; I think that we'd basically caused a DoS on the system which prevented any work getting started.
Asking most of the users to close their tabs allowed hashview to carry on with it's work.
I don't have a particular error to report and I appreciate that having multiple users accessing the same system simultaneously is probably uncommon, but it seems like the web server being run by hashview is not particularly robust. Any ideas about solutions for this?
The text was updated successfully, but these errors were encountered: