Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create web visualization tools #82

Open
wants to merge 27 commits into
base: master
Choose a base branch
from

Conversation

EricZeiberg
Copy link
Contributor

This PR creates a visualization tool (similar to the one in NeuralTalk) that allows someone to track the training of the modal in the browser, by visiting the monitor.html page. It works by outputting various metrics to the data.txt file and train.txt file in the web_utils folder via Lua's I/O. The JS on the monitor.html page then reads those 2 files every 3 seconds and updates various graphs and progress bars to show how the training score is lowering, and completion of the epoch / iteration.

Visualization tools are disabled by default, and can be enabled by adding the -visualize flag to the train.lua file when running it.

Heres a link to a screenshot of the page: http://puu.sh/jycUj/254db9430c.png
(The black background of the graph was removed after this screenshot was taken, but otherwise, this picture is accurate)

Also, theres alot of commits in this PR, so its best to view it on combined mode, instead of viewing each commit separately.

Thanks,
Eric

@EricZeiberg
Copy link
Contributor Author

So after some testing, the page tends to crash if left loaded for a period of time. Working on averaging out data points to minimize memory usage right now.

Edit: Actually just going to change it to record every other point, thus cutting data in half.

@karpathy
Copy link
Owner

Thanks! This looks great and I'd be happy to merge something like this when the kinks are ironed out.

@EricZeiberg
Copy link
Contributor Author

Alright, after upgrading to CanvasJS, the graph can now handle hundreds of thousands of data points at once, with little optimization. Feel free to merge, I'll let you know if I find any other bugs.

Oh, and heres an updated screenshot: http://puu.sh/jz8d9/89901ee9f6.png

@karpathy
Copy link
Owner

Is it easy to also include, e.g. validation loss? looking at it in context of training loss is very useful usually. We can expand on this in the future I suppose. I will look through this PR in detail tomorrow, it's already near midnight here. Thanks!

@EricZeiberg
Copy link
Contributor Author

Its easy to include anything, really. Would you like just some text displaying the most recent validation loss value, or a graph?

@alexkruegger
Copy link

Nice work! Is it possible also to include to the monitoring page next graphs?

  • training/validation accuracy on the one graph (to check overfitting/underfitting)
  • grad/param norm
    These graphs will be useful to tune hyperparameters.

@karpathy
Copy link
Owner

Ok I had a closer look at the code and while I am onboard with the general idea of including web-based visualization of the training progress, I am hesitant to merge this particular version of it.

Going down the path of writing arbitrary things to different lines of a text files doesn't usually lead anywhere nice. Fast hacks and conveniences usually down the line lead to frustration.

I think a clean solution would compile reports in lua, which would then be exported as JSON. The JSON would contain a summarized full history of the training progress (e.g. current options, train/val loss as a list, example samples from current model, etc.). The web interface would then read these files and draw them. The web interface should have little state by itself - e.g. refreshing the page should give approximately the same view.

We can leave this pull request open so that anyone who stumbles by this and wants to use it can merge in your code, but I'll leave it out of master.

@EricZeiberg
Copy link
Contributor Author

Yep, that's fine. I'll work on it when I get some free time.
On Aug 15, 2015 11:53 AM, "alexkruegger" [email protected] wrote:

Nice work! Is it possible also to include to the monitoring page next
graphs?

  • training/validation accuracy on the one graph (to check
    overfitting/underfitting)
  • grad/param norm These graphs will be useful to tune hyperparameters.


Reply to this email directly or view it on GitHub
#82 (comment).

@whackashoe
Copy link

Just wanted to say I'm working on this now (in case someone else was thinking of starting on this).

@jtoy
Copy link

jtoy commented Jan 30, 2016

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants