Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

evaluation accuracy #6

Open
sunnymoon155 opened this issue Nov 21, 2018 · 2 comments
Open

evaluation accuracy #6

sunnymoon155 opened this issue Nov 21, 2018 · 2 comments

Comments

@sunnymoon155
Copy link

sunnymoon155 commented Nov 21, 2018

hi, the eval result is mean_edit_distance, could I use tf.metrics.accuracy to evaluate the accuracy?
Could you please tell me how should I modify the code?
eval_metric_ops = {"accuracy": tf.metrics.accuracy(labels,predictions)} it is not working...

PS: very interest in your post but have no experience about that.

@sunnymoon155
Copy link
Author

im using below code to try to get the accuracy for the latest epcoh, but seems something wrong, the accuracy number always change to 0.5 and 0.75 with no any other numbers.
dense_label = tf.sparse_tensor_to_dense(labels)
dense_prediction = tf.cast(tf.sparse_tensor_to_dense(decoded[0]),tf.int32)
eval_metric_ops = {"accuracy": tf.metrics.accuracy(dense_label,dense_prediction)}

@breadbread1984
Copy link
Owner

breadbread1984 commented Nov 22, 2018 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants