90.04%
accuracy with a regularized Support Vector Classifier. No feature selection applied.
- Source: Fashion MNIST
- U. Raghava, Deep Learning in Fashion Industry, Medium, April 2021
- X. Gu, F. Gao, M. Tan, P. Peng, Fashion analysis and understanding with artificial intelligence, Information Processing & Management, Volume 57, Issue 5, 2020
- J.J. Wen, W.K. Wong, Fundamentals of common computer vision techniques for fashion textile modeling, recognition, and retrieval, Editor(s): W.K. Wong, In The Textile Institute Book Series, Applications of Computer Vision in Fashion and Textiles, Woodhead Publishing, 2018, pp. 17-44
- Zalando Research, Fashion mNIST repository, Github
- J.J. Wen, W.K. Wong, Fashion accessory segmentation, Editor(s): W.K. Wong, In The Textile Institute Book Series, Applications of Computer Vision in Fashion and Textiles, Woodhead Publishing, 2018, pp. 221-252
- G. Carbone, Fashion mNIST challenge – kNN, Kaggle, 2021
- L. Buitinck et al., API design for machine learning software: experiences from the scikit-learn project, European Conference on Machine Learning and Principles and Practices of Knowledge Discovery in Databases, 2013
- P. Gupta, Regularization in Machine Learning, Medium, November 2017
- R. Wang, N. Xiu, S. Zhou, An extended Newton-type algorithm for ℓ2-regularized sparse logistic regression and its efficiency for classifying large-scale datasets, Journal of Computational and Applied Mathematics, Volume 397, 2021
- M. Mazuecos, How to check for Linear Separability, Medium, May 2019
- S. Kumar, 3 Techniques to Avoid Overfitting of Decision Trees, Medium, May 2021](https://towardsdatascience.com/3-techniques-to-avoid-overfitting-of-decision-trees-1e7d3d985a09)
- Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., … Liu, T.-Y., Lightgbm: A highly efficient gradient boosting decision tree. Advances in Neural Information Processing Systems, 30, 2017, pp. 3146–3154.
- I. Goodfellow et al., Deep Learning (Adaptive Computation and Machine Learning series), MIT Press, 2016