Notebooks for Kaggle competition
-
Updated
Jan 25, 2025 - Jupyter Notebook
Notebooks for Kaggle competition
This repository contains a project I completed for an NTU course titled CB4247 Statistics & Computational Inference to Big Data. In this project, I applied regression and machine learning techniques to predict house prices in India.
Dynamically adjust cost of the rides in response to changing factors
Detecting brain age based on MRI scans data.
Math Score Predictor
Accident damage prediction using catboost regressor
Linking Writing Processes to Writing Quality
Estimating abalone rings (age) based on their physical characteristics, such as gender, length, height, diameter, weight, etc.
Predicting house prices using advanced regression techniques (LightGBM, XGBoost, CatBoost, stacking) on Kaggle’s Ames dataset.
A light-weight Kaggle challenge to predict crabs' age
Predicting house prices using advanced regression algorithms
This project aims to predict flight arrival delays using various machine learning algorithms. It involves EDA, feature engineering, and model tuning with XGBoost, LightGBM, CatBoost, SVM, Lasso, Ridge, Decision Tree, and Random Forest Regressors. The goal is to identify the best model for accurate predictions.
Code for kaggle single cell competition (got bronze medal)
Predicción del precio de venta de las viviendas en venta y de las viviendas en alquiler de Barcelona.
Development of a project for the thesis “AI in the reduction of cognitive biases (Anchoring Bias) in Brazilian consumption", with the support of Python and Machine Learning.
Add a description, image, and links to the catboost-regressor topic page so that developers can more easily learn about it.
To associate your repository with the catboost-regressor topic, visit your repo's landing page and select "manage topics."