Skip to content
This repository has been archived by the owner on Jan 23, 2023. It is now read-only.

0xUniko/Etwas-fuer-Bewerbung

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Here are some projects, seminars, courses I have engaged in and learned a lot.

Drug Sensitivity Ranking

Advisor: Professor Yuan Yao from Hong Kong University of Science and Technology

A kaggle contest held by Prof Yao for the drug sensitivity ranking, based on pairwise comparison data on cancer cell lines whose genetic (binary) features are also provided. More details about this contest can be found here

I use a DAG model and make improvements to a previous method. Finally I succeeded in winning first place in this contest.

I gained a lot of practical hand-on experience in processing data with python, including using numpy, pandas scikit-learn and other packages which are the essential tools for data science. I also had a deeper understanding about doing research. I experience the whole precedure about doing about research, including how to find an entry point when facing a new problem, how to find reference, and how to work it out practically.

Estimation of lasso in Neyman-Rubin model

Advisor: Professor Hanzhong Liu from Tsinghua University

An application of lasso used in Neyman-Rubin model is proposed by Adam-Hanzhong-et al(2016) and a later result proposed by Wager-et al(2016) improves it, which is not checked carefully. My work is to check the later result and try to improve an inequality used in the previous to obtain a better condition of the asymptotic efficiency of the lasso estimator.

A deep learning approach to objection detection

Advisor: Professor Ke Deng and Dr. Wanchuang Zhu from Tsinghua University

A temple of Song dynasty is discovered and researchers want to extract the specific architecture from the fresco of this temple. The methods of machine learning is applied by our group and my work is to learn tensorflow and basic knowledge about deep learning and discuss them in the workshop.

In this project improved my coding ability for python. I have been familar with neural network and tensorflow. And I trained my ability in writing neural network with tensorflow.

A seminar for non-parametric statistics and functional data analysis

Advisor: Professor Lijian Yang from Tsinghua University

This seminar covers basic ideas and techniques of non-parametric smoothing, including Nadaraya-Waston estimator and other kernel methods, locally polynomial regression. Asymptotic properties of these estimators, including consistency and uniformly convergence, were examined. Also discussed was basic knowledge of functional data analysis, out of the Springer book Linear Processes in Function Space by D. Bosq.

The course Machine Learning

Teacher: Liwei Wang from Peking University

This course contains many topics about supvised learning. We learned about a lot of content about concentration inequalities, including Chernoff inequality and its many deformation. We also learned about VC theory, which is the foundation of statistical learning theory. Two practical algorithms and their properties are introducted in the class. Other topics about supervised learning are PAC-Bayes and online learning. Another important topic in the class is reinenforce learning.

The course Stochastic Analysis

Teacher: Zongxia Liang from Tsinghua University

This is an advanced course in probability, which containing the following contents: martingales with continuous parameter(including stopping time theorem, martingale inequalities and Doob inequality, convergence theorems including uniformly integrable and backward martingale, Doob Meyer decomposition, quadratic variation), Poisson process(including Poisson random measure) and Brownian motion, stochastic integral with respect to discontinuous semimartingales with jumps, Itô's formula, Lévy's characterization of Brownian motion, Burkholder-Davis-Gundy inequalities, Local time for Brownian motion, Girsanov's theorem, martingale representation theorem, Stochastic differential equations , and many others.

After finishing this course, I feel like I will be invincible where probability is used!