This repository contains homework submissions for the course "System Identification," instructed by Dr. Babak Nadjar Araabi.
In this assignment, we delve into the fundamental concept of least squares estimation. The project covers the application of least squares in system identification, providing insights into its implementation and significance.
-
Cramer-Rao Lower Bound Analysis: Explore the Cramer-Rao lower bound and its implications on the precision of parameter estimation. Understand the theoretical limits of estimation accuracy.
-
Estimation Error Analysis: Investigate the sources and characteristics of estimation errors in the context of system identification.
-
Model Error Examination: Explore the impact of model errors on the overall system identification process.
Certainly! Below is an extended version of your README file incorporating the additional content you provided:
In this assignment, we delve into advanced techniques in system identification, building upon the fundamental concept of least squares estimation. The project explores various methods for identifying significant regressors in a model and introduces key techniques such as regularization, recursive least squares (RLS), and the Kalman filter.
-
Finding Important Regressors: Explore various methods for identifying significant regressors in your model:
-
Ordinary Least Squares (OLS): Implement the classic OLS method to estimate model parameters and assess the significance of regressors.
-
Ridge Regression: Employ ridge regression to handle multicollinearity and stabilize parameter estimates, particularly effective when dealing with ill-conditioned or over-parameterized models.
-
Forward Selection: Utilize forward selection techniques to iteratively add the most significant regressors to the model, effectively enhancing its predictive power while controlling for complexity.
-
Backward Elimination: Implement backward elimination strategies to systematically remove less significant regressors from the model, refining its simplicity without sacrificing predictive performance.
-
-
Regularization: Use regularization in parameter estimation
-
Recursive Least Squares (RLS): Explore recursive least squares algorithms, including covariance reset in RLS and RLS with forgetting factor, for efficient online parameter estimation.
-
Kalman Filter: Introduce the Kalman filter as a recursive estimator, particularly useful for systems with stochastic dynamics and measurements.
-
Parameter Estimation for Different Models: Explore parameter estimation techniques for a variety of models commonly used in system identification, including:
- Output Error (OE) Models
- Box-Jenkins (BJ) Models
- AutoRegressive AutoRegressive with eXogenous inputs (ARARX) Models
- AutoRegressive Moving Average with eXogenous inputs (ARMAX) Models
- AutoRegressive with eXogenous inputs (ARX) Models
Homework 2 delves into advanced techniques in neural networks and learning algorithms, extending beyond traditional approaches to explore cutting-edge methodologies in system identification.
-
Perceptron Networks: Implement and analyze perceptron networks, foundational models in neural network theory, for system identification tasks.
-
Radial Basis Function (RBF) Networks: Explore the use of RBF networks, which utilize radial basis functions as activation functions, for modeling complex relationships in system identification.
-
Normalized Radial Basis Function (NRBF) Networks: Investigate NRBF networks, an extension of RBF networks with normalized basis functions, for improved performance and stability in parameter estimation.
-
LOLIMOT Algorithm: Introduce the LOLIMOT (Local Linear Model Tree) algorithm, a hybrid learning approach combining tree-based modeling with local linear models, for efficient and accurate system identification.
-
LLNF Network: Explore the LLNF (Local Linear Neuro-Fuzzy) network, which integrates neuro-fuzzy techniques with local linear models, offering adaptive learning capabilities for system identification tasks.