Skip to content

Commit

Permalink
Update ex2.m
Browse files Browse the repository at this point in the history
  • Loading branch information
malakar-soham authored Jul 22, 2019
1 parent a505ce0 commit c7d1dae
Showing 1 changed file with 17 additions and 38 deletions.
55 changes: 17 additions & 38 deletions ex2.m
Original file line number Diff line number Diff line change
@@ -1,21 +1,3 @@
%% Machine Learning Online Class - Exercise 2: Logistic Regression
%
% Instructions
% ------------
%
% This file contains code that helps you get started on the logistic
% regression exercise. You will need to complete the following functions
% in this exericse:
%
% sigmoid.m
% costFunction.m
% predict.m
% costFunctionReg.m
%
% For this exercise, you will not need to change any code in this file,
% or any other files other than those mentioned above.
%

%% Initialization
clear ; close all; clc

Expand All @@ -27,7 +9,7 @@
X = data(:, [1, 2]); y = data(:, 3);

%% ==================== Part 1: Plotting ====================
% We start the exercise by first plotting the data to understand the
% We start by first plotting the data to understand the
% the problem we are working with.

fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...
Expand All @@ -50,17 +32,16 @@


%% ============ Part 2: Compute Cost and Gradient ============
% In this part of the exercise, you will implement the cost and gradient
% for logistic regression. You neeed to complete the code in
% costFunction.m
% In this part, I have implemented the cost and gradient
% for logistic regression.

% Setup the data matrix appropriately, and add ones for the intercept term
% Setting up the data matrix appropriately, and adding ones for the intercept term
[m, n] = size(X);

% Add intercept term to x and X_test
X = [ones(m, 1) X];

% Initialize fitting parameters
% Initializing fitting parameters
initial_theta = zeros(n + 1, 1);

% Compute and display initial cost and gradient
Expand All @@ -72,7 +53,7 @@
fprintf(' %f \n', grad);
fprintf('Expected gradients (approx):\n -0.1000\n -12.0092\n -11.2628\n');

% Compute and display cost and gradient with non-zero theta
% Computing and displaying cost and gradient with non-zero theta
test_theta = [-24; 0.2; 0.2];
[cost, grad] = costFunction(test_theta, X, y);

Expand All @@ -87,29 +68,29 @@


%% ============= Part 3: Optimizing using fminunc =============
% In this exercise, you will use a built-in function (fminunc) to find the
% I have used a built-in function (fminunc) to find the
% optimal parameters theta.

% Set options for fminunc
% Setting options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);

% Run fminunc to obtain the optimal theta
% Running fminunc to obtain the optimal theta
% This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

% Print theta to screen
% Printing theta to screen
fprintf('Cost at theta found by fminunc: %f\n', cost);
fprintf('Expected cost (approx): 0.203\n');
fprintf('theta: \n');
fprintf(' %f \n', theta);
fprintf('Expected theta (approx):\n');
fprintf(' -25.161\n 0.206\n 0.201\n');

% Plot Boundary
% Ploting Boundary
plotDecisionBoundary(theta, X, y);

% Put some labels
% Puting some labels
hold on;
% Labels and Legend
xlabel('Exam 1 score')
Expand All @@ -123,17 +104,15 @@
pause;

%% ============== Part 4: Predict and Accuracies ==============
% After learning the parameters, you'll like to use it to predict the outcomes
% on unseen data. In this part, you will use the logistic regression model
% After learning the parameters, I have used it to predict the outcomes
% on unseen data. In this part, I have used the logistic regression model
% to predict the probability that a student with score 45 on exam 1 and
% score 85 on exam 2 will be admitted.
%
% Furthermore, you will compute the training and test set accuracies of
% our model.
%
% Your task is to complete the code in predict.m
% Furthermore, I have computed the training and test set accuracies of
% the model.

% Predict probability for a student with score 45 on exam 1
% Predicting probability for a student with score 45 on exam 1
% and score 85 on exam 2

prob = sigmoid([1 45 85] * theta);
Expand Down

0 comments on commit c7d1dae

Please sign in to comment.