Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

big o complexity section edits #5

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 27 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,6 +273,18 @@ Coding examples:
- print "hello";


**O(log n)**

O(log n) is slightly more difficult to explain.
One good example of a O(log n) problem is when searching up a phone book. Even if the phone book is thick, you would not need to search every name, but you just have to look for the name under the correct alphabet.

Reducing the problem size with every iteration

Coding examples:
- Binary search
- Finding largest/smallest number in a binary search tree


**O(N)**

Performance will grow linearly and in direct proportion to the size of the input data set.
Expand All @@ -282,7 +294,18 @@ Coding examples:
- Traversing a linked list
- Linear Search

**O(N)^2**

**O(n log n)**

To better understand it, think of it as O(N) and O(log n). An example of such a notation is the Quick sort when we divide the array into two parts and each time it takes O(N) time to find a pivot element.

Coding examples:
- Merge Sort
- Heap Sort
- Quick Sort


**O(N^2)**

Performance is directly proportional to the square of the size of the input data set.
This is common with nested iterations over the data set.
Expand All @@ -294,32 +317,14 @@ Coding examples:
- Insertion Sort
- Selection Sort

**O(2N)**

Performance doubles with each additon to the input data set. The growth curve of an O(2N) function is exponential.
**O(2^N)**

Performance doubles with each additon to the input data set. The growth curve of an O(2^N) function is exponential.

Coding examples:
- Recursive calculation of Fibonacci numbers

**O(log n)**

O(log n) is slightly more difficult to explain.
One good example of a O(log n) problem is when searching up a phone book. Even if the phone book is thick, you would not need to search every name, but you just have to look for the name under the correct alphabet.

Reducing the problem size with every iteration

Coding examples:
- Binary search
- Finding largest/smallest number in a binary search tree

**O(n log n)**

To better understand it, think of it as O(N) and O(log n). An example of such a notation is the Quick sort when we divide the array into two parts and each time it takes O(N) time to find a pivot element.

Coding examples:
- Merge Sort
- Heap Sort
- Quick Sort

## Big-Ω (Big-Omega) notation
It describes:
Expand Down