Big O
Big O
Big O is a metric how we describe the efficiency of algorithms. Not only it can be used to judge an algorithm, but also can help you develop one.
Big O is also called asymptotic runtime, it means how the runtime of time or space grow as the input size grows.
Time Complexity
If we take the “data transfer” as an example, we will have the following runtimes:
- O(s), where s is the size of the file. It will grow as the file size grows.
- O(1), it means the time is constant, the time won’t grow as the file size grows.
There are other runtimes. Some of the most common ones are O(logN), O(NlogN), O(N), O(N^2) and O(2^N). We can also have multiple variables in runtimes, such as O(ab) where a and b are sizes of two array.
Best Case, Worse Case and Average Case
Given an algorithm, there are three ways to describe it for different input.
Let’s use Binary Search to describe these.
- Best Case is O(1) when the first element is what we want.
- Worse Case is O(logN) which means we need to find it in the first or last position.
- Average Case is also O(logN).
Best Case is not that useful, because it requires optimal situation. In practise, we are care more about the other two.
Space Complexity
How an algorithm use space also matters. Space complexity is a parallel concept to time complexity. Normally if we create an array of size N, it will take O(N) space. If we create a two-dimentional array, it will require O(n^2) space.
Rules to follow
Drop the Constants
Because Big O describes the rate of increase, we just use O(N) instead of O(2N). And we also need to accept that O(N) is not always faster than O(N^2), because at some point O(N) would slower than O(N^2).
Drop the Non-Dominant Terms
Same as the Drop the Constants, if we don’t care about the constants, we also don’t need to care about the Non-Dominant terms. For example O(N^2 + N) should be O(N^2).
Amortized Time
Consider insertion of ArrayList, what is runtime?
Because ArrayList is implemented with an array, mostly we can insert item in constant time O(1). But when the array is full, more array spaces need to be added and copy the old items into the new array which is costly.
The insertion causes array resize is not happen very often, so we can “amortized” the expensive operation.
If the array will double its size when is full and the initial size is 1, we will do resize the array where these insertions happen: 2, 4, 8, 16, …, X, so we need to copy 1, 2, 4, 8, 16, …, X to the resized array.
How many copies did we do after insert X? It is 2X. Therefore the runtime of X insertions is O(2X), the amortized runtime of each insertion is O(1).
What Log N Runtime Means
We also take binary search as an example. Given N numbers in an array, how many times we need to reach 1?
It is N + N/2 + N/4 + … + 1, if we sum it from right to left, it is 1 + 2 + 4 + … + N. Let’s assume N = 2^X where x is the runtime. Obviously X is LogN.
Recursive Runtime
Recursive Runtime could be tricky, for example:
int fib(int n) {
if (n <= 1) return n
if (n == 2) return fib(1) + fib(0)
return fib(n - 1) + fib(n - 2)
}
We can’t get the runtime directly, so we derive the runtime by walking through the code. Let’s start with fib(5), it will have two recursive calls fib(4) and fib(3), both of them will have two calls too. They are fib(3), fib(2) and fib(2), fib(1).
We can see this is a binary tree, the sum of nodes is the total runtime.
In general, it is 2^0 + 2^1 + 2^2 + … + 2^N = 2^(N+1) - 1, so the runtime is O(2^N).
When a recursive funtion has multiple calls, the runtime could be O(branches^depth).
Examples
Tips
When facing a function whose runtime is not obvious, we have some strategies to handle it.
- Counting the iterations
- What the code mean
- Average work
Example 1
void fun(int[] array) {
for(int i = 0; i < array.length; i++)
for(int j = i + 1; j < array.length; j++) {
println("")
}
}
At first glance, we may think the runtime is O(N^2) as N is the array size, but we need to make it sure.
It is a little different from the common two nested loops whose runtime is actually O(N^2). So let’s count the iterations first.
The inner statement will run: N - 1, N - 2, N - 3, … 1.
If we count from bottom, it will be 1 + 2 + 3 + … + N - 1, the sum is (N * (N - 1))/2, therefore the runtime is O(N^2).
Example 2
The following code sums the value of all the nodes in a balanced binary seach tree.
int sum(Node node) {
if (node == null) return 0
return sum(node.left) + node.val + sum(node.right)
}
What is the runtime?
There are several operations in Balanced Binary Search Tree run in O(LogN) runtime. But how about this one? We can consider what it means. Sum all the nodes need to traverse all the nodes, therefore the runtime is O(N).
Since it is also a recursive one, the runtime should be O(2^depth), the depth of a balanced binary search tree is LogN, therefore the runtime now is O(2^LogN). So what is 2^LogN?
P = 2^LogN
LogP = LogN
P = N, so the runtime is O(N).
Example 3
Let’s see a tricky one.
void permutation(String str) {
permutation(str, "")
}
void permutation(String str, String prefix) {
if (str.length() == 0) {
println(prefix)
} else {
for(int i = 0; i < str.length(); i++) {
String rem = str.substring(0, i) + str.substring(i + 1)
permutation(rem, prefix + str.charAt(i))
}
}
}
Given a string length of N, there are N! permutations. There are N options for the first slot, N - 1 for the second slot and so on.
So what we need to do for each of the permutations?
For each permuatation, we have a for loop runs in O(N) and in the for loop, the substring and contacenation will also in O(N) runtime, so the total runtime is O(N^2 * N!)
References
Cracking the Coding Interview