Big-O Notation

Javascript — Big O Notation: Time and Space Complexity

5beberson

--

In preparation of upcoming interviews, I am deep diving into Javascript and algorithms. I’m taking a class on Udemy, very intense but very informative. I learned more about Big-O Notation so I decided to write a blog about my understanding of this way to write easier and faster code.

What is Big O Notation?

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann,[1] Edmund Landau,[2] and others, collectively called Bachmann–Landau notation or asymptotic notation.

Wikipedia’s definition

Big-O Notation is a clear way of describing the complexity of your code using algebraic terms.

Time complexity

Let’s take a simple example:

const data = ['A', 'B', 'C']for ( let i = 0; i < data.length; i++) {
console.log(data[i])
}
A
B
C

The first thing we need to do is figure out what the core part of the algorithm is. In this case, the core portion is the console.log. It is the one thing that our algorithm does over and over to get the end result. And then we determine how many times this thing happens based of how large our input is. Generally, what we are going to do, is set some form of variable to be how long the input is. This is almost always used as ’n’ so you’ll see ’n’ a lot in Big-O Notation. So the length of our ‘data’ array is equal to ’n’.

And now we want to determine how our algorithm grows as ’n’ grows. In this case, as ’n’ grows from 3 to infinite, we run this algorithm 3 to infinite time. So it scales 1 to 1 with the size of ’n’. So we will be giving this a Big-O Notation of ’n’, O(n).

If we were to put another loop like so:

const data = ['A', 'B', 'C'];
const data2 = [1, 2, 3, 4, 5];
for (let j = 0; j < data2.length; j++) {
console.log(data2);
}
for ( let i = 0; i < data.length; i++) {
console.log(data);
}
1
2
3
4
5
A
B
C

Now there are 2 different loops in our algorithm. First it loops everything in data2, and then loops over data. Now we have 2 separate variables. We have ’n’ for ‘data’, and ‘b’ representing the length of data2. Our algorithm is now going to scale based on the length of data and data2. So we will have a Big-O Notation O(n + b), which we can simplify as O(n).

Now let’s do a nested loop where data2 and data run simultaneously.

const data = ['A', 'B', 'C'];
const data2 = [1, 2, 3, 4, 5];
for (let j = 0; j < data2.length; j++) {
for ( let i = 0; i < data.length; i++) {
console.log(data[i] + data2[j]);
}
}

Our big O Notation is going to be the console.log, and we are doing this ’n’ times for every single ‘b’ time that we go through our loop. It will be O(n * b) which we usually write down as O(n²)(O(n squared)).

When you start to see these types of algorithm, this is when you are going to run into problems because these perform very slowly as the input grows. It is growing exponentially, as it show in the chart below:

Big-O Complexity Chart

Now let’s imagine we take the example above and console.log 10 times instead of 1.

const data = ['A', 'B', 'C'];
const data2 = [1, 2, 3, 4, 5];
for (let j = 0; j < data2.length; j++) {
for ( let i = 0; i < data.length; i++) {
console.log(data[i] + data2[j]);
console.log(data[i] + data2[j]);
console.log(data[i] + data2[j]);
console.log(data[i] + data2[j]);
console.log(data[i] + data2[j]);
}
}

This will go to O(4n²), but when you are doing Big-O Notation, one thing that we do is remove any leading constants like this, so it will end up being O(n²).

A last example that I would take would be if we another loop after the nested loop like so:

const data = ['A', 'B', 'C'];
const data2 = [1, 2, 3, 4, 5];
for (let j = 0; j < data2.length; j++) {
for ( let i = 0; i < data.length; i++) {
console.log(data[i] + data2[j]);
}
}
for ( let i = 0; i < data.length; i++) {
console.log(data[i]);
}

We are looping n squared time on the 1st nested loop, and n time over the 2nd loop. Big-O Notation will be O(n² + n), but we can cut off the second variable that scales less and keep it as O(n²).

Space complexity

We want to ask ourselves how does our algorithm grow in memory usage, as the input size scales.

Let’s go back to the 1st example of time complexity:

const data = ['A', 'B', 'C']for ( let i = 0; i < data.length; i++) {
console.log(data[i])
}

The space complexity for this is zero. We are not adding any extra space because we are not creating anything inside of the loop. It will be called a Big-O of one — O(1) because it’s going to be constant. The space doesn’t change as our input size scales.

Let’s take another example where we create a new array.

const data = ['A', 'B', 'C'];
const out = [];
for ( let i = 0; i < data.length; i++) {
out[i] = data[i];
}

Now we are creating a brand new array from the value inside of the data array. This is going to be a space complexity of ’n’ O(n). The amount of memory that this algorithm takes up is exactly the same amount as the input size of our element.

Let’s go further and add a nested loop:

const data = ['A', 'B', 'C'];
const out = [];
for ( let i = 0; i < data.length; i++) {
out[i] = [];
for ( let j = 0; j < data.length; j++) {
out[i][j] = data[i];
}
}

This is going to output 3 new arrays with the 1st array being the letter ‘A’ repeated 3 times, same with ‘B’ and ‘C’. This is going to have a space complexity of n squared — O(n²) because the output is going to be 3 times the size of the input. If we have 4 inputs, the output will be 16 times larger.

Conclusion

I still have a long road before I master the algorithms, but what I learned so far is that generally we worry about the time complexity of an algorithm. It is still nice to know about space complexity and determine how much space the algorithm is going to take.

--

--

5beberson

Made in France, built in USA. Living the American dream