How fast does your code grow when input size grows? Master this and you'll analyse any algorithm.
See how different complexities grow as n increases. Drag the slider to change n.
# Sequential → O(n) + O(n) = O(2n) → O(n) for i in range(n): ... for i in range(n): ... # Nested → O(n) × O(n) = O(n²) for i in range(n): for j in range(n): ... # Log n: each step HALVES the problem while n > 1: n = n // 2 # O(log n)
| Complexity | Name | n = 1M steps | Verdict |
|---|---|---|---|
O(1) | Constant | 1 | Perfect |
O(log n) | Logarithmic | ~20 | Excellent |
O(n) | Linear | 1,000,000 | Good |
O(n log n) | Linearithmic | ~20,000,000 | Acceptable |
O(n²) | Quadratic | 1,000,000,000,000 | Slow ❌ |
O(2ⁿ) | Exponential | ∞ | Never use ❌ |
| What you create | Space |
|---|---|
| A few integer variables | O(1) |
| Array/list of size n | O(n) |
| 2D matrix n×n | O(n²) |
| Recursion depth n | O(n) call stack |
| HashMap with n entries | O(n) |
x in list is O(n), x in set is O(1), sorted() is O(n log n).| Pattern | Time | Why |
|---|---|---|
| Single loop over n | O(n) | Runs n times |
| Two nested loops | O(n²) | n × n |
| Halve input each step | O(log n) | log₂(n) steps |
arr.sort() | O(n log n) | TimSort |
dict / set lookup | O(1) | Hash function |
| Recursion branches 2× | O(2ⁿ) | Tree of calls |
x in list | O(n) | Linear scan |
list.pop(0) | O(n) | Shifts all elements |
deque.popleft() | O(1) | Double-ended |
# O(1) operations arr[i] # index access arr.append(x) # amortized O(1) arr.pop() # remove last d[key] # dict access x in my_set # set lookup len(anything) # stored attribute # O(n) operations — be careful! x in my_list # linear scan ❌ arr.insert(0, x) # shifts all right ❌ arr.pop(0) # shifts all left ❌ arr.copy() # copies n elements " ".join(arr) # O(total chars) # O(n log n) arr.sort() # in-place TimSort sorted(arr) # returns new sorted list