When we write f(n) = O(n) what we truly should write is f(n) ∈ O(n).

That is because, technically O(n) defines a set.

This is only one of the various forms of imprecisions in our language using “Big Oh” (arguably the most widely adopted, and the only harmless imprecision I am about to discuss).

If O(n) is a set, what set is that? Is it the set of all algorithms that runs on linear time? Not really. It also includes all algorithms that runs on constant time. Do not confuses “Big Oh” with “Big Theta”.

Taking this pedantry one step further: technically O(n) is not a set of algorithm. It is a set of function. It is fundamentally a mathematical concept which could be understood without any context of computer science. To clarify this, we need the formal definition of the “Big Oh” set:

f(n) ∈ O(g(n)) if and only if there exist positive constants c and k such that f(n) <= c * g(n) for all n > k.

Noticing the parallel between this definition and that of limit, we realizes that this is really a property characterizing the limiting behavior of functions.

To make things less abstract, we could say f(n) = 3n + 9 belongs to O(n) by letting c = 4 and k = 10. We could also say f(n) = 999 belongs to O(n) by letting c = 1 and k = 1000. However, when we try to do the same thing to f(n) = n^2 we would fail.

Of course, “Big Omega” can be defined similarly, and “Big Theta” is just the intersection of the two previous sets.