Big O?

Big O is a gigantic pile of shit. It's a pathetic attempt to recapture the hipness of Cowboy Bebop.

That is all.
 
Here's a link to a massive spoiler page that explains much of the symbolism. There's references to programming, mathematical problems, and religion.

http://www.sixfortyfive.com/bigo/wtf/index.html

Here's the summaries of possible interpretations:

Spoiler
 
Big O notation (with a capital letter O, not a zero), also called Landau's symbol, is a symbolism used in complexity theory, computer science, and mathematics to describe the asymptotic behavior of functions. It indicates how fast a function grows or declines.

Landau's symbol comes from the name of the German number theorist Edmund Landau who invented the notation. The letter O is used because the rate of growth of a function is also called its order.

For example, when analyzing some algorithm, one might find that the time (or the number of steps) it takes to complete a problem of size n is given by T(n) = 4 n2 − 2 n + 2. If we ignore constants (which makes sense because those depend on the particular hardware the program is run on) and slower growing terms, we could say "T(n) grows at the order of n2" and write:T(n) = O(n2).

In mathematics, it is often important to get a handle on the error term of an approximation. For instance, one may write

6effaced0ddcbe16843ca79b4bdf465f.png


to express the fact that the error is smaller in absolute value than some constant times x3 if x is close enough to 0.
For the formal definition, suppose f(x) and g(x) are two functions defined on some subset of the real numbers. We write

f(x) = O(g(x)) as x → ∞
if and only if there exist constants N and C such that
|f(x)| ≤ C |g(x)| for all x > N.
Intuitively, this means that f does not grow faster than g.
If a is some real number, we write

f(x) = O(g(x)) for x -> a
if and only if there exist constants d > 0 and C such that
|f(x)| ≤ C |g(x)| for all x with |x-a| < d.

Yorick215.PNG


The first definition is the only one used in computer science (where typically only positive functions with a natural number n as argument are considered; the absolute values can then be ignored), while both usages appear in mathematics.

Here is a list of classes of functions that are commonly encountered when analyzing algorithms. The slower growing functions are listed first. c is some arbitrary constant.


notation name
O(1) constant
O(log(n)) logarithmic
O((log(n))c) polylogarithmic
O(n) linear
O(n log(n)) sometimes called "linearithmic"
O(n2) quadratic
O(nc) polynomial, sometimes "geometric"
O(cn) exponential
O(n!) factorial

Note that O(nc) and O(cn) are very different. The latter grows much, much faster, no matter how big the constant c is. A function that grows faster than any power of n is called superpolynomial. One that grows slower than an exponential function of the form cn is called subexponential. An algorithm can require time that is both superpolynomial and subexponential; examples of this include the fastest algorithms known for integer factorization.

Note, too, that O(log n) is exactly the same as O(log(nc)). The logarithms differ only by a constant factor, (since log(nc)=c log(n)) and thus the big O notation ignores that. Similarly, logs with different constant bases are equivalent.

The above list is useful because of the following fact: if a function f(n) is a sum of functions, one of which grows faster than the others, then the faster growing one determines the order of f(n). Example: If f(n) = 10 log(n) + 5 (log(n))3 + 7 n + 3 n2 + 6 n3, then f(n) = O(n3). One caveat here: the number of summands has to be constant and may not depend on n.

This notation can also be used with multiple variables and with other expressions on the right side of the equal sign. The notation:

f(n,m) = n2 + m3 + O(n+m)
represents the statement:
∃C ∃N ∀n,m>N : f(n,m)≤n2+m3+C(n+m)
Obviously, this notation is abusing the equality symbol, since it violates the axiom of equality: "things equal to the same thing are equal to each other". To be more formally correct, some people (mostly mathematicians, as opposed to computer scientists) prefer to define O(g(x)) as a set-valued function, whose value is all functions that do not grow faster then g(x), and use set membership notation to indicate that a specific function is a member of the set thus defined. Both forms are in common use, but the sloppier equality notation is more common at present.

Another point of sloppiness is that the parameter whose asymptotic behaviour is being examined is not clear. A statement such as f(x,y) = O(g(x,y)) requires some additional explanation to make clear what is meant. Still, this problem is rare in practice.


Related notations
In addition to the big O notations, another Landau symbol is used in mathematics: the little o. Informally, f(x) = o(g(x)) means that f grows much slower than g and is insignificant in comparison.

Formally, we write f(x) = o(g(x)) (for x -> ∞) if and only if for every C>0 there exists a real number N such that for all x > N we have |f(x)| < C |g(x)|; if g(x) ≠ 0, this is equivalent to limx→∞ f(x)/g(x) = 0.

Also, if a is some real number, we write f(x) = o(g(x)) for x -> a if and only if for every C>0 there exists a positive real number d such that for all x with |x - a| < d we have |f(x)| < C |g(x)|; if g(x) ≠ 0, this is equivalent to limx -> a f(x)/g(x) = 0.


Big O is the most commonly used of five notations for comparing functions:


Notation Definition Analogy
f(n) = O(g(n)) see above ≤
f(n) = o(g(n)) see above <
f(n) = Ω(g(n)) g(n)=O(f(n)) ≥
f(n) = ω(g(n)) g(n)=o(f(n)) >
f(n) = Θ(g(n)) f(n)=O(g(n)) and g(n)=O(f(n)) =

The notations Θ and Ω are often used in computer science; the lower-case o is common in mathematics but rare in computer science. The lower-case ω is rarely used.


A common error is to confuse these by using O when Θ is meant. For example, one might say "heapsort is O(n log n) in average case" when the intended meaning was "heapsort is Θ(n log n) in average case". Both statements are true, but the latter is a stronger claim.

Another notation sometimes used in computer science is Õ (read Soft-O).
f(n) = Õ(g(n)) is shorthand for f(n) = O(g(n) logkn) for some k. Essentially, it is Big-O, ignoring logarithmic factors.

The notations described here are used for approximating formulas (e.g. those in the sum article), for analysis of algorithms (e.g. those in the heapsort article), and for the definitions of terms in complexity theory (e.g. polynomial time).
 
Nordramor said:
Here's a link to a massive spoiler page that explains much of the symbolism. There's references to programming, mathematical problems, and religion.

http://www.sixfortyfive.com/bigo/wtf/index.html

Here's the summaries of possible interpretations:

Spoiler
Wow, that's really interesting/cool. Kind of makes me wonder why they went for more of the "giant robots fighting" thing and didn't expand more upon the possibly very cool storyline.
 
Big O? I gave up trying to understand that show a long time ago. I don't know wtf is going on anymore- I just make popcorn and watch it. I'm always left wondering what in god's name I just watched, but I love it anyway.

I need to buy a DVD-set or something.. anyone know if they have one out?
 
Last edited:
Notation Definition Analogy
f(n) = O(g(n)) see above ≤
f(n) = o(g(n)) see above <
f(n) = Ω(g(n)) g(n)=O(f(n))
f(n) = ω(g(n)) g(n)=o(f(n)) >
f(n) = Θ(g(n)) f(n)=O(g(n)) and g(n)=O(f(n)) =

i'm always laughing on the inside when profs write that out
 
Nordramor said:
Here's a link to a massive spoiler page that explains much of the symbolism. There's references to programming, mathematical problems, and religion.

http://www.sixfortyfive.com/bigo/wtf/index.html

Here's the summaries of possible interpretations:

Spoiler

Number One actually seems pretty plausible to me. Thanks for the link, Nordramor.
 
There's an audio clip up on the site I posted. It's one of Schwartzvaltz's monolouges. Apparently, Schwartz was the first person to figure out 'the secret' of Paradigm city, and I suppose it kinda drove him mad.
 
Nordramor said:
There's an audio clip up on the site I posted. It's one of Schwartzvaltz's monolouges. Apparently, Schwartz was the first person to figure out 'the secret' of Paradigm city, and I suppose it kinda drove him mad.
Which guy was Schwartz again? Was that the guy who that he was "god" or was that the old guy?
 
Ben Reed said:
Number One actually seems pretty plausible to me. Thanks for the link, Nordramor.

I think both are equally plausible given the rest of the show's context but we'd need a season three to figure it out. ;)
 
FalseMyrmidon said:
Which guy was Schwartz again? Was that the guy who that he was "god" or was that the old guy?

Schwarzwald was the guy in all the bandages. He figured prominently in Season One.
 
Back
Top