Compartilhar via


Rarefied Heights Part Two, Plus Some Random Short Takes

A few short takes today before I get into the actual subject of today's entry.

***************************************************************************

Kids Today

My teenaged cousin has apparently accidentally reinvented pseudolocalization.  I was chatting with him the other day and I noticed that his MSN Instant Messenger screen name was ŁıppэяŦ.

D00D, W00T! Kids today have it so easy. Back in my day we didn't have Unicode; when we wanted to be 3L1T3 on a BBS we had to be content with using numbers in the place of vowels, aggressive use of capitalization, and deliberate mispellings.

***************************************************************************

Same To You, Buddy

This true story happened oh, about fifteen minutes ago.

I am walking into building 41. A guy is sitting by the front door. I smile and nod as I pass by. The guy says in response "You know what you should do? I'll tell you what you should do!"

I shoot him an extremely quizzical and rather startled look. Then I realized that he is in fact on a hands free cellphone and only appears to be responding to me.

I'm still not used to modern technology.

***************************************************************************

A Natural Theory Of Programming Languages

Stanley Lippman has a wonderful post today that crams the history, theory and practice of language development into a small space. It's called Towards a Natural Theory of Programming Languages. The key takeaway is this: programming languages are tools built at a specific time to solve a specific class of problems on a specific class of hardware. Many flame wars could have been ended by both sides recognizing this rather basic fact. I'm looking forward to more posts in this vein, as I think there is a lot more to say on the subject.

***************************************************************************

Rarefied Heights Revisited

I miss mathematics. My degree was a joint degree in both applied mathematics and computer science, but over the last few years I've gradually been losing the math skills as they've gone unused. I think I might delve into some of the theoretical-yet-recreational aspects of mathematics in computer science from time to time. (And just to be clear, by mathematics I mean coming up with proofs of theorems, not solving practical problems -- that's arithmetic.)

Case in point: a while back I pulled the weak form of Stirling's Approximation out of thin air in order to justify the claim that any comparison sort of n elements requires O(n log n) comparisons.

n! > (n/e)n 

This is quite weak. The strong form of Stirling's Approximation is

n! = √(2nπ) (n/e)n (1 + O(1/n))

Note the "big O" notation in there. We're saying that we're not sure exactly how close this approximation is, but that the approximation gets really good on a percentage basis for big values of n.

I feel bad about pulling that result out and using it with no justification whatsoever. Proving the strong form is kind of a pain in the rear but proving the weak form is really easy. Since that's all we need, just for the heck of it let's prove the weak form.

Instead of proving the inequality directly, we'll prove that

ln n! > ln (n/e)n 

Which I'm sure you'll agree is basically the same thing -- if one quantity is larger than another, then the log is larger than the log of the other, and vice versa. 

Let's start with the left side. Clearly

ln n! = ln 1 + ln 2 + ln 3 + ... + ln n   

which is a little cumbersome to write, so I'll use big-sigma notation for the sum.

= Σln x, for x from 1 to n.

This next step is a bit of a hand-wave -- I want to use the fact that this sum is the area of a "stair step" function, and that the area of a stair-step function is always larger than the area under a function which it dominates. Rather than proving that formally, I'll give a graphical justification.

Let's consider n = 4. I've graphed out ln x, and I'm sure that you will agree that this sum is equal to the total area bounded by the three red boxes. (ln 1 is zero, so we can ignore it -- it's "red box" would have zero height and hence zero area.) 

Log graph

Let's call the area computed by adding together a bunch of rectangles the "quadrature". Clearly the quadrature is larger than the (blue) area under the curve not just for n = 4 but for all n. We can compute the blue area by integrating, so let's declare this inequality.

Σln x >= ∫ln x dx, for x from 1 to n.

The antiderivative of ln x is x ln x - x, so we have

ln n!
= Σln x
>= ∫ln x dx
= (n ln n - n) - (1 ln 1 - 1)
= n ln n - n + 1
> n ln n - n
= ln (n/e)n 

And we're done! The weak form of Stirling's approximation is justified. 

Didn't that feel good? How often do you get to use calculus to prove a fact about the performance characteristics of sorting lists?

Comments

  • Anonymous
    June 02, 2004
    I miss proving things too. It feels like all I ever do these days is googling my way through MSDN.
  • Anonymous
    June 02, 2004
    I miss math.
  • Anonymous
    June 02, 2004
    <i> a while back I pulled the weak form of Stirling's Approximation out of thin air in order to justify the claim that any comparison sort of n elements requires O(n log n) comparisons. </i><p><p>

    Here is another similar problem, and solution to it.

    Claim: The best algorithm that can ever be found for sorting can only be O(n.log n)!

    Hand waving proof:

    Hmm.... Let's see...

    Sorting n numbers is like picking one permutation out of all possible permutations of n numbers.

    There are n! permutations possible given a sample of n numbers.

    The algorithm with the best worst-case-complexity for "picking an item out of n items", ie, search, is binary search. Complexity is O(log n) for a sample size of n.

    Suppose we could apply binary-search to our little problem, then the best one could do for sorting is O(log n!).

    From strling's approximation, O(log n!) ~ O(n.log n).

    Thus, the best sorting algorithm will have a worst-case-complexity of O(n.log n), and we can't do any better than that!

    <i> It is nice to think of mathematics once in a while.. </i>
  • Anonymous
    June 02, 2004
    If by "similar" you mean "exactly the same", then, yes. That's exactly the argument I made.
  • Anonymous
    June 02, 2004
    To eliminate the appeal to pictures:

    Choose an integer n > 1.

    ln is strictly increasing so ln n >= ln x for x in [n-1,n].

    Thus, int_{n-1}^n ln n dx >= int_{n-1}^n ln x dx.

    But int_{n-1}^n ln n dx is just (n - (n - 1)) * ln n = ln n.

    So ln n >= int_{n-1}^n ln x dx.

    Therefore sum_{x=2}^n ln x >= int_{1}^n ln x dx.

    And since ln 1 = 0, the same is true for sum_{x=1}^n ln x.
  • Anonymous
    June 02, 2004
    Indeed, though I think the picture communicates the result more clearly than the text.

    We could go on to justify each of your handwaves further.

    How do we know that if f dominates g, the integral of f dominates the integral of g? That has to be proven.

    How do we know that it's legal to roll up all those partial integrals into one integral? That's got to be proven as well.

    However, my aim is to get across the spirit of the argument rather than a letter-perfect proof.

    As an exercise for the reader, you can use similar techniques to prove another weak form of Stirling's approximation -- take the other side and find an upper bound on n! -- can you prove that

    n! < n (n/e)^n

    ?
  • Anonymous
    June 03, 2004
    You say "Clearly the quadrature is larger than the (blue) area under the curve not just for n = 4 but for all n" when I've had more than one student try to argue the opposite based on pictures. It's amazing what that picture can look like for n=100 on the 80x80 display of a graphing calculator.
  • Anonymous
    June 03, 2004
    Hey, the "clearly" in there is supposed to prevent that objection. Isn't that the first rule of writing a convincing proof? Mark all dubious steps with "clearly", "obviously", "it follows immediately", or my favourite "as any moron can see", and you're golden!

    :-)