Division is too hard: can we get rid of it? :)
Just a bit of philosophical rumination on the most basic math operators as they relate to (graphics) programming. I don't know if this stuff is all obvious, laborious, redundant, or maybe incorrect. Or if there's anything interesting here. Let me know in the comments. :)
First the easy ones: addition seems to be about combining. And subtraction about separating.
Multiplication. Since I haven't thought carefully enough about math for much of my life, I suspect that there was a long time during which I thought of multiplication as making things bigger, or increase. It's used that way as a synonym for animal reproduction, and in economics (the money multiplier) and other sciences, to imply growth.
In graphics programming, I soon noticed that multiplication is great for scaling things (and since then I've read that multiplication is scaling). Scaling is about making things bigger or smaller. Imagine a unit-length cube with a corner at 0,0,0. You can scale it uniformly up and down by multiplying its vertex coordinates by some number S.
When it comes to multiplication (at least), there's something special about the number 1. It's special because if S > 1 then the cube scales up, if S < 1 then the cube scales down, and if S == 1 then multiplication has no effect and arguably there is no scaling, or you have "identity scaling", which I may have just made up but it amounts to the same thing.
Outside of graphics, multiplication feels natural and extremely useful when, for example, scaling between a normalized value [0-1] in a data model and a much wider range on a UI slider, say [0-100]. Doing that scaling in the getter and setter (and raising a property change notification at the same time) nicely encapsulates it.
The one thing that you can't affect by scaling is 0. You can't scale zero. When it comes to zero, 1 loses its specialness. That idea might be worth returning to.
In the past I've been even more confused about division than about multiplication. I used to think about division as making things smaller, or decrease. Dividing by D > 1 does indeed make a thing smaller, but dividing by D < 1 makes it bigger. And division by D == 1 has no effect. There's that special number 1 again.
So division is multiplication-by-the-reciprocal (and multiplication is division-by-the-reciprocal). So, division is also a form of scaling but a kind of through-the-looking-glass scaling that doesn't appeal to me. So, in my mental model of math, what if I eliminate division in general (whether I can eliminate the use of the operator from my code or not) and replace it with multiplication-by-the-reciprocal. But the reciprocal is itself a special case of division so what can I do about that?
The reciprocal is 1/x, but it's such a special case of division that I wonder whether you really need such a general notational device as division to represent it. For now, let's just say recip(x) instead. If you draw a graph of y = recip(x), you see that the graph avoids the x == 0 line like the plague. So recip(0) is undefined, but recip of any other x is fine, so that's good to know. Also, it's also interesting to note that the product of a number and its reciprocal is 1, so x * recip(x) == 1. The reason I find that interesting is because if you want to scale x to make it unit size, you just scale it by recip(x). And scaling something to make it unit size is very common in graphics programming, and it's called normalizing the thing. For example, what you do to a vector to make it unit length is called normalizing it, and it's done by scaling each co-ord by the reciprocal of the vector's length. Once things are normalized in the graphics world, they become heaps more useful.
If using multiplication to scale works for you then use it and you're done. If you want to scale down by the proportion that multiplying by S scales up, or you want to scale up by the proportion that multiplying by S scales down, then multiply by recip(S). That way, you're always using multiplication to scale, and multiplication is the same concept as scaling and the mental model is simple. Another notation for 1/S is S-1, so when it comes to writing code you could either write 1/S (but read that in your mind not as division but as recip(S)), or write pow(S,-1) so that you never need to use the division operator, and division isn't a concept any more. More generally, instead of dividing by D you can multiply by the reciprocal of D and thus avoid the division operator in every case. It's true that A / D is a little simpler and shorter as notation than A * pow(D, -1), but it does require having division in your mental model and, as I think I'm implying, I'm not entirely sure that I know philosophically-speaking what division means. But scaling by a reciprocal, I think maybe I do.
The number 1 is showing up a lot, sometimes as -1, sometimes as the notion of normalizing. And I find it really interesting to think of the reciprocal as both the tool you use to normalize a value, and the tool you use to make multiplication the only scaling operator you need.
So far I've been thinking in terms of how the magnitude of the output relates to the magnitude of the input. But there's more to this than just making a value bigger or smaller. Division can be seen as a counting operation sometimes. If my hens lay a hundred eggs, how many dozen-egg-boxes will that make? Is that the same as scaling? My mental model, what I mean in this case by 100 / 12, is that I'm calculating a count. The fact that I'm scaling 100 down to some smaller value is incidental, it feels. Similarly, the number of pints in a gallon is conceptually about counting, even though it involves dividing the gallon into parts. So, G / 8 feels right, but G * 1 / 8 or G * 8-1 or G * 0.125 don't.
Comments
Anonymous
November 10, 2014
Thx for an Intriguing post. It's tempting to try to extend the idea to 'Subtraction can be thought of as some sort of variation on addition' but that would a mistake I think because the notion of 'giving' and 'taking away' are very different in their parallel in the physical world which, as you point out towards the end, is not the case for division. That being said, division does seem to have a meaning/purpose in the following scenario: If I have a circular cake to share equally amongst 'n' people, how much does each person get if the entire cake is used up?Anonymous
November 10, 2014
Yes, I think "dividing" is the right word to use to describe what one's doing with the cake. That's probably why I always thought (as a kid) that division was about diminishment. Say, you divide the cake into 3 pieces, I also have a suspicion that there's something wrong with counting each of those 3 pieces of cake as 1. Each piece contributes 1 to the total of 3, but they're not all identical: they can't be. Similarly, if I have 3 peanut MnMs (which I frequently do), they're not all exactly the same size, some are slightly bigger, some slightly smaller. How long are three MnMs meant to measure physically? If they don't match that, then which MnM's fault is it for not being the right size, and how unfair is it to allocate each one a contribution of a uniform (and kind of meaningless) 1 when you're not taking into account its idiosyncratic size in your expectation of sum length? :)