Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] widget rate puzzle ... reasoning, scaling, et cetera



On 12/31/2014 01:40 PM, Philip Keller wrote:

[scaling arguments]

Should I be embarrassed to admit that this was the first method I
used to get the answer?

No!

I am reminded of the perl motto:
TIMTOWTDI.
There Is More Than One Way To Do It.

If this group collectively could come up with only one
way of doing it, /that/ would have been embarrassing.

===========

Here is how my friend Sagredo would formulate the scaling
argument. It's the same idea as others have suggested,
just with slightly fancier language and a bit more detail:

<quote>
First we invoke the pedagogical proverb that says
"learning proceeds from the known to the unknown."
In that spirit, we start by reviewing stuff that
all the students are "supposed" to know. We start
with distance = rate * time. If we divide both
sides by time, we get

distance
---------- = rate
time

We now focus attention on situations where rate is
constant, and consider two different instances of
the problem:

5 miles
------------- = (some rate)
5 minutes

100 miles
--------------- = (the same rate)
100 minutes

So we see that distance over time is an /invariant/.

Here's another motto: "Always be on the lookout for
invariants." This is a fundamental problem-solving
skill that you can use over and over and over.

Moving right along, let's consider something with more
variables, perhaps a recipe for raisin bread. Here is
an oversimplified version:
1 cup milk
1 cup water
10 cups flour
1 cup raisins
1 oz powdered cinnamon
10 g yeast
et cetera.

Now if we have two instances of the recipe, one of which
is scaled up by some factor, we find:
++ The ratio of milk to water is the same in both instances.
That's an invariant.
++ The ratio of milk to flour is the same in both instances.
That's also an invariant ... not the same invariant, but
still an invariant.
++ And so forth. If there are $N$ ingredients, there will be
N(N-1)/2 invariant ratios, one for each pair of ingredients.

Moving right along, let's consider today's question:

It takes 5 minutes for 5 machines to make 5 widgets.
So, how many minutes does it take for 100 machines
to make 100 widgets?

That is "almost" a rate * time problem, but it involves
a generalized notion of rate. The rate is the number of
widgets per machine *and* per unit time:

# widgets
------------------- = (generalized rate)
# machines * time

Now things get interesting, because we see that if we
just scale up everything on the LHS by the same factor,
it does not leave the rate unchanged. We want the rate
to be constant, so we need to come up with another way
of scaling things.

It suffices to scale the number of widgets with the number
of machines, leaving the time alone, assuming everything
is nice and linear (or more properly speaking, bilinear).
This suffices to leave the rate unchanged.

At this point we begin to see why this problem is
notoriously deceptive. On first hearing it "sounds
like" a recipe scaling problem ... but really it isn't.
Rather than having N(N-1)/2 scaling laws involving two
variables apiece, we have one big scaling law involving
all N variables in nontrivial combinations.

One more proverb: "Check the work."

No problem is so easy that it is exempt from checking.
If it really is an easy problem, it should be easy to
check, so there is no excuse for not checking. If it
is a hard problem, there is a greater risk of mistakes,
so once again there is no excuse for not checking.

If you are doing something that is in principle absolutely
guaranteed to work (such as long division) check the work
anyway. (Multiply the quotient by the divisor and make
sure you recover the dividend.) If you are doing something
that is not guaranteed to work, such as arguing by analogy,
it is double-especially super-important to check the work.

In this case, here is one incisive check: Does it make
sense that the time required should scale in direct
proportion to the number of machines in use? If not,
it tells you that this is not a simple "raisin bread"
scaling problem. At this point you should perceive
all sorts of red flags and alarm bells, telling you
that the analogy has failed. The problem has moved
from the "simple" category to the "seemingly simple but
viciously deceptive" category. Probably you are good
at doing raisin-bread type scaling in your head, but
this isn't that sort of problem, so maybe you shouldn't
be doing it in your head at all. Maybe it's time to
drag out the heavy artillery, or at least pencil and
paper, and proceed very cautiously.
</quote>

==========

Speaking for myself now: There remains another aspect
to this story that nobody has properly dealt with so
far. Even my friend Sagredo blew right past it. Can
anybody see what it is?