Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: Division via subtractions



Below is my rather computer, self taught, savvy (pardon my French) reply to:


This is near the end of a long thread. Do you do subtraction [sic. He
understood division.] , and, if so, by subtraction?

bc

Stefan Jeglinski wrote:

Although I agree with most of what Denker has said on this topic, I take
some offense at his statement "...[division by subtraction] as a
practical algorithm it is laughable..."

I have stated I have used division by subtraction in some
assembly-language programming, and John's statement could be construed
that I was pretty stupid to do so.



Indeed, the two modern Digital Signal Processors I have worked with
both do division by subtraction. I doubt either TI or Analog Devices
would consider it laughable. I think if they had a better way to do
it, they would, as it might put substantially more money in their
pockets. I hesitate to say that perhaps all processors do it this way
on the machine level, because I don't want to be accused of Bold
Assertion Without Proof.

Of course, division is extremely expensive on a DSP because of this
method. Typically, a 16-bit division takes 16 clock cycles, whereas
multiplication takes one clock cycle, and can include an accumulation
if you like. Thus, if you can get away with some pre-processing,
division is done (much) faster by multiplying by a "magic" number and
down-shifting.


Stefan Jeglinski




XX wrote:

X!

This is near the end of a long thread. Do you do subtraction, and,
if so, by subtraction?


If you had to divide, for example, 5,000,000 by 26, it would take
FOREVER to subtract 26 from 5,000,000 192,307 times. No one who was in
a hurry would do division this way. But if you have time, it's trivial
to implement. If your numbers were always going to be small so the
iterations were few, you might choose to do division in this way to
save programming time and to save code space.

But faster algorithms still use subtraction, because chips don't
commonly have a divide function. There are ways to compress the number
of subtractions needed. Think of hand division --- you don't divide 26
into 5,000,000 in one step, you first divide 26 into 50, etc. The same
with software.

To say that division by subtraction is laughable is silly because
there are times when it's appropriate. It's also naive, because faster
algorithms still use subtraction as part of the process. Machine code
doesn't have that many operations to choose from, and division is not
usually one of them.