Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Contour maps, etc.



On 5 May 1997 Roger A. Pruitt, referring to contour mapping, wrote:

... What I would like to know is the algorithm used to do this.
I thought that was what you were after too, Ludwik.
.......................................................................
Yes I was and I now have a good reference (see the added note). But instead
of first learning how the problem was solved by somebody else I wanted to
find a solution by myself. Who said that teachers should not enjoy the
"learning by discovery"? The more I think about the contour mapping the
less complicated it becomes. It was just the opposite at the beginning;
I am a slow thinker. Today the subject again became complicated; I started
reading references borrowed from a geologist.

My most important step was to change a continuous Z(x,y) distribution
(an array over a mesh grid of chosen cell sizes) into a discrete one,
according to a desired step size in Z. Once this is done a contour is
naturally defined by those grid locations at which sudden changes of
Z are taking place. What can be more simple that this? You can draw the
contours, one after another, starting, for example, with the highest.
That is what my program does for mathematically given functions.

One should recognize a big difference between functions Z(x,y) which are
defined by formulas (Z can be calculated for every cell) and those based
on experimental data (many cells are empty). In the first case the "error
bars" (visualized as contour line thicknesses) can often be as small as
a single cell. Pathological cases may develop for extremely steep slopes
but one can deal with this by using smaller cells, ultimately as small
as distances between adjacent pixels on a displaying device.

Scarcity of experimental data, however, can be a source of additional
errors. The amount of data does not have to be "maximum possible" but it
should be "statistically significant" in each region. Interpolations in
three dimensions, as previously mentioned, can be quite uncertain. If I
had to reinvent them I would first try the so-called "spline method" on
the plane-by-plane basis. (Preliminary mapping of row data would be used
to choose a suitable set of parallel planes for that purpose.)

Let me illustrate this with two situations. (I hope the asterisk patterns
below will be displayed as a single screen page on your terminal, as it
does when printed). Both patterns refer to a "rectangular roof" of a
discrete Z distribution at the highest level. But the left picture consists
of complete data from a formula (Z are assigned to all mesh points) while
the second consists of limited data from sensors. One and only one contour
is defined by the borders of the first picture.

**************** * ** * 1 IH G
**************** * 2 F
**************** * 3
**************** * * 4 5
**************** * 6
**************** 7
**************** * * * 8 D E
**************** * 9 C
**************** * * A B

But the second picture can be interpreted in many ways. Only one of them
(1ABG1 on the right side) is correct; others, such as 123456789ABCDEFGHI1,
are acceptable but not correct. Only lines with crossings would not be
acceptable in this case. Even a human genius can not infer the real
contour line from the incomplete data shown above. She can "interpolate
and extrapolate on the basis of some reasonable assumptions" (to remove
ambiguities) but this is nothing more than an educated guess.

It is time to stop working on my contour-making program. The playing time
is limited and I will never be able to match professionals in terms of
using available shortcuts to maximize efficiency. My program, using only
a mesh of 63 by 63 bins, is too slow, and too memory-intensive. It would
be totally unacceptable for a mesh matching pixel distances of our screens
and printers. Knowing how erroneous contour lines (based on limited data)
can be we should test available software with artificial data before using
it in real research. How else can uncertainties be estimated? I am now ready
to select a tool for my summer project. Thanks to all who wrote about the
existing software packages.
Ludwik Kowalski
P.S.
The reference text ("Statistics and Data Analysis for Geology" by John C.
Davis, John Wiley & Sons, 1986) has a long chapter on contouring and many
algorithms are discussed. The general methodology is described as follows.

(a) Collect data
(b) Construct a mathematical model, based on these data and on reasonable
assumptions (--> create missing data through interpolations).
(c) Draw contour lines on the basis of your numerical model.

On page 362 you may find a table of 52 (x,y,Z) data points. Three contour
maps based on these data, but produced by different methods, are not
identical. Why the differences? Because the essential part (b) was imlemented
differently in each case. The first map (p 365) was produced by the
triangulation algorithms, the second (p 376) by the regular grid algorithm
while the third (p 377) was produced manually by a trained geologist. This
is a good illustration for my original concern.