Friday, September 23, 2011

Co-ordinate systems & projections

Geologists have in the past generally managed
to avoid dealing with different coordinate
systems in any detail, as the areas they were
dealing with were small. The advent of GPS
and computerized data management has
changed this. The plotting of real world data on
a flat surface is known as projection and is the
result of the need to visualize data as a flat
surface when the shape of the earth is best
approximated by a spheroid, a flattened sphere.
For small areas the distortion is not important
but for larger areas there will be a compromise
between preserving area and distance relationships.
For example, the well-known Mercator
projection emphasizes Europe at the expense of
Africa. The scale of the data also governs the
choice of projection. For maps of scales larger
than 1:250,000, either a national grid or a
Universal Transverse Mercator (UTM) grid is
generally used. In the latter projection, the
earth is divided into segments of 6 degrees
longitude with a value of 500,000 m E given to
the central meridian of longitude and a northing
origin of 0 m at the equator, if north of the
equator, or large number, often 10,000,000 m,
if south of the equator.

There are a variety of different values in use
for the ellipsoid that approximates the shape
of the earth, known as the datum. The most
commonly used datum for GPS work is World
Geodetic System (WGS 1984) but the datum
used on the map must be carefully checked,
as the use of different datums can change coordinates
by up to 1500 m. The reader is advised
to read about the problems in more detail in
texts such as Longley et al. (2001) and Snyder
(1987).

Corporate solutions

As large amounts of money are invested in collecting
the data, it is crucial that the data are
safely archived and made available to those
who need them as easily as possible. Integrity
of data is paramount for any mining or exploration
company, both from a technical and
legal viewpoint (acQuire 2004). However this
integrity has often been lacking in the past and
many organizations have had poor systems
giving rise to inconsistencies, lost data, and
errors. Increasingly, in the wake of incidents
such as the Bre-X fraud (see section 5.4), both
industry and government departments require
higher levels of reporting standards. Relational
databases provide the means by which data can
be stored with correct quality control procedures
and retrieved in a secure environment.

Many proprietary technical software products
provide such storage facilities, for example
acQuire (acQuire 2004) provides such a solution
for storage and reporting of data that also
interfaces with files in text formats such as csv,
dif, txt (tab delimited and fixed width formats),
as well as numerous proprietary formats.
The strategy for collection and evaluation
(checking) of data (Walters 1999) is often a matter
of company procedure. Most errors are gross90
and can be easily filtered out. Each geologist
and mining or processing engineer knows what
the database should contain in terms of ranges,
values, and units. It is a simple matter of setting
up the validation tables to check that the
data conform to the ranges, values, and units
expected. A simple example would be ensuring
that the dip of drillholes is between 0 and
degrees for surface drilling.

Data Capture & Storage

There are two major methods of representing
spatial data, raster and vector. In the vector
model the spatial element of the data is represented
by a series of coordinates, whereas in the
raster model space is divided into regular pixels,
usually square. Each model has advantages and
disadvantages summarized but the
key factors in deciding on a format are resolution
and amount of storage required. A simple geological map in vector and
raster format. The raster method is commonly
used for remote sensing and discussed, whereas the vector method is used
for drillholes and geological mapping. Most
modern systems allow for integration of the
two different types as well as conversion from
one model to another, although raster to vector
conversion is much more difficult than that
from vector to raster.

In a simple (two-dimensional) vector model,
points are represented by
lines as a series of connected points (known as
vertices), and polygons as a series of connected
lines or strings. This simple model for polygons
is known as a spaghetti model and is that
adopted by computer aided drawing (CAD)
packages. For more complex querying and
modeling of polygons, the relationship between
adjoining polygons must be established and
the entire space of the study area subdivided.
This is known as the topological model. In this
model, polygons are formed by the use of
software as a mesh of lines, often known as
arcs, that meet at nodes. Another variation of
this model often used for height data is that of
the triangular irregular networks (TIN) and is
used to visualize digital elevation surfaces or
construct digital terrain models (DTM). The
TIN model is similar to the polygons used in
ore resource and reserve calculation
x and y coordinates,

Introduction


One of the major developments in mineral
exploration has been the increased use of computerized
data management. This has been used
to handle the flow of the large amounts of data
generated by modern instrumentation as well
as to speed up and improve decision making.
This chapter details some of the techniques
used to integrate data sets and to visualize this
integration. Two types of computer packages
have evolved to handle exploration and development
data: (i) Geographical Informations
Systems (GIS) for early stage exploration data,
usually generic software developed for other
nongeologic applications, discussed in section
DATA INTEGRATION AND GEOGRAPHICAL
INFORMATION SYSTEMS
to enable mine planning and resource calculations,
discussed in section .
INTEGRATION WITH RESOURCE
CALCULATION AND MINE PLANNING
SOFTWARE
What must be emphasized is that the quality
of data is all important. The old adage “rubbish
in and rubbish out” unfortunately still applies.
It is essential that all data should be carefully
checked before interpretation, and the best
times to do this are during entry of the data into
the database and when the data are collected.
A clear record should also be maintained of
the origin of the data and when and who edited
the data. These data about data are known as
metadata.
, and (ii) mining-specific packages designed