FIXDHD

Use Mapfile Generation to generate regular mapfiles using the same data selection procedures used by the fixing process.


Usage:

fixdhd.exe <fd_spec specification file>

or: fixdhd.exe <horizon file> <database> [qualifiers]


<> indicates a mandatory switch.
[ ] indicates an optional switch


Where:

Option

Description

[-l <log file>]

turn on logging of output messages to given file.

[-ls <log file>]

turn on log and silent running (messages go only to log).

[-e <env>]

environnment code; to override vulcan.chk value.

[-p <proj>]

project code; to override vulcan.chk value.

[-s <sel file>]

selection file; to limit drill hole selection.

[-x <exc file>]

exclusion file; to limit drill hole selection.

[-f ]

feedback selection file listing to log.

[-d <design>]

design <DSN>; needed for ODBC or to override DB suffix.

[-no_dsr]

ignore any DSR; all holes considered vertical.

[-dsr <dsr db>]

use the specified DSR DB rather than the default.

[-tk <thick field>]

use the specified field as thickness on litho rec.

[-rb <rely field>]

use the specified field as reliability code on litho rec.

[-sp]

set for second or subsequent fixing pass

[-ep <precision>]

enforce precision on nos. in db to given no of d.p. (1-6)

[-u <no data value>]

set 'no data' value for numeric data (default -999)

[-use_ply]

set to append ply field to horizon field

[-new_end]

set so names not in horizon list end a seam (see notes)

[-rock <rock codes>]

set the list of codes as valid rock codes (see notes)

[-rock_file <file>]

set the contents of file as valid rock codes (see notes)

[-rock_perc <codes & %]

set the list of codes and percentages (see notes)

[-rock_perc_file <file>]

set the contents of file as codes and % (see notes)

[-no_perc]

ignore percentages if -rock_perc or -rock_prec_file used

[-no_rock_code_ok]

intercepts with blank rock codes to count as good

[-no_perc_val_ok]

intercepts with no percentage value count as good

[-limit <dgd name> [<template>]]

set to use limit (mask) strings if they exist where <dgd name> is the name of the design db containing and optional <template> is the layer name template for the layers containing limit strings (default: MASK_%S)

[-limit_type [<ALL|IND>]]

the type of limit strings (default: ALL, see notes)

[-limit_known [<ERROR|IGNORE|FLOOR|ROOF|FIX>]]

how to treat known data found outside the limit string (default:ERROR, see notes)

[-bow <BOW field>]

use the specified field as B.O.W. depth on locate rec.

[-bseam <BOW value>]

specified value in seam field of litho rec. is B.O.W.

[-bmodel <model name>]

specified model used to determine Base Of Weathering

[-barb <depth>]

specified vertical depth is considered Base Of Weathering

[-o_rel_b]

override rel code stat on top seams when checking B.O.W.

[-in_tol <distance>]

distance an interpolated surface for a missing horizon may be within a hole before the horizon is considered as being not there and will be fixed at zero thickness (see notes)

[-all_miss_abls_full]

all horizons missing outside the logged seams of a hole (above or below) to be fixed at full interpolated thickness and never set at zero thickness, even if the interpolated depth(s) are within the hole (see notes for a full description) (default T)

[-inv <pwr> <n_pts> <sep_f> <cell>]

set inverse distance interpolation and its associated parameters (see notes for full details)

[-ida <pwr> <n_pts> <sep_f> <cell>]

set inverse distance interpolation all and its associated parameters (see notes)

[-min_tk_pts <n_pts>]

set min no. of points allowed for thickness calc

[-best_by <AVE|SUM|IND>]

set best surface assessment tecnique

[-invert [<T|F>]]

set world to have inverted levels (default not inverted)

[-app_part [<T|F>]]

set if to apportion instead of correlate any partings found in a merged seam that will be split (see notes)

[-int_part [<PRO|ABS>]]

set to interpolate partings for missing horizons PRO - proportionally, ABS - absolute (see notes)

[-csv]

set if CSV file output is wanted (see notes)

[-no_csv]

turn off CSV file output (default)

[-map [<suffix>]]

set if Mapfile output is wanted (see notes)

[-no_map]

turn off Mapfile output (default)

[-dbl [<dblname>][ADJ]]

set if DBL output is wanted (see notes)

[-no_dbl]

turn off DBL output (default)

[-out_int_thick]

use interpolated thickness for known 0 thick (see notes)

[-out_neg_thick [<mult>]]

use -ve interp thick for known 0 thick (see notes)

[-no_int_thick]

reset both of the above to default (see notes)

[-mod_param <params>]

model parameters, min&max E, min&max N, grid size, units

[-survey <horz><surf><model>]

survey pickup data for a surface (call repeatedly

[-seed_name <name>]

survey point seed name (2 chars max, default DH)

[-fudge [<T|F>]]

set fixing by addition as a last resort (default T)

[-error_sum [<T|F>]]

set to output a final summary of data errors (default T)

[-zits [<T|F>]]

set to remove Zero Interpolated Thickness Splits (def. T)

[-fault_zone_field <fault field>]

use the specified field as fault zone tag on litho rec.

[-fault_zone_list_file <fault file>]

fault list file, contains a list of fault zones triangulations

Specification file:

This is the specification file created by the FIXDHD Vulcan interface. It has an extension.fd_spec, which can be omitted from the command line.  If you want to use a specification file, do not add any more arguments to the command line.  The specification file contains all the required parameters for fixdhd to run.

Horizon file:

Either enter the name of a globals file (e.g. <proj>.gdc_glob) in which a SPLITS table has been defined, or the name of a text file in which the splits have been defined using the following syntax:-

<horizon> <merges into 1> <merges into 2> etc...

<horizon> <merges into 1> <merges into 2> etc...

etc...

For example:

A

B1 B

B2 B

C

D1 D

D2A D2 D

D2B D2 D

Note: horizon names may be separated by whitespace or commas.

Database:

The name of the database containing the drillhole data to fix.  Enter the name of the database file only, not the rax file.  For ODBC database enter just ODBC for the name (case insensitive) and specify the design using the appropriate qualifier (see above).  The environment and project codes will will be read automatically from vulcan.chk if available, to override or to specify these values for library databases use the appropriate qualifiers (see above).

Note:    Use selection and/or exclusion files to limit the drillholes used from a database otherwise all holes will be included in the fix.  Wildcards may be used in either file.

The enforce precision -ep also has the effect of setting internal accuracy tolerance value; e.g. if precision is set to 3 decimal places accuracy is set to 0.0001.  Two values with a differences less than this tolerance will be considered to be equal.

Default desurvey information will be used if it is available.  This may be turned off by using the -no_dsr qualifier, in which case all holes will be considered to be vertical.  The default desurvey data is as follows:-

  1. Library lithology DBs must have an associated DSR DB with the same name, except for the suffix which will be.dsr and there must be a DSR datasheet defined in the datasheet library.

  2. Headed and ODBC lithology DBs must contain survey records and must have desurvey type and desurvey synonyms defined in the design header for these records/fields.

Both of which may be overridden by specifying the name of a library DSR database using the -dsr qualifier (N.B. requires DSR datasheet).

While assessing a horizon's extent, i.e. when at least one intercept with the horizon's name has been encountered, encountering an intercept with a different name will signify the end of our horizon.  If the new name is blank, it will either be treated as a parting for our horizon, or the end

of that horizon, depending upon the next non-blank name encountered.  The system is also set by default to ignore any non-blank names encountered while assessing a horizon, if those names are not in the horizon list. These name will be treated as though they were blank.  This default can be turned off using the -new_end qualifier.  The default situation allows for 'special horizons' that might have finite thickness to exist in the logged data.  Note: ALL zero thickness horizons, if their name is not in the horizon list, are considered to be 'special horizons'.  Thus for example, names defining 'limit of oxidation' or 'water table' etc. can be inserted without affecting horizon definition or causing sequence errors when found within a seam range.

By default, all data in the database is considered to have the same reliability.  However, if reliability data is availiable, it can be used to refine the fixing process.  Use the -rb qualifier to specify the name of the reliability-code field on the lithology record (as there is no synonym for reliability).  This field will need to contain an integer value between 1-9 having the following meanings:-

Decreasing thickness reliability ->

-----------------------------------

Decreasing  |    1           2           3

Depth (RL)  |

Reliability |    4           5           6

|      |

V      |    7           8           9

|

If no reliability code value is available it will always be considered to be 1.  Obviously, if only depth data is used the reliability of thickness is dependent solely on depth reliability.

Note: it is possible to use thickness values from the database as well as depths by using the -tk qualifier to specify the thickness field on the lithology record (for which there is no synonym).  However, where both top and bottom values for a lithology are used, if they have higher reliability than the thickness field provided, they will override the thickness field data and thickness wil be calculated by subtraction.

It is possible to specify NO DATA values for fields on the data base.  If, for any reason, it was not possible to record a value enter the NO DATA value for that field; e.g. if top depth is known but bottom depth is not because of core loss.  By default the NO DATA value is -999, but a different value can be specified using the -u argument; for simplicity the value should be a large negative number, especially if the following feature is to be used.

By specifying a value as a negative number will mean the value is not reliable, but is know to be a minimum value.  When top and bottom values are used with a thickness field and the thickness value is given as a minimum (i.e. as a -ve number), it will be necessary to have one of the depth values specified as NO DATA (-999), because any DB values for depths with override the minimum thickness.  Note also that when using thickness and both top and bottom depths, the thickness will not be used to back calculate depths.

If using a thickness field when only using one depth has been logged, if there is no thickness data in the field, the corresponding depth will not be calculated.  With no thickness field, it is calculated from the depth on the next or previous interval.  With the thickness field, it will only be calculated from an available value.

Note: if continuation records are used (either cont or flag) the first record in the group is always considered to be the primary record that contains the lithological intercept information, and the following records as secondary records containing only additional descriptive information.  The only data that will be read from the secondary records is extra rock types and percentage values.

The data collected from the intercepts of a drill-hole database to define a horizon is matched using the intercept's seam code field. Matches are made to the against the horizon names (seam codes) that are defined in the horizon table (in a case insensitive manner).  The first down hole occurance of an intercept with a matching seam code defines the top of the horizon and the last occurance of an intercept with that seam code defines the bottom of the horizon.  Any intercepts found between the top and the bottom of the horizon which do not contain a seam code or any gaps in logging between the top and bottom of the horizon are counted as parting for that seam and their thickness does not count towards the product thickness reported (obviously, they do count towards structure thickness).  By default, intercepts with non-blank seam codes and a finite thickness that don't appear in the horizon list are treated as if they were blank and counted as parting.   But when the -new_end qualifier has been used any intercept with a non-blank seam code and some thickness that occurs withing the depth range of the horizon being collated will be considered as a mis-correlation and flagged as a fatal error.  The exceptions being the so-called special horizons, as defined above, which behave as parting or are ignored depending upon thickness.

This process can be refined by the use of rock codes.  In this way the intercepts which will be used to form the product of the horizon have to match both the seam code and the lithogy wanted (rock codes). Intercepts which match seam code only still form part of the structural thickness of the horizon (and may even indicate the top or bottom of the horizon) but do not contribute to the product thickness.

Obviously, to use rock codes there must be lithology data present on the intercept records and the 'RockType' database synonym must be defined to point to that field.  Also, the system must have been informed of the valid rock codes.  Where continuation records have been used in the lithology to allow more than one rock type to be logged for each intercept, a percentage of a wanted rock type can also be defined if required (obviously a field containing percentage data, with the 'Percentage' synonym, have to have been defined).

Control the use of this feature with the following qualifiers:-

The -rock qualifier indicates that rock codes are to be used (without percentage qualification) and is followed by a list of allowable rock codes (comma separated with no spaces).  They will be ignored if the 'RockType' synonym is not defined. E.g. -rock CO,CL,CD,CS.

The -rock_file qualifier if the same as -rock but indicates that the codes are found in the associated file name argument.  This file should contain rock codes one code per line.  E.g. -rock_file codes.txt where codes.txt looks like:-

CO

CL

CD

.

.

The -rock_perc qualifier indicates that rock codes are to be used (with percentage qualification) and is followed by a list of data pairs (rock code and percent).  They will be ignored if the 'RockType' synonym is not defined.  The percentage information will be ignored if the 'Percentage' synonym is not defined. E.g. -rock_perc CO,75.5,CL,25 Note: the numerics can be frational but don't need to be.

The -rock_perc_file qualifier if the same as -rock_perc but indicates that the codes are found in the associated file name argument.  This file should contain rock codes and percentage data one code per line. E.g. -rock_perc_file codes.txt   where codes.text looks like:-

CO 75.5

CL 25

.

.

The -no_perc qualifier indicates that percentage data is to be ignored even if it has been supplied.  This only has any effect if -rock_perc or -rock_perc_file qualifiers are being used.  It has the effect of assuming that no percentage data comparison is required.

The -no_rock_code_ok qualifier indicates that where a matching seam intercept has no data in the rock code field it is to be accepted as being a valid product intercept.  By default, it would be rejected as not being a valid rock code and would be treated as waste.

The -no_perc_val_ok qualifier indicates that where a matching seam intercept with matching rock code data has no data in the percentage field it is to be accepted as being a valid product intercept.  By default, it would be rejected as not being a valid rock code and would be treated as waste.

In all cases, the rock codes provided may contain wild-cards (e.g. C* will match all occurrences of CO, CD, or CL on an intercept record). But you must be careful if you use wild cards in conjunction with percentages.  E.g. if you want CL at 75% but any other C* match at 85%, you must put them in that order in the supplied list:-

CL 75%

C* 85%

.

.

As the list is assessed in the order given, the first match indicating a valid intercept.

Note: the use of rock codes to produce 'waste' partings will also cause the '-app_part T' qualifier to be brought into effect (i.e. partings will be simply apportioned across splits, rather than make an attempt at correlation to splits).  This can be reversed to the usual default situation (-app_part F) but care should be taken in this instance, because using rock codes will have the effect of producing non-structural partings (not physical gaps) which may cause spurious correlation.

If the -use_ply qualifier is used, the seam code and the ply code from the intercept are concatenated to form the horizon name.  Obviously, this can only happen if the 'Ply' synonym is defined in the database, and will only match if the horizon list contains concatenated names to match.

Limit strings may be used to indicate the known (or guessed) extent of a seam.  Holes outside the limit which do not contain the seam data will then always have that seam fixed at zero thickness.

Use the -limit qualifier to specify the use of limit strings.  This must be followed by a argument specifying the DGD database file name which contains the limit strings - one layer per seam (not all have to be present).  An optional argument can follow which specifies the format of the layer names. The default for this argument is MASK_%S where the %S will be substituted by the horizon name.

Limit string layers can be made in one of two ways where a seam splits. They can either be made as limits of the split names as correlated in the original data, or they can be made for the lowest level child split names only (as defined in the horizon table) but made to encompass all of their parent names area as well.  E.g. if seam A splits into A1 and A2, you can either make limits for A, A1, and A2 separately, or you can make limits for A1 and A2 where these limits each include the area of A.  The former case is designated type ALL and the latter type IND (ALL is the default).  To switch types use qualifier -limit_type <ALL or IND>.  With type ALL if you want to limit A1, you must have limits available for both A1 and A which will be checked together (if A is not present the effect will be the same as not having a mask at all).  With type IND when you want to limit A1 you only need the A1 mask - but this is assumed to cover the whole area that A1 is being 'fixed' for, i.e. the union of A1 and A data hulls. If the limit string(s) are formed in such a way that a logged data value is found outside the limit of the seam in question, you can control how the system will deal with this data (by default it will cause an error condition and the fixing process will stop).  This is done with the -limit_known qualifier and an optional argument:-

ERROR - when data is found outside the limit an error condition is raised

IGNORE - ignore the mask for this point and preserve the data unchanged

FLOOR - preserve the floor value and fix at zero thickness

ROOF - preserve the roof value and fix at zero thickness

FIX - fix at zero thickness, and letting FixDHD fix the position

If no argument is given ERROR is assumed.

If a horizon base is at the total hole depth, we will assume that the horizon may be thicker than is measured.

If a horizon at the top of the hole starts at the top of the hole, we may consider the same thing - the roof depth is not reliable.  Also, we may determine if the top horizon is inside the weathering and the roof depth is unreliable, even though it is not at the top of the hole.  This can be done a number of ways:-

  1. Through the use of the reliability code data.  If reliability codes are in use they will take precidence.  However, they can be overridden by the use of the -o_rel_b argument.

  2. We can use a numeric field in the 'locate' record that sets the base of weathering depth.  This is not available via the normal synonyms, but you can use the -bow argument to indicate this field name

  3. We can use information in the 'litho' records to indicate the position of the base of weathering; i.e. have a value in the seam name field that indicates that a record is being used to shoe the base of weathering depth (use the -bseam argument to indicate what this value is used in the seam name field).  If this value is encountered in the seam field of a litho record the record will be ignored for all other purposes so as not to interrupt seam info.

  4. Use a model of the base of weathering (use the -bmodel argument to specify the model name).

  5. Use an arbitrary depth below which we can be sure there is not likely to be any weathering (use the -barb argument to specify this depth).  This depth is considered to be a vertical depth BELOW the collar, but will be converted to a down hole depth for each hole from the DSR 'shape'.

  6. Use the fact that the seam starts at depth 0; i.e. the top of the hole.  This is the default condition and will always be checked even if none of the others are used.

Note:   You can use all or any (or none) of these in conjunction with one another.  They will be applied in the order specified here.  No.6 will always be applied.

When fixing horizons missing from the hole which are missing above the first logged horizon or below the last logged horizon, it is assumed that they are expected to be in the hole if any part of their interpolated position lies between the top of the hole (or base of weathering) and the total depth.  So, if the interpolated roof or floor overlaps these depths the missing horizon is given zero thickness - i.e. not found, but expected, so not there.  However, there are some occasions when this is too harsh a measure (e.g. where a seam is cropping but just a fraction of the interpolated depths become expected in the hole) and this then causes un-geological looking dimples in resultant models.  Bearing in mind that any interpolation is an inexact science, we have made it is possible to specify a tolerance distance (with the -in_tol qualifier) that a surface may lie inside the hole before it is considered as being expected to have been found in the logging.  By default this distance is zero.

Alternatively, if required, all horizons missing above or below the logged seam data can be considered as 'not expected' even if all or some of their interpolated depth lies within the hole.  This way, these horizons will always receive their full interpolated thickness.  This condition is specified with the -all_miss_abls_full qualifier.

N.B. it is possible that full interpolated thickness may be zero.  So, even when -all_miss_abls_full or -in_tol qualfiers are used a missing horizon may be interpolated as zero thick even if outside a hole's depth range.

N.B. all horizons missing within the logged seams are always considered as 'expected' and always given zero thickness.  They are missing from the hole, maybe because of pinching out, washing out, or faulting.  However, their expected interpolated thickness is available via a special column in the 'fixed' mapfile.

It is possible that you may wish to run multiple fixing passes on the same data.  In fact, if you have sparse seams within the horizon list it is desirable and best to do this.  Fix the main horizons first, then add the sparse horizon data to the fixed DB and fix again.  However, it may be that the first fixing pass has interpolated horizons above and/or below the extents of some holes.  The above actions, checking the total depth and base of weathering or top of hole, are then not desirable as they will undo the effects of the first fixing pass.  So, it is necessary to tell the system that you are running a second or subsequent fixing pass in order that these tests are not done.  This is achieved by using the -sp argument.

The default interpolation model is inverse distance, with a power of 2 and searching for a maximum of 10 points (the same as FIXMAP).  This can be changed with the model setting arguments:-

-inv switches to inverse distance modelling and/or supplies different power and interpolation point numbers and the stratigraphic separation factor for weighting thicknesses and the cell size for reducing coordinate numbers (suggest 100). This interpolation technique finds the surface best suited to interpolate the missing surface from (i.e. with the best overall weight) and uses the n best values from this surface to make the inverse distance weighted value.

-ida switches to 'inverse distance all' modelling and/or supplies different power and interpolation point numbers and the stratigraphic separation factor for weighting thicknesses and the cell size for reducing coordinate numbers (suggest 100).  This interpolation technique uses all surface pairs and build a weight based on distance separation and stratigraphic sequence separation.  It uses the n best weighted thicknesses (from all available surfaces) and uses these to calculate the missing depth, each depth weighted by the thickness weight.

-grid ?   not implemented yet

Seam thickness is always calculated just using available thickness values for that seam, rather than calculating the missing surface(s) of that seam from other known surfaces in the hole.  This is because strange effects can be seen near seam splits that end up with floor above roof.  To this end you can nominate the minimum number of points that will be used to compute the seam thickness

-min_tk_pts sets this number of points default is 10, but it can never be more than the number of points used for inverse distance when using -inv or -ida modelling.

When deciding which known surface in a hole is the best to compute the level of a missing surface, it is necessary to determine which known surface has the best thickness data (between it and the unknown) available in surrounding holes.  This is done by finding the 'n' best holes for each thickness that could be used.  Determining the best of these groups which come from the closest holes (and are the closest in stratigraphic sequence) is done by comparing their inverse distance weights.  A group can be considered to be the best for the task in a number of ways:-

  1. The group which has the best average weight. This is the group of data that comes from the closest overall group of holes (and with the closest overall stratigraphic separation).

  2. The group which has the best sum of weights.  This is effectively the same as the average technique except that it compensates for groups where the full complement of n values were not available.

  3. The group which has the best individual weight.  This is the group which contains the actual hole which is closest to the hole in which the missing surface is being computed (and closest strat), irrespective of how close the other holes in the group are.

This can be controlled with:-

-best_by sets which of these two methods you wish to use (the default is to use the group of values which contain the individual best weight).

-best_by AVE or -best_by SUM or -best_by IND <default>

Note:   the higher the inverse distance power being used the less difference there is in the two techniques.

Some systems have levels (R.L.) decreasing upwards instead of the more normal (default) situation of increasing upwards.  If you are using one of these systems use:-

-invert <inv> to set the type of world you have.  Using -invert T (or t) will set the system to use inverted levels, i.e. decreasing as you ascend, or you can do -invert F (or f) to reset to the default state of increasing as you ascend.

When assigning partings found in a merged parent horizon to the child splits that will be produced by this fixing process, by default the system will try and correlate these as best it can between and within the 'fixed' splits determined for that merge.  If the parting falls on a split boundary it is treated as interburden, if it falls within a split as parting for that split.  However, to mimic fixmap, the system can be set to just apportion the partings across all of the splits proportially to their thickness fraction of the merged horizon.  With this technique, no interburden will be computed between the splits, and any positional information from the partings is ignored.  In both cases, the thickness of product and waste is maintained, but the default approach attempts to also preserve the positional information from the partings to more accurately reflect the logged data.  Obviously, it would be better if the user did this correlation himself if these partings are actual gaps.  To change the way partings in merged horizons are treated use:-

-app_part [<app>] to change the setting, -app_part or -app_part T to set apportion partings across all splits in a merge seam (and mimic fixmap), or -app_part F to reset to the default state and to correlate partings between and within splits accordng to their position.

Partings can also be interpolated for missing horizons if required.  By default partings are not interpolated into missing horizons, because this is better done at the modelling phase for the data, where the user should be in full control.  But, if required, this feature can be turned on with the -int_part qualifier.  When turned on, partings will be interpolated, using the chosen interpolation technique, as a proportion of the horizon thickness (by default).  Using the optional argument to this qualifier <ABS>, the partings can be interpolated as an absolute thickness value.  E.g.s i) Using the default proportional <PRO> technique with inverse distance interpolation, the proportion of the horizon structure-thickness taken by parting is calculated for the available data for a horizon.  The weighted average of this proportion is then calculated for the missing horizon and this proportion of the missing horizon's interpolated thickness is assigned as parting.  ii) Using the absolute thickness <ABS> technique a weighted average of the actual parting thickness is calculated for the missing horizon, and this absolute thickness value is assigned as parting for the missing horizon.

Notes:

  1. It is possible using the <ABS> technique that parting thickness can be interpolated as being greater than the horizon thickness, where horizon thickness is very variable.  If this happens, parting thickness will be reduced to the horizon thickness.

  2. Where missing horizons are interpolated at zero thickness an interpolated parting value will obviously not be reported.  However, when an interp_thick value is reported in a map file substituted for the zero thickness (see notes below on -map), an interpolated parting will be reported if this option is turned on.

Turn off this option with -no_int_part qualifier.

Output can be to Map files, CSV files, DBL file, or to a DB.  To set the output type(s) required (they are not mutually exclusive) use the following arguments:-

-csv       Turn on CSV output.  Two files will be produced; a file containing header records and a file containing the litho records.  These will be named with the DB name (minus the extension) plus the suffices _fixed_header.csv and _fixed_litho.csv respectively.  If DSR data is used, a third file will be produced containing this data, named the same way with _fixed_dsr.csv as the suffix.

-no_csv    Turn off CSV output (the default situation).

-map       Turn on mapfile output.  This will produce a mapfile for each child seam (lowest level split) in the SPLITS table.  The format of these is very similar to the FIXMAP mapfiles but with less wasted whitespace and there are some extra values given that are set to zero in the FIXMAP version.  Plus there are some new fields:  FIX_TK - the interpolated seam thickness at a hole position when not given in the ST/TK columns because the seam it is known to be missing in the hole from the DB log.

R_STATUS - the status of the data given for the roof values.

F_STATUS - the status of the data given for the floor values.

T_STATUS - the status of the data given for the thickness value.

P_STATUS - the status of the data given for the parting value.

These status values describe how data is derived, and can be one of the following values:-

DB - data derived directly from logged data in the database,without modification (i.e. data with good reliability).

DB_FIXED - data derived from logged data but which required some checking or 'fixing' by this process (e.g.s data which extends to the top of the hole or above the base of weathering, or data which extends to the bottom of  the hole, or data with poor logging reliability when reliability-codes are used).

FIXED - data derived entirely by interpolation within this process.

MODEL - data derived from an external reference control model.  (not yet implemented - may never be)

UNKNOWN - data item of unknown provenance.  This should never be seen, it probably indicates that an error has occured in the process and should be reported.

Mapfile names will be <proj><seam><suffix>.map where <suffix> is _z by default to distinguish them from normal mapfiles (no _z) and FIXMAP mapfiles (just z no _).  However, a user definable suffix can be used if the optional suffix argument is added to the -map qualifier; e.g. -map _f.

Note:   Special characters can  be protected by \ if necessary (e.g. \-f to protect the - from being treated as the start of the next qualifier or \\-f in CSH in order to protect the \ itself from the shell), but beware of using invalid filename characters in the suffix!  Also the max allowed length of the suffix is three characters.

-no_map    Turn off mapfile output (the default situation).

-dbl       Turns on DBL file output (format the same as FIXMAP).  This has two optional arguments:-

<dblname> The name of the DBL; filename is formatted as <proj><dblname>.dbl.  The default is 'fixdhd'

The token ADJ if collar levels are to be adjusted to the top of the topmost seam when the upper seam is above the original hole collar (i.e. the same behaviour as FIXMAP).  The default is NOT to adjust.

-no_dbl       Turn off DBL output (the default sutuation).

Other output types not yet implemented.

By default this system will fix horizons KNOWN to be missing as zero thickness.  It does this to ensure the best looking stratigraphic picture, which is correct according to the data.  So, the default situation is similar to the fixmap -z qualifier.

The effect of modelling directly from the default data output is that a horizon will pinch to zero thickness at the first hole where an area of zero thickness is encountered.  Sometimes this picture is not what is required, although it is the most conservative method of using the data, making no assumptions whatever beyond the known data.

So, two special mapfile output controls can be obtained by using the qualifiers described below, which will affect the picture obtained by modelling directly from the data.  However, they should be used with caution because of their effects, which are explained below.

Where a known zero thickness horizon is encountered, the interpolated horizon thickness is stored in an extra column of the mapfile (see the description of mapfiles above).

  1. This interpolated value can be substituted for the structure thickness (and TK value) of a zero thickness horizon by using the -out_int_thick qualifier.  The effect of this is that the models made directly from this data will then not pinch to the zero thickness value.  So, when a mask is applied to a modelled it will stop squarely at the mask boundary with no change in thickness.  This is NOT a conservative approach to modelling because it over-estimates resource which is not indicated by the data. N.B.: this is applied solely at the output stage.  It does not guarantee that a horizon will not cross the surface(s) above it in the hole(s) where it is applied.  So, it MUST be used in conjunction with masking that will ensure that any area of possible overlap is masked out and a sensible picture is obtained.

  2. This interpolated value can can be substituted as its negative value for the structure thickness (and TK value) of a zero thickness horizon by using the -out_neg_thick qualifier.  The effect of this is that the models made directly from this data will pinch out (the roof will go below the floor) half way between a hole having the horizon and one not having the horizon - rather than pinching to zero at the hole (the roof will be adjusted by the -ve thickness and floor will be unchanged. A post modelling GDCALC call to ensure that RF = MAX(RF,FL) is required to stop the roof going below the floor and to ensure that the picture looks correct.  An optional numerical argument can be used with this qualifier (e.g. -out_neg_thick 2.0) which will be used as a multiplier of the interpolated value (values < 1.0 will bring the cross-over pinch point closer to the hole with the zero thickness, values > 1.0 further away).  The default multiplier is 1.0 which have the effect of making the pinch approximately half-way between the holes (assuming the interpolated value is the similar to the known value).  This technique will actually produce a lower more conservative resource estimate than the default data, but again is making assumptions about the horizons not supported by data. N.B.: this technique MUST be accompanied by the RF = MAX(RF,FL) to obtain a sensible picture in the modelling.

Notes:

a) Some holes may have horizon(s) for which the interpolated value for thickness is still zero, even though we did not 'know' from the data in a hole that it was missing.  Obviously, these two qualifiers will hove no effect in these situations.  However, these places will tend to be found within an area of known zero thickness.  So, will not affect the modelling pictures obtained - where the 'pinching out' or 'squaring off' is required at the edge of a zero thickness area.

b) Remember, the default data ONLY will be totally stratigraphically correct and is assured to contain no crossing horizons or other artifacts produced by the fixing process.  These options are intended for the purposes that are described above; to make particular modelling tasks easier to obtain required pictures.

c) Remember, these two qualifiers affect mapfile output only.

Modelling Parameters

====================

Needed if you are going to use survey pickup data.  Specified using the -mod_param qualifier, followed by the modelling parameters in the following order:-

Minimum easting, maximum easting, minimum northing, maximum northing, grid cell size and units name (space separated).

To use survey pickup data use -survey <horz><surf><model> for each survey model you have.  Where <horz> is a name from the horizon list, <surf> is ROOF|FLOOR|TOPO, and <model> is the model file name (in full; can have path if file is not in the project directory).  Although the topology model is not strictly necessary, it will provide better looking data to the mapfiles and ensure that all LC values are valid.

Note:   triangulation or grid models can be supplied in all cases.

The data will create points in the output mapfile call DHnnnnnn where the numeric part will go from 000000 up to 999999.  If you want to use a different name part use -seed_name <name> to change it seed name (2 chars max).

When fixing data the system searches for all available thickness intervals to fix a missing surface.  Using the most statistically significant values to obtain a weighted average.  Sometimes (e.g. a steeply deeping and extensive site) there may be no thickness set directly available to fix the missing value.  However, it may be possible to make a less satifcatory (less accurate) estimate by adding the weighted average values of what data there is for all intercepts between a known value in the hole and the missing surface (as does FIXMAP).  The system will do this as a last resort to fix missing surfaces by default, but this operation can be controlled with -fudge [<T|F>].  If turned off, then a hole will be rejected when a surface can not be fixed more often. Usually, the fixing by addition will obtain a result.  If not, it means that there are isolated horizons in the horizon table that must be ignored for the data set.  It might be that a horizon (or group of horizons) has been identified singly (or as a group) in holes with no reference to other horizons at the site.  This may be mis-correlation, or holes need to be drilled deeper to establish a refernce connection.

During a fixing run, any data specific errors are reported ONCE only as they are encountered.  E.g. if there is no interburden for a given seam, this will be reported in detail the first time it is needed, but then not again.  These messages will appear dotted about through the fixing log, which makes them less easy to follow.  There is an option (on by default) which will reissue all of these messages together as a summary at the end of the process.  This action is controllable with the -error_sum [<T|F>]] qualifier.