MODULE(1)
NAME
module - do what the module does
SYNOPSIS
module [-adpqsV] [argument= value ...]
DESCRIPTION
Execute a module from the command line. When a module is
executed from the command line, all argument values listed
in the module are supplied from (in order of precedence):
values on the command line; values in the named (or default)
parameter file; the environment; or the default values
specified in the module, if any. If any arguments cannot be
otherwise evaluated, the user is prompted interactively to
supply the value(s). Once all the argument values exist,
the module is executed, and its diagnostic output is routed
to the appropriate channel.
The -q option prints out the list of required arguments and
their default or environmental values, then quits. The -V
option prints the full parameter keylist before execution.
redirects the module's history channel from stdout or the
history file to stderr, and errlog from the errlog file to
stderr, and prints the output keylist at the end of execu-
tion. The -s option shuts off all messages from history and
errlog. All flags (except of course -q) are available to be
passed down to the module if they are requested in the argu-
ment list, but certain flags should be avoided in modules,
as discussed below. White space between the '=' sign and
the argument value is optional.
Running a module with the -p option will cause a mapfile to
be created, once all the required argument values have been
satisfied, and then invoke a pipeline execution (pe) of the
module. The -d and -a switches, which only apply in this
case, will cause the DSDS services to be used to supply and
archive input and output datasets, respectively. The -V
option in this case will cause the (pe) to be run in its
verbose mode, with the full parameter keylist placed on the
history file.
OPTIONS
argument=value The value of the argument.
There are several argument and flag names reserved by both
the command line interface and by pe that should not be used
in module argument lists:
-a Use the DSDS to archive output when running in the
pipeline. (The a argument specifies archiving options
in pe; the equivalent argument in the shell is
archive_hold).
-d Use the DSDS to supply input when running in the pipe-
line.
-q Force query-only mode in the shell.
alloc=n Space in megabytes to allocate for output in
the DSDS staging area. Default value is 100.
The corresponding environment variable is
DSDS_ALLOC. Only applicable when running in
pipeline.
archive_hold=n Number of days for which output datasets are
to be retained online in the DSDS staging
area. A negative value prevents the data
from being permanently archived. A value
greater than 1023 will cause the archive to
be to an appendable data set. Default value
is -5. The corresponding environment vari-
able is ARCHIVE_HOLD. Only applicable when
running in pipeline.
history=name The name of the file to which module history
output (normal diagnostic output) should be
written. There is no default value, but if
no value is supplied, history will be
directed to either stdout or stderr in the
shell. (The argument is ignored in the shell
if either the verbose or silent flags is
invoked.) In the pipeline, the default is
/tmp/ModuleName_UserName_PVMtaskID.log. The
corresponding environment variable is HIS-
TORY.
errlog=name The name of the file to which module errlog
output (exception diagnostic output) should
be written. There is no default value, but
if no value is supplied, errlog will be
directed to stderr in the shell. (The argu-
ment is ignored in the shell if the silent
flag is invoked.) In the pipeline, the
default is
/tmp/ModuleName_UserName_PVMtaskID.log. The
corresponding environment variable is LOG-
FILE.
param=name The name of the parameter file to use for
resolving argument values not on the command
line. There is no default value. The
corresponding environment variable is PARAM-
FILE.
ABORT_ACTION=continue
Used in pe only to indicate that pe should
continue executing mapfile after a module
aborts
COPY_HISTORY=no
Suppress copying of history and errlog files
and map file to output directory when running
in the pipeline
EXAMPLES
There are a number of sample modules in
~soi/CM/src/examples. These are examples of how (or how
not) to write strategy modules.
For purposes of running modules in the shell, it is often
convenient to think of dataset names (for arguments of type
ARG_DATA_IN and ARG_DATA_OUT) as referring to directories
and sets of files within the directories. The following
three sets of "dataset names" are all permissible.
in= /tmp/indir/ out= foo/
in= /tmp/indir/[0-7,2] out= /tmp/indir/new
in= /tmp/indir/[12] out= ./
in= /tmp/indir/foo out= in
The first will result in the default range of files named
NNNN.fits on the named directory being processed and simi-
larly named files written to the named subdirectory of the
current path. The second will process only even-numbered
files from 0000.fits through 0006.fits, and create files
named new.0000.fits etc. on the same directory. The third
form will process only a single file and create its output
file(s) on the working directory. The fourth form will
result in any files created being placed in the same direc-
tory as the input (and possibly overwriting them if the file
names are the same). The fourth specification will cause
the module to process files named foo.NNNN.fits rather than
those named NNNN.fits.
It should be noted that the convention for selecting records
(files) as illustrated above differs from the convention for
selecting datasets (directories) in the dataset naming
scheme. In the latter case a comma-separated list is used
to present multiple datasets or ranges of datasets. Thus
the dataset name
in= prog:a,level:b,series:c[19-23,5,8-10],sel:[10-14,2]
would be used to select records (files) 10, 12, and 14 from
each of the directories corresponding to dataset numbers 19,
20, 21, 22, 23, 5, 8, 9, and 10, in that order.
When entering bracketed ranges in the shell, one must of
course be careful either to arrange that globbing is turned
off, quote the entire string, or escape the opening bracket
with a backslash.
FILES
~soi/CM/include/module.h
~/history
~/logfile
Strategy modules must be linked with the common main()
driver, ~soi/CM/bin/src/cmd/_$MACHINE.b/main.o to run in the
shell, and with the driver
~soi/CM/bin/src/pipe/_$MACHINE.b/main_svc.o to run as pe's.
SEE ALSO
pe (1)
DIAGNOSTICS
BUGS
If a pe started from the shell requests DSDS services for
dataset name resolution (using the -p and -d and/or -a
flags), only one argument of type ARG_DATA_IN and one of
type ARG_DATA_OUT can be successfully parsed.
Strategy modules are an attempt to isolate environmental
dependencies (including I/O) from executable code. The
attempt has been of limited success. There are so far only
two interfaces, and modules are not always exercised under
both.
The default values for first and last file selector numbers
are 0 and -1, respectively.
The option of using a dataset keyword (such as in) as the
value for another dataset works only in the shell interface.
It is potentially dangerous, as noted above.
The sample modules may be very much out of date with respect
to features available in strategy modules.
If a pe is started from the shell (using the -p flag)
without explicitly specifying rooted pathnames for history
and errlog, by default the history and errlog information
will be written to files named history and logfile, respec-
tively, on the user's home directory.
AUTHORS
Rick Bogart & Kay Leibrand are responsible for the shell
command line interface to modules.
HISTORY
1999-12-13 SOI Version 4.5