Aa@S!.HH SKI`R    dFootnote TableFootnote**. . / - -+\T0 \LOT TableTitleLOFFigureTOC1Heading2Heading3Heading 9  U  V  W  X  Y  Z  [  \  ]  ^  _ yy U e ` U  V  W  X ӹO Y  Z  [ ` \ ^ ] T ^ ՗ _ 7j <$marker1>R <$marker2>F <$paratext[Title]>C <$paratext[Heading]> <$curpagenum>H<$lastpagenum><$daynum> <$monthname> <$year>o (Continued)+ (Sheet <$tblsheetnum> of <$tblsheetcount>)i<$daynum> <$monthname> <$year>!"<$daynum>/<$monthnum>/<$shortyear>5<$daynum> <$monthname> <$year> <$hour24>:<$minute00> <$daynum> <$monthname> <$year>"<$daynum>/<$monthnum>/<$shortyear>Fo <$fullfilename> <$filename>Pagepage<$pagenum>Heading & Page <$paratext> on page<$pagenum>Section & Page%Section<$paranum> on page<$pagenum>HSee Heading & Page%See <$paratext> on page<$pagenum>.V Table & Page'Table<$paranumonly> on page<$pagenum> Figure & Page[(Figure<$paranumonly> on page<$pagenum>Equation NumberU eHeadingV <$paranum> !  ``QA bbP eeP ffQaAr hhPaer kkPate mnQ A ppPi]> ssPgum PaTOCenu PaLOF <$ PaLOT Q)IX ^QbAe  Fbls M5.0num Font Fyea F" F/<$ F/<$ M>7.0num  ont4.2> < F24> F00> ;4.2.1nam ;yea4.2.21um ;ont4.2.2>Fo ) ;ena4.1.1ena * (Pa )<$p + H & -par (n p .enu /Se 2age 3<$ 4on Ogen FSe 0 & 1 < 5> 6$pa 7 8 Pa 9<$ :ly> F<$p = K& P Mure2.0y> Fge M>1.0n N F ; Nu4.2.22in ;$pa4.2.3! ;4.2.26 Fb F eP M3.0 Fh F MkPa4.0m A Fp  4.1um F ;TO4.2.24Pa ; <$4.2.4LO ;4.1.22 ;4.2.5F ;4.2.25.0 ;F4.2.6 ;"4.2.27F ;4.2.77.0 ; 4.1.6> < ;24>4.2.8; F ;2.1 4.2.9 ;2.24.2.28 ;2.2 4.2.10; ;4.1.23* ; 4.2.11$p ;+4.2.29 ;par 4.2.12. ;4.1.5age F3 F4 ;O 4.2.13 ; & 4.2.305 ;4.2.14 ;84.1.24 ;ly>4.2.15= ;4.2.31re ;4.2.16 ;>4.1.7F ;4.2.174 ;in 4.2.324 ;! 4.2.194 ;4.1.25 ;4.2.20 ;3.0!4.2.33F ;4.1.44.0 ;"4.2.34 ;4.1.8F ;#4.2.354 ;Pa4.1.24 ;LO$4.2.364 ; 4.1.94 F ;M 4.1.10; ;4.1.3; ; 4.1.11; F ;2.7 4.1.12 F1.6 ;24> 4.1.13; F ;2.14.1.14 F2.2 ;44.1.15 F4 ;;4.1.16.2 F; ; 4.1.17+ F ;4.1.184 F. ;4.1.19F J4 ;;4.1.20.1 J; ;44.1.21 J4 J8 F1.2 Fly> J4= J J2.3; J4 J J.7 J; F4 F; F F4 ;4.2.18; F F1.2 F F4 ;;4.2.23.3 F; !F "F4.0 #F $F2.3 %F & 45.1 'F# (F4 )_ *_1.2 +FLO ,_4 -F; .F /F4 1F 2F 3 1.15.2 4F4 5F; 6F 7 ;5.3; 8F 9F1.1 ;F1.6 < 24>7.11.1 =FF >F; ?F @ 5.4; A_ BF1.1 C_4 DF; ET4.2 FF; GT HF4 I_F JF; K_ L_4 M 5.5 N_1.1 OF QF; R_ S_.1 T_ U 45.61.2 V_ J WF X_ Y_1.2 Z ly>5.7 [_J \F ]_ ^_ __J `_ aF b_; cFF d_F eFF f_F g_; h 5.8F i_ jF lF m_ n  5.9.3 oF; pF qF4.0 rF sF2.3 t; 5.9.1 uF vT wT xT z; 5.9.24 {F-F |T.F }T/F ~T1F ;2F 5.9.3 F5.2 T4 ; T6F  ; 5.9.4 F9F T;F ;<  5.9.51.1 FF T; T ; 5.9.6A_ F T T T T F;    5.10 FF F; F F4 F F5.5 F1.1 F F; F M.16.0 F4Aq_WSKI`A`ey>SKI`BfkSKI`CDDFHixDC FHixF _&` wCorrections - Doppler (*) UU GCalibrated measurements of the line-of-sight component of velocity may *UU Hbe subject to correction for various known and modeled physical effects F6UU TKdepending prior to further analysis. The corrections to be applied depend BUU Gon the type of analysis to which the data will be subjected. They may FNUU@ T include: ;`UU .6Ddetrending: removal of constant or nearly constant contributions to ;lUU Athe true velocity, particularly that due to global rotation, and xUU@ .1)possibly due to steady meridional flows; MUU Bcorrection for limb shifts, or line shifts known or assumed to be UU@ Dassociated with the angle of the line of sight to the local normal; UU Cdisplacement of the effective position due to intensity gradients xUU@ across the pixel; UU CGcorrection for the angle of the line of sight to an assumed radial or ofɪUU@  cAother component of the real surface velocity vector field (???); rتUU  kDcorrection for line shifts known or assumed to be caused by strong io䪛UU@ simagnetic fields; UU nd>interpolation of missing values from neighboring values when UU@ te feasible; UU  Fremoval (and possible reinterpolation) of values from the dataset at UU onFpixels for which physical effects such as very strong magnetic fields &UU@ ot7suggest the unreliability of the velocity calibration. mer>UU HDoppler Correction algorithms can be applied to Dopplergram datasets of meJUU UULany type (e.g. full-disk, high-resolution, single pixel), producing VUU JDopplergram datasets of the same type. Requisite auxiliary data certainly bUU acKinclude the heliocentric and heliographic coordinates associated with each f snUU  r;datum. Additional data such as applicable information from suzUU@ toMagnetograms may be required. ` wfoCorrections - Intensity um` w sCorrections - Line Strength a` wCorrections - Zeeman ` wg .Correlation Tracking - Transverse Velocities ` w; Decompression and Decoding (*) nd  ` wla%Fourier Transform - Limb Parameters !` wpi&Fourier Transform - SH Amplitudes (*) 9` wicHeliocentric Mapping (*) tQ` wliHeliographic Mapping (*) ii` wLow-Pass Spatial Filter cSKI`EDoFFme6ixFE 6ixDH e ', ` w"Peak Identification - p-modes (*) ` weqQuality Parameters (*) tai6` w Ridge Fits - l-nu (*) N` wapSky Mapping (*) sof` w sSpatial Gap Fill ~` wl "Spherical Harmonic Transform (*) UU UUHFor the calculation of the frequencies of the eigenmodes of a spherical - UU Lcavity such as the solar interior it is necessary to decompose the observed -UU Jampitude of oscillations into their orthogonal surface spherical harmonic UU d Ccomponents. These contain the spherical angular dependence of the eteƪUU  Ispherical harmonics; no direct determination of the radial dependence is rҪUU Jpossible from observations made on a single surface. Because the surface ުUU >spherical harmonics form an orthogonal basis set over the two-ꪣUU Jdimensional surface, the components of an arbitrary vector field over the UL UT* icvsurface are determined mathematically by the integral of V(zq,zf) Y*lm(zq,zf) *UU Ma\dzq dzf. Ordinarily we are concerned either with scalar fields or individual Tr*UU@ Lcomponents of the vector field. [problem with line-of-sight decomposition] he+*UU UUFIn order to perform the spherical harmonic transform it is of course 7*UU Jnecessary to define and refer to a fixed spherical coordinate system, for C*UU@ Iwhich we adopt the Carrington heliographic system. [definition] c[*UU YThe surface spherical harmonics being separable in zq and zf, with simple g*UU poIharmonic dependence on the longitude it is expected that the integral is s*UU spInumerically performed by separate quadratures in longitude and latitude, *UU suJthe longitudinal quadrature being simply a Fourier transform. [necessity *UU@ de+of apodization in both dimensions] z*UU Y*BThe input datasets can include Dopplergrams and Photograms of any *UU naJtype, but typically the transform would only be applied to the Full Disc, *UU ts` wneTemporal Gap Fill (*) deV` wn 20-Minute Filter n` wUU!2-Dimensional Fourier Transform cSKI`Gf HHnaHixHG m HixFJ *+UUUU` vluData Product Descriptions UU  tC(Data products marked with an (*) are part of the Dynamics Program the*UU s.IPipeline. Description of the TelemetryStream data set, though obviously r6UU  oHancestral to any other derived data product, is not included. Also the tuBUU ra>ancillary MDI Status and SOHO Ephemeris data sets are not yet NUU@ am described.) e` w TelemetryStream (*) u}` wthRawImage (*) UU FRawImage data are data which have been mapped from individual pixels UU giEassociated with their location in the telemetry stream to equivalent SUU Mpositions in the focal plane of the instrument, with no modifications to the lUU  Jdata values or interpolation. Data values may represent raw uncalibrated ƪUU Eand uncorrected Dopplergrams, Filtergrams, Photograms, Magnetograms, JҪUU UUAPolarigrams, or other imaged observables. RawImage datasets are tުUU@ arDproduced from TelemetryStream datasets by the Camera Mapping suite. s.UU ipGRawImage data provide the input to any of the Calibration procedures, oUU  oGproducing images of observables in the camera coordinate system. They raUU atDshould also be available for inspection and analysis for systematic riUU Einstrumental effects. Whether they are maintained as permanent data &UU RDproducts or generated at the time of use depends on the cost of the 2UU asCmapping algorithms (probably negligible) the relative sizes of the >UU poCRawImage and the Telemetry datasets (probably comparable), and the o tJUU  Hcomplexity of extracting the desired data from Telemetry data (possibly raVUU@  considerable). ctem` wltDopplergrams - Full Disc (*) UU  CDopplergrams represent maps of the line-of-sight component of the e tUU arMphotospheric velocity inferred from the Doppler shift of the observed line. UU ImIThe mapping is an orthographic projection centred on the projection axis UU ucOfor the velocities, i.e. the sub-solar point of the observer. Data values are UU beAthe speeds themselves, calibrated in m/sec (see Units, TN XXX). UªUU ntJAdditional independent data represent positions of a centroid on the map, ΪUU geHwith the camera, plate, and heliocentric (and heliographic?) coordinate asڪUU msJsystems all represented either explicitly or implicitly. The centroid is 檌UU emKdefined by the centroid of continuum intensity over the pixel, the mean of ityUU@  d3all intensity-weighted locations within the pixel. UU UU abEThe structure is similar to that of RawImage, except that there are UUU grHadditional identifiers associated with each datum representing the sky, "UU  vKheliocentric (and heliographic?) coordinates of the centroid of the mapped .UU isDpoint. The independent coordinates may be converted to a distorted UU:UU veIsystem in order to take advantage of the use of rectangular arrays while FUU thKminimizing the number of data-points representing positions off the limb. RUU ndHFor example, the independent coordinates may be radial position varying ^UU erHfrom 0 to 1 in equal steps (sky coordinates) and position angle varying SKI`IerJJtl6ixJI 6ixHL  i"tyUU heMfrom 0 to 360 deg in stepsizes dependent on the radial position. This is To iUU@ UUBe Determined. ruc*UU  tGDopplergram datasets are typically produced from RawImage datasets of ide6UU d Buncalibrated velocities by the Sky Mapping, Heliocentric Mapping, BUU G(Heliographic Mapping,) the limited Velocity Calibration, and Velocity DpoNUU enDCorrection suites. They may also be produced (in certain observing syZUU akwill be accomplished by an SSSC subsystem(s) here referred to ֪UU@ l Bconceptually as a Data Storage and Distribution System, or DSDS. UU` !  aUU` "  UU` # 6UU` d  tNUU` 6lu nfUU` >` ~UU` o UU` s UU` $ ƪUU` % 䪪UU` &v!Providing SOI Data to Scientists SUU (msJScientists will have access to selected data and catalogs administered by UU (Gthe DSDS via a DSDS front-end(s). It is planned that the scientists' fUU (MCDSDS interface will assist the scientist in composing queries and  UU (SSDbrowsing the query results and the scientist will then submit data ,UU (ucEdistribution requests derived from query results. The archive data 8UU (esHrequested will be sanity checked, retrieved, optionally subsetted or DUU@ ( bHtransposed, and packaged in the export format. Then it will be either: l VUU` )a D written to permanent media and sent as a parcel to the scientist aeUU` * F transferred to temporary on-line storage for ftp by the scientist . SKI`OPPHixPO HixNR  &'PrUU` +o DAfter some time period, this on-line storage is reused by the SSSC. teUU` ,s I or transferred to the users on-line storage via ftp (or perhaps AFS). t0UU` -heLThe scientist is notified by email that the data distribution is available. HUU esGPlans for the scientist's interface to the DSDS has been documented in st TUU at@detail and prototyped. See SOI-TN-044, Preliminary Functional r`UU ivEDescription of the SSSC Data Storage and Distribution System; SOI-vlUU bsITN-058, Data Storage and Distribution System Prototype Version 1; rxUU  bESOI-TN-060, Feedback on the DSDS Prototype Version 1; and the SOI UU@  sDSDS Overview Video. UUU` 3v)DSDS Interfaces to other SSSC Subsystems UU 5COther SSSC subsystems will access the data managed by the SSSC via xƪUU 5Lwell-defined interfaces that include catalog queries and directives such as r ҪUU 5thLread, write and update. Care must be taken to ensure . . . Technical note rrުUU 5n-BSOI-TN-044, Preliminary Functional Description of the SSSC Data ꪜUU 5d JStorage and Distribution System, Appendix 3, gives high level protocols UU@ 5is#for the proposed SCCC directives. cumUU` 7v Data Sets ,UU 9tyDThe data sets handled by the DSDS will include the SOI-MDI level 0 De8UU 9SSLtelemetry; selected derived scientific data products from level 1, 2 and 3 DUU 9isBprocessing; ancillary data such as instrument operating modes and PUU 9k Eephemerides; project documentation; software; contributed data sets; S\UU 9 UMetc. Data sets may contain scientifically or operationally related data. 5hUU 9ubJFor example, a full disk dopplergram, the spacecraft ephemeris for a time tUU 9erHperiod, or a collection of consecutive low resolution averages from the 5UU@ 9anstructure program. UU ; .LA preliminary list of scientific data sets is given in the PDMP. Several DUU ;SSEterabytes per year of data sets will be administered by the DSDS. xUU ;veCCurrently the functions of the DSDS described in this document are cumUU ;Jgeneral and apply to all types of SOI-MDI data sets. Thus the functions ȪUU ;leIof the DSDS may be described independently of data set sizes, volumes, pԪUU@ ; 1etc. 쪋UU = 9FThe required DSDS performance will be described and modelled in later UU =k Aversions of the Data Management Plan and future documents. For tUU = 9Bpreliminary estimates, see SOI-TN-048, Estimates of SOI-MDI Data UU@ =!Volumes and Reduction I/O Rates. l.UU` @v, )Data Set Identifiers, Names and Mappings FUU BodJEach data set handled by the DSDS will have a unique identifier, the SOI RUU BctLID . Construction of the SOI ID's has not been finalized; serial numbers et^UU@ BPDmay be used, for instance. ;SKI`QetRRd 6ixRQ ve6ixPT ri) tUU DumHEach data set will also have a unique name. The data set naming scheme UU Ds Ihas not been finalized, but an example of a fully expanded data set name eUU@ D p might be: 0UU E4SOHO - SOI - Dynamics Program Calibrated Full Disk e <UU@ Ell8Dopplergram -Level 1 - 7/5/1996 12:01:01 Version 2.1 agTUU` FurIts abbreviation might be: UUfUU` G)SOHO-SOI-CFDV-1-7.5.1996/12:01.01-V2.1 S~UU HLNote that neither of these names is the data set file name. DSDS catalogs IUU HanJwill be used to map from an SOI ID, or various forms of a data set name, UU@ HfiFto files and catalog information maintained internally by the DSDS. UU JedJWhen data sets are exported from the DSDS, for example to an ftp area for UU JGnetwork transfer, file and directory names will be derived from names 6ƪUU JIsuch as in the example above. Exported data will always be accompanied aҪUU J TIby a manifest or catalog detailing the correspondence between a data set uުUU JfuJname and its file name. For more information on SOI naming, see SOI-ꪛUU@ J- 8TN-038, Preliminary Strategies for SOI Data Set Naming. llUU` Mvel Metadata 9 UU OioLThe term metadata is used to describe data associated with a data set. for ,UU O-7Jexample its name, dimensions, data type, time of creation, etc. There 8UU O fMare several classes of metadata: some metadata is very specific to the data ODUU OorPset and highly likely to be required for processing (e.g. the data set name); ntPUU Oy Aother metadata may apply to many data sets and only sometimes be t\UU OleHneeded for processing (e.g. the spacecraft position associated with the cthUU OdeIobservable from which the dataset is derived); and some metadata is only xtUU OlwKloosely associated with the dataset and rarely needed for processing (e.g. theUU Otw>the detailed instrument command sequences associated with the UU@ Oat/observable from which the dataset is derived). 8TNUU QStPIn the DSDS, metadata of the first type is termed embedded metadata and OUU QadLis stored physically with the dataset (e.g. as a file header). Metadata of OUU QnaPthe second and third types is termed catalog metadata and is stored in arȪUU QofIcatalogs and physically separately from the datasets. Catalog metadata rԪUU QliHmay obviously also include some embedded metadata, such as the data set UUઊUU QtaBname. The catalogs are designed to facilitate access to the most 쪉UU@ Qor$frequently needed catalog metadata. io UU` UvthData Product Formats e"UU WwhKIt is presently planned that the data product export format to scientists, Klo.UU WitGand the delivery format from the SSSC to the NSSDC, will be FITS with tw:UU WtrHSFDU's. It is planned that the SSSC reduction system will generate leFUU WtaHFITS files; the SSSC DSDS will add necessary SFDU information prior to stRUU WBSSSC archival; and the data products transferred to the NSSDC or ^UU@ Wt %scientists will be in export format. OSKI`SndTT HixTS orHixRV  p)llUU` ZvthTypes of Data Sets meUU` \UU/The DSDS will manage three types of data sets: d m0UU` ]he intermediate data sets ?UU` ^at data products tNUU` _s  archive products fUU alyHIntermediate data sets are produced by the SSSC reduction, mission rUU aItFsupport, interactive analysis or SSSC operations and are short-lived, ~UU a WDtemporary data sets. To preserve an intermediate data set, it is S UU@ aUU6reclassified as an archive product or a data product. UU cenLArchive products are very long-lived and are carefully constructed and DUUU c tDmaintained. At least two copies on permanent media are made at the erUU cr ESSSC, and at least one copy is retained at the SSSC. Ultimately the ƪUU cGcollection of all archive products will comprise the SOI archive to be ҪUU cVCmaintained by the NSSDC. Archive products are written in the SOI ުUU@ cDSexport format. reeUU es:IData products are longer lived than intermediate data sets, but for UU eGvarious reasons are not deemed to be archive products. Examples might e dUU eprIbe artificial data for testing, or preliminary data obtained from ground-rUU e SFbased instrument tests. At least two copies of data products will be &UU eveIwritten to permanent media and these will be retained at the SSSC. Data n2UU@ er /products are written in the SOI export format. ducPUU` hvonCatalog Operations ullhUU jDUDAll SSSC data administered by the DSDS will be cataloged. Current ditUU jerBplans are to use a commercial relational DBMS(s) to implement the UU j tFcatalogs, which will be populated, accessed and manipulated using an UU j tCANSI/OSI compliant Structured Query Language (SQL). The catalogs uctUU jheHwill express relationships between data and metadata, and relationships pUU@ jonbetween metadata. diUU l fFThe SSSC will use use the catalogs for internal operations. For SSSC ȪUU l mFoperations, generally any SQL construct to query, create, update or ԪUU l fEdelete the catalog information will be permissible. (Some catalog ઋUU lduAinformation will, of course, be read-only or have special access e쪊UU@ ld restrictions). a n UU` nvr SSSC Catalog Types n i"UU porEThe following categories are representative of the types of catalogs U.UU pdmCenvisioned to support DSDS catalog, data management and operations er:UU p aLfunctions. Most of the query support catalogs will be populated by the , FUU pulFSSSC reduction and analysis systems. (Many additional catalogs will RUU pd Jalso be needed at the SSSC for operations, mission support, reduction and ^UU penKanalysis, but they are not directly related to DSDS functions and are not a. jUU@ p ldescribed here.) SKI`UopVV6ixVU ne6ixTX upr UU r fFThese categories and examples for each category are described in much UU rduGmore detail in SOI-TN-044, Preliminary Functional Description of the UU@ rri,SSSC Data Storage and Distribution System. T5` twUU%Dictionaries/General SOI Information rNUU` uhe examples: `UU vUU6SOI Archive dictionary -- high level SOI archive conlUU@ vnstents {UU` wfu,Instrument dictionary -- instrument modes s UU` x b(Data product dictionary -- definitions on` zwemData Catalogs UU` {l examples: ̪UU` |neArchive product - definition mis۪UU` }ct%Archive product - reference time(s) ꪟUU` ~ir"Archive product - ancillary data ` w Data Management System Catalogs UU`  examples: ,UU` )Data set -- physical location(s) in DMS ;UU` $DMS media - status (e.g. 50% full) UUR` wteArchive Maintenance r kUU` de examples: }UU`  r.Archive increment list -- archive increments ` wioSSSC DSDS - NSSDC Handoff UU` or examples: UU`  TArchive increment -- contents s/GΪUU` ti2Archive increment - statistics (e.g. QC results) ` w dDistribution Requests UU`  examples: UU` *DSDS user -- info (e.g. email address) UU` (Requests -- archive products requested -.UU` Requests -- statistics al=UU`  {Queries -- list SKI`W -XXHixXW efHixVZ hi,duUU` va Data Management System a MUU at@The data management system (DMS) refers conceptually to a mass a *UU caFstorage management system(s) that stores, retrieves and migrates data 6UU  Facross a hierarchy of physical media. [ref IEEE Mass Storage System BUU  iFModel] Thus the DMS is a (conceptual) subsystem or component of the NUU DDSDS, and is as well a subsystem of the SSSC reduction and analysis ntZUU UU@systems. The DMS uses the DSDS catalogs to map from data set fUU@ ri,descriptions to their physical location(s). : ~UU KAt any time, archive products may be stored on a variety of physical media - aUU quEmanaged by the DMS, depending on age, frequency of access, etc. For UU  lCexample, mode frequency tables will always be on media that can be iUU@ accessed quickly. UU ZAThe DMS manages storage as a hierarchy. Each media volume has a MƪUU atJlimit, where the capacity is given in some measure. Assume stages 1...n, ҪUU agKwhere 1 is has the best performance (e.g. 1 = RAM, 2 = mag disk, 3=optical ުUU raKdisk, 4 = tape). Data migrates across this hierarchy when capacity limits FMoꪚUU S Care reached. This migration is managed by the DMS and recorded in DDSUU@  aBcatalogs managed by the DSDS (possibly DMS has private catalogs). UU 1S FSee SOI-TN-044, Preliminary Functional Description of the SSSC Data UU@ 1r DStorage and Distribution System, for a more detailed description. du>` uData Analysis Environment ZUU  >The data reduction and analysis procedures are assembled from fUU Lindividual modules at each of four distinct levels. This is done partly to berUU  Emake the task of designing and implementing the processing pipelines s~UU h Esimpler by isolating the component parts. By requiring well-defined gUU reKinterfaces between the modules at different levels, however, this approach ancUU  =Falso allows us to provide a set of data analysis services that can be UU hiEconnected to a variety of available user environments. The detailed sUU geKspecifications of the modules and interfaces, which depend strongly on the DSUU asEdesign of the data and control structures, will be found in the Data ƪUU@ ti?Management Plan. Here, we provide a motivational description. n SުUU  dKAt the lowest level are library routines implementing specific algorithms, ꪚUU daHfor example transforms, mappings, and calibrations. These routines can UU esIbe written by scientists or programmers with a detailed knowledge of the UU k Falgorithms and computational efficiency but with minimal knowledge of UU soLthe data structures and forms of data organization. The interface to these inUU heHroutines, as with any standard library routines, is exclusively through =&UU o Jtheir calling arguments, and their specifications and descriptions can be 2UU tyFprovided in the form of a unix man() page. The calling arguments are >UU e Gpresumably either constants (in the case of parameters) or pointers to asJUU taJdata for inputs and outputs. Allowance should be made for a small set of VUU reJwell-defined specific argument types, such as codes and pointers to files bUU arJfor error-processing for example, or to function-specific data strucutres nUU maMsuch as would be defined in a c-language include file. The intent, however, nSKI`Y aZZ t6ixZY ms6ixX\ im1wlUU UUGis to keep the number of such arguments small, in the spirit of a true aceUU Jgeneral-purpose library function. Likewise, an attempt should be made to UU Dmake such library level routines as general as possible. It is not sc*UU Bappropriate, for example, to implement separate Fourier transform 6UU e Lroutines at this level for transforms of spatial and temporal data. On the orBUU Oother hand, it is entirely appropriate to provide a suite of different Fourier allNUU UUGtransform routines applicable to different sizes of data or to data of s tZUU UUMspecific symmetries, the choice of routine to be used resting at some higher tfUU@  level. h a~UU  iDFunctions at the next level up may be termed the compute modules. UU FThese modules are in essence the real building blocks of any analysis UU Kpipeline. The compute modules should take some (to be) well specified set smUU  oHof data and control structures and from these generate as necessary the isUU ldIappropriate argument lists to the set of library functions determined by aUU  iKthe control information. They are in effect wrappers, binding the library FoƪUU Ifunctions to a standard data structure. Implementation of these modules oҪUU orIshould require no detailed knowledge of the algorithmic structure of the uުUU ouAunderlying functions, but does require detailed attention to the eꪘUU  oMspecification of their calling arguments, return values, and their exception eUU g Lhandling behaviour. Likewise, a full specification of the standard dataset nsUU  uEstructure and control structure on which the compute modules operate uUU  tHshould provide a complete description of the calling interface of these heUU hoEmodules. The implementer of the compute module need have no concern t&UU ctGfor operating system and data storage system details such as where the rop2UU@ ts!data come from or where they go. tJUU AThe functions at the third level assemble appropriate datasets, ,VUU ryDparameters, and compute modules; they represent instances of actual lebUU  mFanalysis pipelines. These modules, which might be termed the command nUU ctImodules, provide a mapping from the actual operating data systems to the tzUU  tGdata structures required internally by the compute modules. Basically n vUU Gthey provide both a clasic Input/Output wrapper to the compute modules ficUU arIand the necessary control mechanisms for the calling sequence. It is at tUU  oMthis level that operating system dependencies exist. Similarly, this is the iUU esJlevel at which strategic decisions regarding the construction of analysis UU  tJpipelines is made, e.g. whether data are "pushed" or "pulled" through the ªUU opIpipeline, and whether tasks are performed sequentially or concurrently. ΪUU t L[N.B. It may make sense to separate these two functions, i.e. the I/O wrap rsڪUU leJand the strategy into two distinct levels.] These tasks are invoked with 檄UU esHparameters and with data identifiers, such as filenames or sockets. In aUU acGessence the command modules are the equivalent of unix shell commands, reUU byEand they are expected to be useful in the same way, that is by users b UU@ /O>constructing sequences of commands with flags and data names. "UU ntHThe highest level functions provide the interfaces to various standard th.UU atJanalysis and operating system or to specially designed user tools. These :~UU trDare interface modules, and except in special cases (the design of a F}UU  mGstandard processing pipeline for example) it is not intended that such R|UU d Emodules will be provided or supported by the Center. Rather we will U^{UU t Eprovide well-documented and supported stub interfaces to the command jzUU anHfunctions, so that a designer need only implement the command structure SKI`[it\\chHix\[ HixZ^ le#thUU ixGin the target user interface system. Examples of such user interfaces fulUU Iwould certainly include a unix operating system interactive shell itself tUU amB(which presumably would in fact be implemented) virtual operating *UU usIsystems such as IRAF, OSF Motif and other applications built directly on a6UU toIthe X client/server model, graphical user interface applications such as iBUU@ gnHTAE, Explorer, and specialized graphical analysis packages such as IDL. if` ut SSSC Operations UUUU wiEThe SOI Science Support Center will require a significant operations UUU weDinfrastructure to schedule and monitor the SOI reduction in a dialy UUUU s,Hproduction mode, and to ensure that other SSSC needs are met for local KUU@ analysis and mission support: UU` ' ֪UU` / UU` 2 UUU` ar UU` te 6UU` se nNUU`  UfUU` rt l~UU` pe iUU` iv hUU` . UƪUU` re aުUU 4be@These functions will utilize substantial storage, computing and aꪢUU 4anJ(depending on the architecture) network resources. While some of these UU 4icGfunctions will be well determined and repetitive, scheduling changes er,UU 4raHmust be accomodated if, for example, problems with the instrument arise onUU@ 4 *and different analyses must be performed. &UU 8t KIn order to schedule, control and monitor SSSC resources, resource usage SOI2UU 8alDand status information must be provided from many subsystems in the SC>UU 8r Mcourse of daily operations. It is essential that the operations staff be JUU 8 /Fable to schedule and reschedule SSSC activities, assess the status of VUU 8UUHscheduled and ongoing activities, be made aware of anomalies, and track bUU 8UUHcurrent and historic storage and computing usage. The extent to which nUU 8UUEthe SSSC operations scheduling will be automated is under study, but nSKI`]ng^^ne6ix^] th6ix\ e +deUU 8itFmany operations functions must be automated, such as the aging of old UU 8blMdata sets from on-line to near-online storage. It is anticipated that much sUU 8Dof the status information will be stored in an operations databases ur*UU@ 8ge>(relational) and accessed using SQL and other standard tools. HUU` <vte#High Level Operations Capabilities rse`UU ?nsITo illustrate required SSSC data operations capabilities, a scenario is blUU ? rGused; tracking some aspects of tracking telemetry receipt and Level 0 eduxUU@ ?ti reduction: de UU` A ; Generate and examine schedules for telemetry processing. e aUU` C. )Track progress of telemetry processing. UUU` Ion3 Track classes of errors in telemetry processing. UU KE Quickly identify critical situations in telemetry processing (e.g. hêUU@ Khardware failure). e ҪUU` LUU! Initiate telemetry processing. ᪜UU N sE Ability to allocate/negotiate needed disk storage and CPU time for oUU@ Ngetelemetry processing. UU RUUD Ability to determine if expected telemetry has been received from taUU@ RUU JPL/GSFC. UU` Ses@ Monitor how much of the telemetry has been quality checked. L&UU` Tpa2 Find out the error rates from quality checking. 5UU` Vra7 Review and specify how error conditions are handled. GusDUU` X a% Review and specify discard policy. dSUU` Y" Request retransfer of bad data. bUU` [ne@ Determine how much of the telemetry has been read and sorted. UUqUU` `ro Handle telemetry gaps. nUU b IB Check telemetry against expected telemetry contents from associUU bic>ated command sequences. Also subrasters, or other intended UU@ b anomalies. luUU` dUU& Catalog processed telemetry as LZD. UU f NE Ability to reconcile telemetry with real-time telemetry previously fªUU@ f Nreceived at EOF. cѪUU` gUU# Make archive copies and QA them. xpeઊUU i bD Confirm receipt with JPL/GSFC, only after safe archive copies are es쪉UU@ ichmade. UU m qC Ability to reprocess, recatalog, and correctly handle (delete at froUU@ m. times) old versions. UU q hKIt is clear from this scenario that various operational needs, such as the y d+UU qGability to track the telemetry processing, require that the processing @ 7UU q oCfunctions periodically provide status information to an operations andCUU q nEmonitoring subsystem, which can then display the information and (if eOUU qBdesired) store the information in a database. See SOI-TN-046, [UU qHPreliminary Functional Description of the SSSC Operations Interfaces, megUU@ qfor more information. 6ix`aA im6ix  Niv` p U6)a`bAd  t6) )onf6.1&bacA te6.1&  imah w m    l6A=pcbdAorct6A=pAUU6dceA6 is 6 I edA ne6 I abtoh ry^ 2  of  18  An Overview of the SOI Science Support Center  HixfgB nsHix stic` p t iH)gfhB qH))n iH.1&hgiB H.1& Deioh wat    HAihjBmo iHAAaHjikBHH I kjB bH I nfh [ An Overview of the SOI Science Support Center  19  of  19   SKI`lms1cnQ\(mnl 1cnQ\(n UU`  HnÑhnmol HnÑhm  H)onpl H))SciH0$yQpoql H0$yQ h w 7 August 1992  Hhrqprl qHhrhrHrqslHH I srl atH I  ih   1  of  18  SKI`tuHA3K ^uvt HHFootnoteH-vuxt wwFootnotecewv oI`Hxv{tyz 1Heading Rule_yzxn` ~zyx~mH(SNd ^{x|t H_jH_j 1Heading RuleH0 |{~t}} Chapter Rule }| 92HJ ^~|t H(2H(2r Chapter RuleHc~t Table Top_~ ~ EZS(8 ^t EZ\_jEZ\_j Table TopH ,t TableFootnotewHHRb ^t H_H_` TableFootnoteSKI`H~6 nH~6 UU` n"<$paranum><$paratext><$pagenum> ` m^"<$paranum><$paratext><$pagenum> )US UT`  R"<$paranum><$paratext><$pagenum> Hn(C ^ RuHwjHwjTable of Contents Specification H- ^H- rteUU` n"<$paranum><$paratext><$pagenum> H ^ HEHEList of Figures SpecificationH6 H6  (UU` n"<$paranum><$paratext><$pagenum> HJh ^ HEHEList of Tables SpecificationSKI`H~ H~ e` x1, 23 ` x$<$symbols><$numerics><$alphabetics> ` x Level3IX &` x Level2IX 0` x$p Level1IX pC` wLSymbols[\ ];Numerics[0];A;B;C;D;E;F;G;H;I;J;K;L;M;N;O;P;Q;R;S;T;U;V;W;X;Y;Z $pL` x> <$pagenum> ^HlAmN ^ jHuHueIndex SpecificationHSKI`1cnQ\( UU1cnQ\( >   'An Overview of the SOI Science Support Fi$ @ nCenter L p3R. S. Bogart, R. I. Bush, V. L. Johnson, and P. H. (\@ p Scherrer >zUU` ven"SOI Technical Note TN-079 - DRAFT H nh oH nh !` u Apologia #UU FThis document provides a conceptual overview of a design for the SOI /UU 1GScience Support Center. Its purpose is to provide a set of guidelines ;UU l3Dwithin which prototyping will take place with the immediate goal of GUU SyBdeveloping detailed conceptual designs for both the structure and SUU Ioperations of the Center. The intent is to describe a set of functional _UU ecMrequirements solely from the standpoint of the needs of scientific analysis, nkUU @with minimal reference to the constraints imposed by particular wUU veFimplementations. The fi rst section describes both the philosophy of UU R.=management and organization of the SOI team within which the UU@  >%development efforts will take place. eUU JThe Science Support Center has three principal components with different UU Hpurposes. First, a Flight Operations Component must support the actual tuUU esHoperation of the MDI instrument throughout the observational period, as i˪UU  oEwell as providing communications with the SOHO Experiment Operations eתUU meCFacility and support for the Science Working Team. Second, a Data des㪪UU stBReductiona and Analysis Component provides support for the actual 睊UU  aIanalysis of observational data by providing both a set of specified data dUU  oAproducts based on well-defined and carefully controlled analysis eUU s Kprocedures, and help in the further analysis of data by individuals in the ctiUU thGform of a limited collection of analysis procedures and tools for data SOIUU  tIaccess. Finally, a Data Archive Component serves to organize and manage U+UU enIall the data produced during the mission for use by the Science Team, to U7UU . Fprovide catalogues and search mechanisms for the data, and to fulfill CUU atErequests by both team members and guest investigators for online and OUU weLoffline access to data. These three components are described in section 3, UU[UU@  a4, and 5, respectively. WsUU ndHThe successful integration of the SSSC components necessarily involves esUU ctKthe creation of an analysis environment that is suitable to the envisioned setUU  dHoperations of the entire data analysis effort. Given the required team leUU Jdevelopment effort as outlined below, it is an express goal to design the UU Ianalysis environment to be easily fit into a wide array of available and oUU Ksuitable analysis tools. The way in which we plan to do so is outlined in ze SKI`alri6ix nc6ix ca%esUU isJsection 6. Finally, section 7 describes in outline the operations of the UU@ inSSSC. *UU  FThis overview provides an introduction and outline to three separate 6UU tiIdetailed implementation documents: the Mission Operations Plan (sec. 3), dBUU Ithe Data Analysis Plan (sec. 4), and the Data Management Plan (secs. 5 tNUU an@7). Together these documents will replace the Preliminary Data ZUU atIManagement Plan as the description, requirements, and specifications for fUU@ lo2the components of the SOI Science Support Center. ` u&SSSC Management and Team Organization UU` to[to be supplied Scherrer] o` uFlight Operations Component e ުUU` an[to be supplied Bush] e ` u&Data Reduction and Analysis Component UU FThe primary purpose of the SOI Science Support Center is to provide a *UU seIwell-defined set of data products based on MDI observations. Except for 6UU SSN"raw" data the definition of a data set can be thought of as a description of BUU Gthe processing algorithms and the input data sets used to produce it. ns NUU HMany processing algorithms are to a great extent generic and applicable PZUU Fto the production of multiple data sets. This section accordingly is fUU Jsubdivided into a set of algorithm descriptions and a set of data product rUU Ldescriptions. In both cases these descriptions are introductory, providing ~UU enGa scientific rationale for our choice of supported data products and a oUU Igeneral view of dependencies. Detailed descriptions, at the level of at hUU@  Lleast prototypical specifications, will be found in the Data Analysis Plan. puUU` vci Algorithms Cen̪UU a I(Algorithms marked with an * are part of the Dynamics Program Pipeline. aتUU  ADescription of the Decompression and Decoding algorithms, though e䪿UU@ esHobviously included in any complete line of processing, is not included. da` wodCalibration Magnetic  UU`  [Description of algorithm here] g%UU blCNone of the calibration and correction algorithms have an implicit his1UU lyJstructure or type for the input data, as the algorithms can be reasonably =UU odKapplied to the conversion of a single datum (as opposed to transforms, for inIUU inJexample). Nonetheless, in the ordinary course of most analysis pipelines UUU  aERawImage data would be expected for the calibrations, and the output iaUU elEdataset would be a Magnetogram of a type determined by the auxiliary emUU  AJinput. Required auxiliary input data are the spatial pixel location (for SKI`t m Hix DeHix g -thUU Moptical geometric effects, the local time of the observation (for instrument lUU@ :status), and the MDI Status (for filter characteristics). )` wriCalibration - Temperature blA` wbrCalibration - Velocity (*) s ZUU isEThe Velocity Calibration algorithm(s) provides for the conversion of rfUU naGvelocity data from one measurement basis to another, specifically from oprUU s,Csets of measured line intensities in known filter positions to the co~UU siMequivalent line-of-sight velocities required to produce a Doppler shift that rUU tpKwould best reproduce the measured line profile. Velocity calibration thus byUU Grests on both models for the formation of the line and the measured or ixeUU Kmodeled response function of the filter system. In the ordinary case, the UU Messential part of the velocity calibration is performed by the instrument in tUU ocJspace, and the only information received in RawImages in velocity mode is ƪUU (fEvery nearly calibrated velocity. True velocity calibrations must be aҪUU Cperformed from filtergram data, but these are required only during ociުUU orHcampaigns in which filtergrams may be observed or in case of failure of a ꪜUU ntJthe onboard calibration processing (degraded science operations). In the UU s Klatter case, the velocity calibration is based on the onboard calibration, sigUU irKwith the same corrections as noted here. It thus makes sense to break the e tUU roBstandard Velocity Calibration algorithm into two parts: the basic UU  fEcalibration, which is assumed generally to be performed on board the e&UU n Jspacecraft, and a recalibration algorithm performed on the results of the 2UU e Kfirst calibration. Development of the former is only necessary for use in Jsp>UU inFverification experiments and as support for possible degraded science JUU@  observations. bUU brGThe Velocity Calibration operates on a per datum basis, independent of thnUU nlNthe structure of the input dataset. Typically the data will represent pixels zUU luFin an image, but they may also represent groups of pixels or isolated UU@ onpixels. e UU` s 7[Primary calibration: same as onboard; refer to IPFR] nboUU igIThe secondary calibration, or recalibration of Doppler data, provides a eªUU heJone-to-one mapping of encoded Doppler values to floating-point velocities ΪUU L[m/sec] or MISSING, corrects for known or inferred variations in the filter oaڪUU UUEpassbands due to changes in instrument state or gradients across the s檊UU Ifield. Details of the latter are To Be Determined. [mapping in IPFR?] rUU determined. In general most observations would presumably be YUU inEperformed with the CCD as near to the focal plane as possible in the eUU riFhigh-resolution mode, and at a constant offset in the full-disc mode, qUU@ nsEso that in the latter case this effect is definitely to be included. tUU t FThe Camera Mapping algorithm is performed on all raw data direct from UU Fthe CCD camera prior to any other mappings. It is not performed when UU Jindividual pixel behaviour is being assessed, nor is it performed on most UU o G"raw" data processed on board (Structure Program), except for possible e iUU glGcorrection of such data representing images. The information from the it,ŪUU eaIfirst part of the mapping, the column and row number of each pixel, must uѪUU UULbe retained through further processing steps for possible corrections based ݪUU@ deon CCD characteristics. oUU prEAs a mapping scheme based on camera readout order, this algorithm is UU siEdesigned to work with an input dataset of type TelemetryStream, with n UU llDcertain essential auxiliary input: the camera readout order for the ctUU beJrelevant frame (which can be associated with the the input dataset) and a %UU reLframe identifier which is associated with the input data. It also requires is1UU n Dcertain permanent or semi-permanent parameters specifying the plate r =UU  mGgeometry, the plate angle, and the focal position; the last presumably m),IUU leEcomes from instrument housekeeping data, the others must be measured UUU omHor inferred from analysis. The output dataset is by definition of type w a~UU elJRawImage, although in the case of single pixel or other special input the m}UU@ ti0RawImage will not be one of the standard forms.   oSKI`AAsLeftSKI`BerRightuorSKI`l FirstUUSKI`two ReferenceiutSKI`UUTOCeSKI` IXceSKI`arFirsttheSKI`foSKI`beSKI`n SKI`C iSKI`ESKI`GfiSKI`Ih SKI`KeqSKI`M n SKI`O i-SKI`Q fySKI`S UUSKI`U atSKI`WitSKI`Y),SKI`[s SKI`]ng i <~;f HQ.  ofu~~ 1Heading 1Heading RulehH:\t.\t FirstBody spfff SEt AffLe 1Step` Step Number S:.\tStepf vHQ u IX~K2HeadingH:\t.\t FirstBodyZ f @ EAuthorPurpose~~ fE`  EquationEquation Number E:(EQ )fff [ff Bullet Bullet Symboli\t~fffff CStep~;fHP u~ AppendixHead 1Heading Rule H:\t\tAppendixp`~~fD  FirstBodyBody~;fQd u~ ReferenceHead 1Heading Rule\t\tReferencesAppendixf CellBodyf  CellHeadingfff CBulletfHQ u~2HeadingH:\t.\t FirstBodyft fExtract fFAB  Figure Table Top Step NumberdF:FIGURE .\tBodyfR Footnotef@ Purpose FirstBodyfffS fff Step Step Number S:.\tf  TableFootnotefT  TableTitle Table Top Step NumberT:TABLE .\tf @<FTitleB Chapter RuleAuthorf!yU  CellHeadingf"xo fCellBodyf#Ts f TableTitle Table Top Step NumberT:TABLE .\t~~f%D  FirstBodyBody~ f&HA u~3HeadingH:\t..\t FirstBodyf'@\ 0aG <%VQTiAppendixAppendixorZ f(pPU AuthorPurposef)w headerf*   a footer leftlf+  footer rightf-w  header firstf.  footer rightfffff/n %ffQ iff x 3HeadingTOCefff0m  ffos 2HeadingTOCcf1~  f 1HeadingTOCcf2n  f FigureLOFcf3n  f TableTitleLOTwf4xr f SeparatorsIXf5xg f SortOrderIX$f6xi Level3IX f7x  Level2IXf8x Level1IX f9w  GroupTitlesIXf:x  IndexIX~ f;wHA 4xur~3HeadingH:\t..\t FirstBodyXZ f=p@ AuthorPurposef>@ Abstract FirstBody ~~fBD Le References References~~ fE Body~~ fF tleBody~~ fH BodyfffJq 4xff rBullet Bullet Symbol\tnfKv@ TNTagAbstractAufL@ TNTagAbstractAb~;fMuHQ u~1Heading 1Heading Rule H:\t.\t FirstBodyfO@<~Title Chapter RuleAuthorfffffT CStepfff_o fCBulletmo" m n Buet pf q Bullet Symbol rag Bullet Symbolas Callout= tEmphasis u Ab v  w  xy H:I@Ogf")uǁ.R]'>M!2FpPv‰ g"@?-)о[ro\/ci0YYH