Programs ======== Held in: $HOME/bin The contents of bin/ is (those marked with an * are deprecated): CRON_DAILY* cron csh script for running daily telemtry fetch DAILY* csh script for running daily telemtry fetch DR csh script for fetching data recorder telemtry LATE* csh script for running late telemtry fetch MERGE csh script for running daily telemtry fetch REAL csh script for realtime telemtry fetch SOCKFILE_LATE socket description for non-realtime ops SOCKFILE_REAL socket description for realtime ops ccsds_check dumps all ccsds packet headers ccsds_md_check dumps mission data ccsds packet headers ccsds_md_hdr_dump.pl dumps mission data headers ccsds_md_splice joins mission data archives together (incomplete) ccsds_md_split_check determines which mission data archive are split ccsds_time_check.pl time check for ccsds packets ccsds_time_check1.pl ditto ccsdshdrs dumps first & last ccsds packets into a file called ccsds.txt in the current directory cron_daily_status.pl* daily telemetry processing controlled by cron daily* daily telemtry fetch daily.sh* daily telemtry fetch daily_merge daily pipeline telemetry processing controlled by cron daily_passes telemetry fetch of all data recorder contacts for a given day daily_pdfs produces daily plots of status (housekeeping) data daily_pdfs.sh* bash shell for the above daily_status.pl* daily status checking djpeg jpeg decompression for mission data dpcmtopnm dpcm decompression for mission data foo.pl test program htmlify.pl turns ascii text into html late c program for fetching telemetry from Sirius (called from wrapper programs) mdhdrs dumps mission data headers into a file called hdrs.txt in the current directory mem2ascii c program for dumping memory data mem_recon.pl converts memory packets into source form memxtract c program for extracting memory data merge c program for getting merged telemetry from Sirius (called from wrapper programs) mkdartsdirs cron script for making directories in DARTS parse_contact_file perl script for parsing the contacts file into executable lines (for the sdtp program) pass_list_cron.csh* crom job for getting the contacts file pass_list_cron.pl crom job for getting the contacts file prgram_hexit.pl* generates progrm hex files real c program for getting merged telemetry from Sirius (called from wrapper programs) sbtime link to J-side supplied time conversion sdtp wrapper program for accessing the data distributor and Sirius sirius wrapper program for accessing the Sirius telemetry provider system The soure code for most programs is in: $HOME/src Local directory structure ========================= The root directory for local data is: $HOME/work/localdata/ Under the root directory are: log/ pdfs/ pipeline/ sdtp/ All telemetry is stored under the sdtp directory: cal/ ccsds/ cron/ daily/ decompressed/ dmp/ dr/ fits/ late/ md/ merge/ plots/ real/ scratch/ status/ The decompressed directory holds all ccsds mission data packet archives after bit and image decompression has been performed - useful for a quick look at the mission data. Sub-directories of the form yyyymmdd have been made to keep things tidy - these are temporary and should be deleted from time to time. The fits directory holds: mission/ status/ scratch/ The mission directory is the default directory for making mission data fits files by xmkfits. To keep things organized directories of the form yyyymmdd have been created inside the mission/ directory. These are all temporary and can be (should be) deleted from time to time so that we do not run out of disk space. The status/ directory holds the hpusekeeping fits data and is organised on the same lines as the mission/ directory. The scratch/ directory is for temporary files etc and it's contents should be deleted regularly. The cal/ directory could contain calibration archives - it is empty at the moment. DARTS directories ================= The DARTS directories are organised as follows: $HOME/data/mission/yyyy/mm/dd $HOME/data/status/yyyy/mm/dd A staging area has also been made to hold temporary files for external download: $HOME/data/staging/ $HOME/data/staging/mission/ $HOME/data/staging/status/ Files are moved or copied to these areas in the normal way. Daily operations ================ 1. Getting the data ------------------- To get the data generated from the previous day, on pg1 type: [sbukeis@pg1 20061122]$ daily_passes yyyymmdd where yyyymmdd is the previous day's date. For example: [sbukeis@pg1 20061122]$ daily_passes 20061122 will fetch all the data recorder dump telemetry for 22 November 2006. This program will fetch data for any day as long as there is a contacts file (see later) available. The data are stored as ccsds packets. This program will create 5 new directories (if they do not already exist): ~/work/localdata/sdtp/md/yyyymmdd ~/work/localdata/sdtp/status/yyyymmdd ~/work/localdata/sdtp/fits/mission/yyyymmdd ~/work/localdata/sdtp/fits/status/yyyymmdd ~/work/localdata/sdtp/plots/yyyymmdd The program will then look for a contacts file in ~/work/pass_list with the name yyyymmdd_contacts.txt and read in the daily data recorder dump contacts. For each contact it finds, the program will start the sdtp program, which gets the telemetry from the distributor. All the telemetry packets are written to the directory: ~/work/localdata/sdtp/dr Once all the data recorder dump passes have been accessed, the daily_passes program moves the data to the newly created directories: mission data (eis_md_*) files are moved to sdtp/md status data (eis_sts*_) files are moved to sdtp/status memory dump data (if any) are moved to sdtp/status 2. Checking housekeeping data ----------------------------- NOTE: due to the irregular solarsoft updating and ongoing development of some IDL code always cd to $HOME/work/idl and run SSWIDL from there. If not running, start sswidl in the $HOME/idl directory. In IDL, run: auto_mk_fits,'$HOME/work/localdata/sdtp/status/yyyymmdd','$HOME/work/localdata/sdtp/fits/status/yyyymmdd/' (substituting the date used to run daily_passes for yyyymmdd). To make the daily plots, in IDL run: eis_daily_plots,'$HOME/work/localdata/sdtp/fits/status/yyyymmdd/','$HOME/work/localdata/sdtp/plots/yyyymmdd/' To get an ascii listing of the status values, in IDL run: eis_daily_check,'$HOME/work/localdata/sdtp/fits/status/yyyymmdd/' This will print out a selection of housekeeping parameters. More later. The above has been superceded with the command: sts_ql,'yyyymmdd' 3. Checking the mission data, part 1 ------------------------------------ Once the data has been retrieved, change directory to where daily_passes has put the mission data: cd ~/work/localdata/sdtp/md/yyyymmdd The mission data files may or may not be complete, depending on what EIS was doing during the data recorder dump. So to see which archives are complete, type: ccsds_md_split_check . This will print a list of files and instructions on what to do with the files. For example: standalone ./eis_md_20061122_0641069187 join 1 ./eis_md_20061122_0645434307 ./eis_md_20061122_0809562667 ./eis_md_20061122_0809566143 ./eis_md_20061122_0819193253 discard 3 ./eis_md_20061122_1310445379 ./eis_md_20061122_1448482002 discard 3 ./eis_md_20061122_1448494795 The files marked as standalone are complete and are amenable for further processing. The discard files are mission data files which do not start with the beginning header packet or are pixel data packets found after a completed archive. The join archives, once joined together, will give a complete data set. There is a problem here as sometimes the first few packets of a following archive are repeats of the last few of the previous archive. This information can be saved to a file using redirect on the command line. This information will be input for ccsds_md_splice.pl which will join archives together if needed and move them to a directory specified in the command line. This program has not yet been finished. A work around procedure is to create a directory in the sdtp/md/yyyymmdd directory (called join for example) and at the command line issue the following command: cat file1 file2 ... > join/file1 (where file1, file2, etc are the names of the files following the join command in the output from ccsds_md_split_check). The mdhdrs program will create a file called hdrs.txt in the current directory containg information about the mission data archives. The number for the sequence id (SiD) will be the same as the study id number generated by the planning tool/cpt. To decompress the complete archives, in a sswidl window run: eis_md_decomp,'$HOME/work/localdata/sdtp/md/yyyyhhdd/filename' This will write decompressed files (still in ccsds format) to: $HOME/work/localdata/sdtp/decompressed where they can be displayed using xfiles. If anything goes wrong with the decompression of any file then it is important to remove any output before the program failed - remove the partially decompressed file from the sdtp/decompressed/ directory. This is because during the decompression process the output file is appended to so performing multiple decompresses on the same file will result in incorrect output files. 5. Creating mission data fits ----------------------------- In an sswidl session (for now in the $HOME/idl directory), run: xmkfits This will bring up a gui. The default source (ccsds archives) telemetry catalog is $HOME/work/localdata/sdtp/decompressed/. The default output directory is $HOME/work/localdata/sdtp/fits/mission/.Select a file from the telemetry catalog list and press the 'Create FITS' button. If all goes well a fits file will be written to the output directory. A message will appear in the message box giving information about the new filename. 6. Pass lists ------------- Go to the directory ~/work/pass_list To create a new contacts file, type: parse_contact_file yyyymmdd pass_list.txt [> yyyymmdd_contacts.txt] This creates a text file which you can use as input to daily_passes or as individual commands. The pass list is copied from the eisco machine by the cron job created by someone - Narukage, I think). The cron job also parses the new pass_list and creates the contact files for the next 6 days' contacts. created by someone. 7. Getting the mission data, part 2 ----------------------------------- The daily_passes program is a convenience for getting the telemetry data. Other programs are used to get finer control of the fetch process. The sirius program access the distributer in the same way as daily_passes but does not create new directories or move data around. Usage: sirius start_date end_date start_time Example: sirius 20061122 20061123 0845 will get all mission data, status and memory dump packets for 24 hours from 0845 starting from the 22nd of November. This program access the sirius telemetry database and uses the merge telemetry stream which means that only data received a week previously or longer can be fetched. The sdtp program is the low level telemetry fetch program and is used to get data from the merged, late buffering or realtime contacts telemetry streams (merged is the week old data, late buffering and realtime contacts can be accessed from about half an hour after a pass). Usage: sdtp [real|late|merge|dr] ant [band=1|2|3] sdate=yyyymmdd [edate=yyyymmdd] stime=hhmm etime=hhmm [verbose] [-n] [help] The command line arguments can be in any order. Options: [real|late|merge|dr] specifies which telemetry stream you want. merge is the week old data, properly sorted and spliced together (in the case of large mission data sets). real is for realtime contacts, dr for data recorder dump passes. late can access semi-merged data. ant is currently one of [usc20|usc34|sval|gna|gnb|msp1|allant]. For real, late and dr options the ant option must agree with what is specified in a contacts files in $HOME/pass_list. For example, from $HOME/work/pass_list/20061122_contacts.txt: sdtp dr band=2 sval sdate=20061122 stime=0006 etime=0016 this pass is a data recorder dump pass using the Svalbard ground station. Substituting real instead of dr will not get any data. band=[1|2|3] specifies which data stream you want. 1 is S-band (usually only status and memory dump packets), 2 is X-band (usually mission data and status packets), 3 is all bands. 3 is the most useful. sdate is the starting date for the telemetry access. edate is the ending date for the telemetry access. This argument is optionall. If not specified then the sdtp program will usd the starting date by default, limiting the access to 23 hours, 59 minutes and 59 seconds. stime is the start time for the telemetry access. etime is the end time for the telemetry access verbose just prints out information about what the program is doing. -n will print out what the sdtp program would do, but won't actually do it. help prints out the usage line. Depending on which access mode ([real|late|merge|dr]) the dat will end up in one of the sdtp/real, sdtp/late, sdtp/merge or sdtp/dr directories. Housekeeping ============ Remove directories in: $HOME/work/localdata/sdtp/md /status /decompressed and files in: $HOME/work/localdata/sdtp/real /dr /merge /late /decompressed every so often.