ROSA data reduction pipeline

Listed here are the codes required to reduce raw ROSA images taken from the DST to science ready products for analysis. This code was originally written by David Jess, though improvements have been made by both Dave and Peter Keys over the years.

SOLARNET Compliant ROSA Reduction Pipeline

Significant efforts were made to modernise the reduction pipeline for ROSA under the SOLARNET project which finished in March 2017. The code is now a lot more intuitive for new users and the output is more uniform as it now falls within the guidelines provided in the SOLARNET project. The outputs are inline with those of other telescopes/instruments.

The process is structured as following:

  • Generate directory structures for processing
  • Generate average dark
  • Generate average flat field
  • Generate noise file
  • Generate speckle bursts
  • Speckle the data (outside IDL with KISIP)
  • Convert the speckle products to .fits files (back to IDL)
  • Gross alignment of the images for a single bandpass
  • Destretch images
  • Second fine-scale alignment
  • Establish alignment parameters between cameras
  • Apply these to images if comparing images between cameras
  • Add SOLARNET compliant header information to finished data products
  • Output the calibration images in case of future use
  • Produce a sample thumbnail of images for each filter for the run of the reduced data

The code is a mixture of command line prompts in IDL and a C based GUI for speckling the data with KISIP. The code is designed in such a way so as to minimise input from the user as everything that needs to be edited within the file is edited at the start of processing the data. Also, the code has been updated to allow the user to run the code from any directory, where before the user had to switch directories upon completion of certain data products.

The output of the code now falls within the more commonly used standards employed by other instruments. Complete Level 1 standard data is now stored as 'ROSA_filter_YYYYMMDD_HHMMSS.sss.fits' with the FITS metadata showing all the necessary information for the user (e.g. data PI, exposure, pointing etc…)

A manual is nearly complete for this new pipeline. This manual is supposed to be a step-by-step guide for using ROSA data and acquiring science ready end products from raw data. It is written in the context of a complete novice using the data, explaining the reasons why we do certain things within the process and displaying sample screen outputs while running the code, as well as example errors (and how to fix them, if possible). The manual should be on this web-page in early May (so keep an eye out for it if interested).

Update a rough manual can now be found here. There are some things that need to be added to it and it could be tidied up a bit, however, it should be useful enough as is to help you process ROSA data.

The code is maintained on this page as well. We may migrate it to github shortly to aid in version control, however, at present it is easier to keep it on our own web-page. The newest version is v2.1 and you can download it form this zip file.

Most things are run from 'ROSA_reduction_pipeline.pro'. Some things will need to be edited in some of the batch files attached as well, but what needs to be changed is outlined within the main reduction pipeline program. The code is fairly well annotated at present to explain the various sections.

<Older Reduction Pipeline>

Below is information pertaining to the older version of the code and the codes themselves. They have been included here for now, as a sort of version control by us. If you are just starting out looking at processing ROSA data. Please see the information above as it is more useful, easier to use and recommended by us for you to use the above codes. These are SOLARNET compliant and, therefore, the preferred practice as of April 2017. Have a look at the below information if you are interested in the development of the code or used this version previoulsy.

Please note that at present the reduction pipeline is not automated and requires a good amount of user input. Currently, Peter is working on a more user friendly, slightly more automated version of these codes that minimizes user input. As such, the codes presented on this page are a bit rough around the edges and will likely require the user to have a look through the process (if in doubt contact Peter).

With the user friendly pipeline currently in development, there is also a user manual being produced in tandem with the 'new' codes. This will give a step-by-step guide to using the code to get science ready data with your raw images. The present codes have some notes within the codes describing what is happening but may not have enough detail for some. Again, if in doubt contact Peter. The newer processes will have the associated meta data stored with the final data products in accordance with those outlined by the SolarNET Project.

The process is structured as following:

  • Generate average dark
  • Generate average flat field
  • Generate noise file
  • Generate speckle bursts
  • Speckle the data (outside IDL with KISIP)
  • Convert the speckle products to .fits files (back to IDL)
  • Gross alignment of the images for a single bandpass
  • Destretch images
  • Second fine-scale alignment
  • Establish alignment parameters between cameras
  • Apply these to images if comparing images between cameras

All of this is outlined in the batch file ROSA_data_prep_KISIP.bat. The user will be guided to external programs when necessitated by the stage of the reduction pipeline. The newer 512×512 cameras which are standard now in ROSA setups follow the Halpha_data_prep_KISIP.bat file until the data is speckled before returning to the programs highlighted by ROSA_data_prep_KISIP.bat (taking care to remember the image dimensions of these cameras).

Another program of note is load_speckle_im.pro. Developed by the IBIS team, it is able to load in the post KISIP images to look at prior to converting back to FITS files to ensure that the speckle process completed successfully.

Finally, a note on how we usually structure our directories for ROSA images. Raw data (pre-KISIP) goes into the Raw directory while post-speckle data goes into the Reconstructed folder before going into further sub-directories (speckled, mid_processed, processed) depending on the stage of the reduction process the images are in. The user will likely need to alter this for their own cluster/computer system.

The necessary codes can be obtained from this zip file.

public/research_areas/solar_physics/rosa_reduction_pipeline.txt · Last modified: 2017/06/13 10:40 by Peter Keys

Back to Top Sitemap News