Quality Control Monitoring for WFCAM
We present the data Quality Control (QC) infrastructure that has been put in place at the Cambridge Astronomical Survey Unit (CASU) to deal with the large data volume produced by WFCAM. QC measures are produced during pipeline processing and saved in the
- PDF / 4,463,015 Bytes
- 7 Pages / 441 x 666 pts Page_size
- 60 Downloads / 237 Views
Abstract. We present the data Quality Control (QC) infrastructure that has been put in place at the Cambridge Astronomical Survey Unit (CASU) to deal with the large data volume produced by WFCAM. QC measures are produced during pipeline processing and saved in the FITS headers and, afterwards, ingested into a relational database that serves a number of report-generating tools. QC measures includes: sky brightness and noise, average stellar ellipticity and seeing, astrometric calibration errors, per-image and nightly-averaged photometric zero points and errors and the number of detected sources per chip. The QC system has been used also to investigate the near-infrared sky brightness at Mauna Kea Observatory over a period of 3 semesters.
1 WFCAM Data Quality Control The United Kingdom Infra-Red Telescope (UKIRT) Wide Field Camera (WFCAM) on Mauna Kea began operations in the first quarter of 2005 and is currently the most capable near-infrared (NIR) imaging survey instrument in the world [1]. The camera is equipped with four Rockwell-Hawaii-II 2 k × 2 k detectors, each covering about 13.7×13.7 arcmin of the sky, with a separation of ≈ 95% of the detector size and is capable of providing ≈ 0.8 deg2 with a 4pointings filled tile, see e.g. [2]. A large fraction of WFCAM time is dedicated to the implementation of the UKIRT Deep Infrared Sky Survey (UKIDSS), a set of five sky surveys with different filter sets, area coverage and scientific objectives [4]. Since WFCAM was commissioned in early 2005, all the raw data have been shipped to Cambridge to be processed by CASU as part of its involvement in the VISTA Data Flow System (VDFS) [5]. During the first three semesters of operations, over 30 Tb of raw data have been pipelineprocessed by CASU producing ≈60 Tb of reduced images (≈3.5×1013 pixels), catalogues (≈109 detected objects) and confidence maps. To deal with such a large data volume in an effective way, we developed a quality control database based on PostgreSQL that stores all the data quality information for every data product and input science frame. In addition several other relevant pieces of information are extracted from the FITS headers and ingested in the database: complete WCS calibration, weather conditions, temperatures, fractional lunar illumination (FLI), Moon angular distance and elevation, dusk and dawn twilight start times, etc. The ingestion
582
M. Riello and M. Irwin
of a pipeline-processed night into the QC database has also proved to be very effective in spotting a number of problems that may occur during the reduction. The QC database has a web-based front-end that is used internally by CASU members to rapidly search for images at a specific position on the sky and, optionally, satisfying user-specified constraints for any QC parameter. The front-end is also complemented with an image cut-out service that is used to create on-the-fly object postage stamps and full-chip previews. The QC database is also used to keep track of the night status within the processing data flow, to flag missing files when the ra
Data Loading...