next up previous contents
Next: The software design Up: Data analysis software Previous: Data analysis software   Contents


Introduction

The correlator output is used for on-line monitoring of the health of the array (cross and self correlated band-shapes, closure phases, amplitude and phase of various baselines, antenna based amplitude and phase, system temperature ($ T^s$), antenna temperature ($ T^a$) variations as a function of time, etc.) as well as for on-line phasing of the array for the phased array mode. Recorded data is extensively used for the measurements of various telescope parameters (baseline calibration, fixed delay calibration, antenna pointing calibration, beam shape measurements at various frequency bands, measurement of $ T^s$, $ T^a$, antenna sensitivity ($ \eta$) and their variation as a function of time/elevation, etc.). All this requires extensive data analysis and data display capabilities to be easily available for on-line and off-line usage. The GMRT correlator produces 435 cross correlations plus 30 self correlations corresponding to 465 complex numbers per integration time per IF per frequency channel. If all the 128 frequency channels are recorded, this corresponds to 465 baselines $ \times$ 128 channels $ \times$ 2 IFs $ \times$ 2 floating point numbers of size 4 bytes each$ =952320$ bytes of data per integration cycle. With a typical observing time of 8 hours with an integration time of $ \sim 20$ seconds, this corresponds to a database of size $ \sim 1.5$ GBytes. Hence the software should also efficiently handle such large multi-dimensional database and allow easy browsing through the database.

The visibility data is an explicit function of the baseline length (projected separation between the antennas). Implicitly however, it is a function of many other parameters like local sidereal time (LST), observing frequency, the antenna co-ordinates, the co-ordinates of the phase center, compensating delay applied to the various antennas, etc. Most of the processing (on-line as well as off-line) requires efficient access to the visibility data potentially as a function of many of these parameters. During the debugging stage of the telescope, it is also important to have a short turn-around time between observations and results. This in turn demands a fairly sophisticated data analysis package to analyze the data recorded in the native recording format as well as evolve continually with the potentially rapidly changing environment (including the format itself!). Preferably, such a data analysis package should also be usable on-line. The application programs must also provide a user interface for the software to be usable by a larger community.

Section 3.2 describes the design of the software system which was designed to meet most of the above mentioned requirements. sanjay/Offline">Section  briefly describes the design of the low level libraries used for accessing the visibility database. Section 3.4 describes the design of the user interface, while Section 3.5 describes some of the application programs used in the observations and data analysis for this dissertation.


next up previous contents
Next: The software design Up: Data analysis software Previous: Data analysis software   Contents
Sanjay Bhatnagar 2005-07-07