The Terahertz Intensity Mapper (TIM) is a NASA-funded balloon-borne telescope that aims to measure the [CII] emission from star forming galaxies over an enormous cosmic volume. TIM’s cryogenic receiver, based on the BLAST-TNG design, utilizes a three-stage He sorption refrigerator backed by a 280-liter liquid helium tank to achieve a base temperature of 250 mK, which enables photon noise-limited performance for its MKID detectors. Two low-impedance multi-channel heat exchangers enhance cooling efficiency, contributing to a designed hold time of 20 days under ground conditions. Preliminary simulations and assembly tests showcase the cryostat's reliability, and data validation is anticipated by the summer of 2024. We will present the design and current status of the TIM cryogenic receiver and our ongoing characterization effort toward an Antarctic flight in 2026.
We present preparation for fabrication and deployment of science-grade kilo-pixel Kinetic Inductance Detector (KID) based arrays for the Terahertz Intensity Mapper (TIM). TIM is a NASA-funded balloon-borne experiment planning its Antarctic flight for 2026. TIM employs two focal planes, each with four subarrays of ~900 hexagonal-packed, horn-coupled aluminum KIDs. Fabrication yield is high, and we have successfully mapped KID resonant frequencies to spatial locations with our LED mapper. The spatial and frequency information associated with every yielded pixel allows a study of spatial coincidences as cosmic rays interact with the array, as well as interpretation of a covariance analyses performed on the noise timestreams. We also describe the improvement on the science-usable yield of our 864-pixel array achieved by (1) the lithographic trimming that de-collides resonators, and (2) our characterization of interpixel crosstalk. This pioneering work on the postprocessing will pave the way for science with our large KID arrays.
The Rubin Observatory’s Data Butler is designed to allow data file location and file formats to be abstracted away from the people writing the science pipeline algorithms. The Butler works in conjunction with the workflow graph builder to allow pipelines to be constructed from the algorithmic tasks. These pipelines can be executed at scale using object stores and multi-node clusters, or on a laptop using a local file system. The Butler and pipeline system are now in daily use during Rubin construction and early operations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.