Measuring the Scatter in the Cluster Optical Richness-Mass Relation with Machine Learning
MetadataShow full item record
The distribution of massive clusters of galaxies depends strongly on the total cosmic mass density, the mass variance, and the dark energy equation of state. As such, measures of galaxy clusters can provide constraints on these parameters and even test models of gravity, but only if observations of clusters can lead to accurate estimates of their total masses. Here, we carry out a study to investigate the ability of a blind spectroscopic survey to recover accurate galaxy cluster masses through their line-of- sight velocity dispersions (LOSVD) using probability based and machine learning methods. We focus on the Hobby Eberly Telescope Dark Energy Experiment (HETDEX), which will employ new Visible Integral-Field Replicable Unit Spectrographs (VIRUS), over 420 degree2 on the sky with a 1/4.5 fill factor. VIRUS covers the blue/optical portion of the spectrum (3500 - 5500 Å), allowing surveys to measure redshifts for a large sample of galaxies out to z < 0.5 based on their absorption or emission (e.g., [O II], Mg II, Ne V) features. We use a detailed mock galaxy catalog from a semi-analytic model to simulate surveys observed with VIRUS, including: (1) Survey, a blind, HETDEX-like survey with an incomplete but uniform spectroscopic selection function; and (2) Targeted, a survey which targets clusters directly, obtaining spectra of all galaxies in a VIRUS-sized field. For both surveys, we include realistic uncertainties from galaxy magnitude and line-flux limits. We benchmark both surveys against spectroscopic observations with \perfect" knowledge of galaxy line-of-sight velocities. With Survey observations, we can recover cluster masses to ~ 0.1 dex which can be further improved to < 0.1 dex with Targeted observations. This level of cluster mass recovery provides important measurements of the intrinsic scatter in the optical richness-cluster mass relation, and enables constraints on the key cosmological parameter, σ8, to < 20%. As a demonstration of the methods developed previously, we present a pilot survey with integral field spectroscopy of ten galaxy clusters optically selected from the Sloan Digital Sky Survey's DR8 at z = 0.2 – 0.3. Eight of the clusters are rich (λ > 60) systems with total inferred masses (1.58 -17.37) ×1014 Mʘ (M200c), and two are poor (λ < 15) systems with inferred total masses ~ 0.5 × 1014 Mʘ (M200c). We use the Mitchell Spectrograph, (formerly the VIRUS-P spectrograph, a prototype of the HETDEX VIRUS instrument) located on the McDonald Observatory 2.7m telescope, to measure spectroscopic redshifts and line-of-sight velocities of the galaxies in and around each cluster, determine cluster membership and derive LOSVDs. We test both a LOSVD-cluster mass scaling relation and a machine learning based approach to infer total cluster mass. After comparing the cluster mass estimates to the literature, we use these independent cluster mass measurements to estimate the absolute cluster mass scale, and intrinsic scatter in the optical richness-mass relationship. We measure the intrinsic scatter in richness at fixed cluster mass to be σMǀλ = 0.27 ± 0.07 dex in excellent agreement with previous estimates of σMǀλ ~ 0.2 – 0.3 dex. We discuss the importance of the data used to train the machine learning methods and suggest various strategies to import the accuracy of the bias (offset) and scatter in the optical richness-cluster mass relation. This demonstrates the power of blind spectroscopic surveys such as HETDEX to provide robust cluster mass estimates which can aid in the determination of cosmological parameters and help to calibrate the observable-mass relation for future photometric large area-sky surveys.
Boada, Steven Alvaro (2016). Measuring the Scatter in the Cluster Optical Richness-Mass Relation with Machine Learning. Doctoral dissertation, Texas A & M University. Available electronically from