Reconstructing 3D dust distributions from dust emission maps with an Invertible Neural Network

Victor Ksoll

Wednesday, Dec. 7th, 14:30CET

The direct reconstruction of 3D distributions (e.g. density and temperature) of interstellar dust from observed dust emission may provide many insights into e.g. cloud properties, stellar feedback mechanisms and the star formation process, but constitutes a difficult inverse problem due to its inherent degeneracy. Invertible Neural Networks (INNs) are a deep learning architecture particularly well suited to solving degenerate inverse problems. Making use of a latent space approach to encode the information lost in the forward mapping, they can estimate the full posterior distributions for the target parameters of an inverse problem. In this talk, we present a proof-of-concept INN approach for the reconstruction of 3D dust distributions from dust emission maps, trained and tested on simulation data. As our training data we employ dust cloud simulations from the Cloud Factory and emulate the corresponding dust emission observations for various infrared and sub-mm observatories using the POLARIS radiative transfer code. Specifically, we focus on dense dust distributions on a subparsec length scale subject to the interstellar radiation field and a single star, mimicking conditions in the compact cloud Rho Ophiuchi A. Adopting a line of sight approach, where we assume that the observed dust emission in a given pixel only depends on the material behind it, we then train a conditional INN (cINN) to recover both the dust density and temperature along the line of sight. Testing on withheld simulation data we find very promising results with this approach for the recovery of dust temperatures and densities, depending on the adopted wavelength coverage.

Background image: Robert Hurt, IPAC