arXiv:2105.02512 [hep-ex]AbstractReferencesReviewsResources
Development of Convolutional Neural Networks for an Electron-Tracking Compton Camera
Tomonori Ikeda, Toru Tanimori, Atsushi Takada, Yoshitaka Mizumura, Kei Yoshikawa, Mitsuru Abe, Shingo Ogio, Yura Yoshida, Masaya Tsuda, Shinya Sonoda
Published 2021-05-06Version 1
Electron-tracking Compton camera, which is a complete Compton camera with tracking Compton scattering electron by a gas micro time projection chamber, is expected to open up MeV gamma-ray astronomy. The technical challenge for achieving several degrees of the point spread function is the precise determination of the electron-recoil direction and the scattering position from track images. We attempted to reconstruct these parameters using convolutional neural networks. Two network models were designed to predict the recoil direction and the scattering position. These models marked 41$~$degrees of the angular resolution and 2.1$~$mm of the position resolution for 75$~$keV electron simulation data in Argon-based gas at 2$~$atm pressure. In addition, the point spread function of ETCC was improved to 15$~$degrees from 22$~$degrees for experimental data of 662$~$keV gamma-ray source. These performances greatly surpassed that using the traditional analysis.