arXiv:hep-ph/9308249AbstractReferencesReviewsResources
Cancellation of Infrared Divergences in Thermal QED
Published 1993-08-09Version 1
As a preliminary step, the radiation produced by a classical charged current coupled to a quantized $A_{\mu}$ is solved. To each order in $\alpha$, all infrared divergences cancel between the virtual $\gamma$'s and the real $\gamma$'s absorbed from the plasma or emitted into the plasma. When all orders of perturbation theory are summed, the finite answer predicts a suppression of radiation with $\omega< \alpha T$. The analysis of QED then consists of two steps. First, a general probability at $T\neq 0$ is organized so that all the virtual $e^{\pm},\gamma$ are in the amplitudes and all the real $e^{\pm},\gamma$ are in the phase space integrations. Next, the cancellations of IR divergences between virtual and real are demonstrated.