Gamma-ray Thermalization and Leakage from Millisecond Magnetar Nebulae: Towards a Self-Consistent Model for Superluminous Supernovae
Superluminous supernovae (SLSNe) are massive star explosions too luminous to
be powered by traditional energy sources, such as radioactive 56Ni. These
transients may instead be powered by a central engine, such as a millisecond
pulsar or magnetar, whose relativistic wind inflates a nebula of high energy
particles and radiation behind the expanding ejecta. We present 3D Monte Carlo
radiative transfer calculations which follow the production and thermalization
of high energy radiation from the nebula into optical radiation and,
conversely, determine the gamma-ray emission that escapes the ejecta without
thermalizing. We track the evolution of photons and matter in a coupled
two-zone ("wind/nebula" and "ejecta") model, accounting for the range of
radiative processes. We identify a novel mechanism by which gamma-gamma pair
creation in the upstream pulsar wind regulates the mean energy of particles
entering the nebula over the first several years after the explosion, rendering
our results on this timescale insensitive to the (uncertain) intrinsic wind
pair multiplicity. To explain the observed late-time steepening of SLSNe
optical light curves as being the result of gamma-ray leakage, the nebular
magnetization must be very low, epsB <~ 1e-6-1e-4. For higher epsB, synchrotron
emission quickly comes to dominate the thermalized nebula radiation, and being
readily absorbed because of its lower photon energies, results in the SN
optical light curve tracking the spin-down power even to late times >~ 1 yr,
inconsistent with observations. For magnetars to remain viable contenders for
powering SLSNe, we conclude that either magnetic dissipation in the wind/nebula
is extremely efficient, or that the spin-down luminosity decays significantly
faster than the canonical dipole rate ~1/t^2 in a way that coincidentally
mimicks gamma-ray escape.