Abstract
An analysis of the role of general relativistic effects on the decay of a neutron star’s magnetic field is presented. At first, a generalized induction equation on an arbitrary static background geometry has been derived and, secondly, by a combination of analytical and numerical techniques, a comparison of the time scales for the decay of an initial dipole magnetic field in flat and curved spacetime is discussed. For the case of very simple neutron star models, rotation is not accounted for and, in the absence of cooling effects, we find that the inclusion of general relativistic effects result, on the average, in an enlargement of the decay time of the field in comparison to the flat spacetime case. Via numerical techniques, we show that the enlargement factor depends upon the dimensionless compactness ratio and for ε in the range 0.3–0.5, corresponding to the compactness ratio of realistic neutron star models, this factor is between 1.2 and 1.3. The present analysis shows that general relativistic effects on magnetic field decay ought to be examined more carefully than hitherto. A brief discussion of our findings on the impact of neutron star physics is also presented.
- Received 1 April 1999
DOI:https://doi.org/10.1103/PhysRevD.61.123004
©2000 American Physical Society