Using a linear stability analysis and two and three-dimensional nonlinear
simulations, we study the physics of buoyancy instabilities in a combined
thermal and relativistic (cosmic ray) plasma, motivated by the application to
clusters of galaxies. We argue that cosmic ray diffusion is likely to be slow
compared to the buoyancy time on large length scales, so that cosmic rays are
effectively adiabatic. If the cosmic ray pressure $p_{cr}$ is $\gtrsim 25 %$ of
the thermal pressure, and the cosmic ray entropy ($p_{\rm cr}/\rho^{4/3}$;
$\rho$ is the thermal plasma density) decreases outwards, cosmic rays drive an
adiabatic convective instability analogous to Schwarzschild convection in
stars. Global simulations of galaxy cluster cores show that this instability
saturates by reducing the cosmic ray entropy gradient and driving efficient
convection and turbulent mixing. At larger radii in cluster cores, the thermal
plasma is unstable to the heat flux-driven buoyancy instability (HBI), a
convective instability generated by anisotropic thermal conduction and a
background conductive heat flux. Cosmic-ray driven convection and the HBI may
contribute to redistributing metals produced by Type 1a supernovae in clusters.
Our calculations demonstrate that adiabatic simulations of galaxy clusters can
artificially suppress the mixing of thermal and relativistic plasma;
anisotropic thermal conduction allows more efficient mixing, which may
contribute to cosmic rays being distributed throughout the cluster volume.