We present a general framework to solve a fundamental problem in random matrix theory, i.e., the problem of describing the joint distribution of eigenvalues of the sum A + B of two independent random Hermitian matrices A and B. We focus on deriving the spectral density of the mixture of adjoint orbits of quantum states in terms of Duistermaat-Heckman measure, originated from the theory of symplectic geometry. Based on this method, we can obtain the spectral density of the mixture of independent random states. In particular, we obtain explicit formulas for the mixture of random qubits.