We consider the variation, with ρ, of the eigenvalues, λ(ρ), and corresponding right and left eigenvectors, x(ρ) (≠ 0) and y(ρ) (≠ 0), of the parameter-dependent quadratic eigenvalue problem
(λ2M + λC + K)(ρ)x(ρ) = 0, yT(ρ)(λ2M + λC + K)(ρ) = 0, (1)
where M, C and K are mappings from the real line to the space of (real or complex) n×n matrices. We assume that (i) (1) has a semisimple eigenvalue of multiplicity r > 1 when ρ = ρ0, and (ii) M, C and K are analytic in some open neighborhood D0 of ρ0, and (iii) λ, x and y are sufficiently differentiable at ρ0. We develop new algorithms for computing derivatives of λ, x and y at ρ0. In practical numerical computation, there is no clear distinction between exact and approximate equality. Standard methods for computing derivatives of λ, x and y in (1) break down when eigenvalues are very close, and methods requiring eigenvalue derivatives to be distinct break down when these derivatives are also very close. By designing algorithms that allow eigenvalues and their derivatives to be multiple, we obtain methods that are much more accurate than classical methods for tightly clustered eigenvalues. In most applications, M, C and K are known in closed form, and hence accurate values of their derivatives at ρ0 are available. Extending ideas of an earlier work of [Andrew & Tan], we use these derivatives to convert what would otherwise be an ill-conditioned problem into a well-conditioned one.