A standard approach to calculate the roots of a univariate polynomial is to compute the eigenvalues of an associated confederate matrix instead, such as, for instance, the companion or comrade matrix. The eigenvalues of the confederate matrix can be computed by Francis's QR algorithm. Unfortunately, even though the QR algorithm is provably backward stable, mapping the errors back to the original polynomial coefficients can still lead to huge errors. However, the latter statement assumes the use of a non-structure-exploiting QR algorithm. In [J. L. Aurentz et al., Fast and backward stable computation of roots of polynomials, SIAM J. Matrix Anal. Appl., 36 (2015), pp. 942-973] it was shown that a structure-exploiting QR algorithm for companion matrices leads to a structured backward error in the companion matrix. The proof relied on decomposing the error into two parts: a part related to the recurrence coefficients of the basis (a monomial basis in that case) and a part linked to the coefficients of the original polynomial. In this article we prove that the analysis can be extended to other classes of comrade matrices. We first provide an alternative backward stability proof in the monomial basis using structured QR algorithms; our new point of view shows more explicitly how a structured, decoupled error in the confederate matrix gets mapped to the associated polynomial coefficients. This insight reveals which properties have to be preserved by a structure-exploiting QR algorithm to end up with a backward stable algorithm. We will show that the previously formulated companion analysis fits into this framework, and we analyze in more detail Jacobi polynomials (comrade matrices) and Chebyshev polynomials (colleague matrices).
Structured backward errors in linearizations
Robol L.;
2021
Abstract
A standard approach to calculate the roots of a univariate polynomial is to compute the eigenvalues of an associated confederate matrix instead, such as, for instance, the companion or comrade matrix. The eigenvalues of the confederate matrix can be computed by Francis's QR algorithm. Unfortunately, even though the QR algorithm is provably backward stable, mapping the errors back to the original polynomial coefficients can still lead to huge errors. However, the latter statement assumes the use of a non-structure-exploiting QR algorithm. In [J. L. Aurentz et al., Fast and backward stable computation of roots of polynomials, SIAM J. Matrix Anal. Appl., 36 (2015), pp. 942-973] it was shown that a structure-exploiting QR algorithm for companion matrices leads to a structured backward error in the companion matrix. The proof relied on decomposing the error into two parts: a part related to the recurrence coefficients of the basis (a monomial basis in that case) and a part linked to the coefficients of the original polynomial. In this article we prove that the analysis can be extended to other classes of comrade matrices. We first provide an alternative backward stability proof in the monomial basis using structured QR algorithms; our new point of view shows more explicitly how a structured, decoupled error in the confederate matrix gets mapped to the associated polynomial coefficients. This insight reveals which properties have to be preserved by a structure-exploiting QR algorithm to end up with a backward stable algorithm. We will show that the previously formulated companion analysis fits into this framework, and we analyze in more detail Jacobi polynomials (comrade matrices) and Chebyshev polynomials (colleague matrices).File | Dimensione | Formato | |
---|---|---|---|
prod_468672-doc_189437.pdf
accesso aperto
Descrizione: Structured backward errors in linearizations
Tipologia:
Versione Editoriale (PDF)
Dimensione
623.25 kB
Formato
Adobe PDF
|
623.25 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.