The pseudospectra is a helpful tool to analyze the behavior of systems involved with non-normal matrices or linear operators.In this paper,we present a new method to approximate the pseudospetra of large scale matrices.Using the Induced Dimension Reduction iteration (IDR),which was originally proposed for solving systems of linear equations,then we obtain a Hessenberg decomposition,from which we approximate the pseudospectra of a matrix since the IDR iteration is a short-recurrence method which is attractive for large scale computations.Additionally,the IDR polynomial create this Hessenberg decomposition is also used as a filter to discard the unwanted eigenvalues,which is specially constructive and meaningful for computing pseudospectra of large matrices.Numerical experiments and comparisons on the test matrices from the literature show that the proposed method is much more efficient than the Grid-SVD method,inverse Lanczos method,and the implicitly restarted Arnoldi method (IRAM).