This paper provides Monte Carlo results for the performance of the method of moments (MM), maximum likelihood (ML) and ordinary least squares (OLS) estimators of the credit loss distribution implied by the Merton (1974) and Vasicek (1987, 2002) framework when the common or idiosyncratic asset-return factor is non-Gaussian and, thus, the true credit loss distribution deviates from the theoretical one. We find that OLS and ML outperform MM in small samples when the true data-generating process comprises a non-Gaussian common factor. This result intensifies as the sample size increases and holds in all cases. We also find that all three estimators present a large bias and variance when the true data-generating process comprises a non-Gaussian idiosyncratic factor. This last result holds independently of the sample size, across different asset correlation levels, and it intensifies for positive shape parameter values.