The asymptotic normality of the likelihood ratio goodness-of-fit statistic is demonstrated for testing the fit of log-linear models with closed form maximum likelihood estimates in sparse contingency tables. Unlike the traditional chi-squared theory, the number of categories in the table increases as the sample size increases, but not all of the expected frequencies are required to become large. Some results of a small Monte Carlo study are presented. The traditional chi-squared approximation is reasonably accurate for the Pearson statistic for many sparse tables, but cases are presented for which it fails. The normal approximation can be much more accurate than the chi-squared approximation for the likelihood ratio statistic, but the bias of estimated moments is a potential problem for very sparse tables.