Norm and Missing Data options

During the 1980s Rubin, Little and others established the statistical foundations of Missing Data problems. A Bayesian statistical justification for Multiple Imputation methods provided a principled approach to “fill in” missing data and pooling estimates across solutions based on completed data. NORM by Joseph Schafer
is a late 1990′s easy to use Windows based program that implements the methods of Rubin and Little. A major benefit of NORM is ease of use and the author’s excellent commentary concerning guidance and theoretical contributions. One downside is that for some workflow styles the interface becomes a burden. In addition, the educational benefit of NORM is not to be overlooked. One the other hand, NORM, SAS and STATA have incorporated missing data routines that are better integrated with their other statistical models/methods. This aspect reduces the burden of use in a more general data analysis setting.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>