The Loyola Experience (1993-2009): Optimal Data Analysis in the Department of Psychology

The Loyola Experience (1993-2009): Optimal Data Analysis in the Department of Psychology

Fred B. Bryant, Ph.D.

Loyola University Chicago

This article traces the origins and development of the use of optimal data analysis (ODA) within the Department of Psychology at Loyola University Chicago over the past 17 years. An initial set of ODA-based articles by Loyola faculty laid the groundwork for a sustained upsurge in the use of ODA among graduate students which has lasted for more than a decade and a half. These student projects subsequently fueled an increase in ODA-based publications by other Loyola Psychology faculty, who directly supervised the various student projects. Thus, ODA initially trickled down from faculty to students, but later grew up in the opposite direction. The most frequent use of ODA in Loyola’s Psychology Department has been to conduct classification tree analysis, with less common uses of ODA including optimal discriminant analysis and the iterative structural decomposition of transition tables. As more Loyola Psychology graduate students find academic jobs and continue using ODA in their research, we expect that they will replicate the Loyola experience in these new academic settings.

View journal article

Optimal Data Analysis: A General Statistical Analysis Paradigm

Optimal Data Analysis: A General Statistical Analysis Paradigm

Paul R. Yarnold, Ph.D., and Robert C. Soltysik, M.S.

Optimal Data Analysis, LLC

Optimal discriminant analysis (ODA) is a new paradigm in the general statistical analysis of data, which explicitly maximizes the accuracy achieved by a model for every statistical analysis, in the context of exact distribution theory. This paper reviews optimal
analogues of traditional statistical methods, as well as new special-purpose models for which no conventional alternatives exist.

View journal article

Maximizing Accuracy of Classification Trees by Optimal Pruning

Maximizing Accuracy of Classification Trees by Optimal Pruning

Paul R. Yarnold, Ph.D., and Robert C. Soltysik, M.S.

Optimal Data Analysis, LLC

We describe a pruning methodology which maximizes effect strength for sensitivity of classification tree models. After deconstructing the initial “Bonferroni-pruned” model into all possible nested sub-branches, the sub-branch which explicitly maximizes mean sensitivity is identified. This methodology is illustrated using models predicting in-hospital mortality of 1,193 (Study 1) and 1,660 (Study 2) patients with AIDS-related Pneumocystis carinii pneumonia.

View journal article

Two-Group MultiODA: A Mixed-Integer Linear Programming Solution with Bounded M

Two-Group MultiODA: A Mixed-Integer Linear Programming Solution with Bounded M

Robert C. Soltysik, M.S., and Paul R. Yarnold, Ph.D.

Optimal Data Analysis, LLC

Prior mixed-integer linear programming procedures for obtaining two-group multivariable optimal discriminant analysis (MultiODA) models require estimation of the value of a parameter, M. A new formulation is presented which establishes a lower bound for M, which executes more quickly than prior formulations. A sufficient condition for the nonexistence of classification gaps and ambiguous solutions, optimal weighted classification, use of nonlinear terms, selecting an optimal subset of attributes, and aggregation of duplicate observations are discussed. When the design involves six or fewer binary attributes, MultiODA models may easily be obtained for massive samples.

View journal article

Unconstrained Covariate Adjustment in CTA

Unconstrained Covariate Adjustment in CTA 

Paul R. Yarnold, Ph.D. and Robert C. Soltysik, M.S.

Optimal Data Analysis, LLC

In traditional statistical covariate analysis it is common practice to force entry of the covariate into the model first, to eliminate the effect of the covariate (i.e., “equate the groups”) on the dependent measure. In contrast, in CTA the covariate is treated as an ordinary attribute which must compete with other eligible attributes for selection into the model based on operator-specified options. This paper illustrates optimal covariate analysis using an application involving predicting patient in-hospital mortality via CTA.

View journal article

Maximizing the Accuracy of Probit Models via UniODA

Maximizing the Accuracy of Probit Models via UniODA 

Barbara M. Yarnold, J.D., Ph.D. and Paul R. Yarnold, Ph.D.

Optimal Data Analysis, LLC

Paralleling the procedure used to maximize ESS of linear models derived using logistic regression analysis or Fisher’s discriminant analysis, univariate optimal discriminant analysis (UniODA) is applied to the predicted response function values provided by a
model derived by probit analysis (PA), and returns an adjusted decision criterion for making classification decisions. ESS obtains its theoretical maximum value with this adjusted decision criterion, and the ability of the PA model to return accurate classifications is optimized. UniODA-refinement of a PA model is illustrated using an example involving political science analysis of federal courts.

View journal article