language-icon Old Web
English
Sign In

Diagnostic odds ratio

In medical testing with binary classification, the diagnostic odds ratio is a measure of the effectiveness of a diagnostic test. It is defined as the ratio of the odds of the test being positive if the subject has a disease relative to the odds of the test being positive if the subject does not have the disease.The diagnostic odds ratio is defined mathematically as:The diagnostic odds ratio ranges from zero to infinity, although for useful tests it is greater than one, and higher diagnostic odds ratios are indicative of better test performance. Diagnostic odds ratios less than one indicate that the test can be improved by simply inverting the outcome of the test – the test is in the wrong direction, while a diagnostic odds ratio of exactly one means that the test is equally likely to predict a positive outcome whatever the true condition – the test gives no information.The diagnostic odds ratio may be expressed in terms of the sensitivity and specificity of the test:The log diagnostic odds ratio is sometimes used in meta-analyses of diagnostic test accuracy studies due to its simplicity (being approximately normally distributed).Consider a test with the following 2×2 confusion matrix:The diagnostic odds ratio is undefined when the number of false negatives or false positives is zero – if both false negatives and false positives are zero, then the test is perfect, but if only one is, this ratio does not give a usable measure. The typical response to such a scenario is to add 0.5 to all cells in the contingency table, although this should not be seen as a correction as it introduces a bias to results. It is suggested that the adjustment is made to all contingency tables, even if there are no cells with zero entries.

[ "Confidence interval", "Odds ratio", "Receiver operating characteristic", "Meta-analysis", "diagnostic accuracy" ]
Parent Topic
Child Topic
    No Parent Topic