- Mastering Machine Learning with R(Second Edition)
- Cory Lesmeister
- 344字
- 2021-07-09 18:23:57
Logistic Regression and Discriminant Analysis
"The true logic of this world is the calculus of probabilities."
- James Clerk Maxwell, Scottish physicist
In the previous chapter, we took a look at using Ordinary Least Squares (OLS) to predict a quantitative outcome, or in other words, linear regression. It is now time to shift gears somewhat and examine how we can develop algorithms to predict qualitative outcomes. Such outcome variables could be binary (male versus female, purchase versus does not purchase, tumor is benign versus malignant) or multinomial categories (education level or eye color). Regardless of whether the outcome of interest is binary or multinomial, the task of the analyst is to predict the probability of an observation belonging to a particular category of the outcome variable. In other words, we develop an algorithm in order to classify the observations.
To begin exploring classification problems, we will discuss why applying the OLS linear regression is not the correct technique and how the algorithms introduced in this chapter can solve these issues. We will then look at a problem of predicting whether or not a biopsied tumor mass is classified as benign or malignant. The dataset is the well-known and widely available Wisconsin Breast Cancer Data. To tackle this problem, we will begin by building and interpreting logistic regression models. We will also begin examining methods so as to select features and the most appropriate model. Next, we will discuss both linear and quadratic discriminant analyses and compare and contrast these with logistic regression. Then, building predictive models on the breast cancer data will follow. Finally, we will wrap it up by looking at multivariate regression splines and ways to select the best overall algorithm in order to address the question at hand. These methods (creating test/train datasets and cross-validation) will set the stage for more advanced machine learning methods in subsequent chapters.