One vs One & One vs All

Binary Class Classification Approach to Solve Multi Class Classification Problem

akhil anand
Artificial Intelligence in Plain English

--

Source

What is Multi Class Classification problem?

When we predict one class out of multi class known as multi class classification .Suppose your mother has given you a task to bring mango from a basket having variety of fruits , so indirectly you mother had told you to solve multi class classification problem.

But our main is to apply the binary classification approach to predict the result from multi class.

Why we need One vs Rest and One vs One?

There are some classification algorithm which has not been made to solve multi class classification problem directly these algorithms are LogisticRegression and SupportVectorClassifier. By applying heuristic approach to these algorithms we can solve multi class classification problem. let’s get started……….

One Vs Rest

Suppose we have three classes: [Machine Learning, Deep Learning, NLP]. We have to find the final prediction result out of it.

Step 1: Take multi class data and split into multiple binary classes. let we have 100 classes . Then by using One Vs Rest: 1 class belong to (class 0) and remaining 999 class will belong to (class 1). This process will happen again and again till each and every class will form their individual Model.

number of model being created = number of class

Figure 1

Step 2: Each model predicts probability whichever model will have highest probability, will be considered for further binary class classification to get final result.

Let we have probabilities for Model 1, Model 2 ,Model 3 are 0.3,0.5,0.2 respectively . As we observe Model 2 has highest probability hence we will use model 2 for final prediction of class o or class 1. i.e;

Model 2: — Deep Learning (Class 0) & [Machine Learning, NLP] (Class 1)

If we will predict class 0 means our predicted result will be Deep Learning else Result will be form class 1 i.e: [Machine Learning, NLP]

# logistic regression for multi-class classification using a one-vs-rest
from sklearn.datasets import make_classification
from sklearn.linear_model import LogisticRegression
from sklearn.multiclass import OneVsRestClassifier
# define dataset
X, y = make_classification(n_samples=1000, n_features=10, n_informative=5, n_redundant=5, n_classes=3, random_state=1)
# define model
model = LogisticRegression()
# define the ovr strategy
ovr = OneVsRestClassifier(model)
# fit model
ovr.fit(X, y)
# make predictions
yhat = ovr.predict(X)

Disadvantage

  • As it makes numbers of model equals to number of classes hence it does slow prediction of output. Means it has high time complexity.
  • If we will have 100s of classes then task will be so much arduous

One Vs One

Suppose you have n number of classes then it will gives model of one class vs another class.let you have four classes in a dataset A,B,C ,D.

Step 1: Convert the multi class dataset into binary class ,Here number of models will be;

Figure 2

Total 6 binary classes will be formed as shown below;

Figure 3

Step 2: Find the probability of each Model whichever will have highest probability that model will give final predicted output output.

Figure 4

As Model 3 has highest probability hence if we will predict class 0 result will be A and prediction of class 1 will give result as D .

# SVM for multi-class classification using one-vs-one
from sklearn.datasets import make_classification
from sklearn.svm import SVC
from sklearn.multiclass import OneVsOneClassifier
# define dataset
X, y = make_classification(n_samples=1000, n_features=10, n_informative=5, n_redundant=5, n_classes=3, random_state=1)
# define model
model = SVC()
# define ovo strategy
ovo = OneVsOneClassifier(model)
# fit model
ovo.fit(X, y)
# make predictions
yhat = ovo.predict(X)

Conclusion

Hope you liked this blog if you have any suggestion about further improvement please do comment below.

If you want to learn more about this topic please click this link

--

--