As you saw, support vector machines are another classification tool to add to your repertoire. While computationally expensive, they can be powerful tools providing substantial performance gains in many instances.
You will be able to:
- Recall the importance of the c parameter for SVMs
- Recall common kernel functions and their associated hyperparameters
Probably the number one piece of information worth reviewing is some of the various kernel functions that you can apply:
- Radial Basis Functions (RBF)
c
-
$\gamma$ , which can be specified using keywordgamma
in sci-kit learn
- Polynomial Kernel
-
$\gamma$ , which can be specified using keywordgamma
in sci-kit learn -
$r$ , which can be specified using keywordcoef0
in sci-kit learn -
$d$ , which can be specified using keyworddegree
in sci-kit learn
-
- Sigmoid Kernel
-
$\gamma$ , which can be specified using keywordgamma
in sci-kit learn -
$r$ , which can be specified using keywordcoef0
in sci-kit learn
-
Also recall that in general, c
is the parameter for balancing standard accuracy metrics for tuning classifiers versus the decision boundary distance.
While it may appear that this section is a bit brief, Support Vector Machines are a powerful algorithm that deserve attention. In many cases, SVMs will have the top performance of out of the box classifiers from sci-kit learn. Moreover, learning to properly tune SVMs using kernels and an appropriate c value is critical. In the upcoming labs and lessons you'll further investigate and apply these concepts.