Assignment 6: Support Vector Machine

$24.99 $18.99

1 Concepts Explain the following terms and how they are related to SVM in your own words and with (visual) examples: Linear separability Slack variables Kernel functions Perceptron De ne the classi cation function for the perceptron classi er. The dataset for the OR function is given by: 3 0 0 60 17 X=6 7…

5/5 – (2 votes)

You’ll get a: zip file solution

 

Categorys:

Description

5/5 – (2 votes)

1 Concepts

Explain the following terms and how they are related to SVM in your own words and with (visual) examples:

  1. Linear separability

  1. Slack variables

  1. Kernel functions

  • Perceptron

    1. De ne the classi cation function for the perceptron classi er.

    1. The dataset for the OR function is given by:

  • 3

0 0

60 17

X=6 7

41 05

1 1

> y = 1 1 1 1

Given the initial weights of w = 1 1 0:5 , where w3 is the bias. Perform the

perceptron algorithm (slide 10) with = 0:6 until all data points are correctly

classi ed. Show your computations for each training step. (Note: In the case of w x = 0 output 1.)

3. Prove that the XOR function cannot be represented by a (linear) perceptron.

3 Polynomial Kernel

de ned as:

i

x

i1

x

i2

> is

The second-order polynomial kernel for a two-dimensional vector x

=

(xi) = 2

p

x2

3

2xi1xi2

4

i1

5

xi22

Show that the mapping of the two-dimensional vector to three dimensions is not nec-essary for calculating the scalar product h (xi); (xj )i. (Note: Transform the equation such that it only uses the scalar product of two-dimensional vectors.)

4 Gaussian Kernel

For all students other than B.Sc. Data Science.

Slide 69 mentions that the Gaussian kernel, also called Radial Basis Function (RBF), projects to an in nite dimensional feature space. Give an intuition on why this is the case and prove it. (Note: Use the Taylor expansion over ex to show that the Gaussian kernel is an in nite sum over the polynomial kernels.)

2

Assignment 6: Support Vector Machine
$24.99 $18.99