# Toy Kernels

There is a whole lot of literature on mapping data onto higher dimensional spaces. Another closely related and popular term is the kernel trick. See top publications of Prof. Bernhard Schölkopf for example.

In this short entry, I start with a linearly non-separable data in 2D. I use a mapping to 3D for this data and visualize. This is ofcourse just scratching the surface. In coming few days I am exploring the use of RBF for mapping to Hilbert spaces and possibly use this for transfer learning. I use the mapping $\phi : \Re^2 \to \Re^3$. I define $\phi$ as $(x_1, x_2) \to (x_1^2, x_2^2, \sqrt{2} x_1 x_2)$. Next I plot the resultant 3d data, From this I conclude that a simple mapping like this can transform the data in higher dimensions to make it separable use a standard hyper-plane (linearly separable). In upcoming entries I will try and use the RBF (Radial Basis Functions) to map to infinite dimensional spaces. The toy experiment I describe here can be accomplished with the following python code.


import numpy as np
import numpy.linalg as LA
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D

#
# Generate inner gaussian data
Ds = np.random.multivariate_normal([0,0], [[5,2], [2,5]], 1000)
Ds__ = Ds - np.mean( Ds, axis=0 )

#
# Generate outside external ring data
r = np.random.uniform( 15, 17, 100 )
theta = np.random.uniform( -np.pi, np.pi, 100 )
Dt = np.zeros( (100,2))
Dt[:,0] = r * np.cos( theta )
Dt[:,1] = r * np.sin( theta )
Dt__ = Dt - np.mean( Dt, axis=0 )

D = np.concatenate( (Ds.T, Dt.T), axis=1 )

#
# Mapping with phi : (x1, x2) --> (x1^2,x2^2, sqrt(2) x1*x2)
D_phi3 = np.zeros( (3,D.shape))
D_phi3[0,:] = D[0,:]**2
D_phi3[1,:] = D[1,:]**2
D_phi3[2,:] = np.sqrt(2) * D[0,:] * D[1,:]

#
# Plot
plt.plot( D[0,0:1000], D[1,0:1000], 'r.')
plt.plot( D[0,1000:], D[1,1000:], 'b.')
plt.show()

fig = plt.figure()