# kernel PCA (under construction)

(55 minutes to learn)

## Summary

-No Summary-

## Notes

This concept is still under construction.

## Context

This concept has the prerequisites:

- principal component analysis
- the kernel trick
- spectral decomposition (The spectral decomposition is used in deriving kernel PCA.)

## Goals

- Derive kernel PCA as a generalization of PCA.

## Core resources (read/watch one of the following)

## -Paid-

→ Pattern Recognition and Machine Learning

A textbook for a graduate machine learning course, with a focus on Bayesian methods.

Location:
Section 12.3, "Kernel PCA," pages 586-590

## Supplemental resources (the following are optional, but you may find them useful)

## -Paid-

→ Machine Learning: a Probabilistic Perspective

A very comprehensive graudate-level machine learning textbook.

Location:
Section 14.4.4, "Kernel PCA," pages 493-496

## See also

- Some other ways of learning nonlinear representations of data:
- feed-forward neural nets , which adapt the representations in the context of some other learning objective
- spectral embeddings , a very similar algorithm based on spectral graph theory