## Stuck on many things

- Have not correctly defined the recursion in the sequential updating approach. Can I really calculate \(P(y_3 | (y2 | y_1))\) in place of \(P(y_3 | y2, y1)\)? e.g. instead of calculating:

\[V_3 = K(y_3, y_3) - K(y_{1:2}, y_3) K(y_{1:2}, y_{1:2})^{-1} ) K(y_3, y_{1:2}) \]

Is there some scalar \(v_2\) such that

\[V_3 = K(y_3, y_3) - K(y_{1:2}, y_3) K(y_3, y_{1:2}) / v_2\]

e.g. a term like:

\[V_2 = K(y_2, y_2) - K(y_1, y_2) K(y_1, y_1)^{-1} ) K(y_2, y_1) \]

and a similar recursion for the mean?

- Cannot get posterior covariance matrix from kernlab’s
`gausspr`

. Can this be reconstructed from`alpha`

? (question posted to cross-validated) - Likelihood optimization does not handle the noise term. Examples in Rasmussen and Wiliams suggest this should work. (See commit log below)

### nonparametric-bayes log

- re-run example with updated plots, but otherwise as before. 07:40 pm 2012/11/15
- Question on how to get the covariance matrix back from the kernlab gausspr method. 07:39 pm 2012/11/15
- Attempted to fix the sequential method example.

Still incorrect. See sequential-method.Rmd 07:37 pm 2012/11/15 - Attempting to write the sequential method algorithm.

Not correct at the moment, need to understand the recursion properly. 07:36 pm 2012/11/15 - Stick with direct method.

Current version of sequential method is not correct. 07:34 pm 2012/11/15 - Updated hyperparameter-optimization example

Shows that we can estimate l appropriately but fails to get sigma_n; instead estimating the upper boundary. 07:32 pm 2012/11/15