tag:blogger.com,1999:blog-6555947.post6551995311855934658..comments2014-01-12T10:46:48.153-07:00Comments on The Geomblog: Manifold learning and GeometrySuresh Venkatasubramaniannoreply@blogger.comBlogger3125tag:blogger.com,1999:blog-6555947.post-37599303560449895212007-09-22T05:27:00.000-06:002007-09-22T05:27:00.000-06:00It sure looks like there is plenty of overlap espe...It sure looks like there is plenty of overlap especially in light of the recent development in Compressed Sensing.<BR/>I wrote something about it at:<BR/>http://hunch.net/?p=273<BR/>and attendant comments. I am also intrigued with the connection to the nuclear norm:<BR/>http://arxiv.org/abs/0706.4138<BR/>and the diffusion method on manifolds of Lafon, Maggioni and Coifman. <BR/><BR/><BR/>Igor <BR/>http://nuit-blanche.blogspot.comIgorhttps://www.blogger.com/profile/17474880327699002140noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-40495756703976987352007-09-21T22:59:00.000-06:002007-09-21T22:59:00.000-06:00excellent. we need a translator for machine learni...excellent. we need a translator for machine learning and geometry :)Sureshhttps://www.blogger.com/profile/15898357513326041822noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-48795457574183065592007-09-21T12:38:00.000-06:002007-09-21T12:38:00.000-06:00Here's some other kinds of rough connections betwe...Here's some other kinds of rough connections between learning and geometry, well known to those who know them. Surely there's more.<BR/><BR/>What in learning is called "leave one out", or "deletion", in geometry is called "backwards analysis"<BR/><BR/>What in learning is called "VC-dimension", in geometry is called, well, "VC-dimension"<BR/><BR/>What in learning is called "sample compression" using maybe "combinatorial dimension", in geometry is called "random sampling"<BR/><BR/>What in learning is called "sparse greedy approximation" in geometry is called "the Frank-Wolfe algorithm" and "coresets"<BR/><BR/>What in learning is called "winnow" in geometry is sometimes called "iterative reweighting"<BR/><BR/>...plus of course robust estimators, k-means, doubling dimension, etc.<BR/><BR/>-Ken<BR/><BR/>(Apologies if this comment is given twice)Anonymousnoreply@blogger.com