tag:blogger.com,1999:blog-6555947.post3897317099951898167..comments2013-03-25T14:39:21.587-06:00Comments on The Geomblog: Clustering: Hierarchical methodsSuresh Venkatasubramaniannoreply@blogger.comBlogger5125tag:blogger.com,1999:blog-6555947.post-2807781429043366342010-04-26T21:18:08.093-06:002010-04-26T21:18:08.093-06:00Hi Geomblog,
Nice posts. Enjoy them. I...Hi Geomblog,<br /> Nice posts. Enjoy them. I do have a question. How does one quantitatively compare two hierarchical clusters of the same items. What I mean by that is how does one say with statistical significance the items are closely clustered in one vs the other. The items (lets say genes) are the same in both the trees. Its just they are 2 different experimental conditions producing 2 different dendogram structure. But some genes cluster together in both the dendograms but with different distances. So how do you say that those two clusters of the same set of genes are different with statistical significance in the distances between the genes in each of the cluster.<br /><br />Thanks<br />Nishnishhttps://www.blogger.com/profile/13784580890965138422noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-20711660535786662312009-08-05T02:11:35.950-06:002009-08-05T02:11:35.950-06:00Re: "You'll notice that I didn't actu...Re: "You'll notice that I didn't actually define a problem that these algorithms solve, keeping in with the grand tradition of clustering :). "<br /><br />Nei's "Neighbor Joining" is a flavor of hierarchical clustering that solves a more well-defined problem: If the distance matrix can be embedded in a tree, then this will be the tree computed by neighbor joining. This takes you striaght to the question which matrices can be embedded in a tree, leading you to four-point criteria, split systems, and a lot of beautiful combinatorics behind this - along with splits networks as a way to represent matrices that don't fit into a tree.<br /><br />All this comes from the world of phylogeny reconstruction in Biology, but should be of equal interest for clustering.Axelhttps://www.blogger.com/profile/09498279234899252561noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-16527097637681881092009-07-26T09:37:10.112-06:002009-07-26T09:37:10.112-06:00My comment was referring to the idea of using a hi...My comment was referring to the idea of using a hierarchical algorithm to find a k-clustering. you're right that otherwise, they're incompatible concepts.Sureshhttps://www.blogger.com/profile/15898357513326041822noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-19814038749667194262009-07-26T07:50:19.146-06:002009-07-26T07:50:19.146-06:00You say:
"Returning to hierarchical clusterin...You say:<br />"Returning to hierarchical clustering, one major problem with this approach is that it's local: make the wrong choice of merge early on, and you'll never get to the optimal solution for a k-clustering problem.". I don't get it arenĀ“t we talking about hierearchical clustering type, if so why do you say k-clustering, which is non hierarchical and also non comparable to a hierarchical one.<br /><br />Anyway I like the idea of measuring the hole process of clustering, not just keeping the resulting one, but also see the merging times and probably other stuff.Mariana Sofferhttps://www.blogger.com/profile/13351209522681966230noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-361624908053195932009-07-24T05:50:54.551-06:002009-07-24T05:50:54.551-06:00Nice post, looking forward to more in the series.
...Nice post, looking forward to more in the series.<br /><br />I'm no expert, but I've found the neglect of soft clustering in the literature to be rather bizarre.Arvind Narayananhttps://www.blogger.com/profile/02495762505427759752noreply@blogger.com