tag:blogger.com,1999:blog-6555947.post6688363721870050021..comments2014-01-12T10:46:48.153-07:00Comments on The Geomblog: SoCG 2007: Approximate ClusteringSuresh Venkatasubramanianhttps://plus.google.com/112165457714968997350noreply@blogger.comBlogger6125tag:blogger.com,1999:blog-6555947.post-73068848167033981202007-06-08T06:29:00.000-06:002007-06-08T06:29:00.000-06:00Oh yes, another separation: it is known that you c...Oh yes, another separation: it is known that you can solve c-approximate near neighbor with subquadratic space and dn^{1/c^2} query time, while there is a dn^Omega(1/c^2) query time lower bound. However, the lower bound only works for a restricted class of "hashing-based algorithms". See Andoni et al, FOCS'06, and Motwani et al, SOCG'06.<BR/><BR/>PiotrAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-6555947.post-67939326277705232892007-06-08T02:37:00.000-06:002007-06-08T02:37:00.000-06:00To add to what Piotr pointed out; although this is...To add to what Piotr pointed out; although this is not exactly the same thing, <A HREF="http://citeseer.ist.psu.edu/trevisan97when.html" REL="nofollow">Luca Trevisan showed</A> that for geometric TSP, the doubly exponential dependence on dimension for approximation algorithms cannot be removed unless NP has subexponential algorithms. <BR/><BR/>There are other results that show that if the dimension is part of the input, various clustering problems are NP-hard to solve exactly, and in some cases have approximation hardness results. But there is still no nuanced understanding of the relation between n, d, and e. One thing to keep in mind is that these aren't independent parameters, in the sense that any lower bound will have to trade one parameter off against the other.Sureshhttp://www.blogger.com/profile/15898357513326041822noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-23831579805431910102007-06-08T01:40:00.000-06:002007-06-08T01:40:00.000-06:00Hi, There are some provable separations between t...Hi,<BR/><BR/> There are some provable separations between the exact and approximate nearest neighbor. It is actually easier to talk about the *near* neighbor problem, where you are given a parameter r, and for any query q, the goal is to check if there is any data point within distance r from q (the approximate version allows wrong answer if the distance to the nearest neighbor of q is in the range [r, (1+eps)r].)<BR/><BR/>What is known:<BR/><BR/>* (1+eps)-approx near neighbor in d-dimensional spaces (Hamming, etc) can be solved using only one query to a data structure of size n^O(1/eps^2)<BR/><BR/>* exact near neighbor in d-dimensional Hamming space requires roughly Omega(d) queries to any data structure of polynomial size.<BR/><BR/>For references, see:<BR/>"Nearest Neighbors in High-dimensional Spaces" at<BR/>http://theory.lcs.mit.edu/~indyk/39.ps<BR/><BR/>PiotrAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-6555947.post-29791910700176348832007-06-07T13:33:00.000-06:002007-06-07T13:33:00.000-06:00Good questions, Suresh.Like Gareth I am curious wh...Good questions, Suresh.<BR/><BR/>Like Gareth I am curious what complexity/lower bounds theory has to say about this mess of parameters, specifically, in problems where NP-completeness is not the appropriate concept. <BR/><BR/>For instance, exact nearest-neighbor search can be done in time polynomial in the database, but is there work which suggests that it will necessarily suffer more from high dimensionality than the clever algorithms for approximate nearest-neighbor?Andy Dhttp://www.blogger.com/profile/03897281159810085972noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-71203769575604681662007-06-07T04:25:00.000-06:002007-06-07T04:25:00.000-06:00What I'd like to see is some kind of "taxonomizati...<I>What I'd like to see is some kind of "taxonomization" or classification of the different "tricks of the trade" in high-dimensional geometric approximation</I><BR/><BR/>Sounds like a job for <A HREF="http://en.wikipedia.org/wiki/Parameterized_complexity" REL="nofollow">parameterized complexity</A>.gareth.reeshttp://www.blogger.com/profile/15405124248006286547noreply@blogger.comtag:blogger.com,1999:blog-6555947.post-65013671189514452202007-06-06T22:29:00.000-06:002007-06-06T22:29:00.000-06:00Look at the notes from the first half of the class...Look at the notes from the first half of the class Pankaj taught last semester:<BR/>http://www.cs.duke.edu/education/courses/spring07/cps296.2/lectures.html<BR/><BR/>Also, Sariel's book-to-be should have some nice tricks as well:<BR/>http://valis.cs.uiuc.edu/~sariel/teach/notes/aprx/lec/<BR/><BR/>this two places should be a great place to start for someone who already has a basic understanding of algorithms and geometry.<BR/><BR/>JeffJeffhttp://www.blogger.com/profile/02817986846758586086noreply@blogger.com