Wednesday, June 28, 2006
Wednesday, June 21, 2006
Apparently, the soccer balls being used at the World Cup are no longer polyhedral. They consist of 14 pieces, many of which look a lot like the squashed oval shape you see on tennis balls and baseballs. This makes the ball rounder and faster, and apparently gives it baseball-like effects when moving through the air. Players (and especially goalkeepers) have been complaining about this, but then they complain every World Cup, so....
Tuesday, June 20, 2006
I highly recommend reading it, whether you work in this area or not. It addresses the main point that has always made me uncomfortable about power-law research: that almost anything looks like a power-law if you squint hard enough. Michael makes the argument that
while numerous models that yield power law behavior have been suggested, and in fact the number of such models continues to grow rapidly, no general mechanisms or approaches have been suggested that allow one to validate that a suggested model is appropriate.There is a larger point here about "algorithms on data" and "algorithms on structures" that I want to talk about, but I'll keep that for a later post.
Wednesday, June 14, 2006
Tuesday, June 13, 2006
On Monday, a bunch of us had hiked out to the Bell Rock, a rather large bell-shaped rock near the hotel. On Wednesday (I was too cowardly to brave the elements on Tuesday), we decided to climb as far up the rock as we could go. Lest you think that I am some kind of rock climbing savant, I will assure you that no such thing is true. The rocks were rough enough and the trails well marked enough that we could get pretty high up without needing technical apparatus of any kind.
Here, in all its beauty, is Bell Rock:
From this angle, it's hard to explain how high we got, but it was high enough :). It started raining as we descended. The rocks proceeded to get very slick, but in every other way the rain was highly welcome. The rest of the morning was pleasantly cool as a result; so much so that it was hard to stay indoors and attend the first session.
Even though it was day 3 of the conference, which usually means my brain is fried and I am longing to return home, I really wanted to attend the first session. You see, this session was one of many on the burgeoning area of computational topology, a topic that now has a significant presence within computational geometry.
In a sense, there is no surprise that computational topology has become an important area. One of the main application areas of geometry is the understanding of shape, whether it be the triangulated meshes that constitute shapes in an animation, or the intricate surfaces induced by electrostatic forces at the surface of a protein molecule.
Topology provides much of the underlying mathematics of shape. Not all of it: metric properties of the shape are obviously important. But a good deal of it. A classic example of this is the following question:
given points sampled from a shape, under what conditions can I reconstruct a shape that is geometrically and topologically similar to the original shape ?Many results over the years have attempted to answer this question via "sampling criteria" of the form, "if you sample points so that they satisfy some condition SC, then there's an algorithm that will correctly reconstruct the shape within some error bounds". The goal is to get as relaxed conditions SC as possible, given that we often don't have control over the sampling process.
Much work at this year's SoCG was on new sampling criteria and new ways of representing the medial axis (an important structure that acts as a kind of "skeleton" of a shape). In other words, the mathematics and the algorithmics of reconstructing shapes.
Another thread of work is on a notion called 'persistence'. Persistence is a little tricky to define intuitively (and I strongly recommend Afra Zomorodian's excellent monograph), but the basic idea is that you first create a sequence of evolving structures for a shape. One example of this could be the level sets of a terrain as you increase the "level". Then, you look for features that are "long-lasting", or "persistent" within this sequence. Such features are likely to be more permanent parts of the shape, and can be used to identify key parts of the shape. Much work has now gone into computing persistence, doing it in a stable fashion, and trying to generalize it in various settings.
In general, persistence is a way of identifying important elements of a shape, and can often be a "subroutine" in higher level topological analysis of shape.
A third thread that's somewhat distinct from the above, but deals squarely with "algorithmic topology" is exemplified by work on computing paths on shapes of different topological characteristics. Applications of this work are less immediate, but one can imagine using such algorithms to deal with the manifolds generated via surface reconstruction etc. I'm sure Jeff Erickson has more to say about this.
To work in this area, you need a very good grasp of topology (combinatorial, algebraic), some Morse theory, and of course good algorithmic and geometric intuition. The first two topics are not usually covered in a graduate degree in algorithms, and that makes this field a bit harder to get into, but its ranks are growing.
Computational topology is definitely a new and challenging direction in geometry. I wouldn't be surprised if it ended up having its own conference sooner or later: I hope not though. The cross-fertilization between topology and more traditional computational geometry is one of the more interesting trends in the field right now.
(HT: David Poole via Chris Volinsky)
Monday, June 12, 2006
Though India's new batting sensation Mahendra Singh Dhoni is Jharkhand's most eligible bachelor, his family say that he has no intention of getting married for at least another three years as his mind right now is only on cricket.
According to his members, Dhoni, started getting marriage proposals when he played for the country and these proposals multiplied ten folds when he scored 148 against Pakistan at Vishakapatnam earlier this year.
After his unbeaten 183 against Sri Lanka at Jaipur and another unbeaten 45 at Pune he has been flooded with marriage proposals but he has no plans of settling down yet and will only think of marriage after another three years, a family member said.
On the other hand, Indians are savvy with cutting-edge technology. While senators in the US need to be sent iPods so they might understand what MP3 players are, Indian industrialists are way more advanced:
Concealing secret transactions in pen-drives and iPods is fast catching up as a trend, say income tax and Directorate of Revenue Intelligence officials, who now take special care to seize these devices before laying hands on account books and computers during a raid. The internet too is a favoured hideaway for tax-evaders, who post details of illegal accounts on the web before deleting these files from their records.The land of contradictions indeed....(p.s Sariel, careful what you do with your iPod :))
Though these gizmos are small enough to fit into a shirt pocket, they have immense memory ^ ranging from 1 to 30 giga bytes. They are also very easy to discard. Officials believe that during many raids, pen-drives have been put into burning furnaces, crushed under car wheels or simply thrown into dustbins. In one case, a pen-drive was recovered from the driver of an accused.
(HT: India Uncut and Sepia Mutiny)
All that Andrew Wiles did, in his FLT proof, was solve one problem, using ad hoc arguments, that depend on historical contingencies of what was proved before. It is a huge Rube Goldberg contraption, that is unlikely to lead to anything further of any significance, just more of the same human drivel.In context, he is talking about the importance of computer-generated proofs. Read the whole article.
Saturday, June 10, 2006
The jeeps are open-air seven-seaters, the kind of "SUV" that was really designed for offroading, Trust me when I tell you that none of the wimpy SUVs you see on the road would have survived the bone-rattling trip we took. As I said to someone later on, it was like going on a roller coaster over a discontinous function.
The usual tourist patter aside, it was a great ride. 5 of the jeeps took a more "rugged" ride, and although all my bones were aching the next day, it was worth it ! This ranks up there with the S0CG 2004 Manhattan cruise-with-open-bar (yes, geometers do have too much fun).
And now, back to business. The featured paper of the day is 'Overlays of Minimization Diagrams' by Vladlen Koltun and Micha Sharir.
The lower envelope of an arrangement of entities in R^d (the set of entities you hit first as you move upwards from z = -infinity) is a very important structure in geometry. Many optimization problems can be expressed as searches over a suitably defined lower (or upper) envelope, or combinations thereof. A minimization diagram is what you get when you project the lower envelope onto the underlying domain one dimension lower. The best example of this is a Voronoi diagram of points, which as we all know is the projection of the lower envelope of an arrangement of suitably defined cones.
What do you get if you take two such minimization diagrams and put them on each other ? This overlay, which itself is another subdivision of space, is also useful in geometric optimization. The canonical example here is the computation of a minimum width annulus of a set of points. I'd like to find a center, and two concentric circles of radius r and R, so that all the points in my input are enclosed between the two circles, and R-r is as small as possible.
This problem is a robust version of circle fitting, or checking how close points are to a circular shape. It's not hard to see that what you need is a large empty circle, and a small enclosing ball, both centered at the same point. Any vertex of a Voronoi diagram gives you a large empty circle (in fact, a maximal empty circle). If you computed a "farthest-point" Voronoi diagram, i.e a minimization diagram from the upper envelope, you'd get candidates for the smallest enclosing ball. It turns out that the center of the optimal minimum width annulus must be a vertex of the overlay of these two minimization diagrams.
As with many geometry problems, we quickly go from searching to counting, from "find the optimal x satisfying property P" to "Enumerate all the x that can satisfy property P". The question in this case is: What is the complexity of the overlay of two minimization diagrams ?
Of course, a more basic question is: what is the complexity of one minimization diagram ? For the case of hyperplane arrangements, the answer follows from the famous Upper Bound Theorem of McMullen from 1970:
The complexity of the lower envelope of a collection of n hyperplanes in d dimensions is n⌊d/2⌋.Given this, an easy bound on the overlay of two such diagrams would be n2 * ⌊d/2⌋, merely by observing that every intersection of two features creates a new feature. The main result of this paper is that the correct answer is much smaller:
The complexity of the overlay of two minimization diagrams for collections of n hyperplanes in d dimensions is n⌈d/2⌉.Note the flip from floor to ceiling. This creates some unusual effects; the complexity of the overlay of two minimization diagrams in odd dimensions has the same complexity as that of one such diagram; in even dimensions, the difference is one (remember that the minimization diagram inhabits one fewer dimension than the arrangement of hyperplanes defining it).
The key lemma that establishes this fact is very simple; indeed the authors express surprise that this lemma hasn't been observed before. It involves a "reversal of operators": namely, that the overlay of minimization diagrams can be expressed as a minimization of overlays.
The overlay of two minimization diagrams of n d-variate functions is isomorphic to a subset of the minimization diagram of 2n d+1-variate functions.The result above extends to 2 <= m <= d. Similar results also hold for arrangements of simplices, or of constant-degree algebraic functions. They all follow almost directly from the above lemma.
Thursday, June 08, 2006
One of the more notable talks I heard on Tuesday was by Rina Panigrahy on a new lower bound for locality-sensitive hashing by Rajeev Motwani, Assaf Naor and himself.
Locality-sensitive hashing is a suprisingly effective "geometric" analogue of hashing first developed by Piotr Indyk and Rajeev Motwani. In standard hashing, the goal is to maintain a set of elements S drawn from a large universe U so that you can answer the question "is i in S" really quickly. Because U is typically much larger than S, what you'd like is a storage structure that uses space proportional to the size of S, rather than the size of U (which would be easy to do).
Hashing is one of the most elegant, profound and practical ideas to come out of the study of algorithms and data structures. It's almost impossible to find any serious program that doesn't need some kind of hash data structure. A detailed post on hashing will have to wait for another day though..
Locality-sensitive hashing is a way to take hashing to a geometric arena (you knew geometry had to show up sooner or later). Suppose that instead of merely having a set of elements from a universe, I had a set of points in a metric space. For example, maybe I have points situated in 1o-dimensional space with the normal Euclidean distance.
My goal is the same: I want to store points from the space in a data structure so I can ask the question: "is i in S ?". But now that points have distances between them, I can ask the related question, "is i near some point of S ?". Locality-sensitive hashing gives you a data structure that answers such questions approximately. More precisely, given parameters c, r, p, q, the structure guarantees that if two elements are within distance r of each other, then they are hashed to the same bucket with probability p, and if they are further than c * r, then this happens with probability q. In general, you want p to be much larger than q of course, and you'd like c to be as close to 1 as possible.
The actual construction is quite simple (you create a collection of hash functions and map each point to a vector of hash values), and has had numerous applications. What controls the efficacy of the scheme is the parameter r = log(1/p)/log(1/q), which affects both the space and time bounds of the algorithm; the smaller r is, the better. Roughly speaking, the space required by an algorithm is n^(1+r) and the query time is n^r.
Since c controls the quality of the hash, and r controls its efficiency, these two parameters vary inversely. In fact, for l1, it was shown earlier that r <= 1/c. What this paper shows is that for any lp norm, r >= 0.462/cp. Specifically, this means that the bound for l1 is almost tight. Together with new work by Andoni and Indyk showing that LSH for l2 can be done with r <= 1/c2, this gives almost matching upper and lower bounds for LSH for a range of norms.
It's a very nice result, and quite simple. In fact, for those who grumble that papers with few pages are not viewed kindly by PCs, this paper is only 4 pages long.
Update (6/11/06): As Piotr points out in the comments, the bounds are not quite matched. There is a constant factor gap between the upper and lower bounds, and thus some more work to do. Damn you, big-O notation !!!
Tuesday, June 06, 2006
- Jeff, having spent many many years attending SoCG, has a keen sense of the pulse of the community and can distill out its essence.
- SoCG business meetings are really boring and predictable.
The Erickson-meeting was far more entertaining than the poor substitute for fiction that the real meeting was. However, since I am a dedicated worker bee in this community, I will attempt to add some truthiness to his description.
The heat: It is so hot that apparently someone stole the free sunscreen that the organizers had placed on the registration table. Carola made a plaintive plea for its return. Alon (very defensively) claimed that he didn't expect the weather. To which I say, NO ONE EXPECTS THE WEATHER. It's THREE main weapons are surprise, unpredictability, .... oh never mind.
The drinks: We had drink statistics ! as you might suspect, we are a bunch of effete snobs who only drink imported beer.
PC Stats: Papers get in if you write them on folding, sensors, blah blah, 3-4 authors good, blah blah.
Aren't PC stats fun ?
Next year: In Gyeongju, which we are told repeatedly is the old capital of Korea. It is "apparently" easier to get to than Sedona. You
- Fly to Incheon airport (50 bajillion hours)
- Take a bus to Seoul (1 hour)
- Take a train to Gyeongju (3 hours by super-fast-your-teeth-will-ache train, 5 hours by lets-stop-and-pick-pickled-cabbage train)
- David Mount for DC: I hope someone comes up with a better bid than this
- Tamal Dey for OSU, looking at his PR material: Is this really Columbus ?
Who won ? oh yeah, it was DC, 45/28.
The most listless general discussion section EVAH ! Pankaj, John Hershberger and Carola tried valiantly to create controversy over parallel sessions, and the rest of us drink our imported beers and say 'Pfeh'. Thankfully this ends quickly, with no resolution (What! what did you expect?) and this sad spectacle comes to an end.
I feel nostalgic for short papers.
Sunday, June 04, 2006
"God made the Grand Canyon, but He lives in Sedona"I heard this from a fellow passenger on the shuttle from Phoenix to Sedona. It might overstate the case slightly, but Sedona is indeed gorgeous. The Hilton Resort faces the main red rock formations, and my room has a gorgeous view (no pictures, because I neglected to bring my camera download dongle).
It is very very hot here. People think that because I'm Indian, I have no business complaining about the heat or worse, that I should enjoy the heat ! I point out that because I'm Indian, I have a healthy respect for heat and spend my time as far away from it as possible. The low temperature here is higher than the highs when I was in Zurich a week ago. Yikes !
As a consequence, even simple things like taking a walk are tricky. I wanted to walk up to the Bell Rock, one of the standout examples of the red rock formations. It's a mile away, and quite doable under normal circumstances.
Alas, these are anything but normal circumstances. The concierge gave me the 'crazy foreigner' look when I proposed my plan. She did recommend that I try getting up at dawn and walking over, when it's cooler. I might actually do that, for the second time in my life (the first time was in Zion National Park, and the trek was well worth it).
It's hard to get around without a car. In fact, I wonder if this might actually not be an excellent place for the American equivalent of the Dagstuhl/Bertinoro/Oberwolfach workshops. It's hard to get around anyway, and in the summer you wouldn't want to !
Saturday, June 03, 2006
The problem itself is very easy to state. You're given points in a range, and a query from a family of ranges. Find all points contained in the range, and report the set of points (searching), the cardinality of points (counting) or even whether the range is non-empty (emptiness).
If one defines an appropriate commutative semigroup, then all of the above formulations can be captured by the same basic problem. Given a mapping w from points to the semigroup, compute the sum (with respect to the semigroup) of w(p) for all points in the range. Searching maps to the union operation, counting to addition, and emptiness to the OR function.
Most results in range searching are phrased in terms of space-time tradeoffs: if you are willing to pay m space, what is the query time as a function of m and n, the number of points ? The interesting points are at the extremes: If you want logarithmic query time (this ignores the time to output the search results, or you can consider counting queries), then you need polynomial space, and if you want linear space, then you'll end up with near-linear query time.
A flurry of results starting in the late 80s and culminating in the early 90s resolved most of the open problems in range searching, especially for the challenging simplex (or halfspace) range searching problems. The survey by Agarwal and Erickson has more details.
For all intents and purposes, this area appeared be to a closed one. More recently, there was work on tradeoffs for approximate range searching (when you might allow points that are "near" the range to be included in the answer). Here, it was possible to transfer the high complexity from terms involving n (or d) to terms involving epsilon, the error parameter (again, I oversimplify; see here and here for more details).
Recently though, two papers by Arya, Malamatos and Mount appear to have opened up some new angles (pun intended) to range searching. The first paper (which appeared in STOC this year) is titled, in Wildesque fashion, 'On the Importance of Idempotence".
The premise is neat: in all the upper and lower bounds for range searching, the group structure is not explored in any particular way (for technical reasons, the group must be a "faithful semigroup", but that's it). What they show in their paper is that for exact range searching to a degree, and more importantly for approximate range searching, the group structure can change the bounds. Specifically, if the group operation is idempotent, i.e a + a = a, then the (matching) upper and lower bounds for approximate range searching reduce to the square root of their previous values (i.e by a factor of 2 in the exponent). Idempotent group operators include min (used to find the smallest element in a range) and OR (to check range emptiness).
What's really interesting about this is that idempotence itself can be viewed as a limit of a "dequantization process" over integral domains. One example of this is how the field (R, max,+) can be retrieved from the field (R, +, *) (there are some technical details needed) by using the mapping x -> 1/h ln(x), as h tends to infinity. The question is then: can the difference between the idempotent and integral bounds on range searching be viewed as two points on a continuum parameterized by such a parameter ?
There's a lot of work out there on idempotence, so-called "tropical mathematics", and the relation between quantum and classical phenomena. All very interesting stuff.
In a paper coming up at SoCG, Arya, Malamatos and Mount go one step further. They further show that the shape of the ranges is also very important. Their earlier results on idempotence were for ball queries: in this paper they show that if the queries are angular (like rectangles), then the separation between idempotent and integral semigroups doesn't quite show up, and that smooth, well rounded queries (like balls) are needed for this to work.
Friday, June 02, 2006
Of course, everyone is missing a true sign of the apocalypse: a non-Indian winning the National Spelling Bee !! Five out of the last seven winners (and all four top competitors last year) were of Indian origin. But this year, the top placed finisher was 4th (and boy did he have a game face) !
I was amused (and somewhat confused) to see a Hindi word with absolutely no connection to English being used. The word was 'izzat', which means 'pride', or'honor'. Its pronounciation was of course mangled horribly: it's a wonder that the speller managed to get it right. And if one wanted to be pedantic, the word isn't even Hindi; it's Urdu.
p.s Having bought into the stereotype of spelling bee champs being these monomaniacal bottle-glass wearing loons, I was pleasantly surprised by the eventual champ, who displayed remarkable maturity and a sense of balance in her brief biopic.