The IBM Research | NYU | Columbia Theory Day
Friday, April 28, 2006


Davis Auditorium
412 Schapiro(CEPSR) building,
Columbia University, New York

The IBM Research | NYU | Columbia Theory Day is a semi-annual conference, aimed to bring together people in the New York Metropolitan area for one day of interaction and discussion.  The Theory Day features several (usually 4-5) hour-long presentations by leading theoretical computer scientists about state-of-the-art advances in various areas.  Some presentations give a survey of the latest advances in some area, while others may concentrate on a particular result. The meeting is free and open to everyone; in particular, students are encouraged to attend.



Program:
9:30 - 10:00Coffee and bagels
10:00 - 10:55Prof. Bernard Chazelle (Princeton University)
Data-Driven Algorithm Design
10:55 - 11:05Short break
11:05 - 12:00Prof. Michael Saks (Rutgers University)
A Lower Bound for Cooperative Broadcast in the Presence of Noise
12:00 - 2:00Lunch break
2:00 - 2:55Prof. Subhash Khot (Georgia Institute of Technology)
Lower Bounds for Approximating MAX-CUT and Sparsest Cut
2:55 - 3:15Coffee break
3:15 - 4:10Dr. Corinna Cortes (Google Labs)
The Next Algorithmic and Theoretical Challenges for Search Engines

To subscribe to our mailing list, follow instructions here.

Organizers:

Yevgeniy Dodis (NYU)

Tal Malkin (Columbia)

Tal Rabin (IBM Research)

Baruch Schieber (IBM Research)



Abstracts:

Data-Driven Algorithm Design
Prof. Bernard Chazelle
(Princeton University)

Motivated by the need to cope with data that is massive, high-dimensional, noisy, uncertain, unevenly priced, low-entropic, or all of the above, algorithm design is undergoing profound changes. I will discuss some of these developments; in particular, sublinear algorithms, online reconstruction, self-improving algorithms, and dimension reduction.


A Lower Bound for Cooperative Broadcast in the Presence of Noise
Prof. Michael Saks
(Rutgers University)

In a noisy broadcast channel, processors communicate as follows: in each time step, one processor broadcasts a bit. Each of the other processors receives a bit, but the received bit is incorrect with some known probability p. Reception errors are assumed to be independent. In such an environment, a group of n broadcasters, each possessing a bit, wish to transmit all n bits to a receiver so that, with probability close to 1, the receiver learns all of the bits correctly. This can be done easily with O(n log n) broadcasts, by having each processor broadcast its input bit C log n times (for some sufficiently large constant C) and having the receiver decode each bit by majority vote. This naive algorithm was improved by Gallager who, in 1988, gave an algorithm that uses only O(n log log n) broadcasts. I'll describe a recent result, obtained with Navin Goyal and Guy Kindler, showing that Gallager's algorithm is optimal up to a constant factor.



Lower Bounds for Approximating MAX-CUT and Sparsest Cut
Prof. Subhash Khot
(Georgia Institute of Technology)

Goemans and Williamson, in their seminal 1994 paper, introduced the use of semi-definite programming as an algorithmic tool. Using a natural SDP-relaxation and a rounding procedure, they obtained 0.878.. approximation to MAX-CUT problem. Since then, the technique has found several applications including the recent breakthrough result of Arora, Rao and Vazirani that gave O(sqrt{log n}) approximation to the Sparsest Cut problem.

I will talk about recent results proving lower bounds on the appr- oximatbility of MAX-CUT and Sparsest Cut. These lower bounds are of two types: firstly, hardness of approximation results assuming the Unique Games Conjecture and secondly, unconditional and explicit intergrality gaps for the SDP relaxations (even with so-called triangle inequality constraints). The lower bound for MAX-CUT matches exactly with the Goemans- Williamson's bound. The integrality gap result for Sparsest Cut disproves a conjecture of Goemans and Linial that the negative type metrics embed into L_1 with constant distortion.

Based on joint works with [Guy Kindler, Elchanan Mossel and Ryan O'donnell], [Nisheeth Vishnoi] and [Nikhil Devanur, Rishi Saket and Nisheeth Vishnoi].


The Next Algorithmic and Theoretical Challenges for Search Engines
Dr. Corinna Cortes
(Google Labs)

Search engines have been around for more than a decade. Some of the problems that seemed at first intractable have found common practical solutions. But much is still left to organize the information in the world. What are the next challenges faced by search engines? How can search be further improved? This talk presents and discusses several algorithmic and theoretical problems that arise in this context of the design of search engines.