To see the list of my publications, please visit my Google scholar profile or my DBLP page. Below you can find a list of topics and their short descriptions that I have explored in my research. You will see that the common thread between the different topics is that I have focused on understanding the performance of algorithms through an information-theoretic lens. Most of the results have a strong practical motivation, and the studied areas allow for a thorough understanding of a problem without relying on unproven assumptions, such as \mathcal{P} \neq \mathcal{NP}.

Research Topics

Online Algorithms

In the basic or vanilla formulation of an online problem, an input instance is given as a sequence of input items. After each input item is presented, an algorithm needs to output a decision and that decision is final, i.e., cannot be changed upon seeing any future items. The goal is to maximize or minimize an objective function, which is a function of all decisions for a given instance. The term “online” in “online algorithms” refers to the notion of irrevocable decisions and does not refer to the Internet, although a lot of the applications of online algorithms are in networking and some are specific to the Internet. The main limitation of an online algorithm is that it has to make a decision in the absence of the entire input. The value of the objective achieved by an online algorithm is compared against an optimal value of the objective that is achieved by an ideal “offline algorithm,” i.e., an algorithm having access to the entire input. The ratio of the two values is called the competitive ratio. The analysis of online algorithms in terms of the competitive ratio is called competitive analysis.

Most of my research in the recent years has been dedicated to online algorithms. I have worked on various scheduling problems, matching problems, exploration problems, packing and covering problems. I am co-writing a book on online algorithms with Allan Borodin.

Proof Complexity

The central question studied in proof complexity is the following: given a formal statement, what is the length of a shortest proof of the statement in a given formal proof system? Proof complexity is strongly intertwined with the Boolean satisfiability problem and the question of separating \mathcal{NP} from co-\mathcal{NP}. In this work, my co-authors and I showed that random CNFs are hard for the Cutting Planes proof system. In another work, we introduced a new proof system called Stabbing Planes.

Communication Complexity and Information Theory

Claude Shannon introduced information theory in his classical paper in 1948. Andrew Yao introduced communication complexity in 1979. Shannon was concerned with intrinsic information content of one-way communication, while Yao was concerned with minimizing the communication required to compute a function of a distributed input. No formal connections between information theory and communication complexity were known until 2004. In 2004 Bar-Yossef et al. gave a definition of information content of interactive communication. This complexity measure was the central object of study in my PhD work.

In communication complexity, two players, Alice and Bob, receive inputs X and Y , resp. The inputs are sampled from a joint distribution. The players communicate in accordance with a pre-agreed protocol \Pi in order to jointly compute the value of some function f at (X; Y). Communication complexity measures the number of bits exchanged during the execution of \Pi. Information complexity measures the information (in Shannon’s sense) that the protocol reveals about Y to Alice plus the information that the protocol reveals about X to Bob.

Communication complexity is connected with many other branches of computer science, including VLSI, multi-tape Turing machines, streaming algorithms, complexity of Boolean functions (through lifting theorems), and so on. To learn more, see my PhD dissertation.

Complexity of Boolean Functions

A Boolean function on n bits is a function of the form f: \{0,1\}^n \rightarrow \{0,1\}. The area of Boolean function complexity studies the following questions:

  • If input x \in \{0,1\}^n to f is unknown, how many bits of x need to be queried in order to determine the value of f(x)?
  • Given x \in \{0,1\}^n, how “sensitive” is f on x? I.e. how many positions i are there such that flipping x_i changes the output of f?
  • What is the minimum degree of a real polynomial that approximates f on inputs from \{0,1\}^n?
  • Are there any relationships between the complexity measures stated above?

If you would like to learn more, see this survey I wrote with two of my co-authors on the sensitivity conjecture. Since that survey came out, the sensitivity conjecture was resolved by Hao Huang in 2019 in his breakthrough work.