New Techniques for Proving Impossibilities in Computer Science and Their Theoretical and Practical Consequences
The ultimate goal of an algorithmitician is to come up with the most efficient algorithm possible. However, in order to reach this goal it is important to understand what are the limitations that computational models impose on various algorithmic problems, thereby creating a barrier for efficiency that cannot be overcome. In this talk we will survey exciting recently developed techniques for establishing algorithmic lower bounds (the computational barriers) in various models, including conditional lower bounds in the classic RAM model, showing that randomness is sometimes necessary in the LOCAL distributed model, and instigating the amount of energy used by devices in the shared resource model (which is strongly related to radio networks).
Interestingly, new lower bounds obtained by using these techniques have served as an aid for coming up with efficient algorithms, both in theory and in practice. This aid comes in two forms: (1) understanding that the hard instances for an algorithmic problem are combinatorially very different from the instances that appear in practice lead to algorithms for virus detection schemes that are efficient on real-world inputs, and (2) knowing the complexities that one should aim for (due to a lower bound) can often give a sense of what type of tools one should be using. In this talk we will demonstrate examples for both of these cases.
תאריך עדכון אחרון : 09/05/2017