ACM SIGMOD City, Country, Year
sigmod pods logo




PODS 2021: Tutorials

Tutorial 1: Modern Lower Bound Techniques in Database Theory and Constraint Satisfaction

Speaker: Dániel Marx (Max-Planck-Institut für Informatik)

Abstract

Conditional lower bounds based on P!=NP, the Exponential-Time Hypothesis (ETH), or similar complexity assumptions can provide very useful information about what type of algorithms are likely to be possible. Ideally, such lower bounds would be able to demonstrate that the best known algorithms are essentially optimal and cannot be improved further. In this tutorial, we overview different types of lower bounds, and see how they can be applied to problems in database theory and constraint satisfaction.

Bio

Dániel Marx obtained his PhD in 2005 at the Budapest University of Technology and Economics in Hungary. After that, he had postdoc researcher and visiting researcher positions in Berlin, Budapest, and Tel Aviv. From 2012 to 2019, he was at the Institute for Computer Science and Control of the Hungarian Academy of Sciences, where he has founded the Parameterized Algorithms and Complexity group, funded from his European Research Council Starting and Consolidator Grants. In 2019-2020, he was a senior researcher at the Max Planck Institute for Informatics in Saarbrücken, and from 2020 faculty at CISPA Helmholtz Center for Information Security, Saarbrücken. Dániel is known for his theoretical work on algorithms and lower bounds for a wide range of problems.




Tutorial 2: Approximation Algorithms for Large Scale Data Analysis

Speaker: Barna Saha (University of California Berkeley)

Abstract

One of the greatest successes of computational complexity theory is the classification of countless fundamental computational problems into polynomial-time and NP-hard ones, two classes that are often referred to as tractable and intractable, respectively. However, this crude distinction of algorithmic efficiency is clearly insufficient when handling today’s large scale of data. We need a finer-grained design and analysis of algorithms that pinpoints the exact exponent of polynomial running time, and a better understanding of when a speed-up is not possible. Based on stronger complexity assumptions than P vs NP, like the Strong Exponential Time Hypothesis, recently conditional lower bounds for a variety of fundamental problems in P have been proposed. Unfortunately, these conditional lower bounds often break down when one may settle for a near-optimal solution. Indeed, approximation algorithms can play a significant role when designing fast algorithms not just for traditional NP Hard problems, but also for polynomial time problems.

For some applications arising in machine learning, the time complexity of the underlying algorithms is not sufficient to ensure a fast solution. It is often needed to collect side information about the data to ensure high accuracy. This requires low query complexity.

In this presentation, we will cover new facets of fast algorithm design for large scale data analysis that emphasizes on the role of developing approximation algorithms for better polynomial time/query complexity.

Bio

Barna Saha is currently an Associate Professor of Industrial Engineering & Operations Research Department at the University of California Berkeley. She received her Ph.D. at the University of Maryland College Park. Prior to joining UC Berkeley, she was a Research Scientist at AT&T Shannon Laboratories, New Jersey, and was on the faculty of the College of Information & Computer Sciences, UMass Amherst. Her research interests include Algorithms Design & Analysis, Fine-grained Complexity, Probabilistic Method and Large Scale Data Analytics. She has received numerous awards including the Presidential Early Career Award for Scientists and Engineers (PECASE) (2019), Sloan Fellowship (2019), NSF CAREER award (2017), Google Faculty Award (2016), Yahoo Academic Career Enhancement Award (2015), Simons-Berkeley Research Fellowship (2015), and NSF CRII Award (2015). In 2020, she received the Young Alumnus Award from the Indian Institute of Technology Kanpur, and was inducted as a Kavli Fellow to the National Academy of Sciences.

Credits
Follow our progress: FacebookTwitter