Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the complexity of dynamic least-squares regression in this comprehensive lecture from the Simons Institute. Delve into the sharp separations between amortized update times for fully vs. partially dynamic 0.01-LSR and high vs. low-accuracy LSR in the insertion-only setting. Examine the lower bounds derived from a gap-amplification reduction, connecting the exact version of the Online Matrix Vector Conjecture to its constant approximate version over the reals. Gain insights into the independent significance of this result in fine-grained complexity and its implications for the ongoing investigation of the OMv Conjecture. Learn about the challenges and advancements in maintaining ε-approximate solutions to min_{x^(t)} ||A^(t) x^(t) − b^(t)||_2 in dynamic environments where rows and labels can be adaptively inserted and/or deleted.