Loading…
This event has ended. View the official site or create your own event → Check it out
This event has ended. Create your own
View analytic
Monday, December 7 • 19:00 - 23:59
Communication Complexity of Distributed Convex Learning and Optimization

Sign up or log in to save this to your schedule and see who's attending!

We study the fundamental limits to communication-efficient distributed methods for convex learning and optimization, under different assumptions on the information available to individual machines, and the types of functions considered. We identify cases where existing algorithms are already worst-case optimal, as well as cases where room for further improvement is still possible. Among other things, our results indicate that without similarity between the local objective functions (due to statistical data similarity or otherwise) many communication rounds may be required, even if the machines have unbounded computational power.


Monday December 7, 2015 19:00 - 23:59
210 C #99