2018-03-27

Computational Challenges in Machine Learning


source: Simons Institute        2017年5月1日
The aim of this workshop is to bring together a broad set of researchers looking at algorithmic questions that arise in machine learning. The primary target areas will be large-­scale learning, including algorithms for Bayesian estimation and variational inference, nonlinear and nonparametric function estimation, reinforcement learning, and stochastic processes including diffusion, point processes and MCMC. While many of these methods have been central to statistical modeling and machine learning, recent advances in their scope and applicability lead to basic questions about their computational efficiency. The latter is often linked to modeling assumptions and objectives. The workshop will examine progress and challenges and include a set of tutorials on the state of the art by leading experts.
For more information, please visit https://simons.berkeley.edu/workshops...
These presentations were supported in part by an award from the Simons Foundation.

1:05:29 Variational Inference: Foundations and Innovations
44:41 Representational and Optimization Properties of Deep Residual Networks
41:36 Composing Graphical Models with Neural Networks for Structured Representations and Fast Inference
46:48 Scaling Up Bayesian Inference for Big and Complex Data
36:23 Unbiased Estimation of the Spectral Properties of Large Implicit Matrices
53:37 Stochastic Gradient MCMC for Independent and Dependent Data Sources
1:02:06 On Gradient-Based Optimization: Accelerated, Distributed, Asynchronous and Stochastic
48:31 Sampling Polytopes: From Euclid to Riemann
38:54 Understanding Generalization in Adaptive Data Analysis
10 39:37 Your Neural Network Can't Learn Mine
11 Efficient Distributed Deep Learning Using MXNet
12 Machine Learning for Healthcare Data
13 Machine Learning Combinatorial Optimization Algorithms
14 A Cost Function for Similarity-Based Hierarchical Clustering
15 The Imitation Learning View of Structured Prediction
16 Exponential Computational Improvement by Reduction
17 Embedding as a Tool for Algorithm Design
18 Robust Estimation of Mean and Covariance
19 Computational Efficiency and Robust Statistics
20 42:11 System and Algorithm Co-Design, Theory and Practice, for Distributed Machine Learning
21 42:41 The Polytope Learning Problem
22 1:03:24 Computational Challenges and the Future of ML Panel
23 1:03:16 Computationally Tractable and Near Optimal Design of Experiments
24 43:14 Learning from Untrusted Data
25 41:35 The “Tell Me Something New” Model of Computation for Machine Learning

No comments: