data science

Tuesday, September 17, 2019 - 9:00am - 10:00am
Andrew Barron (Yale University)
For deep nets we examine contraction properties of complexity for each layer of the network. For any ReLU network there is, without loss of generality, a representation in which the sum of the absolute values of the weights into each node is exactly 1, and the input layer variables are multiplied by a value V coinciding with the total variation of the path weights. Implications are given for Gaussian complexity, Rademacher complexity, statistical risk, and metric entropy, all of which are shown to be proportional to V.
Wednesday, September 14, 2016 - 3:10pm - 4:00pm
Bin Yu (University of California, Berkeley)
In this talk, I'd like to discuss the intertwining importance and connections of three principles of data science in the title in data-driven decisions. The ultimate importance of prediction lies in the fact that future holds the unique and possibly the only purpose of all human activities, in business, education, research, and government alike.
Making prediction as its central task and embracing computation as its core, machine learning has enabled
Subscribe to RSS - data science