Advanced Search
DSpace@MIT

CBMM Memo Series

Research and Teaching Output of the MIT Community

CBMM Memo Series

 

Recent Submissions

  • Cheney, Nicholas; Schrimpf, Martin; Kreiman, Gabriel (Center for Brains, Minds and Machines (CBMM), arXiv, 2017-04-03)
    Deep convolutional neural networks are generally regarded as robust function approximators. So far, this intuition is based on perturbations to external stimuli such as the images to be classified. Here we explore the ...
  • Zhang, Chiyuan; Liao, Qianli; Rakhlin, Alexander; Sridharan, Karthik; Miranda, Brando; Golowich, Noah; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), 2017-04-04)
    In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in overparametrized deep convolutional networks. We show that Stochastic Gradient Descent (SGD) ...
  • Poggio, Tomaso; Liao, Qianli (Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-30)
    Previous theoretical work on deep learning and neural network optimization tend to focus on avoiding saddle points and local minima. However, the practical observation is that, at least for the most successful Deep ...
  • Lotter, William; Kreiman, Gabriel; Cox, David (Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-01)
    While great strides have been made in using deep learning algorithms to solve supervised learning tasks, the problem of unsupervised learning—leveraging unlabeled examples to learn about the structure of a domain — remains ...
  • Tachetti, Andrea; Voinea, Stephen; Evangelopoulos, Georgios (Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-13)
    The complexity of a learning task is increased by transformations in the input space that preserve class identity. Visual object recognition for example is affected by changes in viewpoint, scale, illumination or planar ...
MIT-Mirage