Keep Up With Latest Trending Papers. Computer Science, AI and Machine Learning and more.Subscribe

Top Papers in Neural networks

Share

Benign, Tempered, or Catastrophic: A Taxonomy of Overfitting

The practical success of overparameterized neural networks has motivated the
recent scientific study of interpolating methods, which perfectly fit their
training data. Certain interpolating methods, i

More...

Share

Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1

We introduce a method to train Binarized Neural Networks (BNNs) - neural
networks with binary weights and activations at run-time. At training-time the
binary weights and activations are used for comp

More...

Share

Variational Neural Networks

Variational Neural Networks

Read More...

Share

On the Impact of Boosting on Neural Networks

To Boost or not to Boost: On the Limits of Boosted Neural Networks

Read More...

Share

Truth or Backpropaganda? An Empirical Investigation of Deep Learning Theory

We empirically evaluate common assumptions about neural networks that are
widely held by practitioners and theorists alike. In this work, we: (1) prove
the widespread existence of suboptimal local min

More...

Share

Hyperbolic Neural Networks

Hyperbolic spaces have recently gained momentum in the context of machine
learning due to their high capacity and tree-likeliness properties. However,
the representational power of hyperbolic geometry

More...

Share

Machine Learning

Neural and Evolutionary Computing

Disordered Systems and Neural Networks

Mathematical Physics

Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Wide neural networks with random weights and biases are Gaussian processes,
as originally observed by Neal (1995) and more recently by Lee et al. (2018)
and Matthews et al. (2018) for deep fully-conne

More...

Share

Hidden Neural Networks

On Hiding Neural Networks Inside Neural Networks

Read More...

Share

Bookmarked byRaf

Lagrangian Neural Networks

Accurate models of the world are built upon notions of its underlying
symmetries. In physics, these symmetries correspond to conservation laws, such
as for energy and momentum. Yet even though neural

More...

Share

Bitwise Neural Networks

Based on the assumption that there exists a neural network that efficiently
represents a set of Boolean functions between all binary inputs and outputs, we
propose a process for developing and deployi

More...

Share

Should Graph Neural Networks Use Features, Edges, Or Both?

Graph Neural Networks (GNNs) are the first choice for learning algorithms on
graph data. GNNs promise to integrate (i) node features as well as (ii) edge
information in an end-to-end learning algorith

More...

Share

Probabilistic Models of Multi-Layer Feedforward Neural Networks

Lifted Neural Networks

Read More...

Share

Transmission Neural Networks: From Virus Spread Models to Neural Networks

This work connects models for virus spread on networks with their equivalent
neural network representations. Based on this connection, we propose a new
neural network architecture, called Transmission

More...

Share

Bayesian Neural Networks

In recent times, neural networks have become a powerful tool for the analysis
of complex and abstract data models. However, their introduction intrinsically
increases our uncertainty about which featu

More...

Share

Constrained Monotonic Neural Networks

Deep neural networks are becoming increasingly popular in approximating
arbitrary functions from noisy data. But wider adoption is being hindered by
the need to explain such models and to impose addit

More...

Share

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks

We study how neural networks trained by gradient descent extrapolate, i.e.,
what they learn outside the support of training distribution. Previous works
report mixed empirical results when extrapolati

More...

Share

Quantum neural networks

This PhD thesis combines two of the most exciting research areas of the last
decades: quantum computing and machine learning. We introduce dissipative
quantum neural networks (DQNNs), which are design

More...

Share

Training Deep Spiking Auto-encoders without Bursting or Dying Neurons through Regularization

Spiking neural networks are a promising approach towards next-generation
models of the brain in computational neuroscience. Moreover, compared to
classic artificial neural networks, they could serve a

More...

Share

E3nn: A Generalized Framework for Creating E(3) Equivariant Trainable Functions

e3nn: Euclidean Neural Networks

Read More...

Share

SCNN: An Architecture for Learning from Graphs

Simplicial Convolutional Neural Networks

Read More...

Share

More

More