We propose a novel method, atommodeling, that can discretize a continuous latent space by drawing an analogybetween a data point and an atom, which is naturally spaced away from other data points with distances depending on their intra structures.
Specifically, we model each data point as an atom composed of electrons, protons, and neutrons and minimize the potential energy caused by the interatomic force among datapoints.
We propose a probe for the analysis of deep learning architectures that is based on machine learning and approximation theoretical principles.
Given a deep learning architecture and a training set, during or after training, the sparsity probe allows to analyze the performance of intermediate layers by quantifying the geometrical features of representations of the training set.
The graph neural networking challenge 2021 brings a practical limitation of existing solutions for networking : the lack of generalization to larger networks including higher link capacities and aggregated traffic on links.
This paper approaches the scaling problem by presenting a graph neural network-based solution that can effectively scale to larger networks including higher link capacities and aggregated traffic on links.