Multiplication (e.g., convolution) is arguably a cornerstone of modern deep
neural networks (DNNs). However, intensive multiplications cause expensive
resource costs that challenge DNNs' deployment on
Rolling shutter (rs) distortions in the captured image can result in undesired motion artifacts known as rolling shutter (rs)distortions in the captured image.
Existing single image rs rectification methods attempt to account for these distortions by either using algorithms tailored for specific class of scenes which warrants information of intrinsic camera parameters or a learning-based framework with known ground truth motionparameters.
Deep learning techniques lie at the heart of several significant AI advances
in recent years including object recognition and detection, image captioning,
machine translation, speech recognition and s
We develop new theoretical results on matrix perturbation to shed light on the impact of architecture on the performance of a deep network.
In particular, we explain analytically what deep learning practitioners have long observed empirically : the parameters of some deep architectures (e.g., residual networks, resnets) are easier to optimize than other deep architectures (e.g., convolutional networks, convnets).
Deep neural networks (dns) provide superhuman performance in numerous computer vision tasks, yet it remains unclear exactly which of a dn s unitscontribute to a particular decision.
Neuroview is a new family of dnarchitectures that are interpretable/explainable by design.
Deep convolutional neural network has demonstrated its capability of learning
a deterministic mapping for the desired imagery effect. However, the large
variety of user flavors motivates the possibili
The trend towards increasingly deep neural networks has been driven by a
general observation that increasing depth increases the performance of a
network. Recently, however, evidence has been amassing
It has been empirically observed that the flatness of minima obtained from
training deep networks seems to correlate with better generalization. However,
for deep networks with positively homogeneous
We present ApproxConv, a novel method for compressing the layers of a
convolutional neural network. Reframing conventional discrete convolution as
continuous convolution of parametrised functions over
In this paper, we study the importance of pruning in deep networks (dns) and motivate it based on the current absence of data aware weight initialization.
Current initializations, focusing primarily at maintaining first order statistics of the feature maps through depth, force practitioners to overparametrize a model in order to reach high performances.
Deep network-based image Compressed Sensing (CS) has attracted much attention
in recent years. However, the existing deep network-based CS schemes either
reconstruct the target image in a block-by-blo
Recent research developing neural network architectures with external memory
have often used the benchmark bAbI question and answering dataset which
provides a challenging number of tasks requiring re
Custom and natural lighting conditions can be emulated in images of the scene
during post-editing. Extraordinary capabilities of the deep learning framework
can be utilized for such purpose. Deep imag
Deep image denoisers achieve state-of-the-art results but with a hidden cost.
As witnessed in recent literature, these deep networks are capable of overfitting their training distributions, causing inaccurate hallucinations to be added to the output and generalizing poorly to varying data.
Despite the fact that the loss functions of deep neural networks are highly
non-convex, gradient-based optimization algorithms converge to approximately
the same performance from many random initial p
Traditional machine learning approaches may fail to perform satisfactorily when dealing with complex data. In this context, the importance of data mining evolves w.r.t. building an efficient knowledge
Schizophrenia is a severe mental health condition that requires a long and
complicated diagnostic process. However, early diagnosis is vital to control
symptoms. Deep learning has recently become a po
Deep learning has shown astonishing performance in accelerated magnetic
resonance imaging (MRI). Most state-of-the-art deep learning reconstructions
adopt the powerful convolutional neural network and