Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
Various Neural Networks employ time-consuming matrix operations like matrix inversion. Many such matrix operations are faster to compute given the Singular Value Decomposition (SVD). Techniques from [1, 2] allow using the SVD in Neural Networks without computing it. In theory, the techniques can speed up matrix operations, however, in practice, they are not fast enough.
We present an algorithm which is up to $27 \times $ faster than a previous approach, fast enough to speed up several matrix operations.
Published:
Many local post-hoc explainability techniques, such as DeConvNet, Guided Backprop, Layer-wise relevance propagation, and integrated gradients, rely on “gradient-like” computations, where explanations are propagated backwards through Neural Networks, one layer at a time. One can alter this backward computation to include attentions, which guides the explanation techniques to produce better explanations.
Published:
“How can I make a minimal and realistic change to an input such that the predicted outcome change?” This is the answer which I am aiming to answer using invertible Neural Networks. The idea relates a lot to pre-imaging from, e.g., kernel-methods.