Activity
-
In our new work TensorGRaD we use a robust decomposition of the gradient tensors into low-rank + sparse parts to reduce optimizer memory for Neural…
In our new work TensorGRaD we use a robust decomposition of the gradient tensors into low-rank + sparse parts to reduce optimizer memory for Neural…
Shared by Anima Anandkumar
-
Can't believe it has been 3 years! What an amazing journey with Benedikt Jenik and so much more to come!
Can't believe it has been 3 years! What an amazing journey with Benedikt Jenik and so much more to come!
Shared by Anima Anandkumar
-
An in-depth physical analysis of #OceanNet, our high-resolution regional ocean digital twins' predictions in terms of eddy structure, shedding…
An in-depth physical analysis of #OceanNet, our high-resolution regional ocean digital twins' predictions in terms of eddy structure, shedding…
Shared by Anima Anandkumar
More activity by Anima
-
It was an honor to be part of Google IO Dialogues stage with James Manyika Pushmeet Kohli Joëlle Barral and talk about AI+Science. I talked about…
It was an honor to be part of Google IO Dialogues stage with James Manyika Pushmeet Kohli Joëlle Barral and talk about AI+Science. I talked about…
Posted by Anima Anandkumar
Other similar profiles
-
Tanya Berger-Wolf
Connect -
Siddhartha Srinivasa
Connect -
Seyda Ertekin
Connect -
Yisong Yue
Connect -
Hana Khamfroush
Connect -
Devi Parikh
Connect -
Arash Vahdat
Connect -
Mahdi Imani
Assistant Professor at Northeastern University
Connect -
Mostafa Ardakani
Associate Professor at University of Utah
Connect -
Vijayan Asari
Professor and Endowed Chair
Connect
Explore collaborative articles
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
Explore More