Hi,
This is Anand ( アナンド , if you prefer). I spend the majority of my time trying to understand the nuances of Deep Neural Networks, if not AI at large. I currently work at Bodygram , Tokyo as an Machine Learning Engineer. On a good day, I have at least a handful of failed ideas, or ideas that don't really work, which I realise while working out the math or while implementing.
I can passionately talk for hours on Karnatik Music (South-Indian Classical music), Philosophy of Art, and Science in general. Recently, I have picked up a passion for hiking and working my way up the 日本百名山
(100 Famous Mountains in Japan). I had previously worked on embedded systems and robotics before transitioning into Machine Learning.
Contact
Recent Posts
5 Feb 2024 | Olafur Eliasson @ Azabudai Hills Gallery |
---|---|
A trip to Olafur Eliasson's exhibition at Azabudai Hills Gallery, Tokyo | |
29 Jun 2022 | The Back-Gradient Trick |
Stochastic Gradient Descent can be (kind of) reversed and can be used to compute gradients with respect to its hyperparameters. | |
31 May 2022 | Parallelizing Kalman Filters |
The associative property of Kalman (Bayesian) filters can yield a parallel algorithm in O(log N). | |
24 Apr 2022 | Linearization is All You Need for an Autodiff Library |
A complete autodiff library can be written only with linearizing the computational graph. | |
23 Sep 2021 | Fast Sample-Covariance Computation for Multidimensional Arrays |
A quick discussion and a vectorized Python implementation for the computation of sample covariance matrices for multi-dimensional arrays. |