昨天看到一篇营销软文,说 Jax 支持 Autograd 和 JIT.
我立即在群里直呼厉害!
群里一个小伙子立即说:torch.autograd 不是早就有了吗?
那么问题来了,这二者是不是一回事呢?
今天早早起来,翻阅了二者 github 主页,发现以下说法:
What is JAX?
JAX is Autograd and XLA, brought together for high-performance machine learning research.
From: https://github.com/google/jax#what-is-jax
我于是顺藤摸瓜,发现 Autograd 的超链接指向 HIPS/autograd 这个项目
在 HIPS/autograd 这个项目的 github 主页,我又发现了一段话:
Note: Autograd is still being maintained but is no longer actively developed. The main developers (Dougal Maclaurin, David Duvenaud, Matt Johnson, and Jamie Townsend) are now working on JAX, with Dougal and Matt working on it full-time. JAX combines a new version of Autograd with extra features such as jit compilation.
From: https://github.com/hips/autograd
主要开发者都跑去搞 JAX 了!看来 JAX 中的 Autograd 确实是继承了 HIPS/autograd 的衣钵。
那么 PyTorch 中的 Autograd 呢?同样在其 github 主页,我们找到了答案
With PyTorch, we use a technique called reverse-mode auto-differentiation, which allows you to change the way your network behaves arbitrarily with zero lag or overhead. Our inspiration comes from several research papers on this topic, as well as current and past work such as torch-autograd, autograd, Chainer, etc.
From: https://github.com/pytorch/pytorch#dynamic-neural-networks-tape-based-autograd
它是受到了 torch-autograd, HIPS/autograd, Chainer 这几个项目的的启发
torch-autograd 又说:
Autograd automatically differentiates native Torch code. Inspired by the original Python version.
From: https://github.com/twitter-archive/torch-autograd#autograd
这个 original Python version 点进去,你猜怎么着?
还是 HIPS/autograd!!!