Computer Vision News - February 2021
16 Computer Vision Tool This month, I am presenting a tool that can accelerate deep learning research using Autograd and XLA. JAX is developed by Google Research teams for high- performance machine-learning research . In this article, I am going to give a short introduction to JAX and present some tools to use with it! Differentiation Autograd is a project which is no longer developed (although actively maintained). It can handle a large subset of Python's features . Importantly, it supports reverse-mode differentiation i.e. backpropagation . It takes gradients of scalar- valued functions with respect to array-valued arguments and forward-mode differentiation, and the two can be composed arbitrarily. Here’s an example from the documentation: U s i n g J A X t o a c c e l e r a t e d e e p l e a r n i n g r e s e a r c h Welcome to this month’s code discussion. I hope that you are all staying safe and are aware of any local restrictions due to the pandemic. It’s really important to be careful and keep the numbers now as in many cases (such as in the UK) the death toll is constantly rising. by Ioannis Valasakis, King’s College London (Twitter/LinkedIn/GitHub as @wizofe) >>> import autograd.numpy as np # Thinly-wrapped numpy >>> from autograd import grad # The only autograd function you may ever need >>> >>> def tanh(x): # Define a function ... y = np.exp(-2.0 * x) ... return (1.0 - y) / (1.0 + y) ... >>> grad_tanh = grad(tanh) # Obtain its gradient function >>> grad_tanh(1.0) # Evaluate the gradient at x = 1.0 0.41997434161402603 >>> (tanh(1.0001) - tanh(0.9999)) / 0.0002 # Compare to finite differences 0.41997434264973155 JAX supports a very similar API like the one from Autograd while it extends it with more automated features. Here’s the respective example from JAX:
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=