Deep Learning with JAX

┬╖ Simon and Schuster рмжрнНрн▒рм╛рм░рм╛ рммрм┐рмХрнНрм░рм┐ рм╣рнЛрмЗрмерм╛рмП
5.0
1рмЯрм┐ рм╕рморнАрмХрнНрм╖рм╛
рмЗрммрнБрмХрнН
408
рмкрнГрм╖рнНрмарм╛рмЧрнБрнЬрм┐рмХ
рмпрнЛрмЧрнНрнЯ

рмПрм╣рм┐ рмЗрммрнБрмХрнН рммрм┐рм╖рнЯрм░рнЗ

Accelerate deep learning and other number-intensive tasks with JAX, GoogleтАЩs awesome high-performance numerical computing library.

The JAX numerical computing library tackles the core performance challenges at the heart of deep learning and other scientific computing tasks. By combining GoogleтАЩs Accelerated Linear Algebra platform (XLA) with a hyper-optimized version of NumPy and a variety of other high-performance features, JAX delivers a huge performance boost in low-level computations and transformations.

In Deep Learning with JAX you will learn how to:

тАв Use JAX for numerical calculations
тАв Build differentiable models with JAX primitives
тАв Run distributed and parallelized computations with JAX
тАв Use high-level neural network libraries such as Flax
тАв Leverage libraries and modules from the JAX ecosystem

Deep Learning with JAX is a hands-on guide to using JAX for deep learning and other mathematically-intensive applications. Google Developer Expert Grigory Sapunov steadily builds your understanding of JAXтАЩs concepts. The engaging examples introduce the fundamental concepts on which JAX relies and then show you how to apply them to real-world tasks. YouтАЩll learn how to use JAXтАЩs ecosystem of high-level libraries and modules, and also how to combine TensorFlow and PyTorch with JAX for data loading and deployment.

About the technology

GoogleтАЩs JAX offers a fresh vision for deep learning. This powerful library gives you fine control over low level processes like gradient calculations, delivering fast and efficient model training and inference, especially on large datasets. JAX has transformed how research scientists approach deep learning. Now boasting a robust ecosystem of tools and libraries, JAX makes evolutionary computations, federated learning, and other performance-sensitive tasks approachable for all types of applications.

About the book

Deep Learning with JAX teaches you to build effective neural networks with JAX. In this example-rich book, youтАЩll discover how JAXтАЩs unique features help you tackle important deep learning performance challenges, like distributing computations across a cluster of TPUs. YouтАЩll put the library into action as you create an image classification tool, an image filter application, and other realistic projects. The nicely-annotated code listings demonstrate how JAXтАЩs functional programming mindset improves composability and parallelization.

What's inside

тАв Use JAX for numerical calculations
тАв Build differentiable models with JAX primitives
тАв Run distributed and parallelized computations with JAX
тАв Use high-level neural network libraries such as Flax

About the reader

For intermediate Python programmers who are familiar with deep learning.

About the author

Grigory Sapunov holds a Ph.D. in artificial intelligence and is a Google Developer Expert in Machine Learning.

The technical editor on this book was Nicholas McGreivy.

Table of Contents
Part 1
1 When and why to use JAX
2 Your first program in JAX
Part 2
3 Working with arrays
4 Calculating gradients
5 Compiling your code
6 Vectorizing your code
7 Parallelizing your computations
8 Using tensor sharding
9 Random numbers in JAX
10 Working with pytrees
Part 3
11 Higher-level neural network libraries
12 Other members of the JAX ecosystem
A Installing JAX
B Using Google Colab
C Using Google Cloud TPUs
D Experimental parallelization

рморнВрм▓рнНрнЯрм╛рмЩрнНрмХрми рмУ рм╕рморнАрмХрнНрм╖рм╛

5.0
1рмЯрм┐ рм╕рморнАрмХрнНрм╖рм╛

рм▓рнЗрмЦрмХрмЩрнНрмХ рммрм┐рм╖рнЯрм░рнЗ

Grigory Sapunov is a co-founder and CTO of Intento. He is a software engineer with more than twenty years of experience. Grigory holds a Ph.D. in artificial intelligence and is a Google Developer Expert in Machine Learning.

рмПрм╣рм┐ рмЗрммрнБрмХрнНтАНрмХрнБ рморнВрм▓рнНрнЯрм╛рмЩрнНрмХрми рмХрм░рмирнНрмдрнБ

рмЖрмкрмг рмХрмг рмнрм╛рммрнБрмЫрмирнНрмдрм┐ рмдрм╛рм╣рм╛ рмЖрмормХрнБ рмЬрмгрм╛рмирнНрмдрнБред

рмкрнЭрм┐рммрм╛ рмкрм╛рмЗрмБ рмдрмернНрнЯ

рм╕рнНрморм╛рм░рнНрмЯрмлрнЛрми рмУ рмЯрм╛рммрм▓рнЗрмЯ
Google Play Books рмЖрмкрнНрмХрнБ, Android рмУ iPad/iPhone рмкрм╛рмЗрмБ рмЗрмирм╖рнНрмЯрм▓рнН рмХрм░рмирнНрмдрнБред рмПрм╣рм╛ рм╕рнНрм╡рмЪрм╛рм│рм┐рмд рмнрм╛рммрнЗ рмЖрмкрмгрмЩрнНрмХ рмЖрмХрм╛рмЙрмгрнНрмЯрм░рнЗ рм╕рм┐рмЩрнНрмХ рм╣рнЛтАНрмЗрмпрм┐рмм рмПрммрмВ рмЖрмкрмг рмпрнЗрмЙрмБрмарм┐ рмерм╛рмЖрмирнНрмдрнБ рмирм╛ рмХрм╛рм╣рм┐рмБрмХрм┐ рмЖрмирм▓рм╛рмЗрмирнН рмХрм┐рморнНрммрм╛ рмЕрмлрм▓рм╛рмЗрмирнНтАНрм░рнЗ рмкрнЭрм┐рммрм╛ рмкрм╛рмЗрмБ рмЕрмирнБрмормдрм┐ рмжрнЗрммред
рм▓рм╛рмкрмЯрмк рмУ рмХрморнНрмкрнНрнЯрнБрмЯрм░
рмирм┐рмЬрм░ рмХрморнНрмкрнНрнЯрнБрмЯрм░рнНтАНрм░рнЗ рмерм┐рммрм╛ рн▒рнЗрммрнН рммрнНрм░рм╛рмЙрмЬрм░рнНтАНрмХрнБ рммрнНрнЯрммрм╣рм╛рм░ рмХрм░рм┐ Google Playрм░рнБ рмХрм┐рмгрм┐рмерм┐рммрм╛ рмЕрмбрм┐рмУрммрнБрмХрнНтАНрмХрнБ рмЖрмкрмг рм╢рнБрмгрм┐рмкрм╛рм░рм┐рммрнЗред
рмЗ-рм░рм┐рмбрм░рнН рмУ рмЕрмирнНрнЯ рмбрм┐рмнрм╛рмЗрм╕рнНтАНрмЧрнБрнЬрм┐рмХ
Kobo eReaders рмкрм░рм┐ e-ink рмбрм┐рмнрм╛рмЗрм╕рмЧрнБрмбрм╝рм┐рмХрм░рнЗ рмкрмврм╝рм┐рммрм╛ рмкрм╛рмЗрмБ, рмЖрмкрмгрмЩрнНрмХрнБ рмПрмХ рмлрм╛рмЗрм▓ рмбрм╛рмЙрмирм▓рнЛрмб рмХрм░рм┐ рмПрм╣рм╛рмХрнБ рмЖрмкрмгрмЩрнНрмХ рмбрм┐рмнрм╛рмЗрм╕рмХрнБ рмЯрнНрм░рм╛рмирнНрм╕рмлрм░ рмХрм░рм┐рммрм╛рмХрнБ рм╣рнЗрммред рм╕рморм░рнНрмерм┐рмд eReadersрмХрнБ рмлрм╛рмЗрм▓рмЧрнБрмбрм╝рм┐рмХ рмЯрнНрм░рм╛рмирнНрм╕рмлрм░ рмХрм░рм┐рммрм╛ рмкрм╛рмЗрмБ рм╕рм╣рм╛рнЯрмдрм╛ рмХрнЗрмирнНрмжрнНрм░рм░рнЗ рмерм┐рммрм╛ рм╕рммрм┐рм╢рнЗрм╖ рмирм┐рм░рнНрмжрнНрмжрнЗрм╢рм╛рммрм│рнАрмХрнБ рмЕрмирнБрм╕рм░рмг рмХрм░рмирнНрмдрнБред