전체 글(95)
-
4. Introduction to graphs and tf.function
This guide goes beneath the surface of TensorFlow and Keras to demonstrate how TensorFlow works. In this guide, you'll learn How TensorFlow allows you to make simple to your code to get graphs How graphs are stored and represented How you can use them to accelerate your models This is a big-picture overview that covers how tf.function allows you to switch from eager execution to graph execution...
2022.03.12 -
3. Introduction to gradients and automatic differentiation
In this guide, you will explore ways to compute gradients with TensorFlow especially in eager execution. import numpy as np import matplotlib.pyplot as plt import tensorflow as tf Computing gradients To differentiate automatically, TensorFlow needs to remember what operations happen in what order during the forward pass. Then, during the backward pass, TensorFlow traverses this list of operation..
2022.03.12 -
2. Introduction to Variables
A TensorFlow variable is the recommanded way to represent shared, persistent state your program manipulates. This guide covers how to create, update, and manage instances of tf.Variable in Tensorflow. Higher level libraries like tf.keras use lf.Variable to store model parameters. import tensorflow as tf Create a variable To create a variable, provide an initial value. The tf.Variable will have t..
2022.03.12 -
1. Introduction to Tensors
import numpy as np import tensorflow as tf Basic Tensors are multi-dimensional arrays with a uniform type (called a dtype). All tensors are immutable like Python numbers and strings: you can never update the contents of a tensor, only create a new one. # Python number num = 123 # Tensorflow Tensor tensor = tf.constant([1, 2]) # Before num address print("Before num address: ", id(num)) # Before t..
2022.03.12 -
[Linear Algebra] 13. Eigenvectors and eigenvalues
Eigenvectors and eigenvalues Most vectors knocked off their span(=line) during transformation. But some special vectors remain on their line after transformation. It means that transformation just stretches or squishes vectors. Moreover, any other vector in that line also stretch or squish by a same scalar. These special vector are called the "eigenvectors" of that transformation. And each eigen..
2022.03.12 -
[Linear Algebra] 12. Change of basis
Standard way to describe vector is using coordinates $\begin{bmatrix} x \\ y \end{bmatrix}$. each coordinate is scalar to stretches or squishes basis vectors, normally $\mathbf{\hat{i}}$ and $\mathbf{\hat{j}}$ which are standard basis vectors. In this section, think about the idea of using different set of basis vectors. Let's define new basis vectors $\mathbf{\vec{b_1}}$ and $\mathbf{\vec{b_2}}..
2022.03.12