This Day in TensorFlow 2: tf.tpu
Hello world, it’s Aaron! Today I’ll be talking about TensorFlow 2’s amazing tf.tpu API.
NOTE: The version of TensorFlow I’m using is TensorFlow 2.1. The Python version I’m using is 3.7. I’ll also periodically update the TensorFlow & Python version used here to match the most recent releases.
What is TensorFlow?
TensorFlow is Google’s machine learning library that is free and open-source for anyone to use. It’s written in C++ so it executes low-level, or it executes ridiculously fast.
What is TPU?
Well, a TPU is a Tensor Processing Unit that was developed by Google Cloud. It’s a scary machine cooled by water(see picture depicted below).
Ok, so now onto the meat: tf.tpu. tf.tpu is a TensorFlow API that is related to the operation of Tensor Processing Units(TPUs).
Currently, from TensorFlow’s tf.tpu documentation, there are only 2 available features.
1. tf.tpu.experimental.initialize_tpu_system
This initializes a TPU device.
Sample usage:
tf.tpu.experimental.initialize_tpu_system(cluster_resolver=None)
Arguments:
- cluster_resolver: provided information of the TPU cluster
Returns:
- There is a TPU: tf.tpu.Topology object for the topology(arrangement) of the TPU cluster
- No TPU found: RuntimeError
2. tf.tpu.experimental.shutdown_tpu_system
This shuts down the TPU device(s).
Sample usage:
tf.tpu.experimental.shutdown_tpu_system(cluster_resolver=None)
Arguments:
- cluster_resolver: provided information of the TPU cluster
Returns:
- There is a TPU: tf.tpu.Topology object for the topology(arrangement) of the TPU cluster
- No TPU found: RuntimeError if no devices found thru eager execution
NOTE: this clears all caches, including cache such as the compilation cache
That’s all for today! Don’t forget to follow me on Medium to stay subscribed to the latest news in technology and more! Thanks for reading!