TensorFlow Tutorials

Before running a tutorial

You will run the tutorials on an inf1.6xlarge instance running Deep Learning AMI (DLAMI) to enable both compilation and deployment (inference) on the same instance. In a production environment we encourage you to try different instance sizes to optimize to your specific deployment needs.

Follow instructions at TensorFlow Tutorials Setup before running a TensorFlow tutorial on Inferentia. We recommend new users start with the ResNet-50 tutorial.

Computer Vision

Natural Language Processing

  • Tensorflow 1.x - Running TensorFlow BERT-Large with AWS Neuron [html]

  • Tensorflow 2.x - HuggingFace Pipelines distilBERT with Tensorflow2 Neuron [html] [notebook]

Utilizing Neuron Capabilities

  • Tensorflow 1.x - NeuronCore Groups tutorial [html]

  • Tensorflow 1.x - TensorFlow Serving tutorial [html]

  • Tensorflow 1.x - NeuronCore Groups with TensorFlow Serving tutorial [html]