Deploy machine learning models on mobile and edge devices
TensorFlow Lite is a mobile library for deploying models on mobile, microcontrollers and other edge devices.
How it works
Convert
Convert a TensorFlow model into a compressed flat buffer with the TensorFlow Lite Converter.
Deploy
Take the compressed .tflite file and load it into a mobile or embedded device.
Optimize
Quantize by converting 32-bit floats to more efficient 8-bit integers or run on GPU.
Solutions to common problems
Explore optimized TF Lite models and on-device ML solutions for mobile and edge use cases.
![](https://www.tensorflow.org/static/site-assets/images/marketing/cards/lite_solution_image_classification.jpg?authuser=19)
Identify hundreds of objects, including people, activities, animals, plants, and places.
![](https://www.tensorflow.org/static/site-assets/images/marketing/cards/lite_solution_object_detection.jpg?authuser=19)
![](https://www.tensorflow.org/static/site-assets/images/marketing/cards/lite_solution_question_answering.jpg?authuser=19)
Use a state-of-the-art natural language model to answer questions based on the content of a given passage of text with BERT.