Google Coral System-on-Module (SoM) – 1GB [Discontinued]
A fully-integrated system for accelerated ML applications (includes CPU, GPU, Edge TPU, Wi-Fi, Bluetooth, and Secure Element), in a 40mm x 48mm pluggable module.
The Google Coral System-on-Module (SoM) is a fully-integrated system that helps you build embedded devices that demand fast machine learning (ML) inferencing. It contains NXP’s iMX 8M system-on-chip (SoC), eMMC memory, LPDDR4 RAM, Wi-Fi, and Bluetooth, but its unique power comes from Google’s Edge TPU coprocessor for high-speed machine learning inferencing.
Provides a complete system
The Coral SoM is a fully-integrated Linux system that includes NXP’s iMX8M system-on-chip (SoC), eMMC memory, LPDDR4 RAM, Wi-Fi, and Bluetooth, and the Edge TPU coprocessor for ML acceleration. It runs a derivative of Debian Linux we call Mendel.
Performs high-speed ML inferencing
The on-board Edge TPU coprocessor is capable of performing 4 trillion operations (tera-operations) per second (TOPS), using 0.5 watts for each TOPS (2 TOPS per watt). For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power-efficient manner. See more performance benchmarks.
Integrates with your custom hardware
The SoM connects to your own baseboard hardware with three 100-pin connectors.
Supports TensorFlow Lite
No need to build models from the ground up. TensorFlow Lite models can be compiled to run on the Edge TPU.
Supports AutoML Vision Edge
Easily build and deploy fast, high-accuracy custom image classification models to your device with AutoML Vision Edge.
Also available with a baseboard as part of the Coral Dev Board.
Specifications
CPU | NXP i.MX 8M SoC (quad Cortex-A53, Cortex-M4F) |
GPU | Integrated GC7000 Lite Graphics |
ML accelerator | Google Edge TPU coprocessor: 4 TOPS (int8); 2 TOPS per watt |
RAM | 1 GB LPDDR4 |
Flash memory | 8 GB eMMC |
Wireless | Wi-Fi 2×2 MIMO (802.11b/g/n/ac 2.4/5GHz) and Bluetooth 4.2 |
Dimensions | 48mm x 40mm x 5mm |
Resources
Datasheet
- System-on-Module datasheet
Application notes
-
Get started with the System-on-Module
Software guides
-
Model compatibility on the Edge TPU
-
Edge TPU inferencing overview
-
Run multiple models with multiple Edge TPUs
-
Pipeline a model with multiple Edge TPUs
API references
-
PyCoral API (Python)
-
Libcoral API (C++)
-
Libedgetpu API (C++)
Downloads
-
Edge TPU compiler
-
Pre-compiled models
-
All software downloads