
“We’ve also made it possible to use the Dev Board’s GPU to convert YUV to RGB pixel data at up to 130 frames per second on 1080p resolution, which is one to two orders of magnitude faster than on Mendel Linux 3.0 release Chef. These changes make it possible to run inferences with YUV-producing sources such as cameras and hardware video decoders,” said Carlos Mendonça, Product Manager, Coral Team.
Google also announced MediaPipe for Coral. Developers and researchers can prototype their real-time perception use cases starting with the creation of the MediaPipe graph on desktop. Then they can quickly convert and deploy that same graph to the Coral Dev Board, where the quantized TensorFlow Lite model will be accelerated by the Edge TPU.
As part of this first release, MediaPipe is making available new experimental samples for both object and face detection, with support for the Coral Dev Board and SoM.
Google said it has made available the source code and instructions for compiling and running each sample on GitHub and on the MediaPipe documentation site.