Example

Description

Avnet PSOC Edge DEEPCRAFT Baby Monitor

This demo project is the integration of Infineon's PSOC™ Edge MCU DEEPCRAFT™ Deploy Audio project with Avnet /IOTCONNECT ModusToolbox SDK. For more details, see the

README on GitHub

.

Avnet PSOC Edge DEEPCRAFT Motion

This demo project is the integration of Infineon's PSOC™ Edge MCU DEEPCRAFT™ Deploy Motion project with Avnet /IOTCONNECT ModusToolbox SDK. For more details, see the

README on GitHub

.

Avnet PSOC Edge DEEPCRAFT Ready Models

This demo project is the integration of Infineon's PSOC™ Edge MCU: DEEPCRAFT™ Ready Model Deployment with Avnet /IOTCONNECT ModusToolbox SDK. For more details, see the

README on GitHub

.

PSOC Edge DEEPCRAFT Rock Paper Scissors Game

This code example demonstrates rock, paper, or scissors, which is a hand game where two players play the game simultaneously showcasing one of the three hand gestures: a fist for rock, a flat hand for paper, or two fingers for scissors.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning AI Hub Deploy Vision

This code example demonstrates a real-time human segmentation detection that uses a USB camera to capture live video and person segmentation model from DEEPCRAFT™ - Model Zoo for PSOC™ to detect human segmentation in the video feed using ModusToolbox™. The detected human segmentations are highlighted by drawing a bounding box around the human segmentation and displaying the corresponding text in a text box on the display.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Data Collection

This code example demonstrates how to collect data from PSOC™ Edge E84 MCU boards using DEEPCRAFT™ Studio. The example implements the DEEPCRAFT™ streaming protocol v2, allowing the streaming of sensor data and other information from the board into DEEPCRAFT™ Studio over USB for development and testing of Edge AI models.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Deploy Audio

This code example demonstrates how to deploy a Machine Learning (ML) model for Audio generated from DEEPCRAFT™ Studio on the PSOC™ Edge MCU. It comes pre-configured with the Baby Cry Detection Starter Model from the DEEPCRAFT™ Studio that uses the data from the PDM microphone to detect whether a baby is crying or not. New models based on the microphone audio can be dropped into the project as-is. The model can be run on either the Arm® Cortex® M33 (CM33) or Cortex® M55 (CM55) CPUs along with their accelerators.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Deploy Motion

This code example demonstrates how to deploy a Machine Learning (ML) model for Motion generated from DEEPCRAFT™ Studio on the PSOC™ Edge MCU. It comes pre-configured with the Human Activity Detection Starter Model from the DEEPCRAFT™ Studio that uses the data from the BMI270 IMU to detect various motions such as standing, running, walking, sitting, and jumping. New models based on the IMU can be dropped into the project as-is. The model can be run on either the Arm® Cortex® M33 (CM33) or Cortex® M55 (CM55) CPUs along with their accelerators.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Deploy Radar

This code example demonstrates how to deploy an DEEPCRAFT™ generated machine learning model. It comes pre-configured with a radar based model generated within DEEPCRAFT™ Studio. The code example collects radar data from the XENSIV™ 60 GHz radar sensor which is then sent to the machine learning model to detect specific gestures (push and circle). It uses the model.c/h files generated from within DEEPCRAFT™ Studio directly. New models based on the Gesture Detection project can be dropped into the project as-is.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Deploy Ready Model

This code example demonstrates how to integrate a ready model library from DEEPCRAFT™ Studio on PSOC™ Edge MCU. The code example includes five different models where four models detect different sounds such as baby-cry detection, cough detection, alarm detection, and siren detection which uses data from the pulse-density modulation (PDM) to pulse-code modulation (PCM) which is sent to the model for detection. The fifth model detects hand gestures which uses data from XENSIV™ radar sensor.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Deploy Vision

This code example demonstrates a real-time hand gesture detection that uses a USB camera to capture live video and a DEEPCRAFT™ Studio object detection model to detect hand gestures (rock, paper, or scissors) in the video feed using ModusToolbox™. The detected gestures are highlighted by drawing a bounding box around the gesture and displaying the corresponding text (rock, paper, or scissors) in a text box on the display.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning DEEPCRAFT Profiler

This code example demonstrates how to use the DEEPCRAFT™ development flow on PSOC™ Edge MCU, where you can have a pre-trained neural network (NN) model, which can be profiled and validated by the target device.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning Face ID Demo

This code example demonstrates Infineon's comprehensive solution for real-time Face ID, utilizing USB camera and MIPI DSI display interfacing with the PSOC™ Edge MCU. It also facilitates both on-device face enrolment, as well as face recognition for the detected enrolled users through the camera. The detected human faces are highlighted by displaying a bounding box around them, along with the corresponding enrolled user ID or "unknown" text in the case of a non-enrolled user, on the live video streamed through the USB camera.For more details, see the

README on GitHub

.

PSOC Edge Machine Learning Profiler

This code example demonstrates how to use the ModusToolbox™ Machine Learning (ModusToolbox™-ML) development flow on PSOC™ Edge MCU, where the end user has a pre-trained neural network (NN) model, which can be profiled and validated on the PC and target device. The user can import a pre-trained model using the ModusToolbox™-ML Configurator tool, create an embedded, optimized version of this model, and validate the performance on the PC. After this, the validation data files can be integrated with this code example or streamed to the device (default option), so the user can run the same validation data and profile performance when the ML model is deployed on the PSOC™ Edge MCU. The user can also select where to deploy and run the model.For more details, see the

README on GitHub

.