Event Based Machine Vision Applications Tool-Kit Launched

Prophesee SA, inventor of advanced neuromorphic vision systems, has announced the availability of its Metavision Intelligence Suite The suite offers a comprehensive set of development tools for accelerating the process of exploring and implementing Event-Based Vision technology in machine vision systems. The three components of the suite – Player, Designer and SDK – are each aimed at different stages of the design process and provide engineers and software developers with a means to easily iterate and customize designs that fully leverage the efficiency and performance of Event-Based Vision.

The Event Based Vision approach to machine vision offers a new vision category which allows for significant reductions of power, latency and data processing requirements to reveal what was previously invisible to traditional frame-based sensors. Prophesee’s patented Metavision sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as industrial automation, IoT, and AR/VR.

In total, the suite consists of 62 algorithms, 54 code samples and 11 ready-to-use applications. It provides users with both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by increasing difficulty to incrementally introduce the fundamental concepts of event-based machine vision. Drawn from more than 5 years of production experience working with developers, this suite of tools fully enables users to experiment and quickly iterate to develop their own applications and products.

With this advanced toolkit, engineers can easily develop machine vision applications for a wide range of markets, including industrial automation, IoT, surveillance, mobile, medical, automotive and more. « Plug-and-play » provided algorithms include high-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow-motion, machine learning and others.

Frame -Based – A conventional camera takes an arbitrary number of frames per second, usually around 30 fps, in which all pixels record in synchrony regardless of what is going on in the scene.

Event Based – In Prophesee Metavision, each pixel is independent and only records when it senses a change or movement. The information created does not arrive frame by frame. Rather, movement is captured as a continuous stream of information. Prophesee sees between the frames, where all traditional frame-based systems are blind.

The software suite is compatible with Prophesee Metavision sensors and Evaluation Kits as well as compatible partners’ products such as Century Arks’ SilkyEvCam. This allows system solution providers a powerful design environment to build end-to-end product solutions and enable rapid time-to-market product deployments.

“We understand the importance of enabling the development ecosystem around event-based vision technology. This software toolkit is meant to accelerate engineers’ ability to take advantage of its unique benefits without having to start from scratch,” said Luca Verre, CEO and co-founder of Prophesee. “The tools offer productivity and learning features that are valuable regardless of where a development team is on the adoption curve of event-based vision and will jumpstart design projects with production ready design aids.”

The Metavision Intelligence Suite is available in both time-unlimited free trial as well as a professional version, providing access to source code, advanced modules, revision updates, full documentation and support.

For more information: www.prophesee.ai

Tags: , , , , , , , , ,