To help personalise content, tailor your experience and help us improve our services, Bisinfotech.com uses cookies.
By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

MathWorks Introduces Sensor Fusion and Tracking Toolbox

MathWorks Extends MATLAB workflow to help engineers design, simulate, and analyze systems fusing data from multiple sensors

Multiplatform
Multiplatform radar detection generation capabilities in Sensor Fusion and Tracking Toolbox. Image courtesy MathWorks

MathWorks unveils Sensor Fusion and Tracking Toolbox, which is now available as part of Release 2018b. The toolbox extends MATLAB based workflows to help engineers develop accurate perception algorithms for autonomous systems.

The new toolbox is known to equip engineers working on autonomous systems in aerospace and defence, automotive, consumer electronics, and other industries with algorithms and tools to maintain position, orientation, and situational awareness.

Engineers working on the perception stage of autonomous system development need to fuse inputs from various sensors to estimate the position of objects around these systems. Now, researchers, developers, and enthusiasts can use algorithms for localization and tracking, along with reference examples within the toolbox, as a starting point to implement components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems. The toolbox provides a flexible and reusable environment that can be shared across developers. It provides capabilities to simulate sensor detections, perform localization, test sensor fusion architectures, and evaluate tracking results.

“Algorithm designers working on tracking and navigation systems often use in-house tools that may be difficult to maintain and reuse,” said Paul Barnard, Marketing Director – Design Automation, MathWorks. “With Sensor Fusion and Tracking Toolbox, engineers can explore multiple designs and perform ‘what-if analysis’ without writing custom libraries. They can also simulate fusion architectures in software that can be shared across teams and organizations.”

Sensor Fusion and Tracking Toolbox Includes:

  • Algorithms and tools to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness
  • Reference examples that provide a starting point for airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems
  • Multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that can be used to evaluate fusion architectures using real and synthetic data
  • Scenario and trajectory generation tools
  • Synthetic data generation for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors
  • System accuracy and performance standard benchmarks, metrics, and animated plots
  • Deployment options for simulation acceleration or desktop prototyping using C-code generation

Further Info About Sensor Fusion and Tracking Toolbox:  Click here

Further info: Click here.

 

Tags
Show More

Niloy Banerjee

A generic movie-buff, passionate and professional with print journalism, serving editorial verticals on Technical and B2B segments, crude rover and writer on business happenings, spare time playing physical and digital forms of games; a love with philosophy is perennial as trying to archive pebbles from the ocean of literature. Lastly, a connoisseur in making and eating palatable cuisines.

Related Articles