Home Awesome Algolux announces Ion, a development platform for autonomous vision systems

Algolux announces Ion, a development platform for autonomous vision systems


Algolux, founded in 2015, isn’t exactly a household name in the already crowded world of automotive computer vision. But the Quebec-based startup has generated some interest among investors. For instance, it’s raised $13.4 million, including a $10 million Series A led by General Motor Ventures last May. Not bad, given the fact that it’s remained a virtual unknown, up until now.

Today, Algolux is unveiling Ion, a platform that dedicates companies a define of tools and an embedded software stack to assist them construct their own perception systems. It’s essentially a plug-and-play answer, a departure from the common approach today in which companies are confined to siloed systems that often don’t integrate as easily with other systems.

Algolux’s system brings the company’s machine learning and computer vision technologies to users looking to build an end-to-end answer, incorporating various regulations from governing bodies and safety features designed to help systems operating in tricky environments.

The company says Ion offers an opportunity to construct more traditional systems, or “radical new designs.” This capability is applicable to any sensor type, processor type and perception undertaking. Ion relies on the deep neural network Eos and Atlas, a number of different modules designed for camera tuning. It offer developers a mix and match approach based on their individual needs.

In a letter to TechCrunch, VP Dave Tokic notes the key differentiator between the company and its competition is a kind of brand agnosticism that lets companies use different products for different needs and to maintain cost down.

” Our Ion Platform consists of tools( Atlas) and embedded software stacks( Eos) to uniquely provide an end-to-end approach to teams constructing perception systems ,” he tells TechCrunch.” This allows the team to optimize and deep learn across both sensing and perception( even up to scheming and control) for significantly better performance and to break down today’s design process silos. This ability is applicable to any sensor form, processor kind, and perception task .”

Read more: feedproxy.google.com


Please enter your comment!
Please enter your name here