This undated photo shows the brain-inspired complementary vision chip "Tianmouc" created by a group of Chinese scientists. (Tsinghua University/Handout via Xinhua)
BEIJING, May 30 (Xinhua) -- A group of Chinese scientists have pioneered the creation of the world's first brain-inspired complementary vision chip, endowing machines with a human-like visual perception capability.
The AI-driven visual perception is laying the groundwork for a transformative tech revolution, particularly in the realm of unmanned systems like autonomous driving.
However, achieving efficient, accurate, and resilient visual perception in environments that are dynamic, varied, and inherently unpredictable continues to be a formidable challenge.
The researchers from Tsinghua University drew inspiration from the principles of the human visual system and have crafted a vision-sensing technology that breaks down visual information into fundamental, primitive-based visual representations.
By integrating these primitives, the technology emulates the features of the human visual system, establishing two complementary visual perception pathways: cognition-oriented and action-oriented, respectively, according to the study published as a cover story in the journal Nature on Thursday.
Building on this foundation, the team has engineered the "Tianmouc" vision chip, which is capable of high-speed visual information acquisition at a rate of 10,000 frames per second with 10-bit precision.
This breakthrough innovation also boasts a 90 percent reduction in bandwidth and operates with low power consumption, according to the study.
Utilizing the chip, the researchers designed high-performance software and algorithms and validated the system on a vehicle-mounted platform operating in diverse open environments.
The system has exhibited low-latency, high-performance real-time perception capabilities, highlighting its immense potential for deployment in the field of intelligent unmanned systems.
The Tianmouc chip paves a new path for pivotal applications such as autonomous driving and embodied intelligence, said Zhao Rong, one of the paper's corresponding authors.
点击右上角微信好友
朋友圈
请使用浏览器分享功能进行分享