By submitting this form, you are agreeing to the Terms of Use and Privacy Policy.
An imaging sensor called an event camera, also referred to as a neuromorphic camera, silicon retina, or dynamic vision sensor, reacts to small variations in luminance. Event cameras don’t use shutters to take pictures like regular (frame) cameras do.
Edge devices, such as smartphones, presently have to delegate processing for computationally intensive tasks to a cloud-based system, which processes the query and feeds the result back to the device. With neuromorphic systems, the investigation could be done inside the system itself rather than being passed back and forth.
The Global Neuromorphic smartphone imaging market accounted for $XX Billion in 2022 and is anticipated to reach $XX Billion by 2030, registering a CAGR of XX% from 2023 to 2030.
Neuromorphic smartphone imaging was offered by Qualcomm and Prophesee. It will make it possible for OEMs to take advantage of event-based vision’s speed, effectiveness, and quality in mobile devices, greatly enhancing camera performance in low-light and dynamic scenes.
Instead of each pixel monitoring a scene at a fixed rate, they work independently and asynchronously at extremely fast speeds, only capturing changes in the scene (such as variations in brightness) as they happen. This ensures that the sensor’s acquisition pace always corresponds to the dynamics of the scene, preventing the sensor from having to collect redundant data if a scene doesn’t change.
The power, latency, and data processing requirements can be significantly reduced thanks to this new vision category, which also achieves an exceptional acquisition speed to power usage trade-off that is up to three orders of magnitude better than that of traditional imaging technologies.
Thus, the efficiency of event-based sensors is drastically increasing in a variety of applications, including industrial automation and monitoring, mobility, medicine, and AR/VR.
Prophesee’s own Metavision sensor is expected to bring these advantages to smartphone cameras as a result of the new collaboration; the company is currently working on a development kit to support the integration of its sensors into products based on next-generation Snapdragon platforms.
With the goal of integrating event-based vision into the Snapdragon ecosystem, they are thrilled to be collaborating with the maker of one of the most widely used mobile systems worldwide. The user experience will be significantly improved by product developers because of cameras that offer image quality and operational excellence that aren’t possible with just conventional frame-based techniques.