The human eye is a wonderful nightmare. With it, we can see the world...but only a very tiny sliver of the world. Namely, the part that exists right in front of us. We can't see to the sides, behind us, or around a corner (or around any other object, for that matter). What's more, the things that we can see are limited to a very narrow band of the electromagnetic spectrum. We can't see heat or cold. We can't see radio signals or many kinds of energy released by our Sun. We can't see things at a distance.
In short, our vision is a sense so limited that, as author David Wong notes, "we might as well not have it."
Yet, for most all people, sight is the primary mode of analyzing and understanding the world. We've evolved as a species to depend on it. And sadly, other senses and forms of perception have atrophied. And as Wong concludes, "we have wound up with the utterly mad and often fatal delusion that if we can't see something, it doesn't exist."
Case in point: If seeing is believing, then what does this mean when it comes to combating climate change, ending the ongoing extinction crisis, or fixing problems related to food waste and contamination? After all, carbon dioxide is invisible. We can’t look to the sky and see it proliferating throughout the atmosphere and smothering our planet. Likewise, changes to species occur over decades or even centuries. It’s not something that we actually see happening as we go about our day-to-day lives. And in many instances, it is near impossible to tell if food is contaminated or outdated just by looking at it.
ImpactVision Founder & CEO Abi Ramanan founded her company in order to use a new way of seeing to help solve some of these problems. The first problem that Ramanan and her team are tackling is global food waste. Their weapon of choice? Hyperspectral imaging.
Hyperspectral imaging was originally developed for NASA. It’s a technology that allows scientists and researchers to investigate the spectra that are produced when matter interacts with parts of the electromagnetic spectrum. Put simply, hyperspectral imaging collects and processes how a particular object reflects light. This allows us to analyze and better understand the object’s chemical composition and identify differences in the signatures of similar objects, even when these differences are not detectable by the human eye.
By combining hyperspectral imaging with machine learning technologies, ImpactVision created a system that allows food companies to scan food, analyze the unique spectral reflections of the light, and determine nutritional content as well as the level of freshness or contamination. What’s more, the system operates without human input or interference. Ramanan asserts that this automated method is ultimately a vast improvement on our current methods, in which human laborers examine food by sight, which is rather imperfect and faulty, or by taking physical samples using various probing methods, which damages the food.
Ultimately, ImpactVision hopes to improve how we asses food and improve our understanding of the supply chain in order to identify issues and inefficiencies in our current systems. “Soon, we will have to feed a population of 10 billion people. That means that we need to conserve resources and improve the way that food is distributed through the supply chain,” Ramanan said.
To learn more about how ImpactVision is helping end our food waste problem, and to uncover the future uses for this technology, see Ramanan’s talk from the 2019 Boma Grow Summit.