Teaching AI the scent of cancer
With Akash Kulgod
Understanding olfaction is a powerful way to understand sensory information processing, with the Berkeley neuroscientist Walter Freeman claiming that the other sensory systems “moved in and co-opted the (olfactory) operating system, changing the details but taking advantage of the main thrust of the dynamics” (Freeman, 2000). Yet artificial intelligence struggles with odors, which resist simple parameterization like wavelength or frequency. Odor percepts arise from a combinatorial encoding of molecules by olfactory receptors, and the complexity of this relationship is exemplified by Sell’s triplets, where structurally similar molecules produce dissimilar percepts, and vice versa. This challenge underpins the limitations of current machine olfaction models, such as gas sensors for cancer detection, which fail to generalize reliably. We propose a percept-first approach: trained biomedical detection dogs, wearing EEG-based BCIs, can serve as perceptual labelers, providing high-dimensional neurobehavioral data to reverse-engineer molecular patterns and transform machine olfaction.