The Active Sensing Thesis
Every non-invasive BCI listens. What changes if we start asking instead.
lobirus ·
The Listening Problem
Every non-invasive BCI built to date is a passive receiver. EEG reads electrical fields at the scalp. MEG reads magnetic fields. fNIRS measures blood oxygenation via near-infrared light. fMRI tracks hemodynamic responses. The instruments differ. The paradigm is the same: put a sensor on the outside and try to pick up whatever the brain is emitting.
After decades of work, the results are consistent. EEG-based BCIs achieve roughly 5 to 25 bits per minute for spelling tasks[1]. For context, that is slower than blinking in Morse code. Invasive implants (BrainGate, Neuralink) do dramatically better because they sit directly on cortical tissue and read individual neurons. The skull is the problem. It acts as a low-pass spatial filter, smearing signals from billions of neurons into an unresolvable blur by the time they reach the scalp[2].
The field's response has been to compensate computationally. More electrodes. Better amplifiers. Smarter decoders. Tang et al. (2023) at UT Austin showed that large language models used as Bayesian priors can reconstruct continuous language from fMRI signals with surprising accuracy[3]. Impressive work. But the underlying assumption is unchanged: the signal is fixed, the skull degrades it, and our job is to recover what we can from the wreckage.
I think the assumption is wrong. Not the physics of it. The framing.
Stop Listening. Start Asking.
There is a fundamental difference between passive detection and active sensing. Passive detection means sitting still and hoping to catch whatever the source emits. Active sensing means sending a controlled signal into the medium and measuring what comes back.
This distinction is not new. It is how most of our best sensing technologies work. MRI does not listen to the brain. It sends radiofrequency pulses into tissue sitting in a strong magnetic field and measures the nuclear spin response[4]. Ultrasound sends acoustic waves and reads reflections. Radar sends electromagnetic pulses and analyzes returns. In each case, the SNR advantage over passive detection is not incremental. It is orders of magnitude. Radar detects objects that emit nothing at all.
Nobody has systematically applied this paradigm to BCI. That is the core of what I am proposing.
Send a controlled wave into the brain. Measure how neural activity modifies its properties on the way through. The information is not in what the brain emits. It is in how the brain changes a signal you control.
Six Channels, Two Explored
When a neuron fires, it is not just an electrical event. It is a cascade of simultaneous physical changes:
Electrical: Membrane potential swings ~100mV. Ion channels open and close. Local charge distributions shift. (This is what EEG reads.)
Magnetic: Current loops from ion flow create local field changes in the femtotesla range[5]. (This is what MEG reads.)
Mechanical: The cell membrane physically deforms during depolarization. Cortical surface displacement is measurable at nanometer to micrometer scale[6].
Thermal: Metabolic activity produces local temperature changes on the order of microkelvins[7].
Optical: Refractive index changes. Scattering properties shift. Neurons emit ultra-weak biophotons in the visible and near-UV spectrum, on the order of tens to hundreds of photons/sec/cm2[8].
Dielectric: Ion concentration shifts alter the local dielectric constant. Water molecules reorient around changing ion distributions, restructuring hydration shells[9].
Decades of BCI research have exploited the first channel and partially the second. The other four are essentially untouched. Each one could be probed by an active carrier wave designed to be sensitive to that class of physical change. Nobody has done this in a systematic way.
Candidate Approaches
Millimeter-wave radar (76-81 GHz). Automotive radar chips (Texas Instruments IWR/AWR series) transmit FMCW signals with sub-millimeter range resolution. At 77 GHz, partial skull penetration is feasible through thinner temporal bone. Neural activity produces microscale tissue displacement and dielectric shifts that would modulate radar returns. The hardware exists as commodity components. Nobody has pointed a mmWave radar array at a skull and applied modern ML decoding to the raw IQ data. The prototype hardware costs under EUR 5,000.
Electrical impedance tomography (EIT). Instead of passively recording voltages, inject small alternating currents through scalp electrodes and measure impedance changes across the volume. Neural firing changes membrane impedance by roughly 1%[10]. Holder's group at UCL has demonstrated fast neural EIT in animal models and preliminary human experiments[11]. The measurement is active, the SNR is better than passive EEG, and the physical quantity being measured (impedance) carries different information than voltage.
Low-THz transmission (0.1-0.3 THz). Terahertz radiation is extremely sensitive to water state and dielectric environment. Neural firing restructures local hydration shells around shifting ion concentrations. At the low end of the THz gap, tissue penetration improves enough to potentially reach cortical tissue through thin bone windows. THz time-domain spectroscopy systems have become practical outside specialized physics labs in the last decade[12].
Focused ultrasound echo analysis. Transcranial focused ultrasound (tFUS) can target specific cortical regions with millimeter precision using phased arrays with skull aberration correction. At diagnostic (non-therapeutic) intensities, the acoustic echo is modulated by neural-activity-driven changes in tissue stiffness and blood volume. Functional ultrasound imaging (fUS) has already demonstrated neural activity detection with resolution superior to fMRI in some dimensions[13].
The Quantum Layer
This is where the argument goes deeper, and where I expect the most resistance. I am going to lay out the evidence and let it speak.
The orthodox position in neuroscience is that the brain operates classically. Quantum effects decohere too fast in warm, wet biological tissue to play any computational role. This was reasonable until experimental evidence started contradicting it.
In 2007, Engel et al. demonstrated long-lived quantum coherence in photosynthetic light-harvesting complexes[14]. This was not marginal. The coherence was functionally useful: it improved energy transfer efficiency. Similar quantum effects have since been identified in avian magnetoreception (the radical pair mechanism[15]) and enzyme catalysis[16]. The blanket claim that biology is too noisy for quantum mechanics is dead.
In the brain specifically: Fisher (2015) proposed that phosphorus-31 nuclear spins in Posner molecules (calcium phosphate clusters that occur naturally in neural tissue) could maintain quantum coherence for hours, not femtoseconds, under biological conditions[17]. This is a specific, testable mechanism with specific predictions. His group at UCSB has been running experiments since.
Kerskens and Lopez Perez (2022) at Trinity College Dublin published MRI results showing proton spin dynamics in brain water consistent with quantum entanglement[18]. The interpretation is contested. The experimental observation has not been refuted.
General anesthesia still has no complete classical mechanistic explanation. We know what predicts anesthetic potency (the Meyer-Overton correlation with lipid solubility) but not how consciousness is actually disrupted. Turin et al. proposed that anesthetics work by interfering with quantum electron transport in microtubules[19]. If true, consciousness has a quantum component that can be chemically switched off.
Biophoton emission from neurons is experimentally established[8]. Whether these photons carry quantum information (coherence, entanglement) has not been tested because the experiments were not designed for it. Nobody has measured single-photon correlations between functionally connected brain regions.
Here is what this means for BCI: if neural computation has a quantum component, classical sensors are not just limited in resolution. They are categorically unable to access the quantum information, no matter how many electrodes you add or how good your decoder gets. You need sensors that couple to quantum states. You need a fundamentally different measurement paradigm.
A Proposed Measurement Paradigm
I have a theory for how to do this. It exploits a specific property of quantum mechanics to probe neural tissue in a way that classical instruments cannot replicate.
The measurement technique exploits a fundamental property of quantum mechanics related to how physical systems reveal different information depending on xxxxxxxxxxxxxxxxx observation strategy. By controlling xxxxxxxxxxxxxxxxx at the detection stage, it becomes possible to extract information about neural tissue that is xxxxxxxxxxxxxxxxx inaccessible to classical sensing, regardless of sensitivity or resolution.
What I can say: the approach does not require cryogenic cooling, magnetically shielded rooms, or surgical implantation. The theoretical basis is well-established quantum mechanics, not speculative extensions. The hardware components exist today in quantum optics and photonics laboratories. The gap is that nobody has assembled them for this purpose.
A second element involves xxxxxxxxxxxxxxxxx that allows the same physical probe to yield xxxxxxxxxxxxxxxxx classes of neural information depending on xxxxxxxxxxxxxxxxx. This is not an incremental improvement. It is a qualitatively different observation regime made possible by xxxxxxxxxxxxxxxxx.
Proposed Research Program
If this theory is correct, here is how to test it. The sequence starts cheap and fast, and only escalates when earlier phases show signal.
The decoder architecture should be modality-agnostic from the start: it accepts time-series sensor data from any input channel and outputs cognitive state classifications. This is the data platform I am building. When a new sensor produces data, it plugs in and gets tested immediately. No new pipeline, no new software, no months of integration work. This is the part I know how to do.
The Pattern
Radio waves existed for all of human history before Hertz built a detector in 1887. The cosmic microwave background filled the universe for 13.8 billion years before Penzias and Wilson stumbled into it in 1964 while trying to fix what they thought was antenna noise[20]. Gravitational waves were predicted in 1916 and measured a century later[21]. Every time, the signal was there. The instrument was not.
Neurons produce physical changes in at least six domains every time they fire. We built instruments for two of them and declared the problem hard. It is not hard. It is incomplete.
I am not going to listen harder. I am going to ask different questions.
References
[1] Wolpaw, J.R. & Wolpaw, E.W. (2012). Brain-Computer Interfaces: Principles and Practice. Oxford University Press.
[2] Nunez, P.L. & Srinivasan, R. (2006). Electric Fields of the Brain: The Neurophysics of EEG. Oxford University Press.
[3] Tang, J. et al. (2023). "Semantic reconstruction of continuous language from non-invasive brain recordings." Nature Neuroscience, 26, 858-866.
[4] Lauterbur, P.C. (1973). "Image formation by induced local interactions." Nature, 242, 190-191.
[5] Hamalainen, M. et al. (1993). "Magnetoencephalography: theory, instrumentation, and applications." Reviews of Modern Physics, 65(2), 413-497.
[6] Bhatt, M.B. et al. (2018). "Imaging brain activity with fast optical signals." NeuroImage, 171, 302-309.
[7] Yablonskiy, D.A. et al. (2000). "Theory of NMR signal behavior in magnetically inhomogeneous tissues." Magnetic Resonance in Medicine, 43(6), 820-834.
[8] Bókkon, I. (2009). "Visual perception and imagery: a new molecular hypothesis." BioSystems, 96(2), 178-184. See also: Salari, V. et al. (2015). "Ultraweak photon emission in the brain." Journal of Integrative Neuroscience, 14(3), 419-429.
[9] Shiraga, K. et al. (2017). "Characterization of dielectric responses of human cancer cells in the terahertz region." Journal of Infrared, Millimeter, and Terahertz Waves, 38, 25-37.
[10] Gilad, O. & Bholder, D.S. (2009). "Impedance changes recorded with scalp electrodes during visual evoked responses." NeuroImage, 47(2), 514-522.
[11] Aristovich, K.Y. et al. (2016). "Imaging fast neural traffic at fascicular level with electrical impedance tomography." NeuroImage, 124, 204-213.
[12] Jepsen, P.U. et al. (2011). "Terahertz spectroscopy and imaging: Modern techniques and applications." Laser and Photonics Reviews, 5(1), 124-166.
[13] Macé, E. et al. (2011). "Functional ultrasound imaging of the brain." Nature Methods, 8, 662-664.
[14] Engel, G.S. et al. (2007). "Evidence for wavelike energy transfer through quantum coherence in photosynthetic systems." Nature, 446, 782-786.
[15] Ritz, T. et al. (2000). "A model for photoreceptor-based magnetoreception in birds." Biophysical Journal, 78(2), 707-718.
[16] Hay, S. & Scrutton, N.S. (2012). "Good vibrations in enzyme-catalysed reactions." Nature Chemistry, 4, 161-168.
[17] Fisher, M.P.A. (2015). "Quantum cognition: The possibility of processing with nuclear spins in the brain." Annals of Physics, 362, 593-602.
[18] Kerskens, C.M. & Lopez Perez, D. (2022). "Experimental indications of non-classical brain functions." Journal of Physics Communications, 6(10), 105001.
[19] Turin, L. et al. (2014). "Electron spin changes during general anesthesia in Drosophila." Proceedings of the National Academy of Sciences, 111(34), E3524-E3533.
[20] Penzias, A.A. & Wilson, R.W. (1965). "A measurement of excess antenna temperature at 4080 Mc/s." The Astrophysical Journal, 142, 419-421.
[21] Abbott, B.P. et al. (LIGO Scientific Collaboration) (2016). "Observation of gravitational waves from a binary black hole merger." Physical Review Letters, 116(6), 061102.
Seeking Collaborators
I am looking for quantum physicists, photonics engineers, and neuroscientists who find this interesting enough to explore together. EU-based collaborators preferred for funding alignment, but I am open to anything that moves this forward.