Digital × - Mary Lou Jepsen - Telepathic Infrastructures

Telepathic Infrastructures

Mary Lou Jepsen

Diagram of Raymond Damadian’s “Apparatus and method for detecting cancer in tissue,” Patent US3932805A (granted February 5, 1974), the first patent ever filed for what is today known as a Medical Resonance Imaging machine.

Digital ×
May 2019

Nick Axel You launched Openwater a few years ago. Can you tell us about it? What’s it trying to do?

Mary Lou Jepsen The idea of Openwater is to put the functionality of an MRI machine into a lightweight bandage that you can wrap anywhere around your body at a massively lower cost, and as it turns out, produce medical imaging with much higher resolution, than a two-ton, room-size piece of equipment.

NA It sounds like it could change the way we think of healthcare and medical infrastructure quite dramatically.

MLJ That’s the goal at least. About forty years ago, three companies—Phillips, Siemens, and GE Healthcare—pooled their intellectual property together to bring magnetic resonance imaging (MRI) to the masses. And for the past few decades, MRI machines have enabled the early diagnosis of all kinds of diseases. They are the primary way we diagnose cancer today, and have helped to reduce mortality and morbidity rates around the world. But these three companies effectively created an oligopoly, and the technology hasn’t really gotten much better since they came together. You still have a two-ton magnet, it takes two hours just to scan my head, and it costs thousands of dollars per scan.

NA Essential, life-saving medical imaging technology could become accessible to places that currently live without it…

Schematic diagram of Openwater’s imaging technology placed on a human head, as if worn like a hat.

MLJ Two-thirds of humanity currently lacks access to this type of medical imaging. But if we really think about bringing medical imaging to places that don’t currently have it, what about having it in your home? Angelina Jolie cut off her breasts because she found a genetic predisposition for a gene called BRCA, which gives a very high risk of breast cancer. But why can’t we monitor what’s going on in our bodies before taking preventative measures? It’s a bit like if you have a cold, you take your temperature. You know that 98.6℉ is considered normal, and if you’re over a hundred there’s some concern, but if you’re over 104 you should probably go to the hospital, or at least take a lot of Ibuprofen. This isn’t just applicable to underdeveloped regions of the world. If you think about it, none of us really have access to medical imaging.

NA What is it that allows this type of technology to be made today?

MLJ This is happening today for a number of reasons. The first, perhaps most pragmatic one, is that while a lot of people talk about the use of machine learning and big data, but there’s another, often overlooked tool unique to our time. The trillion-dollar manufacturing infrastructure of consumer electronics, primarily in Asia, can be harnessed to produce massively disruptive, insurgent technologies in established industries. Then there’s Moore’s Law, which has for the past fifteen years predicted the doubling of transistor density every eighteen months. But on the optical side of things, this means that the pixel size of camera sensors is now approaching the wavelength of light. I started out at the MIT Media Lab studying synthetic holography, and for holography to work, you need your pixel size to be about the wavelength of light. Back in the 80s we had to do all kinds of things, when the pixel size of computer monitors was a couple millimeters, but today, most of the cameras we carry around with us in our pockets allows for us to use the imaging properties of holography in radically new ways.

X-ray diffraction image of DNA taken by Rosalind Franklin and Raymond Gosling in 1952, a key piece of data that lead to James Watson and Francis Crick’s discovery of DNA’s structure.

NA How does it work?

MLJ There are three keys to it. Gamma rays go through our bodies; X-rays go through our bodies; the magnetic fields of two-ton MRI machine magnets go through our bodies; but so does red light. The reason why red light hasn’t been able to see deep inside of our bodies or in high resolution until now is because it scatters as it passes through. If I hold a laser up to my hand, you wouldn’t see a point, but rather a field of light. It turns out that scattering isn’t random; if you can record the wavelength of light—which is why the small pixel size on camera sensors is so important—it’s deterministic and reversible. That’s the first principle. The second is that we can not only record the intensity of red light as it scatters through our body, but also, because light is also a wave, the phase of it. We’ve created chips that can direct ultrasonic pings anywhere in the body. We launch a ping into the body and where it focuses, we bring red light in. When the light travels through that specific point where sound is directed, the color of the light changes, ever so slightly. The Doppler effect works not just with sound and pitch, but also with light and color. Then we exploit this other property of holography. Holography is basically the interference of beams of light, so if you can “beat” one beam of light off another, you can obtain and decode the information content of the spot where they meet. This is how Rosalind Franklin decoded an X-ray diffraction image and revealed the structure of DNA for the first time. Back then she had to stay up all night, but now we have computers, so we can do it super fast. By moving from spot to spot, we can scan and even monitor a selected region of interest, like the speech or visual center of the brain, or others parts of the body. It also has the potential to be put in reverse, to conduct surgery without knives, for instance.

NA As you said, one of the main things MRI does is identify tumors. How can Openwater be used to do that?

MLJ Any tumor bigger than the size of a cubic millimeter has five times the amount of blood as the rest of your body. The shape of tumorous veins and structures are very different than the rest of your body, and the density of blood is very different too. We can see blood very well, because blood absorbs red light, whereas flesh scatters it.

Sample diagnostic imagery from an Openwater scan detecting a concentration of blood in synthetic flesh.

NA Can it find anything else?

MLJ The color of blood is different whether it’s carrying oxygen or not, and you can get a lot of information by looking at the use of oxygen in your body. We think oxygen allows us to understand our brains. Traditional MRI equipment provides an image resolution of one millimeters per pixel, and fMRI ten millimeters per pixel, whereas we can provide an image resolution of one micron per pixel, which in three dimensions is one billion times better. We’ve been able not just to see oxygen, but to focus on and image what’s happening in specific neurons. When a neuron is about to fire, before an electrical pulse goes down the axon, the optical properties of that neuron change. Ion channels open up to build up the charge, and when that happens, the membranes roughen, which scatters light more. So we can see the differential scattering of neurons based on the roughening of their membranes. This can, in short, allow us to read your mind, or communicate with telepathy.

NA Telepathy?!

MLJ Neuroscientists ask how we can do that when we don’t even know what a thought is. We think we know what it is when we have one, but we can’t define it. There’s this issue in neuroscience of causal vs non-causal brain activity. Looking at the oxygen use of your brain is non-causal, but by doing that I can still tell you what words you’re about to say, what images you’re thinking of, what music you have in your head, etc. In a hierarchical way, we still don’t understand how the neurons and the other parts of the brain work, but we can infer a lot of other information. But the most important thing to mention is, we really don’t know how the brain works. We’ve never been able to measure our brain at this resolution. We’re mapping out non-causal brain activity in an attempt to get to the causal.

Screenshot of a brain viewer designed to represent the semantic maps that tile human cerebral cortexes. Source: Alexander Huth, Wendy de Heer, Tom Griffiths, Frederic Theunissen, and Jack Gallant, 2016.

NA Can you explain how that might work?

MLJ There is no difference in your brain’s use of oxygen when you look at an image versus when you imagine that same image. Seven or eight years ago, Jack Gallant at UC Berkley had graduate students spend hundreds of hours in an fMRI scanner watching YouTube videos while he recorded the oxygen use of their brains. Then the graduate students were presented with a new clip, with new images they’d never seen before, and the computer, using the image store of those hundreds of hours of YouTube videos and the oxygen use of the brain reacting to those images, inferred what it thought the student was looking at. The result is pretty grainy, but even with a scanning resolution of ten cubic millimeters, it’s pretty amazing. Last year a Japanese group fed graduate students into an fMRI machine and woke them up three minutes after they had fallen asleep, asked them what they were dreaming about, let them go back to sleep, woke them up three minutes later again, and from that repeated process created a data store of the graduate students’ dreams. Then, using backpropagation deep learning, sleeping graduate students were dumping their dreams into a computer, and could see images of them when they woke up. There are lots of other experiments like this, but imagine if we could up the resolution and get more data.

NA The implications of this could be enormous.

MLJ There’s an incredible potential for this to work for people who have lost the ability to speak. There’s less than a 5% false positive rate for identifying language using MRI. You could also reverse the process and think of filters. But it could also allow us to transcend language. I think we’re all a bit like Steven Hawking. Our brains have a very high bandwidth, through our senses, but what comes out is pretty low resolution, just moving our tongues and mouths and typing. What if we could increase the bandwidth out and communicate raw emotion; get the images, get the music directly out of our brains without needing to have to play the guitar just so? What could that enable us to do? How would that change things like diplomacy, conflict and consensus; ambition, ego, power, and control; or even peace, love, and understanding? Will we all become like Spock from Star Trek, who had to recover for three days after mind-melding with another person? Or is that just what’s called empathy?

00:00
00:00

A segment of a Hollywood movie trailer that the subject viewed while in a fMRI machine (top), the reconstruction of this segment from brain activity (left), based on similar previous brain activity recorded while watching other random YouTube videos that did not include the movies used as stimuli (right). Jack Gallant, “Movie reconstructions from human brain activity: 3 subjects,” YouTube (September 21, 2011).

NA That’s awesome, but it’s hard to think of technology as something neutral these days. What’s the disposition of this technology?

MLJ It ultimately comes down to the question—it’s a yes or no question—as to whether we, as humanity, want to understand our brain. It’s a pandora’s box. The most expensive form of healthcare in every single country in the world is for brain disease. We can say no, but we’ve got the national academies of most developed countries saying that understanding the brain is one of the top five priorities. So I can’t help but think that it’s kind of inevitable. You can stop things for a while, like the Cultural Revolution, stem cell research, or Galileo, but in the end it’s going to happen because it can. The ethics of this are profound. And how do we handle it? I don’t really know.

NA Beatriz Colomina recently published a book on the relationship between the invention of the X-ray and the aesthetics of the modern movement in architecture. It’s much more nuanced than this, but one of the arguments she makes is that we can only think, imagine, and relate to our built environment in the ways we are able to diagnose and see inside the body. What you’re doing with Openwater is not only creating a new form of imaging, but also a new relationship that we have to images; where a medical image isn’t something we only encounter once every year or so, but rather becomes a part of our daily life. I find it fascinating to think what the potentials of this practice of care might be if applied to or incorporated within the logics that drive the design and development of our cities.

MLJ We’ve had these kinds of images for a while, and we can make them higher resolution, but the real paradigm shift is the ability to look at what specific neurons are doing and create an image of thought itself.

Digital × is a collaboration between e-flux Architecture and the Norman Foster Foundation within the context of its 2019 educational program.

Category
Interviews & Conversations, Technology, Bodies
Subject
Health & Disease, Sleep & Dreams, Science
Return to Digital ×

Mary Lou Jepsen is a technical executive and inventor in the fields of display, imaging, and computer hardware. She is the founder of OpenWater, a startup working on fMRI-type imaging of the body using holographic, infrared techniques. She was the co-founder and first Chief Technology Officer of One Laptop per Child (OLPC).

Advertisement
Subscribe

e-flux announcements are emailed press releases for art exhibitions from all over the world.

Agenda delivers news from galleries, art spaces, and publications, while Criticism publishes reviews of exhibitions and books.

Architecture announcements cover current architecture and design projects, symposia, exhibitions, and publications from all over the world.

Film announcements are newsletters about screenings, film festivals, and exhibitions of moving image.

Education announces academic employment opportunities, calls for applications, symposia, publications, exhibitions, and educational programs.

Sign up to receive information about events organized by e-flux at e-flux Screening Room, Bar Laika, or elsewhere.

I have read e-flux’s privacy policy and agree that e-flux may send me announcements to the email address entered above and that my data will be processed for this purpose in accordance with e-flux’s privacy policy*

Thank you for your interest in e-flux. Check your inbox to confirm your subscription.