Covering the Land of Lincoln

Ethan Simmons | My trip inside my mind | University-illinois

I’d venture to say that most people, even in our university town, have no idea what happens inside an academic research study.

In the spirit of truth, science and immersive journalism, I’ve spent about five hours in the last month lying still in an MRI scanner nestled in the basement of the University of Illinois’ Beckman Institute, lending my brain waves to ongoing research.

I hope this first-person account clears some fog for you about what it takes to participate in science and teaches you a few new things about functional magnetic resonance imaging and our mysterious brains.

Of course, all research is different, and there were plenty of aspects about the study I participated in that make it unique. What it entailed for me: three four-hour sessions at the Control & Network Connectivity Team Lab at Beckman and its Biomedical Imaging Center.

I left $200 richer, well-acquainted with the strange, banging symphony of an MRI scanner and ever more curious about what happens between our ears.

Subject 61

The CONNECT Lab’s call for research participants caught my eye for three reasons.

A) I’d never been inside an MRI machine, and I wanted to know what it was like.

B) I’d never seen the inside of my head.

C) I fit the narrow criteria for participation.

What they’re looking for in this study: Native English speakers ages 18 to 35 with at least a high school diploma and without any history of neurological disorders or psychiatric prescriptions, nor any heavy-drinking or drug-abuse habits.

Check, check, check and check.

Like other fMRI studies, participants must be right-handed and have no metallic material embedded in their bodies, like a pacemaker, implant, metal prosthesis, pins or plates. (Regular dental work is fine).

I learn that I’m “Subject 61”; the researchers hope to enroll at least 100 participants.

“Retention is a very important thing, because MRI is expensive, and the time we’re investing is really heavy,” says lab founder and principal investigator Sepideh Sadaghiani, associate professor in the UI Department of Psychology.

How expensive? About $630 an hour, just to run the MRI scanner. That’s not counting all the labor costs associated with each study.

Her lab earned a five-year, $2.3 million grant from the National Institutes of Health to conduct this research, although Sadaghiani has requested a no-cost two-year extension to finish up data collection.

What are they trying to learn?

The CONNECT Lab, housed in the Beckman Institute, focuses its research on a few key concepts within the brain.

One emphasis is cognitive control: How does the brain control its own capabilities when it’s given a goal?

Next is connectivity: How do neurons from distant brain areas talk to one another across the connectome — the brain’s “map” of neural connections?

The CONNECT Lab is particularly interested in so-called higher-order control functions.

The brain is able to control the way it perceives things when given a specific, goal-oriented task — say, looking for a blue purse in the clutter of your living room, Sadaghiani said.

“You’re essentially able to tune the neurons that are representing that particular color in a way that you can more easily find the purse among other things,” Sadaghiani said.

Many of the CONNECT Lab experiments I went through were sensory tasks that tested the limit of my selective attention; detecting a faint sound, recognizing a subtle emotion on a face or answering a tough memory game. (More on this later).

These stimuli create evoked activity in the brain, in response to external events. This is what most of the published research on neuroscience looks at.

This experiment is interested in the brain’s intrinsic or ongoing activity, which is not linked to the outside world. Your brain’s responses to sights and sounds are mere droplets in an ocean of ongoing, spontaneous neural activity. They’re the tip of the iceberg.

In these repeated tasks, CONNECT Lab researchers are trying to find out how the ongoing activity influences how your brain perceives each of these exercises.

“When we look at the brain right before we even show the picture or play the sound, if you look at that part of the activity and apply some simple machine learning to it, the computer can essentially look at your brain patterns and we can predict what you will perceive before we’ve even show you the picture, before we’ve even played the sound,” Sadaghiani said.

These predictions are far from 100 percent correct, but they’re far above chance. Here’s how they test it.

the experiment

Setting up in the scanner with the help of a kind MRI technologist and careful research assistants at the lab feels a lot like a doctor’s appointment.

I remove all metallic objects from my clothes — phone, keys and belt — before entering the room, then they strap me onto the table and put in (or on) earplugs and earmuffs that are wired to a microphone outside the room, so the researchers can speak to me while I’m in the machine. Cushions next to my head and under my arms and legs are meant to keep me comfortable and still.

A mirrored fixture comes into the scanner with me right above my head. The mirror gives my eyes a path to see a TV in the room where visual prompts will appear.

On my right arm: a hand piece with five buttons at the tip of each of my fingers. I’ve practiced pressing certain buttons in response to the stimuli that I’ll see through the MRI.

On my left hand: a finger piece to track my pulse, and a device to squeeze if I absolutely need to get out of the MRI for any reason.

For my final two-hour segments in the scanner, I wore an electroencephalogram cap on my skull. I had to shampoo thoroughly and comb my hair to remove excess oils and static from my scalp, then have each of the 64 electrodes filled with a conductive gel by lab assistants. The process took about an hour.

This is a rare combo; though the EEG and fMRI both measure brain activity in some way, the methods are hostile to each other.

MRI scanners rapidly shift magnetic fields to conjure accurate spatial data from the brain. The scans can mess with the very fine signals going from my brain to the metallic discs, or electrodes, on the EEG. When neurons talk to each other, electrical current flows through them.

After the experiments, the researchers will use an algorithm to clean the EEG data, subtracting the noise from the fMRI scan and even my heartbeat, whose pulsing signals faintly affect the readings from my head.

When the data is cleaned, the researchers will combine the temporal data from the EEG with the spatial data from the MRI for a clear picture of what happened in my brain during the experiment.

“The methods are really complementary,” Sadaghiani said. “Having them at the same time gives us a more complete view of what the whole brain communication organization, connectome, is doing at every given moment, at rapid and slow channels.”

What they’re looking for are changes in my blood oxygenation levels in different areas of my brain. Oxygen is delivered in the brain through hemoglobin, which has iron in it — a magnetic material. The MRI scanner detects the tiny changes in magnetic properties, as certain clumps of neurons demand more oxygen.

the experiment

To participate in this study, I couldn’t be claustrophobic. I’m not, or at least I didn’t think I was.

My first time ferrying into the MRI scanner was a bit psychologically jarring, knowing I’d need to keep my head still for at least an hour, and that I wouldn’t get any bathroom breaks. The feeling started to fade once I got something to do.

And my first time hearing the rhythmic bangs of the MRI through earmuffs was accompanied by a light tingling on my sternum. This may have been from the mechanical vibrations of the scanner.

Day 1 was a basic scan to get the layout of my brain in a relaxed state. Once I exited the machine, what followed was a cognitive battery, a sequence of mental tests that stretched my working memory and my creativity.

I learn that this multi-hour exercise is meant to measure my cognitive flexibility, or how well my brain can shift behavior between different tasks. One of the hypotheses of this study is testing is whether a participant’s cognitive flexibility associates with their “connectome flexibility,” or how well the brain expresses a variety of different network patterns.

Days 2 and 3 were the long haul: I booked three consecutive 19-minute scans in the MRI. By Day 2, my peripheral stimulation was far easier to handle, and I wasn’t nearly as nervous.

After a few minutes of setup, the actual experiment begins. If I hear a faint sound, sort of like radio static, I press down on my index finger. It’s hard to detect within the whir of the MRI.

Occasionally, a photo of a face will show up. If I think it’s emotive in some way, I click my middle finger. If I think the face is neutral, I click my ring finger. After a while, this task goes from easy to second-guessable.

Finally, an array of six arrows will occasionally pop up on the screen, each of them independently pointing down, up, left or right. After a brief flash of the arrows, one box was highlighted. The task: indicate on my finger which direction the arrow that was in the highlighted box was pointing.

Even if I paid extremely close attention, it was easy to forget or miss which direction the arrow pointed. For every hit, there were at least a few misses.

Each of these exercises occurred multiple times, and they were always spaced out. The time gaps allow researchers to get a finer picture of the intrinsic activity happening in my brain between the stimuli.

My first emotion after exiting the scanner was, admittedly, relief. The lab manager assured me the data would be usable, that I didn’t twitch too much or answer anything too wildly.

After I removed the EEG cap and washed my scalp of the conductive gel, one of the researchers handed me the $200 in cash, and I was on my merry way.

When the study is finished, the data from my visits and other participants’ sessions will be publicly shared — there’s very little concurrent MRI and EEG data for other researchers to use. Of course, it will be fully de-identified — no names or faces, just brain signals and behavioral scores attached to basic demographics like the age and sex.

“We want this data to support a larger scientific effort beyond just the questions we have,” Sadaghiani said. “We in our lab will use this data for years and years to come, and we hope others will, too.”

It was good to hear how those 12 hours I spent at Beckman will contribute to the scientific world. A big thanks to Holly, Samar, Grace, Aiman ​​and others for making this new experience so comfortable and transparent, and for allowing me to write about your work.

Sound like a good use of your time? There are more than a half-dozen ongoing studies at Beckman and countless more at departments across the UI.

So, if you have some hours to spare, and want to sate your curiosity or earn some extra cash, I say give it a shot. Who knows if your data could contribute to the next scientific breakthrough?

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More