Breaking News
September 21, 2018 - Study reveals promising new drug candidate to treat acute renal failure
September 21, 2018 - Neural signal that urges to eat overpowers the one that says to stop
September 21, 2018 - Scientists achieve breakthrough in accelerated diagnosis of multi-resistant hospital pathogens
September 21, 2018 - Researchers simulate how different breast tissues respond to heat from MRIs
September 21, 2018 - Despite red flags at surgery centers, overseers award gold seals
September 21, 2018 - Zapping Airway Nerves May Help COPD Patients Breathe
September 21, 2018 - Researchers find answers as to why some people are at risk of gout
September 21, 2018 - Stars of Stanford Medicine: Genetic counseling and compassion
September 21, 2018 - Health insurance ads have shifted over time due to health plans offered via ACA
September 21, 2018 - Use of transcranial electrical stimulation to bolster creativity has far-reaching implications
September 21, 2018 - Scientists find way to boost efficacy of powerful antimalarial drug with anti-cancer medicines
September 21, 2018 - Weighing the risks and benefits of drug tapering—two patients, two perspectives
September 21, 2018 - The “exposome” revealed: a barrage of bacteria, chemicals, microscopic animals and more
September 21, 2018 - Top three immune boosting recommendations to ward off freshers’ flu
September 21, 2018 - Young children’s oral microbiota could serve as early indicator for obesity
September 21, 2018 - Older individuals with multiple sclerosis report higher quality of life than younger counterparts
September 21, 2018 - LineaRx signs agreement with Takis/Evvivax to develop linear-DNA based anti-cancer vaccines
September 21, 2018 - AbbVie Presents Upadacitinib Longer-Term (32-Week) and Patient-Reported Outcomes Data from Phase 2b Atopic Dermatitis Study at 27th European Academy of Dermatology and Venereology (EADV) Congress
September 21, 2018 - Alzheimer’s drug may stop disease if used before symptoms develop
September 21, 2018 - Human skeletal stem cell can generate cartilage, bone
September 21, 2018 - UK and European research collaborations receive vote of confidence by three major cancer charities
September 21, 2018 - Microbiota in the intestines and cell stress cause colon cancer
September 20, 2018 - Arizona EMTs have 39% higher risk for suicide than general public
September 20, 2018 - Driving and older adults: MedlinePlus Medical Encyclopedia
September 20, 2018 - Brain region for stress control is enlarged in people with depression, finds study
September 20, 2018 - Researchers test autobiographical memory for early Alzheimer’s detection
September 20, 2018 - Organizations join forces to help teens with severe mental health challenges | News Center
September 20, 2018 - Neurons in the human brain can encode numerical information
September 20, 2018 - Potential drugs to treat neurodegenerative diseases garner $3 million grant
September 20, 2018 - Processing speed important to higher order cognitive function in multiple sclerosis patients
September 20, 2018 - Helping a patient survive a hurricane
September 20, 2018 - Tafamidis Treats Transthyretin Amyloid Cardiomyopathy
September 20, 2018 - Low academic achievement can lead to drug abuse decades later, research finds
September 20, 2018 - Study identifies stem cell that gives rise to new bone, cartilage in humans | News Center
September 20, 2018 - Celltrion and Emory University sign ‘Incubation’ agreement to develop new drug candidates for atherosclerosis
September 20, 2018 - TGen and PNOC take part in launch of NIH-supported Kids First Data Resource Portal
September 20, 2018 - Could Household Cleaners Make Your Kid Fat?
September 20, 2018 - Addiction nonprofit makes searching for services simple
September 20, 2018 - We are bombarded by thousands of diverse species and chemicals | News Center
September 20, 2018 - Experts to Present Prostate Cancer Advances at Patient Summit
September 20, 2018 - Alector announces initiation of Phase 1 trial of AL001 for treating frontotemporal dementia
September 20, 2018 - Pfizer’s 20vPnC vaccine receives Breakthrough Therapy designation from FDA
September 20, 2018 - Study could allow doctors to screen patients at risk from Aspergillus
September 20, 2018 - Emergex signs MoU with Brazil’s Fiocruz for development of viral vaccines
September 20, 2018 - The ‘real you’ is a myth – we constantly create false memories to achieve the identity we want
September 20, 2018 - Researchers describe cell mechanism that optimizes proteins production in stressful situations
September 20, 2018 - Cell Medica successfully doses first patient with CMD-501 targeting pediatric neuroblastoma
September 20, 2018 - Sesen Bio to present its three-month Phase 3 VISTA Trial data at Global Congress
September 20, 2018 - Senators unveil legislation to protect patients against surprise medical bills
September 20, 2018 - Study provides insights into development of special-purpose cosmetic products
September 20, 2018 - Research shows enlarged genotype-phenotype correlation for three-base pair deletion in NF1
September 20, 2018 - 91% of people around the world believe medical research will result in dementia cure
September 20, 2018 - DePuy Synthes introduces CONCORDE LIFT Expandable Interbody Device at EUROSPINE 2018
September 20, 2018 - Manx Telecom unveils MT clearSound that improves clarity of mobile phone calls
September 20, 2018 - Mediterranean-style diet appears to reduce stroke risk in women
September 20, 2018 - AbbVie Announces Patient-Reported Outcomes Data from Three Pivotal Phase 3 Studies of Risankizumab, Showing Significant Improvements in Health-Related Quality of Life for Patients with Psoriasis
September 20, 2018 - Characterization of pregnancy microbiome reveals variations in bacterial diversity
September 20, 2018 - New guidance for treatment of bone loss in hematologic stem cell transplant Recipients
September 20, 2018 - Experts to present research on prevention, management of dysphagia at international conference
September 20, 2018 - New study focuses on two-way gene switches controlling gene activity
September 20, 2018 - Zika virus could become a weapon against brain cancer
September 20, 2018 - Home-based video game exercises can reduce chronic low back pain in older people, study finds
September 20, 2018 - Investigators find that bile acids reduce cocaine reward
September 20, 2018 - Cannabinoid drugs reduce perceived unpleasantness of painful stimuli and increase tolerance
September 20, 2018 - Health care companies’ data could enable more accurate flu season forecasts
September 20, 2018 - Geroscience takes center stage in Journal of the American Medical Association
September 20, 2018 - Ambient Particulate Matter Linked to Emergency Asthma Care
September 20, 2018 - Patient satisfaction with plastic surgery—it’s the surgeon, not the practice
September 20, 2018 - Medicine is a team sport – and that’s exactly how it should be
September 20, 2018 - Logos Biosystems releases new electrophoretic tissue clearing system with twice the features in half the space
September 20, 2018 - Novel micro-platform reveals never-before-seen behaviors of cancer cells
September 20, 2018 - PAREXEL partners with Datavant to enhance clinical study design and generate real-world evidence
September 20, 2018 - Robert Koch Institute publishes new data on allergies, mental health problems, and accident injuries
September 20, 2018 - Study finds higher readmission rates in for-profit hospitals
September 20, 2018 - Encouraging youth to do strength-based exercises could help tackle child obesity
September 20, 2018 - Sleep apnea, congenital heart disease in hospitalized infants strongly associated with death
September 20, 2018 - Researchers find way to map mysterious content of non-coding RNA
September 20, 2018 - Air Pollutants Reach Placenta, Might Harm Fetus: Study
September 20, 2018 - Sleep apnea, congenital heart disease may be deadly mix for hospitalized infants
September 20, 2018 - My relative has cancer, should I worry? Encouraging cascade genetic testing
Resolving Interfacial Protein Dynamics by STReM

Resolving Interfacial Protein Dynamics by STReM

image_pdfDownload PDFimage_print

An interview with Prof. Christy Landes, conducted by Stuart Milne, BA.

Can you give a brief introduction to your work in single-molecule spectroscopy, and how it can help examine the link between structure and function in biological molecules?

I’ve been working in single-molecule spectroscopy since my postdoc at UT Austin in 2004. My postdoc training threw me straight into the deep end. I showed up at UT and my boss said, “You’re going to work on this protein that is important for HIV’s virulence”, and I had to learn and see what a virus was, and then I had to learn what single-molecule spectroscopy was.

The moment that I really got into it, I understood, first of all, how different biomolecules and biopolymers are from normal molecules.

For any material that exhibits intrinsic structural or dynamic heterogeneity, only single molecule methods make it possible to directly correlate conformational changes with dynamic information like reaction rate constants or product specificity. Ensemble methods would average out all of those properties.

The clearest example is with an enzyme that switches between bound (active) and unbound (inactive) conformations in the presence of its target. If we measured using standard methods, we could extract, at best, the average conformation (which would be something in between the bound/unbound forms and neither useful nor accurate) and the equilibrium binding coefficient. With single molecule methods, in the best case we could extract each conformation and the equilibrium rate constants for the forward and reverse process.

© molekuul_be/Shutterstock.com

As a spectroscopist, if I’m studying say acetone, every molecule of acetone is exactly the same. If you’re studying vibrations, there’s obviously a thermal distribution of vibrational energies in all the acetone molecules in a bottle or a vial. If you’re studying biopolymers or biomolecules like proteins or DNAs, their structure is constantly changing depending on their targets. You can’t capture the intricacies of the structural changes as a function of the reaction happening with standard spectroscopy. All you ever get is an average.

It’s very similar to if you want to understand the structure of a protein. Protein crystallography is incredibly powerful, but it involves freezing the protein structure into what is usually considered the most probably confirmation, which is one of the low energy structures that protein can form. If you want to actually understand how the protein is performing its function, its function is exactly related to its structure and how the structure changes. To be able to see that dynamically, the only way to do it really is with single-molecule spectroscopy.

What first interested you about single-molecule spectroscopy, and how did you come to work in the field?

I became fascinated with the challenges associated with the method, but also its strengths and how it fundamentally links statistical mechanics to experimental observables. With ensemble methods, you measure average values and extract statistical relevance, but with single molecule methods, it is possible to design an experiment in which the probability distribution is your direct observable.

What are the main challenges in measuring the fluorescence signals from a single molecule, or a single chemical reaction event?

I’m a spectroscopist at heart and so of course learning something new and unprecedented about a biomolecule is important and interesting. But the thing that really gets me excited is tackling the challenges that are associated with doing these types of measurements and figuring out new and interesting ways to solve those problems.

© http://www.lrg.rice.edu/

There are three main challenges. In single-molecule fluorescence spectroscopy or microscopy, the quantum yield of a fluorescent molecule can approach 100%, but still that means if you’re only measuring the fluorescence from a single molecule, that’s very few photon counts per unit of time and space. Collecting those photons accurately and precisely and being able to distinguish emitted photons from the noise that’s associated with every type of measurement and background that’s associated with every type of measurement is a very interesting challenging problem that single-molecule spectroscopists are working on.

The second is data processing: most measurements involve high volumes of data that is almost empty of information. Twenty, even ten years ago, if you’re a spectroscopist you would work all day and all night to align the laser so that you could collect a few data points at 3:00 in the morning and be thankful for your 10 data points that you would then try to then fit to an exponential function or a Gaussian function. Single-molecule spectroscopy has advanced so quickly that we’re able collect terabytes of data in just a couple of hours, which weren’t even possible to store just a few years ago.

Almost by definition, most of your sample has to be nothing but background in order to get single molecule resolution. Processing a terabyte of data in which most of the bits of data are background, but hidden within are small amounts of incredibly rich details a very interesting information theory problem. We’re active in developing new data analysis methods based on information theoretic principles.

The third is that labelling and handling single biomolecules introduces the possibility for the biomolecule to alter the label’s quantum yield and for the tagging molecule to alter the biofunctionality. You can’t totally solve this problem, but we must minimize it and quantify it, which involves a lot of tedious but important control measurements.

How has the hardware and/or software influenced the ability to process these large data sets?

I really think it’s both, at least that’s my opinion. One of the reasons this is possible is that data storage is cheap now. I remember how much I prized my floppy disk. I kept rewriting it and reformatting it to make sure that I would have my 1.44 megabytes worth of information. A lot of this is incredible advances in hardware, but then also the data piece that I’m super excited about.

Everyone throws around this term big data, but it really is true that we’re drowning in our ability to collect and save such huge volumes of data that we need to come up with new ways for how to process that data. Actually, I think for the future we need to come up with new ways for how to actually collect it. I think the ideal spectroscopy experiment is adaptive in that you use smart algorithms while you’re collecting the data.

If you can get to a point where the software and algorithms are thinking intelligently enough to do what you want it to do without you even asking to do it, then that processing of data becomes even quicker.

I think in the future collection software will be smart enough to know when it has a good molecule and needs better spatial resolution and better time resolution and will adapt to collect better, finer grained information when you need it, and coarser grained information when you don’t.

We’re working on using some machine learning to teach our collection software when things look good and when things don’t look good, when we need better resolution and when we don’t.

Your talk at Pittcon 2018 will revolve around the use of Resolving Interfacial Protein Dynamics by Super Temporal-Resolved Microscopy – can you explain what STReM is, and how it allows you to observe dynamics of single molecules?

STReM stands for Super Time-Resolved Microscopy, and as STORM, PALM, and other methods are designed to improve spatial resolution of optical microscopy, we desire to improve the time resolution. STReM makes use of point spread function engineering to encode fast events into each camera frame.

To some extent, it’s actually very related to some of the data science principles that underlie what I mentioned above. The double-helix point spread function with 3D super-resolution microscopy method was developed by a group at the University of Colorado and now lots of other people are designing their own phase masks, and point spread functions. The thing we have in common is signal processing principles at the center.

One of the things that happens in signal processing is all the time you go back and forth between real space and Fourier space because all sorts of say noise filtering happens in Fourier space because you can separate noise from signals in Fourier space that you can’t in signal space. That’s why, for example, FT IR or FTN NMR is so much better than regular IR and regular NMR.

What these guys at Colorado recognized is that if they manipulated the phase of the light from their microscope, they could compress three-dimensional information about their sample into the two-dimensional point spread function that they collected on their camera at the image plane. It’s super smart. You’re doing exactly the same measurements with exactly the same laser or lamp on exactly the same sample in the microscope.

The only difference is that you allow yourself some room at the detection part of your microscope to manipulate the phase of a light such that your point spread function is no longer an Airy disk but this two-lobed double-helix point spread function, and the angle that point spread function tells you the three-dimensional position of your sample in your sample plane even though you’re only doing a 2D image. It’s genius. It’s super cool.

STReM is built on that principle. I’m interested in dynamics. I want to see how proteins interact with interfaces, and lots of times that happens faster than my camera frame. Cameras are great and they’re getting better all the time, but a camera is very expensive, and so you can’t buy a new camera every six months to get slightly better time resolution. What we realized we could do is manipulate the phase of the fluorescence light to get better time resolution.

In other words, very similar to the double-helix 3D method, we are encoding faster information into a slower image. The point spread function is no longer a Gaussian or airy disk. It’s these two lobes, and now the two lobes rotate around the center, and the angle tells you the arrival time of your molecule faster than the camera frame. You can get 20 times better time resolution with the same old camera that everybody uses.

What molecules/reactions have you been able to observe so far using this technique?

We are interested in understanding protein dynamics at polymer interfaces for the purpose of driving predictive protein separations. The improved time resolution of STReM is revealing intermediates that are suggestive of adsorption-mediated conformational rearrangements.

Protein separations are getting increasingly important as the pharmaceutical industry moves away from organic molecules as therapeutics. The industry is moving towards therapeutics that are often proteins or peptides or sometimes they’re DNA or RNA aptamer-based. Separating a protein therapeutic from all the other stuff in the growth medium is hugely time-intensive because at the moment there’s no way to optimize a separation predicatively. It’s done empirically. There’s a bunch of empirical parameters that are just constantly messed with to try to optimize it.

Our long-term goal is to find a way to be predictive about protein separations. What that means underneath is being able to exactly quantify exactly what a protein does at that membrane interface in terms of its structure, its chemistry, its physics, how it interacts with other proteins in a competitive manner. Some of those things are faster than our camera can measure. Ideally we can catch everything, and that’s why we need faster time resolution.

How does STReM compare to other single-molecule techniques you’ve employed before, such as FRET or super-resolution microscopy?

STReM is compatible with a wide range of super-resolution methods in that it is possible to combine 2D or 3D super-localization but with the faster time resolution of STReM. There remains the challenge of fitting or localizing more complicated point spread functions.

FRET is a way to measure distance. You would like to understand how protein structures change on the 1 to 10 nanometer scale, but we still don’t have a microscope to measure 1 to 10 nanometers. Essentially, you take advantage of this dipole-induced dipole interaction that just happens to be over that length scale, and the math of that energy transfer process actually gets you the distance so you don’t have to resolve it. FRET is also a math trick, to me anyway.

The STReM technique, because it involves fluorescence, is very compatible with FRET or any other type of super resolution microscopy. The only challenge is that traditional microscopy approximates that Airy point spread function that you get at the image plane as a Gaussian and it’s easy to fit a Gaussian.

Any undergrad with Excel can fit a Gaussian and identify the center and width of the peak, which is what you need for super resolution microscopy, and it’s also what you need for FRET for that matter. The point spread functions that you get with these more complicated techniques like STReM or the 3D double-helix are more complicated to fit. Again, the math is more interesting and challenging.

You could do STReM with FRET. You could do STReM with PALM or STORM. People are using that double-helix with STORM, I think, to get 3D images with STORM, but your point spread function fitting and localization is more complicated.

What are the next steps for your own research? And how do you think your findings will contribute to a better understanding in the wider field of biochemistry?

Our big picture goal is to achieve enough mechanistic understanding of protein dynamics at polymer interfaces to transition to a predictive theory of protein chromatography. Protein separation and purification accounts for billions of dollars every year in industrial and bench scale efforts due to the fact that each separation must be optimized empirically.

Where can our readers learn more about your work?

Our newest work on protein dynamics is still under review, but our paper introducing STReM can be found here:

How important is Pittcon as an event for people like yourself to share your research and collaborate on the new ideas?

In the US for general science and I would say medical science, Pittcon is the place to go to get information about the newest ways to measure things. It’s the only meeting that’s all about measurement. If I have something I need to measure, I go to Pittcon to learn the newest ways to measure it. If I have a new way to measure, I go to Pittcon to make sure that everybody knows that I have a new instrument or method to measure something. It’s the only place.

About Christy F. Landes

Christy F. Landes is a Professor in the Departments of Chemistry and Electrical and Computer Engineering at Rice University in Houston, TX. After graduating from George Mason University in 1998, she completed a Ph.D. in Physical Chemistry from the Georgia Institute of Technology in 2003 under the direction of National Academy member Prof. Mostafa El-Sayed.

She was a postdoctoral researcher at the University of Oregon and an NIH postdoctoral fellow at the University of Texas at Austin, under the direction of National Academy members Prof. Geraldine Richmond and Prof. Paul Barbara, respectively, before joining the University of Houston as an assistant professor in 2006.

She moved to her current position at Rice in 2009, earning an NSF CAREER award for her tenure-track work in 2011 and the ACS Early Career Award in Experimental Physical Chemistry in 2016.

Tagged with:

About author

Related Articles