MIT: Nanoparticles take a “Fantastic, (magnetic) Voyage” – Helping Drug-Delivery Nanoparticles Reach Their Targets (with Video)


MIT-Magnetic-Micropropellers_0MIT engineers have designed a magnetic microrobot that can help push drug-delivery particles into tumor tissue (left). They also employed swarms of naturally magnetic bacteria to achieve the same effect (right). Image courtesy of the researchers.

Tiny robots powered by magnetic fields could help drug-delivery nanoparticles reach their targets.

MIT engineers have designed tiny robots that can help drug-delivery nanoparticles push their way out of the bloodstream and into a tumor or another disease site. Like crafts in “Fantastic Voyage” — a 1960s science fiction film in which a submarine crew shrinks in size and roams a body to repair damaged cells — the robots swim through the bloodstream, creating a current that drags nanoparticles along with them.

The magnetic microrobots, inspired by bacterial propulsion, could help to overcome one of the biggest obstacles to delivering drugs with nanoparticles: getting the particles to exit blood vessels and accumulate in the right place.

“When you put nanomaterials in the bloodstream and target them to diseased tissue, the biggest barrier to that kind of payload getting into the tissue is the lining of the blood vessel,” says Sangeeta Bhatia, the John and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, a member of MIT’s Koch Institute for Integrative Cancer Research and the Institute for Medical Engineering and Science, and the senior author of the study.

“Our idea was to see if you can use magnetism to create fluid forces that push nanoparticles into the tissue,” adds Simone Schuerle, a former MIT postdoc and lead author of the paper, which appears in the April 26 issue of Science Advances.

In the same study, the researchers also showed that they could achieve a similar effect using swarms of living bacteria that are naturally magnetic. Each of these approaches could be suited for different types of drug delivery, the researchers say.

Tiny robots

Schuerle, who is now an assistant professor at the Swiss Federal Institute of Technology (ETH Zurich), first began working on tiny magnetic robots as a graduate student in Brad Nelson’s Multi-scale Robotics Lab at ETH Zurich. When she came to Bhatia’s lab as a postdoc in 2014, she began investigating whether this kind of bot could help to make nanoparticle drug delivery more efficient.

In most cases, researchers target their nanoparticles to disease sites that are surrounded by “leaky” blood vessels, such as tumors. This makes it easier for the particles to get into the tissue, but the delivery process is still not as effective as it needs to be.

The MIT team decided to explore whether the forces generated by magnetic robots might offer a better way to push the particles out of the bloodstream and into the target site.

The robots that Schuerle used in this study are 35 hundredths of a millimeter long, similar in size to a single cell, and can be controlled by applying an external magnetic field. This bio-inspired robot, which the researchers call an “artificial bacterial flagellum,” consists of a tiny helix that resembles the flagella that many bacteria use to propel themselves. These robots are 3-D-printed with a high-resolution 3-D printer and then coated with nickel, which makes them magnetic.

To test a single robot’s ability to control nearby nanoparticles, the researchers created a microfluidic system that mimics the blood vessels that surround tumors. The channel in their system, between 50 and 200 microns wide, is lined with a gel that has holes to simulate the broken blood vessels seen near tumors.

Using external magnets, the researchers applied magnetic fields to the robot, which makes the helix rotate and swim through the channel. Because fluid flows through the channel in the opposite direction, the robot remains stationary and creates a convection current, which pushes 200-nanometer polystyrene particles into the model tissue. These particles penetrated twice as far into the tissue as nanoparticles delivered without the aid of the magnetic robot.

This type of system could potentially be incorporated into stents, which are stationary and would be easy to target with an externally applied magnetic field. Such an approach could be useful for delivering drugs to help reduce inflammation at the site of the stent, Bhatia says.

Bacterial swarms

The researchers also developed a variant of this approach that relies on swarms of naturally magnetotactic bacteria instead of microrobots. Bhatia has previously developed bacteria that can be used to deliver cancer-fighting drugs and to diagnose cancer, exploiting bacteria’s natural tendency to accumulate at disease sites.

For this study, the researchers used a type of bacteria called Magnetospirillum magneticum, which naturally produces chains of iron oxide. These magnetic particles, known as magnetosomes, help bacteria orient themselves and find their preferred environments.

The researchers discovered that when they put these bacteria into the microfluidic system and applied rotating magnetic fields in certain orientations, the bacteria began to rotate in synchrony and move in the same direction, pulling along any nanoparticles that were nearby. In this case, the researchers found that nanoparticles were pushed into the model tissue three times faster than when the nanoparticles were delivered without any magnetic assistance.

This bacterial approach could be better suited for drug delivery in situations such as a tumor, where the swarm, controlled externally without the need for visual feedback, could generate fluidic forces in vessels throughout the tumor.

The particles that the researchers used in this study are big enough to carry large payloads, including the components required for the CRISPR genome-editing system, Bhatia says. She now plans to collaborate with Schuerle to further develop both of these magnetic approaches for testing in animal models.

The research was funded by the Swiss National Science Foundation, the Branco Weiss Fellowship, the National Institutes of Health, the National Science Foundation, and the Howard Hughes Medical Institute.

Advertisements

Genesis Nanotech – ICYMI – Our Top 3 Blog Posts (as picked by you) This Week


#1

MIT Review: Borophene (not graphene) is the new wonder material that’s got everyone excited

#2

China made an artificial star that’s 6 times (6X) as hot as our sun … And it could be the future of energy

 

#3

Graphene Coating Could Help Prevent Lithium Battery Fires

 

Read/ Watch More …

Genesis Nanotech – Watch a Presentation Video on Our Current Project

Nano Enabled Batteries and Super Capacitors

Tenka Energy, Inc. Building Ultra-Thin Energy Dense SuperCaps and NexGen Nano-Enabled Pouch & Cylindrical Batteries – Energy Storage Made Small and POWERFUL!

 

 

 

MIT Review: Borophene (not graphene) is the new wonder material that’s got everyone excited


Stronger and more flexible than graphene, a single-atom layer of boron could revolutionize sensors, batteries, and catalytic chemistry.

Not so long ago, graphene was the great new wonder material. A super-strong, atom-thick sheet of carbon “chicken wire,” it can form tubes, balls, and other curious shapes.

And because it conducts electricity, materials scientists raised the prospect of a new era of graphene-based computer processing and a lucrative graphene chip industry to boot. The European Union invested €1 billion to kick-start a graphene industry.

This brave new graphene-based world has yet to materialize. But it has triggered an interest in other two-dimensional materials. And the most exciting of all is borophene: a single layer of boron atoms that form various crystalline structures.

The reason for the excitement is the extraordinary range of applications that borophene looks good for. Electrochemists think borophene could become the anode material in a new generation of more powerful lithium-ion batteries.

Read More: Borophene Discoveries at Rice University

Chemists are entranced by its catalytic capabilities. And physicists are testing its abilities as a sensor to detect numerous kinds of atoms and molecules.

Today, Zhi-Qiang Wang at Xiamen University in China and a number of colleagues review the remarkable properties of borophene and the applications they might lead to.

Borophene has a short history. Physicists first predicted its existence in the 1990s using computer simulations to show how boron atoms could form a monolayer.

But this exotic substance wasn’t synthesized until 2015, using chemical vapor deposition. This is a process in which a hot gas of boron atoms condenses onto a cool surface of pure silver.

The regular arrangement of silver atoms forces boron atoms into a similar pattern, each binding to as many as six other atoms to create a flat hexagonal structure. However, a significant proportion of boron atoms bind only with four or five other atoms, and this creates vacancies in the structure. The pattern of vacancies is what gives borophene crystals their unique properties.

Since borophene’s synthesis, chemists have been eagerly characterizing its properties. Borophene turns out to be stronger than graphene, and more flexible. It a good conductor of both electricity and heat, and it also superconducts. These properties vary depending on the material’s orientation and the arrangement of vacancies. This makes it “tunable,” at least in principle. That’s one reason chemists are so excited.

Borophene is also light and fairly reactive. That makes it a good candidate for storing metal ions in batteries. “Borophene is a promising anode material for Li, Na, and Mg ion batteries due to high theoretical specific capacities, excellent electronic conductivity and outstanding ion transport properties,” say Wang and co.

Hydrogen atoms also stick easily to borophene’s single-layer structure, and this adsorption property, combined with the huge surface area of atomic layers, makes borophene a promising material for hydrogen storage. Theoretical studies suggest borophene could store over 15% of its weight in hydrogen, significantly outperforming other materials.

Then there is borophene’s ability to catalyze the breakdown of molecular hydrogen into hydrogen ions, and water into hydrogen and oxygen ions.

“Outstanding catalytic performances of borophene have been found in hydrogen evolution reaction, oxygen reduction reaction, oxygen evolution reaction, and CO2 electroreduction reaction,” say the team. That could usher in a new era of water-based energy cycles.

Nevertheless, chemists have some work to do before borophene can be more widely used. For a start, they have yet to find a way to make borophene in large quantities.

And the material’s reactivity means it is vulnerable to oxidation, so it needs to be carefully protected. Both factors make borophene expensive to make and hard to handle. So there is work ahead.

But chemists have great faith. Borophene may just become the next wonder material to entrance the world.

Ref: arxiv.org/abs/1903.11304 : Review of borophene and its potential applications

From MIT Technology Review March 2019

 

MIT – 10 Technology Breakthroughs for 2019 Part II with Guest Curator – Bill Gates


MIT Nuclear 2 c-mod-internal-1

This is Part II of MIT’s  10 Technology Breakthroughs for 2019′ Re-Posted from MIT Technology Review, with Guest Curator Bill Gates. You can Read Part I Here

Part I Into from Bill Gates: How We’ll Invent the Future

was honored when MIT Technology Review invited me to be the first guest curator of its 10 Breakthrough Technologies. Narrowing down the list was difficult. I wanted to choose things that not only will create headlines in 2019 but captured this moment in technological history—which got me thinking about how innovation has evolved over time.

 

Robot dexterity

NICOLAS ORTEGA

  • Why it matters If robots could learn to deal with the messiness of the real world, they could do many more tasks.
  • Key Players OpenAI
    Carnegie Mellon University
    University of Michigan
    UC Berkeley
  • Availability 3-5 years

Robots are teaching themselves to handle the physical world.

For all the talk about machines taking jobs, industrial robots are still clumsy and inflexible. A robot can repeatedly pick up a component on an assembly line with amazing precision and without ever getting bored—but move the object half an inch, or replace it with something slightly different, and the machine will fumble ineptly or paw at thin air.

But while a robot can’t yet be programmed to figure out how to grasp any object just by looking at it, as people do, it can now learn to manipulate the object on its own through virtual trial and error.

One such project is Dactyl, a robot that taught itself to flip a toy building block in its fingers. Dactyl, which comes from the San Francisco nonprofit OpenAI, consists of an off-the-shelf robot hand surrounded by an array of lights and cameras. Using what’s known as reinforcement learning, neural-network software learns how to grasp and turn the block within a simulated environment before the hand tries it out for real. The software experiments, randomly at first, strengthening connections within the network over time as it gets closer to its goal.

It usually isn’t possible to transfer that type of virtual practice to the real world, because things like friction or the varied properties of different materials are so difficult to simulate. The OpenAI team got around this by adding randomness to the virtual training, giving the robot a proxy for the messiness of reality.

We’ll need further breakthroughs for robots to master the advanced dexterity needed in a real warehouse or factory. But if researchers can reliably employ this kind of learning, robots might eventually assemble our gadgets, load our dishwashers, and even help Grandma out of bed. —Will Knight

New-wave nuclear power

BOB MUMGAARD/PLASMA SCIENCE AND FUSION CENTER/MIT

Advanced fusion and fission reactors are edging closer to reality. 

New nuclear designs that have gained momentum in the past year are promising to make this power source safer and cheaper. Among them are generation IV fission reactors, an evolution of traditional designs; small modular reactors; and fusion reactors, a technology that has seemed eternally just out of reach. Developers of generation IV fission designs, such as Canada’s Terrestrial Energy and Washington-based TerraPower, have entered into R&D partnerships with utilities, aiming for grid supply (somewhat optimistically, maybe) by the 2020s.

Small modular reactors typically produce in the tens of megawatts of power (for comparison, a traditional nuclear reactor produces around 1,000 MW). Companies like Oregon’s NuScale say the miniaturized reactors can save money and reduce environmental and financial risks.

There has even been progress on fusion. Though no one expects delivery before 2030, companies like General Fusion and Commonwealth Fusion Systems, an MIT spinout, are making some headway. Many consider fusion a pipe dream, but because the reactors can’t melt down and don’t create long-lived, high-level waste, it should face much less public resistance than conventional nuclear. (Bill Gates is an investor in TerraPower and Commonwealth Fusion Systems.) —Leigh Phillips

NENOV | GETTY

Predicting preemies

  • Why it matters 15 million babies are born prematurely every year; it’s the leading cause of death for children under age five
  • Key player Akna Dx
  • Availability A test could be offered in doctor’s offices within five years

A simple blood test can predict if a pregnant woman is at risk of giving birth prematurely.

Our genetic material lives mostly inside our cells. But small amounts of “cell-free” DNA and RNA also float in our blood, often released by dying cells. In pregnant women, that cell-free material is an alphabet soup of nucleic acids from the fetus, the placenta, and the mother.

Stephen Quake, a bioengineer at Stanford, has found a way to use that to tackle one of medicine’s most intractable problems: the roughly one in 10 babies born prematurely.

Free-floating DNA and RNA can yield information that previously required invasive ways of grabbing cells, such as taking a biopsy of a tumor or puncturing a pregnant woman’s belly to perform an amniocentesis. What’s changed is that it’s now easier to detect and sequence the small amounts of cell-free genetic material in the blood. In the last few years researchers have begun developing blood tests for cancer (by spotting the telltale DNA from tumor cells) and for prenatal screening of conditions like Down syndrome.

The tests for these conditions rely on looking for genetic mutations in the DNA. RNA, on the other hand, is the molecule that regulates gene expression—how much of a protein is produced from a gene. By sequencing the free-floating RNA in the mother’s blood, Quake can spot fluctuations in the expression of seven genes that he singles out as associated with preterm birth. That lets him identify women likely to deliver too early. Once alerted, doctors can take measures to stave off an early birth and give the child a better chance of survival.

The technology behind the blood test, Quake says, is quick, easy, and less than $10 a measurement. He and his collaborators have launched a startup, Akna Dx, to commercialize it. —Bonnie Rochman

BRUCE PETERSON

Gut probe in a pill

Why it matters The device makes it easier to screen for and study gut diseases, including one that keeps millions of children in poor countries from growing properly

  • Key player Massachusetts General Hospital
  • Availability Now used in adults; testing in infants begins in 2019

A small, swallowable device captures detailed images of the gut without anesthesia, even in infants and children.

Environmental enteric dysfunction (EED) may be one of the costliest diseases you’ve never heard of. Marked by inflamed intestines that are leaky and absorb nutrients poorly, it’s widespread in poor countries and is one reason why many people there are malnourished, have developmental delays, and never reach a normal height. No one knows exactly what causes EED and how it could be prevented or treated.

Practical screening to detect it would help medical workers know when to intervene and how. Therapies are already available for infants, but diagnosing and studying illnesses in the guts of such young children often requires anesthetizing them and inserting a tube called an endoscope down the throat. It’s expensive, uncomfortable, and not practical in areas of the world where EED is prevalent.

So Guillermo Tearney, a pathologist and engineer at Massachusetts General Hospital (MGH) in Boston, is developing small devices that can be used to inspect the gut for signs of EED and even obtain tissue biopsies. Unlike endoscopes, they are simple to use at a primary care visit.

Tearney’s swallowable capsules contain miniature microscopes. They’re attached to a flexible string-like tether that provides power and light while sending images to a briefcase-like console with a monitor. This lets the health-care worker pause the capsule at points of interest and pull it out when finished, allowing it to be sterilized and reused. (Though it sounds gag-­inducing, Tearney’s team has developed a technique that they say doesn’t cause discomfort.) It can also carry technologies that image the entire surface of the digestive tract at the resolution of a single cell or capture three-dimensional cross sections a couple of millimeters deep.

The technology has several applications; at MGH it’s being used to screen for Barrett’s esophagus, a precursor of esophageal cancer. For EED, Tearney’s team has developed an even smaller version for use in infants who can’t swallow a pill. It’s been tested on adolescents in Pakistan, where EED is prevalent, and infant testing is planned for 2019.

The little probe will help researchers answer questions about EED’s development—such as which cells it affects and whether bacteria are involved—and evaluate interventions and potential treatments. —Courtney Humphrie

PAPER BOAT CREATIVE | GETTY

Custom cancer vaccines

  • Why it matters Conventional chemotherapies take a heavy toll on healthy cells and aren’t always effective against tumors
  • Key players BioNTech
    Genentech
  • Availability In human testing

The treatment incites the body’s natural defenses to destroy only cancer cells by identifying mutations unique to each tumor

Scientists are on the cusp of commercializing the first personalized cancer vaccine. If it works as hoped, the vaccine, which triggers a person’s immune system to identify a tumor by its unique mutations, could effectively shut down many types of cancers.

By using the body’s natural defenses to selectively destroy only tumor cells, the vaccine, unlike conventional chemotherapies, limits damage to healthy cells. The attacking immune cells could also be vigilant in spotting any stray cancer cells after the initial treatment.

The possibility of such vaccines began to take shape in 2008, five years after the Human Genome Project was completed, when geneticists published the first sequence of a cancerous tumor cell.

Soon after, investigators began to compare the DNA of tumor cells with that of healthy cells—and other tumor cells. These studies confirmed that all cancer cells contain hundreds if not thousands of specific mutations, most of which are unique to each tumor.

A few years later, a German startup called BioNTech provided compelling evidence that a vaccine containing copies of these mutations could catalyze the body’s immune system to produce T cells primed to seek out, attack, and destroy all cancer cells harboring them.

In December 2017, BioNTech began a large test of the vaccine in cancer patients, in collaboration with the biotech giant Genentech. The ongoing trial is targeting at least 10 solid cancers and aims to enroll upwards of 560 patients at sites around the globe.

The two companies are designing new manufacturing techniques to produce thousands of personally customized vaccines cheaply and quickly. That will be tricky because creating the vaccine involves performing a biopsy on the patient’s tumor, sequencing and analyzing its DNA, and rushing that information to the production site. Once produced, the vaccine needs to be promptly delivered to the hospital; delays could be deadly. —Adam Pior

BRUCE PETERSON/STYLING: MONICA MARIANO

The cow-free burger

  • Why it matters Livestock production causes catastrophic deforestation, water pollution, and greenhouse-gas emissions
  • Key players Beyond Meat
    Impossible Foods
  • Availability Plant-based now; lab-grown around 2020

Both lab-grown and plant-based alternatives approximate the taste and nutritional value of real meat without the environmental devastation.

The UN expects the world to have 9.8 billion people by 2050. And those people are getting richer. Neither trend bodes well for climate change—especially because as people escape poverty, they tend to eat more meat.

By that date, according to the predictions, humans will consume 70% more meat than they did in 2005. And it turns out that raising animals for human consumption is among the worst things we do to the environment.

Depending on the animal, producing a pound of meat protein with Western industrialized methods requires 4 to 25 times more water, 6 to 17 times more land, and 6 to 20 times more fossil fuels than producing a pound of plant protein.

The problem is that people aren’t likely to stop eating meat anytime soon. Which means lab-grown and plant-based alternatives might be the best way to limit the destruction.

Making lab-grown meat involves extracting muscle tissue from animals and growing it in bioreactors. The end product looks much like what you’d get from an animal, although researchers are still working on the taste. Researchers at Maastricht University in the Netherlands, who are working to produce lab-grown meat at scale, believe they’ll have a lab-grown burger available by next year. One drawback of lab-grown meat is that the environmental benefits are still sketchy at best—a recent World Economic Forum report says the emissions from lab-grown meat would be only around 7% less than emissions from beef production.

The better environmental case can be made for plant-based meats from companies like Beyond Meat and Impossible Foods (Bill Gates is an investor in both companies), which use pea proteins, soy, wheat, potatoes, and plant oils to mimic the texture and taste of animal meat.

Beyond Meat has a new 26,000-square-foot (2,400-square-meter) plant in California and has already sold upwards of 25 million burgers from 30,000 stores and restaurants. According to an analysis by the Center for Sustainable Systems at the University of Michigan, a Beyond Meat patty would probably generate 90% less in greenhouse-gas emissions than a conventional burger made from a cow. —Markkus Rovito

 

NICO ORTEGA

Carbon dioxide catcher

  • Why it matters Removing CO2 from the atmosphere might be one of the last viable ways to stop catastrophic climate change
  • Key players Carbon Engineering
    Climeworks
    Global Thermostat
  • Availability 5-10 years

 

Practical and affordable ways to capture carbon dioxide from the air can soak up excess greenhouse-gas emissions.

Even if we slow carbon dioxide emissions, the warming effect of the greenhouse gas can persist for thousands of years. To prevent a dangerous rise in temperatures, the UN’s climate panel now concludes, the world will need to remove as much as 1 trillion tons of carbon dioxide from the atmosphere this century.

In a surprise finding last summer, Harvard climate scientist David Keith calculated that machines could, in theory, pull this off for less than $100 a ton, through an approach known as direct air capture. That’s an order of magnitude cheaper than earlier estimates that led many scientists to dismiss the technology as far too expensive—though it will still take years for costs to fall to anywhere near that level.

But once you capture the carbon, you still need to figure out what to do with it.

Carbon Engineering, the Canadian startup Keith cofounded in 2009, plans to expand its pilot plant to ramp up production of its synthetic fuels, using the captured carbon dioxide as a key ingredient. (Bill Gates is an investor in Carbon Engineering.)

Zurich-based Climeworks’s direct air capture plant in Italy will produce methane from captured carbon dioxide and hydrogen, while a second plant in Switzerland will sell carbon dioxide to the soft-drinks industry. So will Global Thermostat of New York, which finished constructing its first commercial plant in Alabama last year.

Still, if it’s used in synthetic fuels or sodas, the carbon dioxide will mostly end up back in the atmosphere. The ultimate goal is to lock greenhouse gases away forever. Some could be nested within products like carbon fiber, polymers, or concrete, but far more will simply need to be buried underground, a costly job that no business model seems likely to support.

In fact, pulling CO2 out of the air is, from an engineering perspective, one of the most difficult and expensive ways of dealing with climate change. But given how slowly we’re reducing emissions, there are no good options left. —James Temple

BRUCE PETERSON

An ECG on your wrist

Regulatory approval and technological advances are making it easier for people to continuously monitor their hearts with wearable devices.

Fitness trackers aren’t serious medical devices. An intense workout or loose band can mess with the sensors that read your pulse. But an electrocardiogram—the kind doctors use to diagnose abnormalities before they cause a stroke or heart attack— requires a visit to a clinic, and people often fail to take the test in time.

ECG-enabled smart watches, made possible by new regulations and innovations in hardware and software, offer the convenience of a wearable device with something closer to the precision of a medical one.

An Apple Watch–compatible band from Silicon Valley startup AliveCor that can detect atrial fibrillation, a frequent cause of blood clots and stroke, received clearance from the FDA in 2017. Last year, Apple released its own FDA-cleared ECG feature, embedded in the watch itself.

The health-device company Withings also announced plans for an ECG-equipped watch shortly after.
Current wearables still employ only a single sensor, whereas a real ECG has 12. And no wearable can yet detect a heart attack as it’s happening.

But this might change soon. Last fall, AliveCor presented preliminary results to the American Heart Association on an app and two-­sensor system that can detect a certain type of heart attack. —Karen Hao

THEDMAN | GETTY

Sanitation without sewers

  • Why it matters 2.3 billion people lack safe sanitation, and many die as a result
  • Key players Duke University
    University of South Florida
    Biomass Controls
    California Institute of Technology
  • Availability 1-2 years

 

Energy-efficient toilets can operate without a sewer system and treat waste on the spot.

About 2.3 billion people don’t have good sanitation. The lack of proper toilets encourages people to dump fecal matter into nearby ponds and streams, spreading bacteria, viruses, and parasites that can cause diarrhea and cholera. Diarrhea causes one in nine child deaths worldwide.

Now researchers are working to build a new kind of toilet that’s cheap enough for the developing world and can not only dispose of waste but treat it as well.

In 2011 Bill Gates created what was essentially the X Prize in this area—the Reinvent the Toilet Challenge. Since the contest’s launch, several teams have put prototypes in the field. All process the waste locally, so there’s no need for large amounts of water to carry it to a distant treatment plant.

Most of the prototypes are self-contained and don’t need sewers, but they look like traditional toilets housed in small buildings or storage containers. The NEWgenerator toilet, designed at the University of South Florida, filters out pollutants with an anaerobic membrane, which has pores smaller than bacteria and viruses. Another project, from Connecticut-based Biomass Controls, is a refinery the size of a shipping container; it heats the waste to produce a carbon-rich material that can, among other things, fertilize soil.

One drawback is that the toilets don’t work at every scale. The Biomass Controls product, for example, is designed primarily for tens of thousands of users per day, which makes it less well suited for smaller villages. Another system, developed at Duke University, is meant to be used only by a few nearby homes.

So the challenge now is to make these toilets cheaper and more adaptable to communities of different sizes. “It’s great to build one or two units,” says Daniel Yeh, an associate professor at the University of South Florida, who led the NEWgenerator team. “But to really have the technology impact the world, the only way to do that is mass-produce the units.” —Erin Winick

BRUCE PETERSON

Smooth-talking AI assistants

  • Why it matters AI assistants can now perform conversation-based tasks like booking a restaurant reservation or coordinating a package drop-off rather than just obey simple commands
  • Key players Google
    Alibaba
    Amazon
  • Availability 1-2 years

 

New techniques that capture semantic relationships between words are making machines better at understanding natural language.

We’re used to AI assistants—Alexa playing music in the living room, Siri setting alarms on your phone—but they haven’t really lived up to their alleged smarts. They were supposed to have simplified our lives, but they’ve barely made a dent. They recognize only a narrow range of directives and are easily tripped up by deviations.

But some recent advances are about to expand your digital assistant’s repertoire. In June 2018, researchers at OpenAI developed a technique that trains an AI on unlabeled text to avoid the expense and time of categorizing and tagging all the data manually. A few months later, a team at Google unveiled a system called BERT that learned how to predict missing words by studying millions of sentences. In a multiple-choice test, it did as well as humans at filling in gaps.

These improvements, coupled with better speech synthesis, are letting us move from giving AI assistants simple commands to having conversations with them. They’ll be able to deal with daily minutiae like taking meeting notes, finding information, or shopping online.

Some are already here. Google Duplex, the eerily human-like upgrade of Google Assistant, can pick up your calls to screen for spammers and telemarketers. It can also make calls for you to schedule restaurant reservations or salon appointments.

In China, consumers are getting used to Alibaba’s AliMe, which coordinates package deliveries over the phone and haggles about the price of goods over chat.

But while AI programs have gotten better at figuring out what you want, they still can’t understand a sentence. Lines are scripted or generated statistically, reflecting how hard it is to imbue machines with true language understanding. Once we cross that hurdle, we’ll see yet another evolution, perhaps from logistics coordinator to babysitter, teacher—or even friend? —Karen Hao

MIT’s 10 Breakthrough Technologies for 2019 – Introduction by Bill Gates: Part I


In this Two (2) Part Re-Post from MIT Technology Review 10 Breakthrough Technologies for 2019. Guest Curator Bill Gates has been asked to choose this year’s list of inventions that will change the world for the better.

Part I: Bill Gates: How we’ll Invent the Future

was honored when MIT Technology Review invited me to be the first guest curator of its 10 Breakthrough Technologies. Narrowing down the list was difficult. I wanted to choose things that not only will create headlines in 2019 but captured this moment in technological history—which got me thinking about how innovation has evolved over time.

My mind went to—of all things—the plow. Plows are an excellent embodiment of the history of innovation. Humans have been using them since 4000 BCE, when Mesopotamian farmers aerated soil with sharpened sticks. We’ve been slowly tinkering with and improving them ever since, and today’s plows are technological marvels.

 

But what exactly is the purpose of a plow? It’s a tool that creates more: more seeds planted, more crops harvested, more food to go around. In places where nutrition is hard to come by, it’s no exaggeration to say that a plow gives people more years of life. The plow—like many technologies, both ancient and modern—is about creating more of something and doing it more efficiently, so that more people can benefit.

Contrast that with lab-grown meat, one of the innovations I picked for this year’s 10 Breakthrough Technologies list. Growing animal protein in a lab isn’t about feeding more people. There’s enough livestock to feed the world already, even as demand for meat goes up. Next-generation protein isn’t about creating more—it’s about making meat better. It lets us provide for a growing and wealthier world without contributing to deforestation or emitting methane. It also allows us to enjoy hamburgers without killing any animals.

Put another way, the plow improves our quantity of life, and lab-grown meat improves our quality of life. For most of human history, we’ve put most of our innovative capacity into the former. And our efforts have paid off: worldwide life expectancy rose from 34 years in 1913 to 60 in 1973 and has reached 71 today.

Because we’re living longer, our focus is starting to shift toward well-being. This transformation is happening slowly. If you divide scientific breakthroughs into these two categories—things that improve quantity of life and things that improve quality of life—the 2009 list looks not so different from this year’s. Like most forms of progress, the change is so gradual that it’s hard to perceive. It’s a matter of decades, not years—and I believe we’re only at the midpoint of the transition.

To be clear, I don’t think humanity will stop trying to extend life spans anytime soon. We’re still far from a world where everyone everywhere lives to old age in perfect health, and it’s going to take a lot of innovation to get us there. Plus, “quantity of life” and “quality of life” are not mutually exclusive. A malaria vaccine would both save lives and make life better for children who might otherwise have been left with developmental delays from the disease.

We’ve reached a point where we’re tackling both ideas at once, and that’s what makes this moment in history so interesting. If I had to predict what this list will look like a few years from now, I’d bet technologies that alleviate chronic disease will be a big theme. This won’t just include new drugs (although I would love to see new treatments for diseases like Alzheimer’s on the list). The innovations might look like a mechanical glove that helps a person with arthritis maintain flexibility, or an app that connects people experiencing major depressive episodes with the help they need.

If we could look even further out—let’s say the list 20 years from now—I would hope to see technologies that center almost entirely on well-being. I think the brilliant minds of the future will focus on more metaphysical questions: How do we make people happier? How do we create meaningful connections? How do we help everyone live a fulfilling life?

I would love to see these questions shape the 2039 list, because it would mean that we’ve successfully fought back disease (and dealt with climate change). I can’t imagine a greater sign of progress than that. For now, though, the innovations driving change are a mix of things that extend life and things that make it better. My picks reflect both. Each one gives me a different reason to be optimistic for the future, and I hope they inspire you, too.

My selections include amazing new tools that will one day save lives, from simple blood tests that predict premature birth to toilets that destroy deadly pathogens. I’m equally excited by how other technologies on the list will improve our lives. Wearable health monitors like the wrist-based ECG will warn heart patients of impending problems, while others let diabetics not only track glucose levels but manage their disease. Advanced nuclear reactors could provide carbon-free, safe, secure energy to the world.

One of my choices even offers us a peek at a future where society’s primary goal is personal fulfillment. Among many other applications, AI-driven personal agents might one day make your e-mail in-box more manageable—something that sounds trivial until you consider what possibilities open up when you have more free time.

The 30 minutes you used to spend reading e-mail could be spent doing other things. I know some people would use that time to get more work done—but I hope most would use it for pursuits like connecting with a friend over coffee, helping your child with homework, or even volunteering in your community.

That, I think, is a future worth working toward.

MIT Nuclear 2 c-mod-internal-1

 

You can read Part II Here

MIT: How Tumors Behave on Acid: Acidic Environment Triggers Genes that Help Cancer Metastasize


MIT-Tumor-Acidity_0

In these tumor cells, acidic regions are labeled in red. Invasive regions of the cells, which express a protein called MMP14, are labeled in green. Image: Nazanin Rohani

Acidic environment triggers genes that help cancer cells metastasize.

Scientists have long known that tumors have many pockets of high acidity, usually found deep within the tumor where little oxygen is available. However, a new study from MIT researchers has found that tumor surfaces are also highly acidic, and that this acidity helps tumors to become more invasive and metastatic.

The study found that the acidic environment helps tumor cells to produce proteins that make them more aggressive. The researchers also showed that they could reverse this process in mice by making the tumor environment less acidic.

“Our findings reinforce the view that tumor acidification is an important driver of aggressive tumor phenotypes, and it indicates that methods that target this acidity could be of value therapeutically,” says Frank Gertler, an MIT professor of biology, a member of MIT’s Koch Institute for Integrative Cancer Research, and the senior author of the study.

Former MIT postdoc Nazanin Rohani is the lead author of the study, which appears in the journal Cancer Research.

Mapping acidity

Scientists usually attribute a tumor’s high acidity to the lack of oxygen, or hypoxia, that often occurs in tumors because they don’t have an adequate blood supply. However, until now, it has been difficult to precisely map tumor acidity and determine whether it overlaps with hypoxic regions.

In this study, the MIT team used a probe called pH (Low) Insertion Peptide (pHLIP), originally developed by researchers at the University of Rhode Island, to map the acidic regions of breast tumors in mice. This peptide is floppy at normal pH but becomes more stable at low, acidic pH. When this happens, the peptide can insert itself into cell membranes. This allows the researchers to determine which cells have been exposed to acidic conditions, by identifying cells that have been tagged with the peptide.

To their surprise, the researchers found that not only were cells in the oxygen-deprived interior of the tumor acidic, there were also acidic regions at the boundary of the tumor and the structural tissue that surrounds it, known as the stroma.

“There was a great deal of tumor tissue that did not have any hallmarks of hypoxia that was quite clearly exposed to acidosis,” Gertler says. “We started looking at that, and we realized hypoxia probably wouldn’t explain the majority of regions of the tumor that were acidic.”

illustration-of-tumor-spreading          A new study explores how an acidic environment drives tumor spread.

Read More: How Does Tumor Acidity Help Cancer Spread

Further investigation revealed that many of the cells at the tumor surface had shifted to a type of cell metabolism known as aerobic glycolysis. This process generates lactic acid as a byproduct, which could account for the high acidity, Gertler says. The researchers also discovered that in these acidic regions, cells had turned on gene expression programs associated with invasion and metastasis. Nearly 3,000 genes showed pH-dependent changes in activity, and close to 300 displayed changes in how the genes are assembled, or spliced.

“Tumor acidosis gives rise to the expression of molecules involved in cell invasion and migration. This reprogramming, which is an intracellular response to a drop in extracellular pH, gives the cancer cells the ability to survive under low-pH conditions and proliferate,” Rohani says.

Those activated genes include Mena, which codes for a protein that normally plays a key role in embryonic development. Gertler’s lab had previously discovered that in some tumors, Mena is spliced differently, producing an alternative form of the protein known as MenaINV (invasive). This protein helps cells to migrate into blood vessels and spread though the body.

Another key protein that undergoes alternative splicing in acidic conditions is CD44, which also helps tumor cells to become more aggressive and break through the extracellular tissues that normally surround them. This study marks the first time that acidity has been shown to trigger alternative splicing for these two genes.

Reducing acidity

The researchers then decided to study how these genes would respond to decreasing the acidity of the tumor microenvironment. To do that, they added sodium bicarbonate to the mice’s drinking water. This treatment reduced tumor acidity and shifted gene expression closer to the normal state. In other studies, sodium bicarbonate has also been shown to reduce metastasis in mouse models.

Sodium bicarbonate would not be a feasible cancer treatment because it is not well-tolerated by humans, but other approaches that lower acidity could be worth exploring, Gertler says. The expression of new alternative splicing genes in response to the acidic microenvironment of the tumor helps cells survive, so this phenomenon could be exploited to reverse those programs and perturb tumor growth and potentially metastasis.

“Other methods that would more focally target acidification could be of great value,” he says.

The research was funded by the Koch Institute Support (core) Grant from the National Cancer Institute, the Howard Hughes Medical Institute, the National Institutes of Health, the KI Quinquennial Cancer Research Fellowship, and MIT’s Undergraduate Research Opportunities Program.

Other authors of the paper include Liangliang Hao, a former MIT postdoc; Maria Alexis and Konstantin Krismer, MIT graduate students; Brian Joughin, a lead research modeler at the Koch Institute; Mira Moufarrej, a recent graduate of MIT; Anthony Soltis, a recent MIT PhD recipient; Douglas Lauffenburger, head of MIT’s Department of Biological Engineering; Michael Yaffe, a David H. Koch Professor of Science; Christopher Burge, an MIT professor of biology; and Sangeeta Bhatia, the John and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science.

MIT: New Optical Imaging System could be Deployed to find Tiny Tumors and Detect Cancer Earlier – “A Game Changing Method”


 

MIT-Deep-Tissue-Imaging-01_0

Near-infrared technology pinpoints fluorescent probes deep within living tissue; may be used to detect cancer earlier. MIT researchers have devised a way to simultaneously image in multiple wavelengths of near-infrared light, allowing them to determine the depth of particles emitting different wavelengths. Image courtesy of the researchers

Many types of cancer could be more easily treated if they were detected at an earlier stage. MIT researchers have now developed an imaging system, named “DOLPHIN,” which could enable them to find tiny tumors, as small as a couple of hundred cells, deep within the body. 

In a new study, the researchers used their imaging system, which relies on near-infrared light, to track a 0.1-millimeter fluorescent probe through the digestive tract of a living mouse. They also showed that they can detect a signal to a tissue depth of 8 centimeters, far deeper than any existing biomedical optical imaging technique.

The researchers hope to adapt their imaging technology for early diagnosis of ovarian and other cancers that are currently difficult to detect until late stages.

“We want to be able to find cancer much earlier,” says Angela Belcher, the James Mason Crafts Professor of Biological Engineering and Materials Science at MIT and a member of the Koch Institute for Integrative Cancer Research, and the newly-appointed head of MIT’s Department of Biological Engineering. “Our goal is to find tiny tumors, and do so in a noninvasive way.”

Belcher is the senior author of the study, which appears in the March 7 issue of Scientific Reports. Xiangnan Dang, a former MIT postdoc, and Neelkanth Bardhan, a Mazumdar-Shaw International Oncology Fellow, are the lead authors of the study. Other authors include research scientists Jifa Qi and Ngozi Eze, former postdoc Li Gu, postdoc Ching-Wei Lin, graduate student Swati Kataria, and Paula Hammond, the David H. Koch Professor of Engineering, head of MIT’s Department of Chemical Engineering, and a member of the Koch Institute.

Deeper imaging

Existing methods for imaging tumors all have limitations that prevent them from being useful for early cancer diagnosis. Most have a tradeoff between resolution and depth of imaging, and none of the optical imaging techniques can image deeper than about 3 centimeters into tissue. Commonly used scans such as X-ray computed tomography (CT) and magnetic resonance imaging (MRI) can image through the whole body; however, they can’t reliably identify tumors until they reach about 1 centimeter in size.

Belcher’s lab set out to develop new optical methods for cancer imaging several years ago, when they joined the Koch Institute. They wanted to develop technology that could image very small groups of cells deep within tissue and do so without any kind of radioactive labeling.

Near-infrared light, which has wavelengths from 900 to 1700 nanometers, is well-suited to tissue imaging because light with longer wavelengths doesn’t scatter as much as when it strikes objects, which allows the light to penetrate deeper into the tissue. To take advantage of this, the researchers used an approach known as hyperspectral imaging, which enables simultaneous imaging in multiple wavelengths of light.

The researchers tested their system with a variety of near-infrared fluorescent light-emitting probes, mainly sodium yttrium fluoride nanoparticles that have rare earth elements such as erbium, holmium, or praseodymium added through a process called doping. Depending on the choice of the doping element, each of these particles emits near-infrared fluorescent light of different wavelengths.

Using algorithms that they developed, the researchers can analyze the data from the hyperspectral scan to identify the sources of fluorescent light of different wavelengths, which allows them to determine the location of a particular probe. By further analyzing light from narrower wavelength bands within the entire near-IR spectrum, the researchers can also determine the depth at which a probe is located. The researchers call their system “DOLPHIN”, which stands for “Detection of Optically Luminescent Probes using Hyperspectral and diffuse Imaging in Near-infrared.”

To demonstrate the potential usefulness of this system, the researchers tracked a 0.1-millimeter-sized cluster of fluorescent nanoparticles that was swallowed and then traveled through the digestive tract of a living mouse. These probes could be modified so that they target and fluorescently label specific cancer cells.

“In terms of practical applications, this technique would allow us to non-invasively track a 0.1-millimeter-sized fluorescently-labeled tumor, which is a cluster of about a few hundred cells. To our knowledge, no one has been able to do this previously using optical imaging techniques,” Bardhan says.

Earlier detection

The researchers also demonstrated that they could inject fluorescent particles into the body of a mouse or a rat and then image through the entire animal, which requires imaging to a depth of about 4 centimeters, to determine where the particles ended up. And in tests with human tissue-mimics and animal tissue, they were able to locate the probes to a depth of up to 8 centimeters, depending on the type of tissue.

Guosong Hong, an assistant professor of materials science and engineering at Stanford University, described the new method as “game-changing.”

“This is really amazing work,” says Hong, who was not involved in the research. “For the first time, fluorescent imaging has approached the penetration depth of CT and MRI, while preserving its naturally high resolution, making it suitable to scan the entire human body.”

Early Detect cancer-cells-600Read More About the Importance of Early Detection

This kind of system could be used with any fluorescent probe that emits light in the near-infrared spectrum, including some that are already FDA-approved, the researchers say. The researchers are also working on adapting the imaging system so that it could reveal intrinsic differences in tissue contrast, including signatures of tumor cells, without any kind of fluorescent label.

In ongoing work, they are using a related version of this imaging system to try to detect ovarian tumors at an early stage. Ovarian cancer is usually diagnosed very late because there is no easy way to detect it when the tumors are still small.

“Ovarian cancer is a terrible disease, and it gets diagnosed so late because the symptoms are so nondescript,” Belcher says. “We want a way to follow recurrence of the tumors, and eventually a way to find and follow early tumors when they first go down the path to cancer or metastasis. This is one of the first steps along the way in terms of developing this technology.”

The researchers have also begun working on adapting this type of imaging to detect other types of cancer such as pancreatic cancer, brain cancer, and melanoma.

The research was funded by the Koch Institute Frontier Research Program, the Marble Center for Cancer Nanomedicine, the Koch Institute Support (core) Grant from the National Cancer Institute, the NCI Center for Center for Cancer Nanotechnology Excellence, and the Bridge Project.

EPFL and MIT Researchers Discover the ‘Holy Grail’ of Nanowire Production


Holy Grail Nanowire 5c6d75008f989

EPFL researchers have found a way to control and standardize the production of nanowires on silicon surfaces. Credit: Ecole Polytechnique Federale de Lausanne (EPFL)

Nanowires have the potential to revolutionize the technology around us. Measuring just 5-100 nanometers in diameter (a nanometer is a millionth of a millimeter), these tiny, needle-shaped crystalline structures can alter how electricity or light passes through them.

They can emit, concentrate and absorb light and could therefore be used to add optical functionalities to electronic chips. They could, for example, make it possible to generate lasers directly on  and to integrate single-photon emitters for coding purposes. They could even be applied in  to improve how sunlight is converted into electrical energy.

Up until now, it was impossible to reproduce the process of growing nanowires on silicon semiconductors – there was no way to repeatedly produce homogeneous nanowires in specific positions.

But researchers from EPFL’s Laboratory of Semiconductor Materials, run by Anna Fontcuberta i Morral, together with colleagues from MIT and the IOFFE Institute, have come up with a way of growing nanowire networks in a highly controlled and fully reproducible manner. The key was to understand what happens at the onset of nanowire growth, which goes against currently accepted theories. Their work has been published in Nature Communications.

“We think that this discovery will make it possible to realistically integrate a series of nanowires on silicon substrates,” says Fontcuberta i Morral. “Up to now, these nanowires had to be grown individually, and the process couldn’t be reproduced.”

The holy grail of nanowire production
Two different configurations of the droplet within the opening – hole fully filled and partially filled and bellow illustration of GaAs crystals forming a full ring or a step underneath the large and small gallium droplets. Credit: Ecole Polytechnique Federale de Lausanne (EPFL)

 

Getting the right ratio

The standard process for producing nanowires is to make  in  monoxide and fill them with a nanodrop of liquid gallium. This substance then solidifies when it comes into contact with arsenic. But with this process, the substance tends to harden at the corners of the nanoholes, which means that the angle at which the nanowires will grow can’t be predicted. The search was on for a way to produce homogeneous nanowires and control their position.

Research aimed at controlling the  has tended to focus on the diameter of the hole, but this approach has not paid off. Now EPFL researchers have shown that by altering the diameter-to-height ratio of the hole, they can perfectly control how the nanowires grow. At the right ratio, the substance will solidify in a ring around the edge of the hole, which prevents the nanowires from growing at a non-perpendicular angle. And the researchers’ process should work for all types of .

“It’s kind of like growing a plant. They need water and sunlight, but you have to get the quantities right,” says Fontcuberta i Morral.

This new production technique will be a boon for nanowire research, and further samples should soon be developed.

 Explore further: Nanowires have the power to revolutionize solar energy (w/ video)

More information: J. Vukajlovic-Plestina et al. Fundamental aspects to localize self-catalyzed III-V nanowires on silicon, Nature Communications (2019). DOI: 10.1038/s41467-019-08807-9

 

A Path to Cheaper Flexible Solar Cells -Researchers at Georgia IT and MIT are Developing the Potential Perovskite-Based Solar Cells


Perovskite GT 190207142218_1_540x360
A researcher at Georgia Tech holds a perovskite-based solar cell, which is flexible and lighter than silicon-based versions. Credit: Rob Felt, Georgia Tech

There’s a lot to like about perovskite-based solar cells. They are simple and cheap to produce, offer flexibility that could unlock a wide new range of installation methods and places, and in recent years have reached energy efficiencies approaching those of traditional silicon-based cells.

But figuring out how to produce perovskite-based energy devices that last longer than a couple of months has been a challenge.

Now researchers from Georgia Institute of Technology, University of California San Diego and Massachusetts Institute of Technology have reported new findings about perovskite solar cells that could lead the way to devices that perform better.

“Perovskite solar cells offer a lot of potential advantages because they are extremely lightweight and can be made with flexible plastic substrates,” said Juan-Pablo Correa-Baena, an assistant professor in the Georgia Tech School of Materials Science and Engineering. “To be able to compete in the marketplace with silicon-based solar cells, however, they need to be more efficient.”

In a study that was published February 8 in the journal Science and was sponsored by the U.S Department Energy and the National Science Foundation, the researchers described in greater detail the mechanisms of how adding alkali metal to the traditional perovskites leads to better performance. Perov SCs 091_main

“Perovskites could really change the game in solar,” said David Fenning, a professor of nanoengineering at the University of California San Diego. “They have the potential to reduce costs without giving up performance. But there’s still a lot to learn fundamentally about these materials.”

To understand perovskite crystals, it’s helpful to think of its crystalline structure as a triad. One part of the triad is typically formed from the element lead. The second is typically made up of an organic component such as methylammonium, and the third is often comprised of other halides such as bromine and iodine.

In recent years, researchers have focused on testing different recipes to achieve better efficiencies, such as adding iodine and bromine to the lead component of the structure. Later, they tried substituting cesium and rubidium to the part of the perovskite typically occupied by organic molecules.

“We knew from earlier work that adding cesium and rubidium to a mixed bromine and iodine lead perovskite leads to better stability and higher performance,” Correa-Baena said.

But little was known about why adding those alkali metals improved performance of the perovskites.

To understand exactly why that seemed to work, the researchers used high-intensity X-ray mapping to examine the perovskites at the nanoscale.

Structure-of-perovskite-solar-cells-a-Device-architecture-and-b-energy-band-diagram

“By looking at the composition within the perovskite material, we can see how each individual element plays a role in improving the performance of the device,” said Yanqi (Grace) Luo, a nanoengineering PhD student at UC San Diego.

They discovered that when the cesium and rubidium were added to the mixed bromine and iodine lead perovskite, it caused the bromine and iodine to mix together more homogeneously, resulting in up to 2 percent higher conversion efficiency than the materials without these additives.

“We found that uniformity in the chemistry and structure is what helps a perovskite solar cell operate at its fullest potential,” Fenning said. “Any heterogeneity in that backbone is like a weak link in the chain.”

Even so, the researchers also observed that while adding rubidium or cesium caused the bromine and iodine to become more homogenous, the halide metals themselves within their own cation remained fairly clustered, creating inactive “dead zones” in the solar cell that produce no current.

“This was surprising,” Fenning said. “Having these dead zones would typically kill a solar cell. In other materials, they act like black holes that suck in electrons from other regions and never let them go, so you lose current and voltage.

“But in these perovskites, we saw that the dead zones around rubidium and cesium weren’t too detrimental to solar cell performance, though there was some current loss,” Fenning said. “This shows how robust these materials are but also that there’s even more opportunity for improvement.”

The findings add to the understanding of how the perovskite-based devices work at the nanoscale and could lay the groundwork for future improvements.

“These materials promise to be very cost effective and high performing, which is pretty much what we need to make sure photovoltaic panels are deployed widely,” Correa-Baena said. “We want to try to offset issues of climate change, so the idea is to have photovoltaic cells that are as cheap as possible.”

Story Source:

Materials provided by Georgia Institute of TechnologyNote: Content may be edited for style and length.

MIT: Unleashing perovskites’ potential for solar cells


Solar cells made of perovskite have great promise, in part because they can easily be made on flexible substrates, like this experimental cell. Image: Ken Richardson

New results show how varying the recipe could bring these materials closer to commercialization.

Perovskites — a broad category of compounds that share a certain crystal structure — have attracted a great deal of attention as potential new solar-cell materials because of their low cost, flexibility, and relatively easy manufacturing process.

But much remains unknown about the details of their structure and the effects of substituting different metals or other elements within the material.

Conventional solar cells made of silicon must be processed at temperatures above 1,400 degrees Celsius, using expensive equipment that limits their potential for production scaleup.

In contrast, perovskites can be processed in a liquid solution at temperatures as low as 100 degrees, using inexpensive equipment. What’s more, perovskites can be deposited on a variety of substrates, including flexible plastics, enabling a variety of new uses that would be impossible with thicker, stiffer silicon wafers.

Now, researchers have been able to decipher a key aspect of the behavior of perovskites made with different formulations:

With certain additives there is a kind of “sweet spot” where greater amounts will enhance performance and beyond which further amounts begin to degrade it.

The findings are detailed this week in the journal Science, in a paper by former MIT postdoc Juan-Pablo Correa-Baena, MIT professors Tonio Buonassisi and Moungi Bawendi, and 18 others at MIT, the University of California at San Diego, and other institutions.

Perovskite solar cells are thought to have great potential, and new understanding of how changes in composition affect their behavior could help to make them practical. Image: Ken Richardson

Perovskites are a family of compounds that share a three-part crystal structure. Each part can be made from any of a number of different elements or compounds — leading to a very broad range of possible formulations. Buonassisi compares designing a new perovskite to ordering from a menu, picking one (or more) from each of column A, column B, and (by convention) column X.

“You can mix and match,” he says, but until now all the variations could only be studied by trial and error, since researchers had no basic understanding of what was going on in the material.

In previous research by a team from the Swiss École Polytechnique Fédérale de Lausanne, in which Correa-Baena participated, had found that adding certain alkali metals to the perovskite mix could improve the material’s efficiency at converting solar energy to electricity, from about 19 percent to about 22 percent.

But at the time there was no explanation for this improvement, and no understanding of exactly what these metals were doing inside the compound. “Very little was known about how the microstructure affects the performance,” Buonassisi says.

Now, detailed mapping using high-resolution synchrotron nano-X-ray fluorescence measurements, which can probe the material with a beam just one-thousandth the width of a hair, has revealed the details of the process, with potential clues for how to improve the material’s performance even further.

It turns out that adding these alkali metals, such as cesium or rubidium, to the perovskite compound helps some of the other constituents to mix together more smoothly. As the team describes it, these additives help to “homogenize” the mixture, making it conduct electricity more easily and thus improving its efficiency as a solar cell.

But, they found, that only works up to a certain point. Beyond a certain concentration, these added metals clump together, forming regions that interfere with the material’s conductivity and partly counteract the initial advantage. In between, for any given formulation of these complex compounds, is the sweet spot that provides the best performance, they found.

“It’s a big finding,” says Correa-Baena, who in January became an assistant professor of materials science and engineering at Georgia Tech.

What the researchers found, after about three years of work at MIT and with collaborators at UCSD, was “what happens when you add those alkali metals, and why the performance improves.” They were able to directly observe the changes in the composition of the material, and reveal, among other things, these countervailing effects of homogenizing and clumping.

“The idea is that, based on these findings, we now know we should be looking into similar systems, in terms of adding alkali metals or other metals,” or varying other parts of the recipe, Correa-Baena says.

While perovskites can have major benefits over conventional silicon solar cells, especially in terms of the low cost of setting up factories to produce them, they still require further work to boost their overall efficiency and improve their longevity, which lags significantly behind that of silicon cells.

Although the researchers have clarified the structural changes that take place in the perovskite material when adding different metals, and the resulting changes in performance, “we still don’t understand the chemistry behind this,” Correa-Baena says. That’s the subject of ongoing research by the team. The theoretical maximum efficiency of these perovskite solar cells is about 31 percent, according to Correa-Baena, and the best performance to date is around 23 percent, so there remains a significant margin for potential improvement.

Although it may take years for perovskites to realize their full potential, at least two companies are already in the process of setting up production lines, and they expect to begin selling their first modules within the next year or so. Some of these are small, transparent and colorful solar cells designed to be integrated into a building’s façade. “It’s already happening,” Correa-Baena says, “but there’s still work to do in making these more durable.”

Once issues of large-scale manufacturability, efficiency, and durability are addressed, Buonassisi says, perovskites could become a major player in the renewable energy industry. “If they succeed in making sustainable, high-efficiency modules while preserving the low cost of the manufacturing, that could be game-changing,” he says. “It could allow expansion of solar power much faster than we’ve seen.”

Perovskite solar cells “are now primary candidates for commercialization. Thus, providing deeper insights, as done in this work, contributes to future development,” says Michael Saliba, a senior researcher on the physics of soft matter at the University of Fribourg, Switzerland, who was not involved in this research.

Saliba adds, “This is great work that is shedding light on some of the most investigated materials. The use of synchrotron-based, novel techniques in combination with novel material engineering is of the highest quality, and is deserving of appearing in such a high-ranking journal.” He adds that work in this field “is rapidly progressing. Thus, having more detailed knowledge will be important for addressing future engineering challenges.”

The study, which included researchers at Purdue University and Argonne National Laboratory, in addition to those at MIT and UCSD, was supported by the U.S. Department of Energy, the National Science Foundation, the Skolkovo Institute of Science and Technology, and the California Energy Commission.