What if Green Energy Isn’t the Future?


A gas-filtration system atop a well managed by Anadarko in Pennsylvania, Sept. 8, 2012.PHOTO: ROBERT NICKELSBERG/GETTY IMAGES

What’s Warren Buffett doing with a $10 billion bet on the future of oil and gas, helping old-schoolOccidental Petroleum buy Anadarko, a U.S. shale leader? For pundits promoting the all-green future, this looks like betting on horse farms circa 1919.

Meanwhile, broad market sentiment is decidedly bearish on hydrocarbons. The oil and gas share of the S&P 500 is at a 40-year low, and the first quarter of 2019 saw the Nasdaq Clean Edge Green Energy Index and “clean tech” exchange-traded funds outperform the S&P.

A week doesn’t pass without a mayor, governor or policy maker joining the headlong rush to pledge or demand a green energy future.

Some 100 U.S. cities have made such promises. Hydrocarbons may be the source of 80% of America’s and the world’s energy, but to say they are currently out of favor is a dramatic understatement. 

Yet it’s both reasonable and, for contrarian investors, potentially lucrative to ask: What happens if renewables fail to deliver?

The prevailing wisdom has wind and solar, paired with batteries, adding 250% more energy to the world over the next two decades than American shale has added over the past 15 years.

Is that realistic? The shale revolution has been the single biggest addition to the world energy supply in the past century. And even bullish green scenarios still see global demand for oil and gas rising, if more slowly.

If the favored alternatives fall short of delivering what growing economies need, will markets tolerate energy starvation? Not likely. Nations everywhere will necessarily turn to hydrocarbons.

And just how big could the call on oil and natural gas—and coal, for that matter—become if, say, only half as much green-tech energy gets produced as is now forecast? Keep in mind that a 50% “haircut” would still mean unprecedented growth in green-tech.

If the three hydrocarbons were each to supply one-third of such a posited green shortfall, global petroleum output would have to increase by an amount equal to doubling the production of the Permian shale field (Anadarko’s home). And the world supply of liquid natural gas would need to increase by an amount equal to twice Qatar’s current exports, plus coal would have to almost double what the top global exporter, Australia, now ships. 

Green forecasters are likely out over their skis. All the predictions assume that emerging economies—the least wealthy nations—will account for more nearly three-fourths of total new spending on renewables. That won’t happen unless the promised radical cost reductions occur.

For a bellwether reality-check, note that none of the wealthy nations that are parties to the Paris Accord—or any of the poor ones, for that matter—have come close to meeting the green pledges called for. In fact, let’s quote the International Energy Agency on what has actually happened: “Energy demand worldwide [in 2018] grew by . . . its fastest pace this decade . . . driven by a robust global economy . . . with fossil fuels meeting nearly 70% of the growth for the second year running.”

The reason? Using wind, solar and batteries as the primary sources of a nation’s energy supply remains far too expensive. You don’t need science or economics to know that. Simply propose taking away subsidies or mandates, and you’ll unleash the full fury of the green lobby.

Meanwhile, there are already signs that the green vision is losing luster. Sweden’s big shift to wind power has not only created alarm over inadequate electricity supplies; it’s depressing economic growth and may imperil that nation’s bid for the 2026 Winter Olympics. China, although adept at green virtue-signaling, has quietly restarted massive domestic coal-power construction and is building hundreds of coal plants for emerging economies around the world.

In the U.S., utilities, furiously but without fanfare, have been adding billions of dollars of massive oil- and natural-gas-burning diesel engines to the grid. Over the past two decades, three times as much grid-class reciprocating engine capacity has been added to the U.S. grid as in the entire half-century before. It’s the only practical way to produce grid-scale electricity fast enough when the wind dies off. Sweden will doubtless be forced to do the same.

A common response to all of the above: Make more electric cars. But mere arithmetic reveals that even the optimists’ 100-fold growth in electric vehicles wouldn’t displace more than 5% of global oil demand in two decades. Tepid growth in gasoline demand would be more than offset by growing economies’ appetites for air travel and manufactured goods. Goodness knows what would happen if Trump-like economic growth were to take hold in the rest of the developed world. As Mr. Buffett knows, the IEA foresees the U.S. supplying nearly three-fourths of the world’s net new demand for oil and gas.

Green advocates can hope to persuade governments—and thus taxpayers—to deploy a huge tax on hydrocarbons to ensure more green construction. But there’s no chance that wealthy nations will agree to subsidize expensive green tech for the rest of the world. And we know where the Oracle of Omaha has placed a bet.

Re-Posted from the Wall Street Journal – Mr. Mills is a senior fellow at the Manhattan Institute and a partner in Cottonwood Venture Partners, an energy-tech venture fund, and author of the recent report, “The ‘New Energy Economy’: An Exercise in Magical Thinking.”

Advertisements

Say What? U.S. cancer institute (NCI) cancels nanotech research centers – Why?


The U.S. National Cancer Institute (NCI) in Bethesda, Maryland, will halt funding next year for its long-running Centers of Cancer Nanotechnology Excellence (CCNEs), which are focused on steering advances in nanotechnology to detect and treat cancer.

The shift marks nanotechnology’s “natural transition” from an emerging field requiring dedicated support to a more mature enterprise able to compete head to head with other types of cancer research, says Piotr Grodzinski, who heads NCI’s Nanodelivery Systems and Devices Branch, which oversees the CCNEs. “This doesn’t mean NCI’s interest in nanotechnology is decreasing.”

Nevertheless, cancer nanotechnology experts see the decision as a blow. “It’s disappointing and very shortsighted given the emergence of nanotechnology and medicine,” says Chad Mirkin, who directs a CCNE at Northwestern University in Evanston, Illinois.

CCNEs have spawned dozens of clinical trials for new drugs and drug delivery devices, as well as novel technologies for diagnosing disease, he says. “Cancer research needs new ways of making new types of medicines. Nanotechnology represents a way to do that,” he says.

Nanotechnology also has a unique place in cancer research, where making advances requires multiple disciplines, including chemistry, physics, cell biology, and patient care, to design novel drugs and drug carriers that can navigate the body and seek out and destroy tumors.

“We’re talking about a different beast here,” says Michelle Bradbury, a radiologist at Memorial Sloan Kettering Cancer Center in New York City, who co-directs the Sloan Kettering-Cornell University CCNE. “The center format is perfect for that.”

NCI launched eight CCNEs in 2005 for an initial 5-year term. Nine received funding in 2010 for the project’s second phase, and six in 2015 for phase three. In total, CCNEs received about $330 million over 15 years, Grodzinski says, with an additional $70 million in funding for training and other types of nanotechnology research centers.

That, he says, represents between 10% to 20% of NCI’s funding for nanotechnology research, depending on the specific 5-year phase. NCI will continue to support nanotechnology through R01 and other grant mechanisms, Grodzinski says. But Bradbury and others are concerned that a more piecemeal funding approach might be less successful. “You might not see the integration between disciplines,” she says.

Army research may be used to treat cancer, Heal combat wounds


RESEARCH TRIANGLE PARK, N.C. — Army research is the first to develop computational models using a microbiology procedure that may be used to improve novel cancer treatments and treat combat wounds.

Using the technique, known as electroporation, an electrical field is applied to cells in order to increase the permeability of the cell membrane, allowing chemicals, drugs, or DNA to be introduced into the cell.

For example, electro-chemotherapy is a cutting-edge cancer treatment that uses electroporation as a means to deliver chemotherapy into cancerous cells.

The research, funded by the U.S. Army and conducted by researchers at University of California, Santa Barbara and Université de Bordeaux, France, has developed a computational approach for parallel simulations that models the complex bioelectrical interaction at the tissue scale.

Previously, most research has been conducted on individual cells, and each cell behaves according to certain rules.

“When you consider a large number of them together, the aggregate exhibits novel coherent behaviors,” said Pouria Mistani, a researcher at UCSB. “It is this emergent phenomenon that is crucial for developing effective theories at the tissue-scale — novel behaviors that emerge from the coupling of many individual elements.”

This new research, published in the Journal of Computational Physics, is funded by the U.S. Combat Capabilities Development Command’s Army Research Lab, the Army’s corporate research laboratory known as ARL, through its Army Research Office.

“Mathematical research enables us to study the bioelectric effects of cells in order to develop new anti-cancer strategies,” said Dr. Joseph Myers, Army Research Office mathematical sciences division chief.

“This new research will enable more accurate and capable virtual experiments of the evolution and treatment of cells, cancerous or healthy, in response to a variety of candidate drugs.”

Researchers said a crucial element in making this possible is the development of advanced computational algorithms.

“There is quite a lot of mathematics that goes into the design of algorithms that can consider tens of thousands well-resolved cells,” said Frederic Gibou, a faculty member in the Department of Mechanical Engineering and Computer Science at UCSB.

Another potential application is accelerating combat wound healing using electric pulsation.

“It’s an exciting, but mainly unexplored area that stems from a deeper discussion at the frontier of developmental biology, namely how electricity influences morphogenesis,” — or the biological process that causes an organism to develop its shape — Gibou said. “In wound healing, the goal is to externally manipulate electric cues to guide cells to grow faster in the wounded region and accelerate the healing process.”

The common factor among these applications is their bioelectric physical nature. In recent years, it has been established that the bioelectric nature of living organisms plays a pivotal role in the development of their form and growth.

To understand bioelectric phenomena, Gibou’s group considered computer experiments on multicellular spheroids in 3-D. Spheroids are aggregates of a few tens of thousands of cells that are used in biology because of their structural and functional similarity with tumors.

“We started from the phenomenological cell-scale model that was developed in the research group of our colleague, Clair Poignard, at the Université de Bordeaux, France, with whom we have collaborated for several years,” Gibou said.

This model, which describes the evolution of transmembrane potential on an isolated cell, has been compared and validated with the response of a single cell in experiments.

“From there, we developed the first computational framework that is able to consider a cell aggregate of tens of thousands of cells and to simulate their interactions,” he said. “The end goal is to develop an effective tissue-scale theory for electroporation.”

One of the main reasons for the absence of an effective theory at the tissue scale is the lack of data, according to Gibou and Mistani. Specifically, the missing data in the case of electroporation is the time evolution of the transmembrane potential of each individual cell in a tissue environment. Experiments are not able to make those measurements, they said.

“Currently, experimental limitations prevent the development of an effective tissue-level electroporation theory,” Mistani said. “Our work has developed a computational approach that can simulate the response of individual cells in a spheroid to an electric field as well as their mutual interactions.”

Each cell behaves according to certain rules. 

“But when you consider a large number of them together, the aggregate exhibits novel coherent behaviors,” Mistani said. “It is this emergent phenomenon that is crucial for developing effective theories at the tissue-scale — novel behaviors that emerge from the coupling of many individual elements.”

The effects of electroporation used in cancer treatment, for example, depend on many factors, such as the strength of the electric field, its pulse and frequency.

“This work could bring an effective theory that helps understand the tissue response to these parameters and thus optimize such treatments,” Mistani said. “Before our work, the largest existing simulations of cell aggregate electroporation only considered about one hundred cells in 3-D, or were limited to 2-D simulations. Those simulations either ignored the real 3-D nature of spheroids or considered too few cells for tissue-scale emergent behaviors to manifest.”

The researchers are currently mining this unique dataset to develop an effective tissue-scale theory of cell aggregate electroporation.

_______________________________________

The CCDC Army Research Laboratory (ARL) is an element of the U.S. Army Combat Capabilities Development Command. As the Army’s corporate research laboratory, ARL discovers, innovates and transitions science and technology to ensure dominant strategic land power. Through collaboration across the command’s core technical competencies, CCDC leads in the discovery, development and delivery of the technology-based capabilities required to make Soldiers more effective to win our Nation’s wars and come home safely. CCDC is a major subordinate command of the U.S. Army Futures Command.

____________________________

Rice University – Flexible insulator offers high strength and superior thermal conduction – Applications for Flexible Electronics and Energy Storage


 

flexible insulator offers high strength and superior thermal conduction
Rice University research scientist M.M. Rahman holds a flexible dielectric made of a polymer nanofiber layer and boron nitride. The new material stands up to high temperatures and could be ideal for flexible electronics, energy storage and electric devices where heat is a factor. Credit: Jeff Fitlow/Rice University

A nanocomposite invented at Rice University’s Brown School of Engineering promises to be a superior high-temperature dielectric material for flexible electronics, energy storage and electric devices.

The nanocomposite combines one-dimensional  nanofibers and two-dimensional  nanosheets. The nanofibers reinforce the self-assembling material while the “white graphene” nanosheets provide a thermally conductive network that allows it to withstand the heat that breaks down common dielectrics, the polarized insulators in batteries and other devices that separate positive and negative electrodes.

The discovery by the lab of Rice  scientist Pulickel Ajayan is detailed in Advanced Functional Materials.

Research scientist M.M. Rahman and postdoctoral researcher Anand Puthirath of the Ajayan lab led the study to meet the challenge posed by next-generation electronics: Dielectrics must be thin, tough, flexible and able to withstand .

“Ceramic is a very good dielectric, but it is mechanically brittle,” Rahman said of the common material. “On the other hand, polymer is a good dielectric with good mechanical properties, but its thermal tolerance is very low.”

Boron  is an electrical insulator, but happily disperses heat, he said. “When we combined the polymer nanofiber with boron nitride, we got a material that’s mechanically exceptional, and thermally and chemically very stable,” Rahman said.

A lab video shows how quickly heat disperses from a composite of a polymer nanoscale fiber layer and boron nitride nanosheets. When exposed to light, both materials heat up, but the plain polymer nanofiber layer on the left retains the heat far longer than the composite at right. Credit: Ajayan Research Group/Rice University

The 12-to-15-micron-thick material acts as an effective heat sink up to 250 degrees Celsius (482 degrees Fahrenheit), according to the researchers. Tests showed the polymer nanofibers-boron nitride combination dispersed heat four times better than the polymer alone.

In its simplest form, a single layer of polyaramid nanofibers binds via van der Waals forces to a sprinkling of boron nitride flakes, 10% by weight of the final product. The flakes are just dense enough to form a heat-dissipating network that still allows the composite to retain its flexibility, and even foldability, while maintaining its robustness. Layering polyaramid and boron nitride can make the material thicker while still retaining flexibility, according to the researchers.

“The 1D polyaramid  has many interesting properties except thermal conductivity,” Rahman said. “And  nitride is a very interesting 2-D material right now. They both have different independent properties, but when they are together, they make something very unique.”

Rahman said the material is scalable and should be easy to incorporate into manufacturing.


Explore further

New material to pave the way for more efficient electronic devices


More information: Muhammad M. Rahman et al. Fiber Reinforced Layered Dielectric Nanocomposite, Advanced Functional Materials (2019). DOI: 10.1002/adfm.201900056

Journal information: Advanced Functional Materials
Provided by Rice University

Magnetic Nanoparticles ease Removal of ‘microcontaminants’ from Wastewater


efficientremMany wastewater treatment plants do not completely remove chemical substances from wastewater. Credit: Symbol image: Shutterstock

Microcontaminants place a considerable burden on our water courses, but removing them from wastewater requires considerable technical resources. Now, ETH researchers have developed an approach that allows the efficient removal of these problematic substances.

In our , we all use a multitude of chemical substances, including cosmetics, medications, contraceptive pills, plant fertilisers and detergents—all of which help to make our lives easier. However, the use of such products has an adverse effect on the environment, because many of them cannot be fully removed from wastewater at today’s treatment plants. As , they ultimately end up in the environment, where they place a burden on fauna and flora in our water courses.

As part of a revision of the Waters Protection Act, parliament therefore decided in 2014 to fit an additional purification stage to selected water treatment plants by 2040 with a view to removing microcontaminants. Although the funding for this has in principle been secured, the project presents a challenge for plant operators because it is only possible to remove the critical substances using complex procedures, which are typically based on ozone, activated carbon or light.

Nanoparticles aid degradation

Now, researchers at ETH Zurich’s Institute of Robotics and Intelligent Systems have developed an elegant approach that could allow these substances to be removed more easily. Using multiferroic , they have succeeded in inducing the decomposition of chemical residues in contaminated water. Here, the nanoparticles are not directly involved in the chemical reaction but rather act as a catalyst, speeding up the conversion of the substances into harmless compounds.

“Nanoparticles such as these are already used as a catalyst in  in numerous areas of industry,” explains Salvador Pané, who has played a key role in advancing this research in his capacity as Senior Scientist. “Now, we’ve managed to show that they can also be useful for wastewater purification.”

Efficient removal of problem substances
Based on the example of various organic pigments, such as those used in the textile industry, the researchers are able to demonstrate the effectiveness of their approach. Picture left before treatment, right after treatment. Credit: ETH Zurich / Fajer Mushtaq

An 80 percent reduction

For their experiments, the researchers used aqueous solutions containing trace quantities of five common medications. The experiments confirmed that the nanoparticles can reduce the concentration of these substances in water by at least 80 percent. Fajer Mushtaq, a doctoral student in the group, underlines the importance of these results: “These  also included two compounds that can’t be removed using the conventional ozone-based method.”

“Remarkably, we’re able to precisely tune the catalytic output of the nanoparticles using magnetic fields,” explains Xiangzhong Chen, a postdoc who also participated in the project. The particles have a cobalt ferrite core surrounded by a bismuth ferrite shell. If an external alternating magnetic field is applied, some regions of the particle surface adopt positive electric charges, while others become negatively charged. These charges lead to the formation of reactive oxygen species in water, which break down the organic pollutants into harmless compounds. The magnetic nanoparticles can then be easily removed from water using , says Chen.

Positive responses from industry

The researchers believe that the new approach is a promising one, citing its easier technical implementation than that of ozone-based , for example. “The wastewater industry is very interested in our findings,” says Pané.

However, it will be some time before the method can be applied in practice, as it has been investigated only in the laboratory so far. At any rate, Mushtaq says that approval has already been given for a BRIDGE project jointly funded by the Swiss National Science Foundation and Innosuisse with a view to support the method’s transfer into practical applications. In addition, plans are already in place to establish a spin-off company, in which the researchers intend to develop their idea to market maturity.


Explore further

Chemists suggest a new method to synthesise titanium nanoparticles for water purification


More information: Fajer Mushtaq et al. Magnetoelectrically Driven Catalytic Degradation of Organics, Advanced Materials (2019). DOI: 10.1002/adma.201901378

Journal information: Advanced Materials
Provided by ETH Zurich

University of Georgia – Microfluidic device may help researchers better understand (and isolate) Metastatic Cancer – “Finding the Needle in the Haystack”


Needle Cancer U Georgia d2_JEHjN

Instead of searching for a needle in a haystack, what if you were able to sweep the entire haystack to one side, leaving only the needle behind? That’s the strategy researchers in the University of Georgia College of Engineering followed in developing a new microfluidic device that separates elusive circulating tumor cells (CTCs) from a sample of whole blood.

CTCs break away from cancerous tumors and flow through the bloodstream, potentially leading to new metastatic tumors. The isolation of CTCs from the blood provides a minimally invasive alternative for basic understanding, diagnosis and prognosis of metastatic cancer. But most studies are limited by technical challenges in capturing intact and viable CTCs with minimal contamination.
“A typical sample of 7 to 10 milliliters of blood may contain only a few CTCs,” said Leidong Mao, a professor in UGA’s School of Electrical and Computer Engineering and the project’s principal investigator. “They’re hiding in whole blood with millions of white blood cells. It’s a challenge to get our hands on enough CTCs so scientists can study them and understand them.”
Leidong Mao (right) and graduate student Yang Liu in lab
Leidong Mao (right) and graduate student Yang Liu stand in Mao’s lab at UGA.
Circulating tumor cells are also difficult to isolate because within a sample of a few hundred CTCs, the individual cells may present many characteristics. Some resemble skin cells while others resemble muscle cells. They can also vary greatly in size.
“People often compare finding CTCs to finding a needle in a haystack,” said Mao. “But sometimes the needle isn’t even a needle.”
To more quickly and efficiently isolate these rare cells for analysis, Mao and his team have created a new microfluidic chip that captures nearly every CTC in a sample of blood ­- more than 99% – a considerably higher percentage than most existing technologies.
The team calls its novel approach to CTC detection “integrated ferrohydrodynamic cell separation,” or iFCS. They outline their findings in a study published in Lab on a Chip (“Tumor antigen-independent and cell size variation-inclusive enrichment of viable circulating tumor cells”).
The new device could be “transformative” in the treatment of breast cancer, according to Melissa Davis, an assistant professor of cell and developmental biology at Weill Cornell Medicine and a collaborator on the project.
“Physicians can only treat what they can detect,” Davis said. “We often can’t detect certain subtypes of CTCs, but with the iFCS device we will capture all the subtypes of CTCs and even determine which subtypes are the most informative concerning relapse and disease progression.”
Davis believes the device may ultimately allow physicians to gauge a patient’s response to specific treatments much earlier than is currently possible.
While most efforts to capture circulating tumor cells focus on identifying and isolating the few CTCs lurking in a blood sample, the iFCS takes a completely different approach by eliminating everything in the sample that’s not a circulating tumor cell.
The device, about the size of a USB drive, works by funneling blood through channels smaller in diameter than a human hair. To prepare blood for analysis, the team adds micron-sized magnetic beads to the samples. The white blood cells in the sample attach themselves to these beads. As blood flows through the device, magnets on the top and bottom of the chip draw the white blood cells and their magnetic beads down a specific channel while the circulating tumor cells continue into another channel.
The device combines three steps in one microfluidic chip, another advance over existing technologies that require separate devices for various steps in the process.
“The first step is a filter that removes large debris in the blood,” said Yang Liu, a doctoral student in UGA’s department of chemistry and the paper’s co-lead author. “The second part depletes extra magnetic beads and the majority of the white blood cells. The third part is designed to focus remaining white blood cells to the middle of channel and to push CTCs to the side walls.”
Wujun Zhao is the paper’s other lead author. Zhao, a postdoctoral scholar at Lawrence Berkeley National Laboratory, worked on the project while completing his doctorate in chemistry at UGA.
“The success of our integrated device is that it has the capability to enrich almost all CTCs regardless of their size profile or antigen expression,” said Zhao. “Our findings have the potential to provide the cancer research community with key information that may be missed by current protein-based or size-based enrichment technologies.”
The researchers say their next steps include automating the iFCS and making it more user-friendly for clinical settings. They also need to put the device through its paces in patient trials. Mao and his colleagues hope additional collaborators will join them and lend their expertise to the project.
Source: University of Georgia

Argonne National Laboratory – New coating could have big implications for lithium batteries


Argonne scientists have developed a new coating (shown in blue) for battery cathodes that can improve the electronic and ionic conductivity of a battery while improving its safety and cycling performance. Credit: Argonne National Laboratory

Building a better lithium-ion battery involves addressing a myriad of factors simultaneously, from keeping the battery’s cathode electrically and ionically conductive to making sure that the battery stays safe after many cycles.

In a new discovery, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed a new   by using an oxidative chemical vapor deposition technique that can help solve these and several other potential issues with  all in one stroke.

“The coating we’ve discovered really hits five or six birds with one stone.” Khalil Amine, Argonne distinguished fellow and  scientist.

In the research, Amine and his fellow researchers took particles of Argonne’s pioneering nickel-manganese-cobalt (NMC) cathode material and encapsulated them with a sulfur-containing polymer called PEDOT. This polymer provides the cathode a layer of protection from the battery’s electrolyte as the battery charges and discharges.

Unlike conventional coatings, which only protect the exterior surface of the micron-sized cathode particles and leave the interior vulnerable to cracking, the PEDOT coating had the ability to penetrate to the cathode particle’s interior, adding an additional layer of shielding.

In addition, although PEDOT prevents the chemical interaction between the battery and the electrolyte, it does allow for the necessary transport of lithium ions and electrons that the battery requires in order to function.

“This coating is essentially friendly to all of the processes and chemistry that makes the battery work and unfriendly to all of the potential reactions that would cause the battery to degrade or malfunction,” said Argonne chemist Guiliang Xu, the first author of the research.

The coating also largely prevents another reaction that causes the battery’s cathode to deactivate. In this reaction, the  converts to another form called spinel. “The combination of almost no spinel formation with its other properties makes this coating a very exciting material,” Amine said.

The PEDOT material also demonstrated the ability to prevent oxygen release, a major factor for the degradation of NMC cathode materials at . “This PEDOT coating was also found to be able to suppress oxygen release during charging, which leads to better  and also improves safety,” Amine said.

Amine indicated that battery scientists could likely scale up the coating for use in nickel-rich NMC-containing batteries. “This polymer has been around for a while, but we were still surprised to see that it has all of the encouraging effects that it does,” he said.

With the coating applied, the researchers believe that the NMC-containing batteries could either run at higher voltages—thus increasing their —or have longer lifetimes, or both.

To perform the research, the scientists relied on two DOE Office of Science User Facilities located at Argonne: the Advanced Photon Source (APS) and the Center for Nanoscale Materials (CNM). In situ high-energy X-ray diffraction measurements were taken at beamline 11-ID-C of the APS, and focused ion beam lithography and  were performed at the CNM.

A paper based on the study, “Building ultra-conformal protective layers on both secondary and primary particles of layered lithium transition metal oxide cathodes,” appeared in the May 13 online edition of Nature Energy.

More information: Gui-Liang Xu et al, Building ultraconformal protective layers on both secondary and primary particles of layered lithium transition metal oxide cathodes, Nature Energy(2019).  DOI: 10.1038/s41560-019-0387-1

Journal information: Nature Energy

Provided by Argonne National Laboratory

KAUST – Putting the ‘Sense’ in Materials – Say Goodbye to Batteries as You Know Them


KAUST 5c6a94542b7e32661e7f8b63An interdisciplinary initiative is helping KAUST be at the forefront of a digital revolution, where sensors can find a use just about anywhere.

 

The ability to track minuscule but important changes across a range of systems—from the body to the borough and beyond—seems limitless with the emerging array of novel devices that are tiny, self-powering and wirelessly connected. KAUST’s Sensor Initiative comprises a broad range of experts, from marine scientists to electrical engineers, who are innovating solutions to some of the most challenging obstacles in sensor technology. Together, they are powering up to transform the exciting intersection between small interconnected devices and the world around us.

Capacity to monitor our surroundings also reveals new potential in environmental and community protection. For example, a sensor that can detect a flood or a fire can save lives; a sensor that can track animals could help to better manage an ecosystem; and a sensor that can read plant condition could promote sustainable farming.

To take advantage of the market opportunities for sensors in both medical and environmental fields, KAUST holds an annual meeting of biologists, engineers and chemists to discuss technology development. Since 2015, these meetings have produced ambitious collaborations that aim to improve the science that underpins next-gen sensors as well as to take them to the market.

Get ready to plug and play

Khaled Salama, professor of electrical engineering and director of the Sensor Initiative, explains that what sets KAUST apart are the University’s human resources and outstanding lab facilities that underpin its innovative sensor technologies. With the onslaught of data coming from the hundreds of billions of sensors in our cities, cars, homes and offices, we need machine learning to help us understand the data, the supercomputing power to manage it and the expertise to make sure the machines do it all effectively.

“KAUST has strength in materials research, which is where our expertise can be used for developing sensors with transducer components that can be quickly swapped out and replaced with ones customized for different biological or environmental applications,” says Salama.

“Some can stick to your skin and monitor your vital signs through changes in your sweat while others can be placed in petroleum installations to monitor hazardous gases,” says Salama. “We’re not bound to one specific application, and each new development gives us a chance to answer some fundamental scientific questions along the way.”

Say goodbye to batteries, as you know them

KAUST is deploying tiny sensors across the University’s campus to model future smart cities that can continuously monitor air quality or help self-driving cars navigate. Implementing this vision means making devices that are as self-sufficient as possible.

“If you have sensors containing regular batteries, they might last a thousand cycles,” says Husam Alshareef, professor of materials science. “We have to get them to last millions of times longer.”

Alshareef and several international collaborators are building a technology known as microsupercapacitors—next-generation batteries—to resolve challenges around energy storage. Through a special vacuum deposition process, the team has transformed ruthenium oxide into a thin-film electrode that can hold massive amounts of charge and quickly release it on demand.

Get plant smart with winged sensors

Professor Muhammad Hussain is a strong believer in the importance of availability in the sensor market. He insists that his sensors not only provide solutions to everyday problems but also that they be affordable to all. That said, he does not forgo creativity for affordability. Hussain’s plant sensors are flexible, inexpensive and range in size from 1-20 mm in diameter. When placed on a plant leaf, they can detect temperature, humidity and growth, data that can be used to help farmers farm smart—minimizing nutrient and water waste. But what makes them especially remarkable is their beautiful butterfly shape. When asked why he chose the butterfly shape Hussain told us, “Butterflies are aesthetically beautiful and natural in a plant environment. Their large wings allow us to integrate many different sensors, which is especially useful for the artificial intelligence chip we are currently integrating into the system. Ultimately, we aim to create a fully interactive system such that the butterfly can deliver nutrients or gather more data.”

KAUST 4 5cad09dcc58d3a5aa53e57ab

Sherjeel Khan, ph.D. student with Mohammed Hussain, fabricates the plant-monitoring sensors shaped like butterflies. © 2018 KAUST

Learn to talk effectively

One of the advanced sensors being developed at KAUST is the smart bandage from the group of Atif Shamim in the electrical engineering program. This gadget uses carbon-based transducers to directly contact chronic wounds and to predict signs of infection based on blood pH levels.

Shamim notes that wireless communication is crucial if sensors and other components of the Internet of things are to be integrated with everyday items. His team has pioneered the use of low-energy Bluetooth radio networks to help connect smart devices with each other and also with network servers.

“Even though the Internet of things is about inanimate objects, they have to make decisions for you,” says Shamim. “They need to sense and they need to communicate.”

KAUST 5c6a94542b7e32661e7f8b63

Shamim’s smart bandage may help to predict signs of infection in a wound by reading blood pH levels. © 2018 KAUST 

Be prepared to dive deep

Shamim is partnering with other KAUST researchers, including Jürgen Kosel, who specializes in using the property of magnetism in his sensor work to track animal behavior in the Red Sea. The team created stickers—each containing a self-powered, Bluetooth-connected position sensor—that are small enough to be attached to crabs, turtles and giant clams in the Red Sea.

Kosel and his group aimed to tackle the primary challenge associated with remote tracking of marine life—the tendency for water to scatter the radiofrequency waves used by most sensors for geolocation. Working with the KAUST Nanofabrication Core Lab to fabricate thin-film structures, the team created flexible sensors that reveal their global position using magnetic signals that easily access subsurface environments.

“Magnetic fields can penetrate many materials without affecting them, and that includes humans and other animals,” says Kosel. “We’ve shown that you can even derive how much energy a marine animal consumes using magnetic sensors that monitor water flow.”

Sense the future of sensors

For the Emeritus Senior Vice President for Research, Jean Frechet, the possibilities are great: “With our expertise and resources, we have built bridges across disciplines by bringing together researchers from KAUST and other institutions. They inspire each other to solve challenges as diverse as the survival of marine life, communications for the 21st century, and the exploitation of big data. The KAUST Sensor Initiative will stimulate the next generation and contribute to diversifying the country’s economy as we design and engineer sensors that collect the data we need to address global challenges.”

 

Colorful solution to a chemical industry bottleneck – KAUST Researchers Develop an “hourglass shape” Graphene-Oxide Membrane to rapidly separate chemical mixtures – Application Pharmaceuticals (other chemical mixtures)


KAUST 2 Color 809

A graphene-oxide membrane design inspired by nature swiftly separates solvent molecules.

The nanoscale water channels that nature has evolved to rapidly shuttle water molecules into and out of cells could inspire new materials to clean up chemical and pharmaceutical production. KAUST researchers have tailored the structure of graphene-oxide layers to mimic the hourglass shape of these biological channels, creating ultrathin membranes to rapidly separate chemical mixtures.

“In making pharmaceuticals and other chemicals, separating mixtures of organic molecules is an essential and tedious task,” says Shaofei Wang, postdoctoral researcher in Suzana Nuñes lab at KAUST. One option to make these chemical separations faster and more efficient is through selectively permeable membranes, which feature tailored nanoscale channels that separate molecules by size.

But these membranes typically suffer from a compromise known as the permeance-rejection tradeoff. This means narrow channels may effectively separate the different-sized molecules, but they also have an unacceptably low flow of solvent through the membrane, and vice versa—they flow fast enough, but perform poorly at separation.

Nuñes, Wang and the team have taken inspiration from nature to overcome this limitation. Aquaporins have an hourglass-shaped channel: wide at each end and narrow at the hydrophobic middle section. This structure combines high solvent permeance with high selectivity. Improving on nature, the team has created channels that widen and narrow in a synthetic membrane.

The membrane is made from flakes of a two-dimensional carbon nanomaterial called graphene oxide. The flakes are combined into sheets several layers thick with graphene oxide. Organic solvent molecules are small enough to pass through the narrow channels between the flakes to cross the membrane, but organic molecules dissolved in the solvent are too large to take the same path. The molecules can therefore be separated from the solvent.

To boost solvent flow without compromising selectivity, the team introduced spacers between the graphene-oxide layers to widen sections of the channel, mimicking the aquaporin structure. The spacers were formed by adding a silicon-based molecule into the channels that, when treated with sodium hydroxide, reacted in situ to form silicon-dioxide nanoparticles. “The hydrophilic nanoparticles locally widen the interlayer channels to enhance the solvent permeance,” Wang explains.

When the team tested the membrane’s performance with solutions of organic dyes, they found that it rejected at least 90 percent of dye molecules above a threshold size of 1.5 nanometers. Incorporating the nanoparticles enhanced solvent permeance 10-fold, without impairing selectivity. The team also found there was enhanced membrane strength and longevity when chemical cross-links formed between the graphene-oxide sheets and the nanoparticles.

“The next step will be to formulate the nanoparticle graphene-oxide material into hollow-fiber membranes suitable for industrial applications,” Nuñes says.

References

Wang, S., Mahalingam, D., Sutisna, B. & Nunes, S.P. 2D-dual-spacing channel membranes for high performance organic solvent nanofiltration. Journal of Materials Chemistry Aadvance online publication, 10 January 2019.| article

 

Batteries can’t solve the world’s biggest energy-storage problem – One startup in Denmark has a solution – Solving CO2 Emissions and … Energy Storage


Copenhagen, Denmark

Sometimes, there can be too much of a good thing.

Every so often, from California to Germany, there’s news of “negative electricity prices,” a peculiar side effect of global efforts to generate clean energy.

Solar farms and wind turbines produce varying amounts of power based on the vagaries of the weather. So we build electrical grids to handle only the power levels we expect in a given location.

But in some cases, there’s more sun or wind than expected, and these renewable energy sources pump in more power than the grid can handle. The producers of that power then have to pay customers to use up the excess electricity; otherwise, the grid would be overloaded and fail.

As we build more and more renewable-power capacity in efforts to meet the emissions-reduction goals of the Paris climate agreement, these situations will become more common. Startups led by entrepreneurs who see this future on the horizon are now looking for ways to make money off the inevitable excess clean electricity.

On a mildly chilly day in April, with the smell of poo in the air, I met one of these startups at a sewage-water treatment plant in Copenhagen, Denmark. 

Electrochaea takes carbon dioxide produced during the process of cleaning wastewater, and converts it into natural gas. That alone would be impressive enough; if we want to stop global warming in its tracks, we need to do everything we can to keep CO2 from entering the atmosphere. But Electrochaea has also figured out a way to power the whole enterprise with the excess green energy produced during particularly sunny and windy days that otherwise would have gone to waste, because there would have been no way to store it.

In other words, when scaled up, Electrochaea’s process could be an answer to one of the biggest problems of the 21st century: energy storage, while also making a dent in cutting emissions.

The Electrochaea pilot plant in Copenhagen, with the Biofos water-treatment plant in the background.
This article is part of The Race to Zero Emissions series investigating carbon-capture technology. You can also read our feature laying out the case for using the technology to fight climate change.

The battery problem

The biggest problem with wind and solar energy is that they’re intermittent. There might be violent winds one day, and calm skies the next; broiling sunshine on Monday and 100% cloud cover on Tuesday. Some argue this problem is easily overcome by storing any excess energy in batteries until it’s needed at a later time. Further, battery advocates say, even though the bookcase-sized batteries required to store solar energy for a small home are expensive today, prices are falling and will continue to fall for some time.

Except it’s not that easy. The batteries on the market for these applications are, essentially, large versions of the lithium-ion batteries found in mobile phones. They can only store energy for a certain amount of time—weeks, at most. As soon as the charging source is removed, they start to lose the charge.

That’s not a problem if the batteries are for ironing out the peaks and troughs of daily use. The trouble is that humanity’s energy demand is skewed based on local seasons, which requires sometimes drawing on every available source, and sometimes not using much energy at all.

Mumbai’s peak energy demand is during the hottest days of summer, when people run air conditioners to survive. London’s peak energy demand comes during the coldest days of winters, when people burn natural gas to heat their homes and offices.

Peak energy demand, whether for heating or cooling, can be as much as 20 times the energy consumed on an average day. Today, we shovel more coal or pump more natural gas into fossil-fuel power plants on those high-demand days. Some places, like Bridgeport in Connecticut, have old fossil-fuel plants, often coal, that they keep shut down most of the year and fire up only during peak demand. Obviously, that won’t work in a future powered by renewable energy.

There are two solutions on the table for inter-seasonal energy storage, and they both involve massive investment in infrastructure: First, you could build so many solar panel fields or so many wind turbines that you could produce much more than 20 times the power of an average day. The upshot: you’d have much more excess energy on a low-demand day, but would at least be able to fill demand on peak-demand days. The second option is to get so many batteries that they can store up enough excess energy that, even as they lose their charge, there’s still enough power to get the grid through peak-demand days.

Even if both renewable generation and storage were affordable enough for these plans—and they’re not, yet—there’s still another economic wall that might be impossible to traverse: Most of the time, your new gigantic power plant and fleet of batteries would be useless, because peak demand happens only a few times each year. No government can waste the money needed to build something with so little utility.

Store it another way

Beyond batteries, there are other mechanical ways to store energy. One is to pump water into elevated lakes. Another is to compress air with excess energy. Yet another is to store energy in the form of a high-speed rotating disk. But, like batteries, none of these options can store energy between seasons.

There is one option for the inter-seasonal problem called underground thermal-energy storage. It works on a simple principle: no matter the temperature above ground, at a depth of about 15 meters, temperature in most places on Earth is about the same: 10°C (or 50°F). The planet’s soil provides natural insulation, and, in theory, we could use that insulation to store energy.

There have been successful pilot projects around the world showing you can set up solar panels that, after filling the grid, use any excess electricity to heat gravel, heat-carrying chemicals, or water stored in tanks deep underground. With enough insulation, the heat could be stored for months, until it’s needed in homes nearby, and delivered to them via pipes and heat pumps. (This heat energy can also be converted to run air conditioners, where cooling is needed instead of heating.)

There are just two problems when it comes to scaling up: First, it’s expensive to build. Even if the cost of construction and management were to come down, if cities and towns haven’t already planned for underground reservoirs (and most haven’t), then it can be prohibitively exorbitant to find and secure the space. Second, the solution only works at a local scale, because transporting heat comes with natural losses. So the farther you need to move it from the storage site, the more loss you have to deal with.

Electrochaea provides another option, where renewable energy could be stored indefinitely and transported without losses.

The green goo

If you don’t mind the smell, wastewater treatment plants are fascinating. The Copenhagen plant takes in all the water sent down toilets, bathrooms, and kitchen sinks, and puts out H2O that’s nearly clean enough to drink—just one more step would be required, the plant operator told me. But since the city has no shortage of water, the treatment plant dumps its clean, but non-potable water into the North Sea.

Before that, the water goes through dozens of steps, including one where organic matter is allowed to settled to the bottom of large, open tanks. This sludge, rich in carbon-containing molecules, is transferred to a sealed bioreactor where microbes filtered from local soil are added. If this were done in the open tanks, the microbes would break the matter down slowly to produce carbon dioxide. But, in the bioreactor, in the absence of oxygen, a different set of microbes springs into action. They produce methane—the primary component of natural gas—instead.

The facility takes the methane (and any remaining sludge that can’t be broken down) and then burns it in a biomass power plant. “We produce more energy than we consume to clean up the water coming into our plant,” says Dines Thornberg, a manager at Biofos, the part-government-owned company that operates the treatment plant.

” … Batteries can’t solve the world’s biggest energy-storage problem. One startup has a solution.”

That may be true. But the process still does pollute, since some chemical degradation happens independent of microbes and creates carbon dioxide. To help Denmark hit its Paris agreement goals, Biofos wants to cut its own carbon footprint. That’s why the company gave Electrochaea valuable space at its plant to build the pilot-scale chemical plant that completes the job that Biofos’s microbes couldn’t do: convert the carbon dioxide released in the bioreactor into methane. To achieve this amazing transformation, Electrochaea gets help from microbial life called archaea.

Archaea are the oldest of the three branches of life, which include bacteria and eukaryotes (consisting of all other more advanced organisms including humans). Their ancient survival skills include one that us humans now can put to good use: an ability to naturally take in CO2 and turn it into methane.

Most scientists believe life on Earth was formed in hydrothermal vents, created by underwater volcanoes. The temperatures there can reach 400°C (750°F), far higher than that of boiling water. But the water in the vents doesn’t boil and the gases the vents release—including carbon dioxide and hydrogen—don’t explode thanks to the tremendous pressure applied by miles of seawater above. Some archaea that live in the vents have learned to use carbon dioxide as food, combining the carbon (C) from carbon dioxide (CO2) with hydrogen (H2) to form and release methane (CH4). Humans do the reverse, consuming carbon-rich food and combining it with oxygen—both readily available on land—to make and expel carbon dioxide.

Electrochaea’s pilot plant on Biofos’s premises takes up an area the size of a tennis court. As usual for a chemical factory, there’s a tangle of stainless steel pipes with sensors and control valves. The pipes all lead to a big cylindrical bioreactor, about 10 meters (30 ft) tall, kept at 60°C (140°F) and eight times the atmospheric pressure. Through a small glass window at the bottom of the bioreactor, I can see a bubbling mixture the color of an avocado milkshake. That’s Electrochaea’s proprietary species of archaea, cultured and bred to efficiently combine carbon dioxide and hydrogen to produce methane.

           Electrochaea’s lab-scale bioreactor with archaea producing methane.

The gases injected into the bioreactor come from two sources. The wastewater-treatment plant sends a mixture of carbon dioxide and methane. Meanwhile, two shipping container-sized electrolyzers use renewable electricity to split water into hydrogen and oxygen. The oxygen is sent off into the atmosphere, and the hydrogen to the Electrochaea bioreactor.

The microbes are so effective that in the time the mixture of gases travels from the bottom of the bioreactor to its top, 99 out of every 100 molecules of carbon dioxide and hydrogen are converted into methane, water, and heat. The heat is useful: it helps keep the temperature inside the reactor steady. Meanwhile, as the archaea consume the CO2 and H2, they multiply. Because they are naturally occurring species, extra archaea can be simply dumped into sewers, flushed out by the water also created in the process.

The valuable product from the reaction is methane. In fact, the stuff created through this process is even more valuable than typical methane. It’s “renewable methane” or “biomethane,” because the gas was produced from non-fossil-fuel sources and without any fossil-fuel power. This methane can be used to run boilers in homes, power plants, and even cars or buses. It’s a cleaner fuel than coal and oil, producing the fewest emissions for each unit of energy released. Methane can also be very easily stored. In fact, across Europe, there are large underground stores, connected by pipes that can safely transport the gas.

When the system runs at full capacity, the archaea produce about 50 cubic meters of natural gas per hour. For every unit of energy of electricity fed into the system, it produces about 0.75 units of energy stored in the form of methane, according to Doris Hafenbradl, Electrochaea’s chief scientist. That’s not as good as lithium-ion batteries, which can reach near 100% efficiency. But unlike the energy stored in batteries, once methane is produced it can be stored indefinitely, because it doesn’t spontaneously degrade into other chemicals. If this process could be scaled up, it could solve renewable energy’s inter-seasonal storage problem.

Electrochaea’s plant does not need to be close to solar farms or wind turbines, because excess electricity can be extracted from anywhere on the grid. The limitations on location come based on access to CO2. Water treatment plants are fair game and so are ethanol producers (such as beer and liquor factories). Exhaust gases from power plants could be used, but they will need to be cleaned to remove sulfur and particulate emissions, which could harm archaea.

Solving problems at scale

The Copenhagen plant is one of three where Electrochaea has successfully installed its technology. The US National Renewable Energy Laboratory has one installation on its campus in Boulder, Colorado, and the last is part of a European Commission-funded project in Solothurn, Switzerland. Ultimately, Electrochaea’s core business model is to license the technology. The company is currently courting a Hungarian power company that wants to build a plant 10 times the size of the one in Copenhagen. It would be the startup’s biggest yet. And the carmaker Audi has shown interest in Electrochaea’s technology as a way to use biomethane in its natural-gas-powered cars.

Electrochaea is one of a growing number of players in the “power-to-gas” industry. ITM Power and Hydrogenics, for example, make electrolyzers that convert excess renewable energy into storable hydrogen. But hydrogen is not as widely used as natural gas. That’s why companies such as Electrochaea, MicrobEnergy, and ETOGas are betting that it’s worth the extra step of turning that hydrogen into methane. MicrobEnergy, as the name suggests, uses microbes for the conversion, just like Electrochaea. ETOGas, a Hitachi-supported startup, uses metal catalysts.

Doing it this way also gives these companies a potential second business model: Because these processes all involve injecting carbon dioxide into the system, it could turn out to be the case that they are hired to install their power-to-gas systems simply to reduce a facility’s CO2 emissions, whether it has an excess of renewable energy or not.

Electrochaea hasn’t secured a carbon-capture customer yet. And to be sure, its technology is carbon-capture and recycling—not storage or removal. The methane produced in the process will eventually be burned again, and that will put carbon dioxide into the atmosphere. In other words, it simply delays the production of greenhouse gases, instead of eliminating them.

That said, there is some value in delaying emissions. Every CO2 molecule not dumped in the atmosphere right now is a CO2 molecule that doesn’t absorb and retain sun’s heat. If nothing else, Electrochaea and companies like it may do their part in saving the planet simply by helping change the conversation around carbon dioxide, from treating it as a byproduct to considering it as a feedstock.

%d bloggers like this: