CREATING AN INJECTABLE SWARM OF BRAIN READING NANO-SENSORS – POTENTIAL FOR DIAGNOSING NEUROLOGICAL DISORDERS & AS A POWERFUL BRAIN-COMPUTER INTERFACE – UC SANTA CRUZ


USING NANO-SENSORS TO DIAGNOSE NEUROLOGICAL DISORDERS & FOR POWERFUL BRAIN-COMPUTER INTERFACES.

A team of scientists has developed a new kind of biosensor that can be injected straight into the bloodstream, and will then travel to your brain, where they will — according to the scientists behind the project — monitor your neural activity and even potentially thoughts.

The cell-sized nanosensors, aptly named NeuroSWARM3, can cross the blood-brain barrier to the brain, where they convert neural activity into electrical signals, allowing them to be read and interpreted by machinery, according to work by a team of University of California, Santa Cruz scientists that will be presented next week at a virtual Optical Society conference.

The tech could, the researchers say, help grant extra mobility to people with disabilities in addition to helping scientists understand human thoughtbetter than before. However, they haven’t yet been tested on humans or even animals.

“NeuroSWARM3 can convert the signals that accompany thoughts to remotely measurable signals for high precision brain-machine interfacing,” lead study author A. Ali Yanik said in a press release. “It will enable people suffering from physical disabilities to effectively interact with the external world and control wearable exoskeleton technology to overcome limitations of the body. It could also pick up early signatures of neural diseases.”

It’s also a notably different approach to the problem of brain-computer interfaces from most high profile attempts, including Elon Musk’s Neuralink, which are working on implant-based solutions instead of nanosensor swarms.

During tests, the team found that their nanosensor swarm is sensitive enough to pick up on the activity of individual brain cells. Single-neuron readings aren’t new, but the ability to detect them with free-floating sensors, and especially the ability to wirelessly broadcast them through a patient’s thick skull, is an impressive technological development. If further tests continue to pan out, those capabilities could make real-time neuroscientific research simpler and neurological medicine more sophisticated.

“We are just at the beginning stages of this novel technology, but I think we have a good foundation to build on,” Yanik added. “Our next goal is to start experiments in animals.”

Mystery Solved? The hidden culprit killing lithium-metal batteries from the inside – Sandia National Laboratories


In this new, false-color image of a lithium-metal test battery produced by Sandia National Laboratories, high-rate charging and recharging red lithium metal greatly distorts the green separator, creating tan reaction byproducts, to the surprise of scientists. Credit: Katie Jungjohann

For decades, scientists have tried to make reliable lithium-metal batteries. These high-performance storage cells hold 50% more energy than their prolific, lithium-ion cousins, but higher failure rates and safety problems like fires and explosions have crippled commercialization efforts. Researchers have hypothesized why the devices fail, but direct evidence has been sparse.

Now, the first nanoscale images ever taken inside intact, lithium-metal coin batteries (also called button cells or watch batteries) challenge prevailing theories and could help make future high-performance batteries, such as for electric vehicles, safer, more powerful and longer lasting.

“We’re learning that we should be using separator materials tuned for lithium metal,” said battery scientist Katie Harrison, who leads Sandia National Laboratories’ team for improving the performance of lithium-metal batteries.

Sandia scientists, in collaboration with Thermo Fisher Scientific Inc., the University of Oregon and Lawrence Berkeley National Laboratory, published the images recently in ACS Energy Letters. The research was funded by Sandia’s Laboratory Directed Research and Development program and the Department of Energy.

Internal byproduct builds up, kills batteries

The team repeatedly charged and discharged lithium coin cells with the same high-intensity electric current that electric vehicles need to charge. Some cells went through a few cycles, while others went through more than a hundred cycles. Then, the cells were shipped to Thermo Fisher Scientific in Hillsboro, Oregon, for analysis.

When the team reviewed images of the batteries’ insides, they expected to find needle-shaped deposits of lithium spanning the battery. Most battery researchers think that a lithium spike forms after repetitive cycling and that it punches through a plastic separator between the anode and the cathode, forming a bridge that causes a short. But lithium is a soft metal, so scientists have not understood how it could get through the separator.

Harrison’s team found a surprising second culprit: a hard buildup formed as a byproduct of the battery’s internal chemical reactions. Every time the battery recharged, the byproduct, called solid electrolyte interphase, grew. Capping the lithium, it tore holes in the separator, creating openings for metal deposits to spread and form a short. Together, the lithium deposits and the byproduct were much more destructive than previously believed, acting less like a needle and more like a snowplow.

“The separator is completely shredded,” Harrison said, adding that this mechanism has only been observed under fast charging rates needed for electric vehicle technologies, but not slower charging rates.

As Sandia scientists think about how to modify separator materials, Harrison says that further research also will be needed to reduce the formation of byproducts.

Scientists pair lasers with cryogenics to take “cool” images

Determining cause-of-death for a coin battery is surprisingly difficult. The trouble comes from its stainless-steel casing. The metal shell limits what diagnostics, like X-rays, can see from the outside, while removing parts of the cell for analysis rips apart the battery’s layers and distorts whatever evidence might be inside.

“We have different tools that can study different components of a battery, but really we haven’t had a tool that can resolve everything in one image,” said Katie Jungjohann, a Sandia nanoscale imaging scientist at the Center for Integrated Technologies. The center is a user facility jointly operated by Sandia and Los Alamos national laboratories.

She and her collaborators used a microscope that has a laser to mill through a battery’s outer casing. They paired it with a sample holder that keeps the cell’s liquid electrolyte frozen at temperatures between minus 148 and minus 184 degrees Fahrenheit (minus 100 and minus 120 degrees Celsius, respectively). The laser creates an opening just large enough for a narrow electron beam to enter and bounce back onto a detector, delivering a high-resolution image of the battery’s internal cross section with enough detail to distinguish the different materials.

The original demonstration instrument, which was the only such tool in the United States at the time, was built and still resides at a Thermo Fisher Scientific laboratory in Oregon. An updated duplicate now resides at Sandia. The tool will be used broadly across Sandia to help solve many materials and failure-analysis problems.

“This is what battery researchers have always wanted to see,” Jungjohann said.

More information: Katherine L. Jungjohann et al, Cryogenic Laser Ablation Reveals Short-Circuit Mechanism in Lithium Metal Batteries, ACS Energy Letters (2021). DOI: 10.1021/acsenergylett.1c00509

Provided by Sandia National Laboratories

A high-energy density and long-life initial-anode-free lithium battery


a-high-energy-density battery 071421
Cathode and electrolyte design strategies for the researchers’ anode-free Li cell system. Credit: Qiao et al.

Lithium-metal batteries (LMBs), an emerging type of rechargeable lithium-based batteries made of solid-state metal instead of lithium-ions, are among the most promising high-energy-density rechargeable battery technologies. Although they have some advantageous characteristics, these batteries have several limitations, including a poor energy density and safety-related issues.

In recent years, researchers have tried to overcome these limitations by introducing an alternative, anode-free lithium battery cell design. This anode-free design could help to increase the  density and safety of .

Researchers at the National Institute of Advanced Industrial Science and Technology recently carried out a study aimed at increasing the energy density of anode-free lithium batteries. Their paper, published in Nature Energy, introduces a new high-energy-density and long-life anode-free lithium battery based on the use of a Li2O sacrificial agent.

Anode-free full-cell battery architectures are typically based on a fully lithiated cathode with a bare anode copper current collector. Remarkably, both the gravimetric and volumetric energy densities of anode-free lithium batteries can be extended to their maximum limit. Anode-free cell architectures have several other advantages over more conventional LMB designs, including a lower cost, greater safety and simpler cell assembly procedures.

To unlock the full potential of anode-free LMBs, researchers should first figure out how to achieve the reversibility/stability of Li-metal plating. While many have tried to solve this problem by engineering and selecting more favorable electrolytes, most of these efforts have so far been unsuccessful.

Others have also explored the potential of using salts or additives that could improve the Li-metal plating/stripping reversibility. After reviewing these previous attempts, the researchers at the National Institute of Advanced Industrial Science and Technology proposed the use of Li2O as a sacrificial agent, which is pre-loaded onto a LiNi0.8Co0.1Mn0.1O2 surface.

“It is challenging to realize high Li reversibility, especially considering the limited Li reservoir (typically zero lithium excess) in the cell configuration,” the researchers wrote in their paper. “In this study we have introduced Li2O as a preloaded sacrificial agent on a LiNi0.8Co0.1Mn0.1O2 cathode, providing an additional Li source to offset the irreversible loss of Li during long-term cycling in an initial-anode-free cell.”

In addition to employing Li2O as a sacrificial agent, the researchers proposed the use of a fluoropropyl ether additive to neutralize nucleophilic O2-, which is released during the oxidation of Li2O, and prevent the additional evolution of gaseous O2 resulting from the fabrication of a LiF-based electrolyte coated on the surface of the battery’s cathode.

“We show that O2– species, released through Li2O oxidation, are synergistically neutralized by a fluorinated ether additive,” the researchers explained in their paper. “This leads to the construction of a LiF-based layer at the cathode/electrolyte interface, which passivates the cathode surface and restrains the detrimental oxidative decomposition of ether solvents.”

Based on the design they devised, Yu Qiao and the rest of the team at the National Institute of Advanced Industrial Science and Technology were able to realize a long-life 2.46 Ah initial-anode-free pouch cell. This cell exhibited a gravimetric  of 320 Wh kg-1, maintaining an 80% capacity after 300 operation cycles.

In the future, the anode-free lithium battery introduced by this research group could help to overcome some of the commonly reported limitations of LMBs. In addition, its design could inspire the creation of safer lithium-based rechargeable batteries with higher energy densities and longer lifetimes.


Explore further

An anode-free zinc battery that could someday store renewable energyMore information: A high-energy-density and long-life initial-anode-free lithium battery enabled by a Li2O sacrificial agent. Nature Energy(2021). DOI: 10.1038/s41560-021-00839-0.

Journal information: Nature Energy

Bloom Energy unveils new Electrolyser with more efficiency than market alternatives


Bloom Energy 2

Bloom Energy has unveiled its new “Bloom Electrolyser” today (July 14) that is said to be the most energy-efficient electrolyser to produce clean hydrogen to date with 15 to 45% more efficiency than other products on the market today.

The new electrolyser relies on the same commercially proven and proprietary solid oxide technology platform used by Bloom Energy Servers to provide on-site electricity at high fuel efficiency.

In being highly flexible, it offers unique advantages for deployment across a broad variety of hydrogen applications, using multiple energy sources including intermittent renewable energy and excess heat.

Read more: Bloom Energy reveals strategy for hydrogen market entry

Read more: 100kW of Bloom Energy SOFCs deployed in South Korea

 The Bloom electrolyser operates at high temperatures meaning that it requires less energy to break up water molecules and produce hydrogen.

As a result, Bloom Energy’s electrolyser consumes 15% less electricity than other electrolyser technologies to make hydrogen when electricity is the sole input source.

Unlike low-temperature PEM and alkaline electrolysers that predominantly require electricity to make hydrogen, the Bloom Electrolyser can leverage both electricity and heat to produce hydrogen.

Bloom Energy’s high-temperature electrolyser technology has the potential to use up to 45% less electricity when integrated with external heat sources than low-temperature PEM and alkaline electrolysers.

When the Bloom Electrolyser is paired with intermittent renewable resources, such as wind and solar, the resulting green hydrogen provides an important storage mechanism.

Hydrogen can be stored for long periods of time and transported over long distances.

Alternatively, Bloom Energy’s fuel cells can convert this hydrogen to electricity, thereby providing continuous, reliable power.

KR Sridhar, founder, Chairman, and CEO of Bloom Energy, said, “The launch of the Bloom Electrolyser is a big leap forward in our mission to enable and empower the global hydrogen economy and a decarbonised society.

“Hydrogen enables us to leverage abundant and inexpensive renewable energy to provide zero-carbon power, reliably—instead of intermittently.

“Given its efficiency and input options to make hydrogen, Bloom Energy’s electrolyser is expected to produce hydrogen at a lower price than any alternative on the market today.”

New mechanism of superconductivity discovered in graphene


3d,Illustration,Nanotechnology,,Glowing,Hexagonal,Geometric,Form,Close-up,,Concept,Graphene

Superconductivity is a physical phenomenon where the electrical resistance of a material drops to zero under a certain critical temperature. Bardeen-Cooper-Schrieffer (BCS) theory is a well-established explanation that describes superconductivity in most materials. It states that Cooper pairs of electrons are formed in the lattice under sufficiently low temperature and that BCS superconductivity arises from their condensation.

While graphene itself is an excellent conductor of electricity, it does not exhibit BCS superconductivity due to the suppression of electron-phonon interactions. This is also the reason that most ‘good’ conductors such as gold and copper are ‘bad’ superconductors.

Researchers at the Center for Theoretical Physics of Complex Systems (PCS), within the Institute for Basic Science (IBS, South Korea) have reported on a novel alternative mechanism to achieve superconductivity in graphene. They achieved this feat by proposing a hybrid system consisting of graphene and 2D Bose-Einstein condensate (BEC).

Along with superconductivity, BEC is another phenomenon that arises at low temperatures. It is the fifth state of matter first predicted by Einstein in 1924. The formation of BEC occurs when low-energy atoms clump together and enter the same energy state, and it is an area that is widely studied in condensed matter physics.

A hybrid Bose-Fermi system essentially represents a layer of electrons interacting with a layer of bosons, such as indirect excitons, exciton-polaritons, etc. The interaction between Bose and Fermi particles leads to various novel fascinating phenomena, which piques interests from both the fundamental and application-oriented perspectives.

In this work, the researchers report a new mechanism of superconductivity in graphene, which arises due to interactions between electrons and “bogolons”, rather than phonons as in typical BCS systems. Bogolons, or Bogoliubov quasiparticles, are excitation within BEC which has some characteristics of a particle. In certain ranges of parameters, this mechanism permits the critical temperature for superconductivity up to 70 Kelvin within graphene.

The researchers also developed a new microscopic BCS theory which focuses specifically on the novel hybrid graphene-based system. Their proposed model also predicts that superconducting properties can be enhanced with temperature, resulting in the non-monotonous temperature dependence of the superconducting gap.

Furthermore, the research showed that the Dirac dispersion of graphene is preserved in this bogolon-mediated scheme. This indicates that this superconducting mechanism involves electrons with relativistic dispersion — a phenomenon that is not so well-explored in condensed matter physics.

“This work sheds light on an alternative way to achieve high-temperature superconductivity. Meanwhile, by controlling the properties of a condensate, we can tune the superconductivity of graphene. This suggests another channel to control the superconductor devices in the future.”, explains Ivan Savenko, the leader of the Light-Matter Interaction in Nanostructures (LUMIN) team at the PCS IBS.

Carbon Nanotubes: The Next Generation of Global Water Purification?


carbon-nanotubes-discovery-seawater

A typical day (maybe like yours) involves waking up, taking a 10 minute shower, cooking breakfast, running the dishwasher if it’s full, going to work, eating dinner with a refreshing glass of filtered water, and maybe tackling a load of laundry in the evening. None of these actions feel extravagant, but when I look at statistics of global water usage and the lack of fresh water availability, it’s obvious that as Americans, we consume significantly more gallons of water per day than anywhere else in the world. In fact, on average each American uses about 152 gallons of water daily, while people in some other countries such as Uganda and Haiti use only about 4 gallons.1

That extreme low usage in some countries is not just because people are very conservation-minded – it is largely because there is not enough clean water to go around. Living in Wisconsin, I am conscientious of my water intake, but I am fortunate to not be in constant fear of turning on the faucet to see no water or even dirty water pouring out. Unfortunately, as we’ve learned from the recent experience of people in Flint, Michigan, it is not only countries outside the U.S. that have to worry about availability of clean water.

Personally, a trip to Israel last winter was what forced me to step out of my typical routine and experience firsthand how precious water is to their nation as a natural resource. My guide on the trip encouraged us to shower efficiently, never leave the water running while doing dishes, and to purchase bottled water, but to never waste a drop.

woman drinking
Enjoying the availability of fresh, clean water. (image by Elvert Barnes)

Being a scientist and an advocate for reducing my footprint on the environment, I wanted to learn more about what is being done to solve the problem of global freshwater availability. Considering what other recent medical and energy-related advancements have been made with the use of nanotechnology, it came as no surprise to me that carbon nanotubes (CNTs) are being studied as a means of water purification.

What exactly is a carbon nanotube? Picture a flat sheet of carbon atoms rolled into an incredibly tiny cylindrical tube with a diameter on the nanoscale, as shown here:

carbon nanotube
Carbon nanotube. (image from wikimedia commons)

These tiny, flexible, and surprisingly resilient materials may very well be our new direction in sustainable, large-scale water purification. The major advantage of CNTs is that water passes through them in a nearly frictionless manner due to something called hydrophobicity (i.e. preference to be away from water). But if CNTs are hydrophobic, you might wonder why water would even come near or enter them in the first place. This is an important point because water is a polar molecule and CNTs are non-polar, meaning they typically don’t want to mix together. One option for overcoming this is to coat the top of each tube with specific molecules that will initially attract water to the opening. Then, when water enters the nanotube, it flows through very quickly because it is being repelled by the hydrophobic tube walls. Conversely, most salts, ions, and pollutants won’t flow through the nanotube because they are attracted to and captured by the coating at the opening.

reusablecarb

Also Read: Reusable carbon nanotubes could be the water filter of the future

Because water passes through the CNTs so easily and pollutants do not, CNTs have potential to be a great solution for water purification. If we’re able to use CNTs in the next generation of water purification technology, then we may be able to both desalinate (get rid of salt) and remove pollutants from otherwise unusable water around the world with less energy and less money than current methods require.

Improving water purification really is a global issue. On my trip last winter, I learned that Israel, along with other nations in the Middle East and Northern Africa, were already considered “water stressed” in 1995. This phrase indicates that the nation is withdrawing more than 25% of their renewable freshwater resources for agricultural, domestic, and industrial uses annually. By the year 2025 it is predicted that over 2.8 billion people in 48 countries will be “water stressed” or worse.2 To combat this growing issue, countries are turning to unconventional options for wastewater treatment and desalination.

Global map of freshwater stress
Freshwater stress map showing percentage ranges of how much water will be withdrawn with respect to amount naturally available.2

With a greater understanding of the problem at hand, let’s take a closer look at current techniques for water purification. Right now, the most common practice of cleansing salt water to produce freshwater is the use of desalination plants. These plants operate by the technique of reverse osmosis, which is shown in the figure below. Osmosis is when a less concentrated liquid (like fresh water) moves through a membrane toward a more concentrated liquid (water containing salt in this case). This process happens naturally without any need for applying external pressure. For reverse osmosis, we want the exact opposite to happen, and unlike osmosis, that requires an input of external energy. These semi-permeable reverse osmosis membranes that separate the two liquids are usually made of common organic materials with pores ranging from 0.3 to 0.6 nanometers in diameter. (As a comparison the diameter of a dime is 18,008,600 nanometers while the diameter of a water molecule is a mere 0.1 nm).

diagram of osmosis & reverse osmosis
Differences between osmosis and reverse osmosis. (image by Emily Caudill)

As of 2013, there were over 17,000 plants worldwide providing more than 300 million people with desalinated water, many using reverse osmosis technology.3 Israel alone, with its relatively small land mass, has four plants with a fifth currently being built, while the U.S. houses about 250 plants.4

Desalination plants are effective, yet they are costly monetarily, energetically, and environmentally speaking. These plants require a great deal of energy input daily, and most desalination plants run on non-renewable energy sources (like fossil fuels and nuclear energy).5 With global freshwater availability declining, there is a need for cheaper, more efficient, and more environmentally sustainable desalination technology, and CNTs may be our most viable option looking forward. Their desalination capacity and frictionless water interaction means they require less energy than reverse osmosis to produce the same amount of fresh water.

reverse osmosis desalination plant
Inside a reverse osmosis desalination plant  (image by James Grellier)

There are a few hurdles that must be overcome before we hear about the next “CNT desalination plant,” though. For example, we need to develop a method for large scale synthesis of CNTs that ensures consistent shape, size, and function. However, it is evident that our current methods of water purification aren’t enough and motivation is strong to develop this new use for CNTs.6

With increasing resources and technologies, we are closer than ever to a newer, faster, cheaper, and overall better way to provide fresh water on the global scale. Nanotechnology has proven to be a wonderful alternative to previously used methods in energy and medicine, for example, and we may be close to seeing a similar improvement in the case of water purification. As part of the Center for Sustainable Nanotechnology, I want to mention that it will be absolutely vital to produce and dispose of CNTs in a way that doesn’t lead to greater issues in the future (such as their own toxicity to the environment). This is an exciting time in science because we have groundbreaking approaches, but we are also in a day and age where we need to be more thoughtful about how our actions impact our finite supply of water, precious metals, and other vital resources.


EDUCATIONAL RESOURCES


REFERENCES (may require subscription for full access)

  1. The Water Information Program. Water Facts http://www.waterinfo.org/resources/water-facts.
  2. United Nations Environment Program. Vital Water Graphics, retrieved from http://www.unep.org/dewa/vitalwater/article141.html.
  3. International Desalination Association. Desalination by the Numbers http://idadesal.org/desalination-101/desalination-by-the-numbers/.
  4. Texas Water Development Board. Seawater FAQs http://www.twdb.texas.gov/innovativewater/desal/faqseawater.asp.
  5. Nuclear Energy Institue. Water Desalination
    http://www.nei.org/Knowledge-Center/Other-Nuclear-Energy-Applications/Water-Desalination
  6. Das, R.; Ali, M. E.; Hamid, S. B. A.; Ramakrishna, S.; Chowdhury, Z. Z. Carbon nanotube membranes for water purification: a bright future in water desalination. Desalination2014, 336, 97-109. doi: 10.1016/j.desal.2013.12.026

China’s Dominance Of Clean Energy Supply Chains Raises Concerns


prc-flag (1)

Over the past decade, no major energy source has grown faster than solar power. According to the 2020 BP Statistical Review of World Energy, installed solar photovoltaic (PV) capacity has grown at an average annual rate of over 42% over the past 10 years, translating into a doubling of global capacity every 1.7 years on average.

Although that blistering pace could start slowing down as installed capacity grows, solar will likely remain the fastest-growing energy source for the foreseeable future. Much as with other energy sources, however, solar growth is giving rise to a number of thorny questions regarding geopolitics, supply chains, and national security.

The Path to Decarbonization

From a North American perspective, the election of Joe Biden as U.S. President has breathed new life into the Paris climate agreement — the most significant global effort to rein in carbon dioxide emissions to date. Fulfilling a key campaign promise, President Biden officially rejoined the Paris accord last month. At the same time, following meetings between President Biden and Canadian Prime Minister Justin Trudeau, Canada also pledged to submit its own new target under the Paris pact, with the two leaders insisting on a joint approach to climate issues.

The European Union, for its part, has consistently maintained an aggressive stance toward carbon emission reductions. The EU is on a path to surpass its goal of generating a third of its energy from renewable sources by 2030. Last September, the European Commission presented its plan to reduce EU greenhouse gas emissions by at least 55% by 2030, compared to 1990 levels. That would put the EU on a path to reach climate neutrality by 2050.

All of these efforts point to one inescapable conclusion: installed renewable energy capacity will continue to rise as governments on both sides of the Atlantic pour money into decarbonization efforts.

At the same time, many of these countries are understandably sensitive about energy security. Political leaders don’t like to depend on other countries for their energy supplies, but this is frequently an accepted trade-off due to economic considerations.

That pattern has long held true for fossil fuels, with OPEC maintaining a stranglehold on the world’s oil supplies until the U.S. fracking boom somewhat weakened its monopoly. Now, as the renewable revolution picks up steam, one country – China – has built up a clear advantage around certain key renewable technologies, in particular the components needed to construct solar energy infrastructure in the West.

Huawei in the Spotlight

China’s own energy consumption continues to grow rapidly, making the Chinese economy the world’s largest energy consumer. As a result, Beijing invested aggressively in renewables and has now achieved predominant market shares in solar photovoltaics as well as lithium-ion batteries, another key renewable technology.

Chinese state-linked company Huawei, better known for telecommunications equipment and consumer electronics, has also become one of the world’s largest suppliers of solar inverters, a critical part of solar PV systems that converts direct current power generated by solar panels into alternating current electricity to power electronics in homes and businesses.

Huawei’s dominant position in the inverter market, coupled with the backing it enjoys from the Chinese government, has raised concerns in the U.S. In 2019, a bipartisan group of U.S. Senators sent letters to Energy Secretary Rick Perry and Department of Homeland Security Secretary Kirstjen Nielsen, urging them to ban the sale of all Huawei solar products in the U.S., citing a national security threat to U.S. energy infrastructure.

Noting that Congress had previously blocked Huawei from the U.S. telecommunications equipment market due to concerns over its links to China’s intelligence services, the letter stated in part:

“Both large-scale photovoltaic systems and those used by homeowners, school districts, and businesses are equally vulnerable to cyberattacks. Our federal government should consider a ban on the use of Huawei inverters in the United States and work with state and local regulators to raise awareness and mitigate potential threats.”

renewable-planetary-energy-reserves-Terawattyears-Total

Comparing finite and renewable planetary energy reserves (Terawatt-years). Total recoverable reserves are shown for the finite resources. Yearly potential is shown for the renewables.

The concern is that if the U.S. power grid becomes dependent on a critical piece of state-linked Chinese electronic equipment, it could render that grid especially vulnerable to outside disruption or manipulation. This dynamic mirrors concerns in the U.S. about reliance on OPEC for oil supplies. Huawei responded to what it called an “unwelcoming climate being fostered in the United States” by closing its U.S. inverter business.

Europe’s Diverging Approach

A more ambivalent approach toward Huawei was initially adopted in the EU, which only agreed to reduce its dependency on equipment susceptible to Chinese government influence for future 5G networks. However, officials in a number of EU countries are now sounding an alarm over the Chinese state’s role in sectors of their economies that represent key national security interests, including banking, energy, and infrastructure.

Those concerns extend to solar energy, with EU policymakers also expressing concern over China’s use of Muslim forced labor in solar PV module supply chains. That issue has given additional impetus to the European Parliament, which is now pushing for trade bans on Chinese solar module equipment if human rights abuses are involved in their manufacture.

These factors all create major incentives for Western countries to address Chinese state dominance in the clean energy sector. That imbalance didn’t arise overnight, and it will take some time to address.

President Biden took a step in that direction by signing an executive order aimed at making U.S. supply chains more resilient. Among other things, the report calls for identifying “risks in the supply chain for high-capacity batteries, including electric-vehicle batteries, and policy recommendations to address these risks.”

The EU will now have to decide whether it is ready to pursue a similar approach. Clean energy supply chains haven’t received a lot of policy attention until recently, but governments are increasingly under pressure to ensure potential threats to those supply chains don’t derail global efforts to decarbonize.

Article from The Energy Collective Group: Robert Rapier

How Quantum Computing Could Remake Chemistry


Quantum Chemistry C6CDE761-4D0F-4C92-AB655524F25DD263_source
Photo Credit: Andriy Onufriyenko  Article By Jeanette Garcia

In my career as a chemist, I owe a huge debt to serendipity. In 2012, I was in the right place (IBM’s Almaden research lab in California) at the right time—and I did the “wrong” thing. I was supposed to be mixing three components in a beaker in the hope of systematically uncovering a combination of chemicals, meaning to replace one of the chemicals with a version that was derived from plastic waste, in an effort to increase the sustainability of thermoset polymers.

Instead, when I mixed two of the reagents together, a hard, white plastic substance formed in the beaker. It was so tough I had to smash the beaker to get it out. Furthermore, when it sat in dilute acid overnight, it reverted to its starting materials. Without meaning to, I had discovered a whole new family of recyclable thermoset polymers. Had I considered it a failed experiment, and not followed up, we would have never known what we had made. It was scientific serendipity at its best, in the noble tradition of Roy Plunkett, who invented Teflon by accident while working on the chemistry of coolant gases.

Today, I have a new goal: to reduce the need for serendipity in chemical discovery. Nature is posing some real challenges in the world, from the ongoing climate crisis to the wake-up call of COVID-19. These challenges are simply too big to rely on serendipity. Nature is complex and powerful, and we need to be able to accurately model it if we want to make the necessary scientific advances.

Specifically, we need to be able to understand the energetics of chemical reactions with a high level of confidence if we want to push the field of chemistry forward. This is not a new insight, but it is one that highlights a major constraint: accurately predicting the behavior of even simple molecules is beyond the capabilities of even the most powerful computers.

This is where quantum computing offers the possibility of major advances in the coming years. Modeling energetic reactions on classical computers requires approximations, since they can’t model the quantum behavior of electrons over a certain system size. Each approximation reduces the value of the model and increases the amount of lab work that chemists have to do to validate and guide the model. Quantum computing, however, is now at the point where it can begin to model the energetics and properties of small molecules such as lithium hydride, LiH—offering the possibility of models that will provide clearer pathways to discovery than we have now.

THE QUANTUM CHEMISTRY LEGACY

Of course, quantum chemistry as a field is nothing new. In the early 20th century, German chemists such as Walter Heitler and Fritz London showed the covalent bond could be understood using quantum mechanics. In the late the 20th century, the growth in computing power available to chemists meant it was practical to do some basic modeling on classical systems.

Even so, when I was getting my Ph.D. in the mid-2000s at Boston College, it was relatively rare that bench chemists had a working knowledge of the kind of chemical modeling that was available via computational approaches such as density functional theory (DFT). The disciplines (and skill sets involved) were orthogonal. Instead of exploring the insights of DFT, bench chemists stuck to systematic approaches combined with a hope for an educated but often lucky discovery. I was fortunate enough to work in the research group of Professor Amir Hoveyda, who was early to recognize the value of combining experimental research with theoretical research.

THE DISCONTENTS OF COARSE DATA

Today, theoretical research and modeling chemical reactions to understand experimental results is commonplace, as the theoretical discipline became more sophisticated and bench chemists gradually began to incorporate these models into their work. The output of the models provides a useful feedback loop for in-lab discovery. To take one example, the explosion of available chemical data from high throughput screening has allowed for the creation of well-developed chemical models. Industrial uses of these models include drug discovery and material experimentation.

The limiting factor of these models, however, is the need to simplify. At each stage of the simulation, you have to pick a certain area where you want to make your compromise on accuracy in order to stay within the bounds of what the computer can practically handle. In the terminology of the field, you are working with “coarse-grained” models—where you deliberately simplify the known elements of the reaction in order to prioritize accuracy in the areas you are investigating. Each simplification reduces the overall accuracy of your model and limits its usefulness in the pursuit of discovery. To put it bluntly, the coarser your data, the more labor intensive your lab work.

The quantum approach is different. At its purest, quantum computing lets you model nature as it is; no approximations. In the oft-quoted words of Richard Feynman, “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical.”

We’ve seen rapid advances in the power of quantum computers in recent years. IBM doubled its quantum volume not once but twice in 2020 and is on course to reach quantum volume of more than 1,000, compared with single-digit figures in 2016. Others in the industry have also made bold claims about the power and capabilities of their machines.

So far, we have extended the use of quantum computers to model energies related to the ground states and excited states of molecules. These types of calculations will lead us to be able to explore reaction energy landscapes and photo-reactive molecules. In addition, we’ve explored using them to model the dipole moment in small molecules, a step in the direction of understanding electronic distribution and polarizability of molecules, which can also tell us something about how they react.

Looking ahead, we’ve started laying the foundation for future modeling of chemical systems using quantum computers and have been exploring different types of calculations on different types of molecules soluble on a quantum computer today. For example, what happens when you have an unpaired electron in the system? Do the calculations lose fidelity, and how can we adjust the algorithm to get them to match the expected results? This type of work will enable us to someday look at radical species, which can be notoriously difficult to analyze in the lab or simulate classically.

To be sure, this work is all replicable on classical computers. Still, none of it would have been possible with the quantum technology that existed five years ago. The progress in recent years holds out the promise that quantum computing can serve as a powerful catalyst for chemical discovery in the near future.

QUANTUM MEETS CLASSICAL

I don’t envision a future where chemists simply plug algorithms into a quantum device and are given a clear set of data for immediate discovery in the lab. What is feasible—and may already be possible— would be incorporating quantum models as a step in the existing processes that currently rely on classical computers.

In this approach, we use classical methods for the computationally intensive part of a model. This could include an enzyme, a polymer chain or a metal surface. We then apply a quantum method to model distinct interactions—such as the chemistry in the enzyme pocket, explicit interactions between a solvent molecule and a polymer chain, or hydrogen bonding in a small molecule. We would still accept approximations in certain parts of the model but would achieve much greater accuracy in the most distinct parts of the reaction. We have already made important progress through studying the possibility of embedding quantum electronic structure calculation into a classically computed environment obtained at the Hartree-Fock (HF) or DFT level of theory.

The practical applications of advancing this approach are numerous and impactful. More rapid advances in the field of polymer chains could help address the problem of plastic pollution, which has grown more acute since China has cut its imports of recyclable material. The energy costs of domestic plastic recycling remain relatively high; if we can develop plastics that are easier to recycle, we could make a major dent in plastic waste. Beyond the field of plastics, the need for materials with lower carbon emissions is ever more pressing, and the ability to manufacture substances such as jet fuel and concrete with a smaller carbon footprint is crucial to reducing total global emissions.

MODELING THE FUTURE

The next generation of chemists emerging from grad schools across the world brings a level of data fluency that would have been unimaginable in the 2000s. But the constraints on this fluency are physical: classically built computers simply cannot handle the level of complexity of substances as commonplace as caffeine. In this dynamic, no amount of data fluency can obviate the need for serendipity: you will be working in a world where you need luck on your side to make important advances. The development of— and embrace of—quantum computers is therefore crucial to the future practice of chemists.

This is an opinion and analysis article.

Super Oil & Gas Company Total Orders Hydrogen Re-Fueling Station from HRS – A Hydrogen Fuel Success Story


hrs-total-hydrogene_130321

Hydrogen Refueling Solutions (HRS) has announced that it has received an order from Total for the supply and installation of a hydrogen station at the site of one of its customers.

A European designer and manufacturer of hydrogen fueling stations, HRS is a success
story. Recently listed on the stock exchange, the Iserise company has just formalized an order related to the supply and installation of a hydrogen station for one of the total group’s customers.

GBN_Green-hydrogen_19122020

A hydrogen station delivered by June 2021

If hrS does not specify the location of this future station, it says it will be delivered and commissioned by June 2021.
Specially designed to meet the needs of Total’s teams, this station will be able to distribute up to 200 kilograms of hydrogen per day. Accessible to all types of vehicles, it will offer two levels of pressure: 350 and 700 bars. With a storage capacity of 190 kilos, it can be easily dismantled and transported.

(Quote) Philippe Callejon, Director of Mobility and New Energy of Total Marketing France,

“With this high-capacity, transportable HRS solution, Total is able to offer its customers an innovative solution of temporary rental offer turnkey and quickly deployable, to address their experimental operational needs (bus fleets, household dumpsters, heavy trucks, commercial vehicles …).” says

Hydrogen projects worth $300 billion are dropping green H2 prices fast


 
Hydrogen 1 download
 
Key hydrogen projects that have been announced globally – Hydrogen Council
 

A new Hydrogen Council report sheds some light on Hydrogen’s rise as a green fuel source. More than 30 countries now have a national H2 strategy and budget in place, and there are 228 projects in the pipeline on both the production and usage sides.

Europe is leading the way, with 126 projects announced to date, followed by Asia with 46, Oceania with 24 and North America with 19. In terms of gigawatt-scale H2 production projects, there are 17 projects planned, with the largest in Europe, Australia, the Middle East and Chile.

Overall, projects seem fairly well balanced between hydrogen production and end-use applications, with a smaller number focusing on distribution.

European projects are balanced between production and usage initiatives, while Korea and Japan are developing much more on the usage side, for both transport and industrial applications. Australia and the Middle East are more active on the supply side, working to position themselves as hydrogen exporters.

The majority of these projects – some 75 percent, it should be noted – have been announced but do not yet have funding committed. This figure includes budgets committed by governments for spending, for which no project has yet been identified.

Only US$45 billion worth of projects are at the “mature” stage, having reached the feasibility study or engineering and design stage, and $38 billion are at the “realized” stage, with a final investment decision made, construction started, or already operational. 

Hydrogen production projections for 2030 have leapt up in the last year. The previous report estimated that 2.3 million tons will be produced annually by 2030, and this report revises that figure up to 6.7 million tons. To put that another way, two-thirds of the global hydrogen production expected to be operational in 2030 has been announced in the last year.

Government decarbonization initiatives are a huge driving force behind the hydrogen wave, with some $70 billion committed globally. Carbon pricing is helping, with some 80 percent of global GDP covered by some kind of CO2 pricing mechanism. 

Japan and Korea, as you’d expect, are leading the charge on fuel cell vehicles, and globally the report projects some 4.5 million FCVs on the road by 2030, with 10,500 hydrogen fuel stations targeted to meet that demand.

Hydrogen 2 download

Green hydrogen production prices are dropping faster than previously expected, with optimal operations beginning to achieve price parity by 2030 even without carbon taxes on gray hydrogen  – Hydrogen Council

There’s good news too in terms of production costs, with prices for green, renewable hydrogen falling faster than expected. Partially, this is because electrolyzer supply chains are ramping up faster than expected, bringing the price of electrolyzers down 30-50 percent lower than anticipated.

Other factors include a declining cost of energy, with renewable energy costs revised down by 15 percent, and green hydrogen production companies figuring out their mix of renewable inputs more effectively to keep the hydrolyzers up and running longer.

So while “gray” hydrogen costs are expected to remain stable at around $1.59 per kg, green hydrogen is expected to drop from its current price around $4-5.50 per kilogram to hit an average of $1.50 by 2050, with green supply potentially becoming cheaper than gray hydrogen in optimal areas as soon as 2030. Low-carbon hydrogen production will start coming online around 2025, with prices sitting roughly between the two. Adding carbon taxes to the gray production could bring green hydrogen to price parity by 2030. 

Hydrogen transport is going to become a big deal, with major demand centers likely to look at imports. The cheapest way to do it for short to medium distances is through retrofitted pipelines, provided you’ve got a guaranteed demand to fill.

If demand fluctuates, trucks become more attractive. For longer distances, some routes have undersea pipelines that could be used, but much of the rest will have to be done using ships, which will add around $1-2 to the cost per kilogram.

Long-range overland pipelines also look like an interesting opportunity, with the report pointing out that hydrogen pipelines can transport 10 times more energy than a long-distance electricity transmission line at one eighth the cost. And existing pipelines can be retrofitted to handle hydrogen to vastly reduce the cost of pipeline projects.

The report makes further long-term projections for hydrogen vehicles, trucks, ships and aircraft. In aviation, the report projects hydrogen will become a cost-effective way to de-carbonize short and medium range flights (sub-10,000 km, or 6.200 mi) by around 2040, but there’ll need to be significant advances in storage to make it practical for longer range flights.

The report should not be taken as gospel, having been written by the H2 industry itself, but it makes for some interesting reading if you’re interested in the development of the clean energy economy.

Source: Hydrogen Council

%d bloggers like this: