“On the Rebound” The quest to introduce self-healing behaviors in Nanoparticles: Stanford University


In a newly discovered twist, Argonne scientists and collaborators found that palladium nanoparticles can repair atomic dislocations in their crystal structure. This self-healing behavior could be worth exploring in other materials. (Image by Argonne National Laboratory.)

Our bodies have a remarkable ability to heal from broken ankles or dislocated wrists. Now, a new study has shown that some nanoparticles can also “self-heal” after experiencing intense strain, once that strain is removed.

New research from the U.S. Department of Energy’s (DOE) Argonne National Laboratory and Stanford University has found that palladium nanoparticles can repair atomic dislocations in their crystal structure. This newly discovered twist could ultimately advance the quest to introduce self-healing behaviors in other materials.

“It turns out that these nanoparticles function much more like the human body healing from an injury than like a broken machine that can’t fix itself.” – Andrew Ulvestad, Argonne materials scientist

The research follows a study from last year, in which Argonne researchers looked at the sponge-like way that palladium nanoparticles absorb hydrogen.

When palladium particles absorb hydrogen, their spongy surfaces swell. However, the interiors of the palladium particles remain less flexible. As the process continues, something eventually cracks in a particle’s crystal structure, dislocating one or more atoms.

“One would never expect the dislocation to come out under normal conditions,” said Argonne materials scientist Andrew Ulvestad, the lead author of the study. “But it turns out that these nanoparticles function much more like the human body healing from an injury than like a broken machine that can’t fix itself.”

Ulvestad explained that the dislocations form as a way for the material to relieve the stress placed on its atoms by the infusion of additional hydrogen. When scientists remove the hydrogen from the nanoparticle, the dislocations have room to mend.

Using the X-rays provided by Argonne’s Advanced Photon Source, a DOE Office of Science User Facility, Ulvestad was able to track the motion of the dislocations before and after the healing process. To do so, he used a technique called Bragg coherent diffraction imaging, which identifies a dislocation by the ripple effects it produces in the rest of the particle’s crystal lattice.

In some particles, the stress of the hydrogen absorption introduced multiple dislocations. But even particles that dislocated in multiple places could heal to the point where they were almost pristine.

“In some cases, we saw five to eight original dislocations, and some of those were deep in the particle,” Ulvestad said. “After the particle healed, there would be maybe one or two close to the surface.”

Although Ulvestad said that researchers are still unsure exactly how the material heals, it likely involves the relationship between the material’s surface and its interior, he explained.

By better understanding how the material heals, Ulvestad and his colleagues hope to tailor the dislocations to improve material properties. “Dislocations aren’t necessarily bad, but we want to control how they form and how they can be removed,” he said.

The study, entitled “The self-healing of defects induced by the hydriding phase transformation in palladium nanoparticles,” appeared November 9 in Nature Communications.

The work was supported by DOE’s Office of Science and the National Science Foundation.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.

Advertisements

The Death Of Oil: Scientists Eyeball 2X EV Battery Range


For someone who’s all in for fossil fuels, President* Trump sure has a thing for electric vehicles.

Last October the US Energy Department announced $15 million in funding to jumpstart the next generation of “extremely” fast charging systems, and last week the agency’s SLAC National Accelerator Laboratory announced a breakthrough discovery for doubling the range of EV batteries.

Put the two together, and you have EVs that can go farther than any old car, and fuel up just about as quickly.

Add the convenience factor of charging up at home or at work, and there’s your recipe for killing oil. #ThanksTrump!

A Breakthrough Energy Storage Discovery For Electric Vehicles

Did you know that it’s possible to double the range of today’s electric vehicles?

No, really! The current crop of  lithium-ion batteries use just half of their theoretical capacity, so there is much room for improvement.

To get closer to 100%, all you have to do is “overstuff” the positive electrode — the cathode — with more lithium. Theoretically, that would enable the battery to absorb more ions in the same space. Theoretically.

Unfortunately, previous researchers have demonstrated that supercharged cathodes lose voltage too quickly to be useful in EVs, because their atomic structure changes.

During the charge cycle, lithium ions leave the supercharged cathode and transition metal atoms move in. When the battery discharges, not all of the transition metal atoms go back to where they came from, leaving less space for the lithium ions to return.

It’s kind of like letting two friends crash on your couch, and one of them never leaves.

That’s the problem tackled by a research team based at the SLAC National Accelerator Laboratory (SLAC is located at Stanford University and the name is a long story involving some trademark issues, but apparently it’s all good now).

Here’s Stanford grad student and study leader William E. Gent enthusing over the new breakthrough:

It gives us a promising new pathway for optimizing the voltage performance of lithium-rich cathodes by controlling the way their atomic structure evolves as a battery charges and discharges.

Umm, okay.

In other words, the SLAC team discovered a way to manipulate the atomic structure of supercharged cathodes, so the battery doesn’t lose voltage during the charge/discharge cycle.

And, here’s where oil dies:

The more ions an electrode can absorb and release in relation to its size and weight — a factor known as capacity — the more energy it can store and the smaller and lighter a battery can be, allowing batteries to shrink and electric cars to travel more miles between charges.

No, Really — How Does It Work?

To get to the root of the problem, the research team deployed some fancy equipment at  SLAC’s SSRL (Stanford Synchotron Radiation Lightsource) to track the atomic-level changes that a lithium-rich battery undergoes during charging cycles.

First, they defined the problem:

…clarifying the nature of anion redox and its effect on electrochemical stability requires an approach that simultaneously probes the spatial distribution of anion redox chemistry and the evolution of local structure.

The research team “unambiguously confirmed” the interplay between oxygen and the transition metal, along with the mechanism for controlling that reaction:

Our results further suggest that anion redox chemistry can be tuned through control of the crystal structure and resulting TM migration pathways, providing an alternative route to improve Li-rich materials without altering TM–O bond covalency through substitution with heavier 4d and 5d TMs.

The equipment angle is essential, btw. Apparently, until the new SLAC study nailed it down there was widespread disagreement on the root cause of the problem.

The new research was made possible in part by a new soft X-ray RIXS system, which was just installed at the lab last year (RIX stands for resonant inelastic X-ray scattering).

You can get all the details from the study, “Coupling between oxygen redox and cation migration explains unusual electrochemistry in lithium-rich layered oxides,” which just came out in the journal Nature Communications.

So, Now What?

The SLAC team points out that until now, RIXS has been mainly used in foundational research. The new study goes a long way to confirming the practical application of RIXS.

In other words, the floodgates are open to a new wave of advanced energy storage research leading to better, cheaper EV batteries and faster charging systems.

In that regard its worth noting that along with the Energy Department, Samsung partnered in the new study and chipped in some of the funding.

That’s a pretty clear indication that Samsung is looking to crack open the Panasonic/Tesla partnership and take over global leadership of the energy storage field. Last September, Samsung unveiled a new 600-km (430-mile) battery for EVs, but the company was mum on the details.

Last November, Samsung unveiled a new version of its SM3 ZE sedan, in which the size of the battery was doubled without increasing the weight of the vehicle. That’s a significant achievement and the company has been tight-lipped on that score, too.

Samsung is also exploring graphene for advanced, long range batteries, so there’s that.

“Perhaps News of My (Oil) premature Death has been misreported”

As for the death of oil, electric vehicles are getting their place in the sun, no matter how much Trump talks up fossil fuels.

That still leaves the issue of petrochemicals.

Although the green chemistry movement is gathering steam, the US petrochemical industry has been taking off like a rocket in recent years. That means both oil and natural gas production could continue apace for the foreseeable future, with or without gasmobiles.

ExxonMobil has been making huge moves into the Texas epicenter of US petrochemicals, and just last week the Houston Chronicle noted this development:

Michigan and Delware-based DowDuPont announced earlier this year that it would spend $4 billion expanding its industrial campus in Freeport.

The expansion will give Freeport the largest ethylene plant in the world. Houston-based Freeport LNG is also building an LNG export terminal in the area.

Then there’s this:

By 2019, Freeport’s power demand is expected to be 92 percent higher than it was in 2016, according to ERCOT.

The $246.7 million project will include a new 48 mile transmission line and upgrades to shorter line in the area, according to ERCOT.

Stanford University: Lithium-Graphene “FOIL” makes for a GREAT Battery Electrode


defectfreegrScientists fabricate defect-free graphene, set record reversible capacity for Co3O4 anode in Li-ion batteries

Lithium ion batteries, as the name implies, work by shuffling lithium atoms between a battery’s two electrodes. So, increasing a battery’s capacity is largely about finding ways to put more lithium into those electrodes. These efforts, however, have run into significant problems. If lithium is a large fraction of your electrode material, then moving it out can cause the electrode to shrink. Moving it back in can lead to lithium deposits in the wrong places, shorting out the battery.

Now, a research team from Stanford has figured out how to wrap lots of lithium in graphene. The resulting structure holds a place open for lithium when it leaves, allowing it to flow back to where it started. Tests of the resulting material, which they call a lithium-graphene foil, show it could enable batteries with close to twice the energy density of existing lithium batteries.

Lithium behaving badly

One obvious solution to increasing the amount of lithium in an electrode is simply to use lithium metal itself. But that’s not the easiest thing to do. Lithium metal is less reactive than the other members of its column of the periodic table (I’m looking at you, sodium and potassium), but it still reacts with air, water, and many electrolyte materials. In addition, when lithium leaves the electrode and returns, there’s no way to control where it re-forms metal. After a few charge/discharge cycles, the lithium electrode starts to form sharp spikes that can ultimately grow large enough to short out the battery.Graphene Anodes for LI Batt id35611

To have better control over how lithium behaves at the electrode, the Stanford group has looked into the use of some lithium-rich alloys. Lithium, for example, forms a complex with silicon where there are typically over four lithium atoms for each atom of silicon. When the lithium leaves the electrode, the silicon stays behind, providing a structure to incorporate the lithium when it returns on the other half of the charge/discharge cycle.

While this solves the problems with lithium metal, it creates a new one: volume changes. The silicon left behind when the lithium runs to the other electrode simply doesn’t take up as much volume as it does when the same electrode is filled with the lithium-silicon mix. As a result, the electrode expands and contracts dramatically during a charge-discharge cycle, putting the battery under physical stress. (Mind you, a lithium metal electrode disappears entirely, possibly causing an even larger mechanical stress.)

And that would seem to leave us stuck. Limiting the expansion/contraction of the electrode material would seem to require limiting the amount of lithium that moves into and out of it. Which would, of course, mean limiting the energy density of the battery.

Between the sheets

In the new work, the researchers take their earlier lithium-silicon work and combine it with graphene. Graphene is a single-atom-thick sheet of carbon atoms linked together, and it has a number of properties that make it good for batteries. It conducts electricity well, making it easy to shift charges to and from the lithium when the battery charges and discharges. It’s also extremely thin, which means that packing a lot of graphene molecules into the electrode doesn’t take up much space. And critically for this work, graphene is mechanically tough.

Stanford_University_seal_2003_svgTo make their electrode material, the team made nanoparticles of the lithium-silicon material. These were then mixed in with graphene sheets in an eight-to-one ratio. A small amount of a plastic precursor was added, and the whole mixture was spread across a plastic block. Once spread, the polymer precursor created a thin film of polymer on top of the graphene-nanoparticle mix. This could be peeled off, and then the graphene-nanoparticle mix could be peeled off the block as a sheet.

The resulting material, which they call a foil, contains large clusters of the nanoparticles typically surrounded by three to five layers of graphene. Depending on how thick you make the foil, there can be several layers of nanoparticle clusters, each separated by graphene.

The graphene sheets make the material pretty robust, as you can fold and unfold it and then still use it as a battery electrode. They also help keep the air from reacting with the lithium inside. Even after two weeks of being exposed to the air, the foil retained about 95 percent of its capacity as an electrode. Lower the fraction of graphene used in the starting mix and air becomes a problem, with the electrode losing nearly half of its capacity in the same two weeks.

And it worked pretty well as an electrode. When the lithium left, the nanoparticles did shrink, but the graphene sheets held the structure together and kept it from shrinking. And it retained 98 percent of its original capacity even after 400 charge-discharge cycles. Perhaps most importantly, when paired with a vanadium oxide cathode, the energy density was just over 500 Watt-hours per kilogram. Current lithium-ion batteries top out at about half that.

Normally, work like this can take a while to get out of an academic lab and have a company start looking into it. In this case, however, the head of the research group Yi Cui already has a startup company with batteries on the market. So, this could take somewhat less time for a thorough commercial evaluation. The biggest sticking point may be the cost of the graphene. A quick search suggests that graphene is still thousands of dollars per kilogram, although it has come down, and lots of people are looking for ways to make it even less expensive.

If they succeed, then the rest of the components of this electrode are pretty cheap. And the process for making it seems pretty simple.

Nature Nanotechnology, 2017. DOI: 10.1038/NNANO.2017.129  (About DOIs).

Stanford University: Graphene takes the strain in all-carbon stretchable transistors: Applications for displays, functional sensors and digital circuits for electronic skin


Graphene’s unique combination of electrical and physical properties marks it out as a potential candidate for transparent, stretchable electronics, which could enable a new generation of sophisticated displays, wearable health monitors, or soft robotic devices. But, although graphene is atomically thin, highly transparent, conductive, and more stretchable than conventional indium tin oxide electrodes, it still tends to crack at small strains.

Now researchers from Stanford University believe they have found a way to overcome this shortcoming and have created the most stretchable carbon-based transistors to date [Liu et al., Science Advances 3 (2017) e1700159].

“To enable excellent strain-dependent performance of transparent graphene conductors, we created graphene nanoscrolls in between stacked graphene layers,” explains first author of the study, Nan Liu

Illustration of the stacked graphene MGG structure.

The team led by Zhenan Bao dub their combination of rolled up sheets of graphene sandwiched in between stacked graphene layers ‘multi-layer G/G scrolls’ or MGG. The scrolls, which are 1–20 microns long, 0.1–1 microns wide, and 10–100 nm high, form naturally during the wet transfer process as graphene is moved from one substrate to another.

“By using MGG graphene stretchable electrodes (source/drain and gate) and semiconducting carbon nanotubes, we were able to demonstrate highly transparent and highly stretchable all-carbon transistors,” says Liu.

The all-carbon devices fabricated by the team retain 60% of their original current output when stretched to 120% strain (parallel to the direction of charge transport). This is the most stretchable carbon-based transistor reported to date, believe the researchers.

The graphene scrolls are key to the stretchable electrode’s remarkable properties because they seem to provide a conductive path even when graphene sheets start to crack at high strain levels.

“Taking into account the electronic and optical properties as well as the cost, our MGG exhibits substantial strengths over other conductors, such as carbon nanotubes and metal nanowires,” says Liu.

Transparent, stretchable graphene electrodes could be useful as contacts in flexible electronic circuits such as backplane control units for displays, as well as functional sensors and digital circuits for electronic skin.

“This is a very important area of research with a variety of possible applications,” comments Andrea C. Ferrari of the University of Cambridge. “The approach taken by Bao et al. is an interesting one that could be quite general.”

The concept of using a mixture of graphene scrolls and platelets to enable an electrode to stretch without significant losses in transmittance or conductivity is a good and should, in principle, not be too complicated to scale up for real devices, he adds.

“We are now seeking to extend this method to other two-dimensional materials, such as MoS2, to enable stretchable two-dimensional semiconductors,” says Liu.

This article was originally published in Nano Today (2017), doi: 10.1016/j.nantod.2017.10.005.

“Not Just for Select Few” ~ Quantum Computing is Growing toward Commercialization .. Disrupting EVERYTHING


quantum-computing-merl

Consider three hair-pulling problems: 1 percent of the world’s energy is used every year just to produce fertilizer; solar panels aren’t powerful enough to provide all the power for most homes; investing in stocks often feels like a game of Russian roulette.

Those seemingly disparate issues can be solved by the same tool, according to some scientists: quantum computing. Quantum computers use superconducting particles to perform tasks and have long been seen as a luxury for the top academic echelon—far removed from the common individual. But that’s quickly changing.

IBM had been dabbling with commercial possibilities when last year it released Quantum Experience, a cloud-based quantum computing service researchers could use to run experiments without having to buy a quantum system. In early March, IBM took that program further and announced IBM Q, the first cloud quantum computing system for commercial use. Companies will be able to buy time on IBM’s quantum computers in New York state, though IBM has not set a release date or price, and it is expected to be financially prohibitive for smaller companies at first.

Jarrod McClean, a computing sciences fellow at Lawrence Berkeley National Laboratory, says the announcement is exciting because quantum computing wasn’t expected to hit commercial markets for decades. Last year, some experts estimated commercial experimentation could be five to 40 years away, yet here we are, and the potential applications could disrupt the way pharmaceutical companies make medicine, the way logistics companies schedule trains and the way hedge fund managers gain an edge in the stock market. “We’re seeing more application areas start to develop all the time, now that people are looking at quantum,” McClean says.

Quantum computing is as different from traditional computing as an abacus is from a MacBook. “Classical computing was [invented] in the 1940s. This is like [that creation], but even beyond it,” says Scott Crowder, IBM Systems vice president and chief technology officer of quantum computing, technical strategy and transformation. “Take everything you know about how a class of computers works and forget it.”

Besting supercomputers

Quantum computers are made up of parts called qubits, also known as quantum bits. On some problems, they leverage the strange physics of quantum mechanics to work faster than chips on a traditional computer. (Just as a plane cannot exactly compare to a race car, a classical computer will still be able to do some things better than quantum, and vice versa. They’re just different.)

Explaining how qubits work requires jumping into quantum mechanics, which doesn’t follow the same rules of physics we’re used to in our everyday lives. Quantum entanglement and quantum superposition are particularly important; they defy common sense but take place only in environments that are incredibly tiny.

Quantum Computing 0421quantum01

IBM Quantum Computing Scientists Hanhee Paik, left, and Sarah Sheldon, right, examine the hardware inside an open dilution fridge at the IBM Q Lab at IBM’s T. J. Watson Research Center in Yorktown, New York. IBM Q quantum systems and services will be delivered via the IBM Cloud platform and will be designed to tackle problems that are too complex and exponential in nature for classical computing systems to handle. One of the first and most promising applications for quantum computing will be in the area of chemistry and could lead to the discovery of new medicines and materials.CONNIE ZHOU/IBM

Quantum superposition is important because it allows the qubit to do two things at once. Technically, it allows the qubit to be two things at once. While traditional computers put bits in 0 and 1 configurations to calculate steps, a qubit can be a 0 and a 1 at the same time. Quantum entanglement, another purely quantum property, takes the possibilities a step further by intertwining the characteristics of two different qubits, allowing for even more calculations. Calculations that would take longer than a human’s life span to work out on a classic computer can be completed in a matter of days or hours.quantum Computing II images

Eventually, quantum computing could outperform the world’s fastest supercomputer—and then all computers ever made, combined. We aren’t there yet, but at 50 qubits, universal quantum computing would reach that inflection point and be able to solve problems existing computers can’t handle, says Jerry Chow, a member of IBM’s experimental quantum computing department. He added that IBM plans to build and distribute a 50-qubit system “in the next few years.” Google aims to complete a 49-qubit system by the end of 2017.

Some experts aren’t convinced IBM’s move into the commercial market is significant. Yoshihisa Yamamoto, a Stanford University physics professor, says, “I expect the IBM quantum computer has a long way to go before it is commercialized to change our everyday life.”

Caltech assistant professor of computing and mathematical sciences Thomas Vidick says IBM’s commercialization of quantum computing feels “a bit premature” and estimates it will still be 10 to 20 years before commercial applications are mainstream. “The point is that quantum hardware hasn’t reached maturity yet,” he explains. “These are large machines, but they are hard to control. There is a big overhead in the transformation that maps the problem you want to solve to a problem that the machine can solve, one that fits its architecture.”

Despite the skepticism, many researchers are pumped. “While the current systems aren’t likely to solve a computational problem that regular computers can’t already solve, preparing the software layer in advance will help us hit the ground running when systems large enough to be useful become available,” says Michele Mosca, co-founder of the Institute for Quantum Computing at Ontario’s University of Waterloo. “Everyday life will start to get affected once larger-scale quantum computers are built and they are used to solve important design and optimization problems.”

quantum computing III imagesA company called D-Wave Systems already sells 2,000-qubit systems, but its systems are different from IBM’s and other forms of universal quantum computers, so many experts don’t consider their development to have reached that quantum finish line. D-Wave Systems’s computers are a type of quantum computer called quantum annealers, and they are limited because they can be used only on optimization problems. There is a roaring scientific debate about whether quantum annealers could eventually outpace traditional supercomputers, but regardless, this type of quantum computer is really good at one niche problem and can’t expand beyond that right now.

Practical Applications

What problems could be so complicated they would require a quantum computer? Take fertilizer production, McClean says. The power-hungry process to make mass-produced fertilizer accounts for 1 percent to 2 percent of the world’s energy use per year. But there’s a type of cyanobacteria that uses an enzyme to do nitrogen fixation at room temperature, which means it uses energy far more efficiently than industrial methods. “It’s been too challenging for classical systems to date,” McClean says, but he notes that quantum computers would probably be able to reveal the enzyme’s secrets so researchers could re-create the process synthetically. “It’s such an interesting problem from a point of view of how nature is able to do this particular type of catalysis,” he adds.

Pharmaceutical science could also benefit. One of the limitations to developing better, cheaper drugs is problems that arise when dealing with electronic structures, McClean says. Except with the simplest structures, like hydrogen-based molecules, understanding atomic and subatomic motion requires running computer simulations. But even that breaks down with more complex molecules. “You don’t even ask those questions on a classical computer because you know you’re going to get it wrong,” Crowder says.

The ability to predict how molecules react with other drugs, and the efficacy of certain catalysts in drug development, could drastically speed up the pace of pharmaceutical development and, ideally, lower prices, McClean says.

Finance is also plagued by complicated problems with multiple moving parts, says Marcos López de Prado, a research fellow in the Computational Research Department at Lawrence Berkeley National Laboratory. Creating a dynamic investment portfolio that can adjust to the markets with artificial intelligence, or running simulations with multiple variables, would be ideal, but current computers aren’t advanced enough to make this method possible. “The problem is that a portfolio that is optimal today may not be optimal tomorrow,” López de Prado says, “and the rebalance between the two can be so costly as to defeat its purpose.”

Quantum computing could figure out the optimal way to rebalance portfolios day by day (or minute by minute) since that “will require a computing power beyond the current potential of digital computers,” says López de Prado, who is also a senior managing director at Guggenheim Partners. “Instead of listening to gurus or watching TV shows with Wall Street connections, we could finally get the tools needed to replace guesswork with science.”

While business applications within quantum computing are mostly hopeful theories, there’s one area where experts agree quantum could be valuable: optimization. Using quantum computing to create a program that “thinks” through how to make business operations faster, smarter and cheaper could revolutionize countless industries, López de Prado says.

For example, quantum computers could be used to organize delivery truck routes so holiday gifts arrive faster during the rush before Christmas. They could take thousands of self-driving cars and organize them on the highway so all the drivers get to their destination via the fastest route. They could create automated translating software so international businesses don’t have to bother with delays caused from translating emails. “Optimization is just a generic hammer they can use on all these nails,” McClean says.

One day, quantum might even be used for nationwide problems, like optimizing the entire U.S. economy or organizing a national power grid.

Just as computers presented a huge advantage to the handful of companies that could afford them when they first came on the commercial market, it’s possible that a few companies might gain a tactical advantage by using quantum computing now. For example, if only a few investors use quantum computing to balance portfolios, the rest of the market will probably lose money. “But what happens when quantum computing goes mainstream?” asks López de Prado. “That tactical disadvantage disappears. Instead, everyone will be able to make better investment decisions. People will rely on science rather than stories.”

Link to Original Article in Newsweek Tech & Science

Stanford and Oxford scientists report New Perovskite low cost solar cell design could outperform existing commercial technologies: Video


stanford-oxfoed-perovskite_news-960x640Researchers have created a new type of solar cell that replaces silicon with a crystal called perovskite. This design converts sunlight to electricity at efficiencies similar to current technology but at much lower cost.

A new design for solar cells that uses inexpensive, commonly available materials could rival and even outperform conventional cells made of silicon.

Stanford and Oxford have created novel solar cells from crystalline perovskite that could outperform existing silicon cells on the market today. This design converts sunlight to electricity at efficiencies of 20 percent, similar to current technology but at much lower cost.

 

Writing in the Oct. 21 edition of Science, researchers from Stanford and Oxford describe using tin and other abundant elements to create novel forms of perovskite – a photovoltaic crystalline material that’s thinner, more flexible and easier to manufacture than silicon crystals.

Video: Stanford and Oxford scientists have created novel solar cells from crystalline perovskite that could rival and even outperform existing silicon cells on the market today. The new design converts sunlight to electricity at efficiencies of 20 percent, similar to current technology but at much lower cost.

In the video, Professor Michael McGehee and postdoctoral scholar Tomas Leijtens of Stanford describe the discovery, which could lead to thin-film solar cells with a record-setting 30% efficiency.

“Perovskite semiconductors have shown great promise for making high-efficiency solar cells at low cost,” said study co-author Michael McGehee, a professor of materials science and engineering at Stanford. “We have designed a robust, all-perovskite device that converts sunlight into electricity with an efficiency of 20.3 percent, a rate comparable to silicon solar cells on the market today.”

The new device consists of two perovskite solar cells stacked in tandem. Each cell is printed on glass, but the same technology could be used to print the cells on plastic, McGehee added.

“The all-perovskite tandem cells we have demonstrated clearly outline a roadmap for thin-film solar cells to deliver over 30 percent efficiency,” said co-author Henry Snaith, a professor of physics at Oxford. “This is just the beginning.”

Tandem technology

Previous studies showed that adding a layer of perovskite can improve the efficiency of silicon solar cells. But a tandem device consisting of two all-perovskite cells would be cheaper and less energy-intensive to build, the authors said.

Stanford post-doctoral scholar Tomas Leijtens and Professor Mike McGehee examine perovskite tandem solar cells.

Stanford post-doctoral scholar Tomas Leijtens and Professor Mike McGehee examine perovskite tandem solar cells. (Image credit: L.A. Cicero)

“A silicon solar panel begins by converting silica rock into silicon crystals through a process that involves temperatures above 3,000 degrees Fahrenheit (1,600 degrees Celsius),” said co-lead author Tomas Leijtens, a postdoctoral scholar at Stanford. “Perovskite cells can be processed in a laboratory from common materials like lead, tin and bromine, then printed on glass at room temperature.”

But building an all-perovskite tandem device has been a difficult challenge. The main problem is creating stable perovskite materials capable of capturing enough energy from the sun to produce a decent voltage.

A typical perovskite cell harvests photons from the visible part of the solar spectrum. Higher-energy photons can cause electrons in the perovskite crystal to jump across an “energy gap” and create an electric current.

A solar cell with a small energy gap can absorb most photons but produces a very low voltage. A cell with a larger energy gap generates a higher voltage, but lower-energy photons pass right through it.

An efficient tandem device would consist of two ideally matched cells, said co-lead author Giles Eperon, an Oxford postdoctoral scholar currently at the University of Washington.

“The cell with the larger energy gap would absorb higher-energy photons and generate an additional voltage,” Eperon said. “The cell with the smaller energy gap can harvest photons that aren’t collected by the first cell and still produce a voltage.”

Cross-section of new tandem solar cell

Cross-section of a new tandem solar cell designed by Stanford and Oxford scientists. The brown upper layer of perovskite captures low-energy lightwaves, and the red perovskite layer captures high-energy waves. (Image credit: Scanning electron microscopy image by Rebecca Belisle and Giles Eperon)

The smaller gap has proven to be the bigger challenge for scientists. Working together, Eperon and Leijtens used a unique combination of tin, lead, cesium, iodine and organic materials to create an efficient cell with a small energy gap.

“We developed a novel perovskite that absorbs lower-energy infrared light and delivers a 14.8 percent conversion efficiency,” Eperon said. “We then combined it with a perovskite cell composed of similar materials but with a larger energy gap.”

The result: A tandem device consisting of two perovskite cells with a combined efficiency of 20.3 percent.

“There are thousands of possible compounds for perovskites,” Leijtens added, “but this one works very well, quite a bit better than anything before it.”

Seeking stability

One concern with perovskites is stability. Rooftop solar panels made of silicon typically last 25 years or more. But some perovskites degrade quickly when exposed to moisture or light. In previous experiments, perovskites made with tin were found to be particularly unstable.

To assess stability, the research team subjected both experimental cells to temperatures of 212 degrees Fahrenheit (100 degrees Celsius) for four days.

“Crucially, we found that our cells exhibit excellent thermal and atmospheric stability, unprecedented for tin-based perovskites,” the authors wrote.

“The efficiency of our tandem device is already far in excess of the best tandem solar cells made with other low-cost semiconductors, such as organic small molecules and microcrystalline silicon,” McGehee said. “Those who see the potential realize that these results are amazing.”

The next step is to optimize the composition of the materials to absorb more light and generate an even higher current, Snaith said.

“The versatility of perovskites, the low cost of materials and manufacturing, now coupled with the potential to achieve very high efficiencies, will be transformative to the photovoltaic industry once manufacturability and acceptable stability are also proven,” he said.

Co-author Stacey Bent, a professor of chemical engineering at Stanford, provided key insights on tandem-fabrication techniques. Other Stanford coauthors are Kevin Bush, Rohit Prasanna, Richard May, Axel Palmstrom, Daniel J. Slotcavage and Rebecca Belisle. Oxford co-authors are Thomas Green, Jacob Tse-Wei Wang, David McMeekin, George Volonakis, Rebecca Milot, Jay Patel, Elizabeth S. Parrott, Rebecca Sutton, Laura Herz, Michael Johnston and Henry Snaith. Other co-authors are Bert Conings, Aslihan Babayigit and Hans-Gerd Boyen of Hasselt University in Belgium, and Wen Ma and Farhad Moghadam of SunPreme Inc.

Funding was provided by the Graphene Flagship, The Leverhulme Trust, U.K. Engineering and Physical Sciences Research Council, European Union Seventh Framework Programme, Horizon 2020, U.S. Office of Naval Research and the Global Climate and Energy Project at Stanford.

 

ASU and Stanford Researchers Achieve Record Breaking Efficiency with Tandem (Perovskite + Silicon) Solar Cell


ecee-perovskite-silicon-tandem-cell-pz-0035-w-1280x640Above: A perovskite/silicon tandem solar cell, created by research teams from Arizona State University and Stanford University, capable of record-breaking sunlight-to-electricity conversion efficiency. Photographer: Pete Zrioka/ASU

Some pairs are better together than their individual counterparts — peanut butter and chocolate, warm weather and ice cream, and now, in the realm of photovoltaic technology, silicon and perovskite.

As existing solar energy technologies near their theoretical efficiency limits, researchers are exploring new methods to improve performance — such as stacking two photovoltaic materials in a tandem cell. Collaboration between researchers at Arizona State University and Stanford University has birthed such a cell with record-breaking conversion efficiency — effectively finding the peanut butter to silicon’s chocolate.

ecee-holman-lab-jh-1123w

The results of their work, published February 17 in Nature Energy, outline the use of perovskite and silicon to create a tandem solar cell capable of converting sunlight to energy with an efficiency of 23.6 percent, just shy of the all-time silicon efficiency record.

“The best silicon solar cell alone has achieved 26.3 percent efficiency,” says Zachary Holman, an assistant professor of electrical engineering at the Ira A. Fulton Schools of Engineering. “Now we’re gunning for 30 percent with these tandem cells, and I think we could be there within two years.”

Assistant Professor Zachary Holman, holds one of the many solar cells his research group has created. Photographer: Jessica Hochreiter/ASU

Silicon solar cells are the backbone of a $30 billion a year industry, and this breakthrough shows that there’s room for significant improvement within such devices by finding partner materials to boost efficiency.

The high-performance tandem cell’s layers are each specially tuned to capture different wavelengths of light. The top layer, composed of a perovskite compound, was designed to excel at absorbing visible light. The cell’s silicon base is tuned to capture infrared light.

Perovskite, a cheap, easily manufacturable photovoltaic material, has emerged as a challenger to silicon’s dominance in the solar market. Since its introduction to solar technology in 2009, the efficiency of perovskite solar cells has increased from 3.8 percent to 22.1 percent in early 2016, according to the National Renewable Energy Laboratory.

The perovskite used in the tandem cell came courtesy of Stanford researchers Professor Michael McGehee and doctoral student Kevin Bush, who fabricated the compound and tested the materials.

The research team at ASU provided the silicon base and modeling to determine other material candidates for use in the tandem cell’s supporting layers.

OVERCOMING CHALLENGES WITH PEROVSKITES

 Zhengshan (Jason) Yu, an electrical engineering doctoral student at ASU, holds up a tiny black solar cell which is flanked by two conductors. The small cell has yielded big improvements, resulting in a record-breaking 23.6 percent efficiency rate.

Though low-cost and highly efficient, perovskites have been limited by poor stability, degrading at a much faster rate than silicon in hot and humid environments. Additionally, perovskite solar cells have suffered from parasitic absorption, in which light is absorbed by supporting layers in the cell that don’t generate electricity.

“We have improved the stability of the perovskite solar cells in two ways,” says McGehee, a materials science and engineering professor at Stanford’s School of Engineering. “First, we replaced an organic cation with cesium. Second, we protected the perovskite with an impermeable indium tin oxide layer that also functions as an electrode.”

Though McGehee’s compound achieves record stability, perovskites remain delicate materials, making it difficult to employ in tandem solar technology.

“In many solar cells, we put a layer on top that is both transparent and conductive,” says Holman, a faculty member in the School of Electrical, Computer and Energy Engineering. “It’s transparent so light can go through and conductive so we can take electrical charges off it.”

This top conductive layer is applied using a process called sputtering deposition, which historically has led to damaged perovskite cells. However, McGehee was able to apply a tin oxide layer with help from chemical engineering Professor Stacey Bent and doctoral student Axel Palmstrom of Stanford. The pair developed a thin layer that protects the delicate perovskite from the deposition of the final conductive layer without contributing to parasitic absorption, further boosting the cell’s efficiency.

The deposition of the final conductive layer wasn’t the only engineering challenge posed by integrating perovskites and silicon.

“It was difficult to apply the perovskite itself without compromising the performance of the silicon cell,” says Zhengshan (Jason) Yu, an electrical engineering doctoral student at ASU.

Silicon wafers are placed in a potassium hydroxide solution during fabrication, which creates a rough, jagged surface. This texture, ideal for trapping light and generating more energy, works well for silicon, but perovskite prefers a smooth — and unfortunately reflective — surface for deposition.

Additionally, the perovskite layer of the tandem cell is less than a micron thick, opposed to the 250-micron thick silicon layer. This means when the thin perovskite layer was deposited, it was applied unevenly, pooling in the rough silicon’s low points and failing to adhere to its peaks.

Yu developed a method to create a planar surface only on the front of the silicon solar cell using a removable, protective layer. This resulted in a smooth surface on one side of the cell, ideal for applying the perovskite, while leaving the backside rough, to trap the weakly-absorbed near-infrared light in the silicon.

“With the incorporation of a silicon nanoparticle rear reflector, this infrared-tuned silicon cell becomes an excellent bottom cell for tandems,” says Yu.

BUILDING ON PREVIOUS SUCCESSES

The success of the tandem cell is built on existing achievements from both teams of researchers. In October 2016, McGehee and post-doctoral scholar Tomas Leijtens fabricated an all-perovskite cell capable of 20.3 percent efficiency. The high-performance cell was achieved in part by creating a perovskite with record stability, marking McGehee’s group as one of the first teams to devote research efforts to fabricating stable perovskite compounds.

Likewise, Holman has considerable experience working with silicon and tandem cells.

“We’ve tried to position our research group as the go-to group in the U.S. for silicon bottom cells for tandems,” says Holman, who’s been pursuing additional avenues to create high-efficiency tandem solar cells.

In fact, Holman and Yu published a comment in Nature Energy in September 2016 outlining the projected efficiencies of different cell combinations in tandems.

“People often ask, ‘given the fundamental laws of physics, what’s the best you can do?’” says Holman. “We’ve asked and answered a different, more useful question: Given two existing materials, if you could put them together, ideally, what would you get?”’

The publication is a sensible guide to designing a tandem solar cell, specifically with silicon as the bottom solar cell, according to Holman.

It calculates what the maximum efficiency would be if you could pair two existing solar cells in a tandem without any performance loss. The guide has proven useful in directing research efforts to pursue the best partner materials for silicon.

“We have eight projects with different universities and organizations, looking at different types of top cells that go on top of silicon,” says Holman. “So far out of all our projects, our perovskite/silicon tandem cell with Stanford is the leader.”

Stanford University: Solving the “Storage Problem” for Renewable Energies: A New Cost Effective Re-Chargeable Aluminum Battery


stanford-alum-urea-battery-160405175659_1_540x360

One of the biggest missing links in renewable energy is affordable and high performance energy storage, but a new type of battery developed at Stanford University could be the solution.

Solar energy generation works great when the sun is shining [duh…like taking a Space Mission to the Sun .. but only at night! :-)] and wind energy is awesome when it’s windy (double duh…), but neither is very helpful for the grid after dark and when the air is still. That’s long been one of the arguments against renewable energy, even if there are plenty of arguments for developing additional solar and wind energy installations without large-scale energy storage solutions in place. However, if low-cost and high performance batteries were readily available, it could go a long way toward a more sustainable and cleaner grid, and a pair of Stanford engineers have developed what could be a viable option for grid-scale energy storage.

With three relatively abundant and low-cost materials, namely aluminum, graphite, and urea, Stanford chemistry Professor Hongjie Dai and doctoral candidate Michael Angell have created a rechargeable battery that is nonflammable, very efficient, and has a long lifecycle.

“So essentially, what you have is a battery made with some of the cheapest and most abundant materials you can find on Earth. And it actually has good performance. Who would have thought you could take graphite, aluminum, urea, and actually make a battery that can cycle for a pretty long time?” – Dai

A previous version of this rechargeable aluminum battery was found to be efficient and to have a long life, but it also employed an expensive electrolyte, whereas the latest iteration of the aluminum battery uses urea as the base for the electrolyte, which is already produced in large quantities for fertilizer and other uses (it’s also a component of urine, but while a pee-based home battery might seem like just the ticket, it’s probably not going to happen any time soon).

According to Stanford, the new development marks the first time urea has been used in a battery, and because urea isn’t flammable (as lithium-ion batteries are), this makes it a great choice for home energy storage, where safety is of utmost importance. And the fact that the new battery is also efficient and affordable makes it a serious contender when it comes to large-scale energy storage applications as well.

“I would feel safe if my backup battery in my house is made of urea with little chance of causing fire.” – Dai

According to Angell, using the new battery as grid storage “is the main goal,” thanks to the high efficiency and long life cycle, coupled with the low cost of its components. By one metric of efficiency, called Coulombic efficiency, which measures the relationship between the unit of charge put into the battery and the output charge, the new battery is rated at 99.7%, which is high.WEF solarpowersavemoney-628x330

In order to meet the needs of a grid-scale energy storage system, a battery would need to last at least a decade, and while the current urea-based aluminum ion batteries have been able to last through about 1500 charge cycles, the team is still looking into improving its lifetime in its goal of developing a commercial version.

The team has published some of its results in the Proceedings of the National Academy of Sciences, under the title “High Coulombic efficiency aluminum-ion battery using an AlCl3-urea ionic liquid analog electrolyte.”

 

PNL Battery Storage Systems 042016 rd1604_batteriesGrid-scale energy storage to manage our electricity supply would benefit from batteries that can withstand repeated cycling of discharging and charging. Current lithium-ion batteries have lifetimes of only 1,000-3,000 cycles. Now a team of researchers from Stanford University, Taiwan, and China have made a research prototype of an inexpensive, safe aluminum-ion battery that can withstand 7,500 cycles. In the aluminum-ion battery, one electrode is made from affordable aluminum, and the other is composed of carbon in the form of graphite.

Read: A step towards new, faster-charging, and safer batteries

 

Stanford University: Researchers Make Wires That Are Just Three Atoms Wide


 

diamondoid-penny-fullresResearchers at Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory have discovered a way to use diamondoids — the smallest possible bits of diamond — to self-assemble atoms, LEGO-style, into the thinnest possible electrical wires, just three atoms wide.

 

Scientists at Stanford University and the US Department of Energy’s SLAC National Accelerator Laboratory have discovered a way to use diamondoids – the smallest possible bits of diamond – to assemble atoms into the thinnest possible electrical wires, just three atoms wide.

By grabbing various types of atoms and putting them together building-block style, the new technique could potentially be used to build tiny wires for a wide range of applications, including fabrics that generate electricity, optoelectronic devices that employ both electricity and light, and superconducting materials that conduct electricity without any loss. The scientists reported their results in Nature Materials.

The animation above shows molecular building blocks joining the tip of a growing nanowire. Each block consists of a diamondoid – the smallest possible bit of diamond – attached to sulfur and copper atoms (yellow and brown spheres). Like LEGO blocks, they only fit together in certain ways that are determined by their size and shape. The copper and sulfur atoms form a conductive wire in the middle, and the diamondoids form an insulating outer shell.

“What we have shown here is that we can make tiny, conductive wires of the smallest possible size that essentially assemble themselves,” said Hao Yan, a Stanford postdoctoral researcher and lead author of the paper. “The process is a simple, one-pot synthesis. You dump the ingredients together and you can get results in half an hour. It’s almost as if the diamondoids know where they want to go.”

diamondoid-penny-fullres

Fuzzy white clusters of nanowires on a lab bench, with a penny for scale.
Image Source – Hao Yan/SIMES; photo by SLAC National Accelerator Laboratory
In the image above, assembled with the help of diamondoids, the microscopic nanowires can be seen with the naked eye because the strong mutual attraction between their diamondoid shells makes them clump together, in this case by the millions. Also, at top right, is an image made with a scanning electron microscope shows nanowire clusters magnified 10,000 times.

 

Related articles

There are other methods to get materials to self-assemble, but this is the first one shown to make a nanowire with a solid, crystalline core that has good electronic properties, said study co-author Nicholas Melosh, an associate professor at SLAC and Stanford and investigator with SIMES, the Stanford Institute for Materials and Energy Sciences at SLAC.

Over the past decade, a SIMES research program led by Melosh and SLAC/Stanford Professor Zhi-Xun Shen has found a number of potential uses for the little diamonds, including improving electron microscope images and making tiny electronic gadgets.

The needle-like wires have a semiconducting core – a combination of copper and sulfur known as a chalcogenide – surrounded by the attached diamondoids, which form an insulating shell.

The wires minuscule size is important, Melosh said, because a material that exists in just one or two dimensions – as atomic-scale dots, wires or sheets – can have very different, extraordinary properties compared to the same material made in bulk. The new method allows researchers to assemble those materials with atom-by-atom precision and control.

“You can imagine weaving those into fabrics to generate energy,” Melosh said. “This method gives us a versatile toolkit where we can tinker with a number of ingredients and experimental conditions to create new materials with finely tuned electronic properties and interesting physics.”

 

 

 

Read Genesis Nanotech Online ~ Stanford team demonstrates a graphene-based thermal-to-electricity conversion technology + More News


 

Global Nano II 041316 41hQZPuT5NL._SX298_BO1,204,203,200_

 

This Week: Articles Like …

 

 

mit-porous-586fc77fe8e8dThe Stanford team demonstrates a graphene-based thermal-to-electricity conversion technology Researchers at Stanford University have recently demonstrated a graphene-based high efficiency thermal-t…

 

 


mit-stronger-graph-586fc75e37984A team of researchers at MIT has designed one of the strongest lightweight materials known, by compressing and fusing flakes of graphene, a two-dimensional form of carbon. The new material, a spong…

 

 

 

img_0826-1Advanced nanostructures are well on their way to revolutionising quantum technology for quantum networks based on light. Researchers from the Niels Bohr Institute have now developed the first build…

 

 

 

img_0742-5

Read All The News Here: Genesis Nanotech: “Great Things from Small Things”