A new nanophotonic material has broken records for high-temperature stability, potentially ushering in more efficient electricity production and opening a variety of new possibilities in the control and conversion of thermal radiation.
Developed by a University of Michigan-led team of chemical and materials science engineers, the material controls the flow of infrared radiation and is stable at temperatures of 2,000 degrees Fahrenheit in air, a nearly twofold improvement over existing approaches.
The material uses a phenomenon called destructive interference to reflect infrared energy while letting shorter wavelengths pass through. This could potentially reduce heat waste in thermophotovoltaic cells, which convert heat into electricity but can’t use infrared energy, by reflecting infrared waves back into the system. The material could also be useful in optical photovoltaics, thermal imaging, environmental barrier coatings, sensing, camouflage from infrared surveillance devices and other applications.
“It’s similar to the way butterfly wings use wave interference to get their color. Butterfly wings are made up of colorless materials, but those materials are structured and patterned in a way that absorbs some wavelengths of white light but reflects others, producing the appearance of color,” said Andrej Lenert, U-M assistant professor of chemical engineering and co-corresponding author of the study in Nature Photonics (“Nanophotonic control of thermal emission under extreme conditions”).
“This material does something similar with infrared energy. The challenging part has been preventing breakdown of that color-producing structure under high heat.”
The approach is a major departure from the current state of engineered thermal emitters, which typically use foams and ceramics to limit infrared emissions. These materials are stable at high temperature but offer very limited control over which wavelengths they let through. Nanophotonics could offer much more tunable control, but past efforts haven’t been stable at high temperatures, often melting or oxidizing (the process that forms rust on iron). In addition, many nanophotonic materials only maintain their stability in a vacuum.
The new material works toward solving that problem, besting the previous record for heat resistance among air-stable photonic crystals by more than 900 degrees Fahrenheit in open air. In addition, the material is tunable, enabling researchers to tweak it to modify energy for a wide variety of potential applications. The research team predicted that applying this material to existing TPVs will increase efficiency by 10% and believes that much greater efficiency gains will be possible with further optimization.
The team developed the solution by combining chemical engineering and materials science expertise. Lenert’s chemical engineering team began by looking for materials that wouldn’t mix even if they started to melt.
“The goal is to find materials that will maintain nice, crisp layers that reflect light in the way we want, even when things get very hot,” Lenert said. “So we looked for materials with very different crystal structures, because they tend not to want to mix.”
They hypothesized that a combination of rock salt and perovskite, a mineral made of calcium and titanium oxides, fit the bill. Collaborators at U-M and the University of Virginia ran supercomputer simulations to confirm that the combination was a good bet.
John Heron, co-corresponding author of the study and an assistant professor of materials science and engineering at U-M, and Matthew Webb, a doctoral student in materials science and engineering, then carefully deposited the material using pulsed laser deposition to achieve precise layers with smooth interfaces. To make the material even more durable, they used oxides rather than conventional photonic materials; the oxides can be layered more precisely and are less likely to degrade under high heat.
“In previous work, traditional materials oxidized under high heat, losing their orderly layered structure,” Heron said. “But when you start out with oxides, that degradation has essentially already taken place. That produces increased stability in the final layered structure.”
After testing confirmed that the material worked as designed, Sean McSherry, first author of the study and a doctoral student in materials science and engineering at U-M, used computer modeling to identify hundreds of other combinations of materials that are also likely to work. While commercial implementation of the material tested in the study is likely years away, the core discovery opens up a new line of research into a variety of other nanophotonic materials that could help future researchers develop a range of new materials for a variety of applications.
Your knees and your smartphone battery have some surprisingly similar needs, a University of Michigan professor has discovered, and that new insight has led to a ‘structural battery’ prototype that incorporates a cartilage-like material to make the batteries highly durable and easy to shape. Credit: Evan Doughtry
Your knees and your smartphone battery have some surprisingly similar needs, a University of Michigan professor has discovered, and that new insight has led to a “structural battery” prototype that incorporates a cartilage-like material to make the batteries highly durable and easy to shape.
The idea behind structural batteries is to store energy in structural components — the wing of a drone or the bumper of an electric vehicle, for example. They’ve been a long-term goal for researchers and industry because they could reduce weight and extend range. But structural batteries have so far been heavy, short-lived or unsafe.
In a study published in ACS Nano, the researchers describe how they made a damage-resistant rechargeable zinc battery with a cartilage-like solid electrolyte. They showed that the batteries can replace the top casings of several commercial drones. The prototype cells can run for more than 100 cycles at 90 percent capacity, and withstand hard impacts and even stabbing without losing voltage or starting a fire.
“A battery that is also a structural component has to be light, strong, safe and have high capacity. Unfortunately, these requirements are often mutually exclusive,” said Nicholas Kotov, the Joseph B. and Florence V. Cejka Professor of Engineering, who led the research.
Harnessing the properties of cartilage
To sidestep these trade-offs, the researchers used zinc — a legitimate structural material — and branched nanofibersthat resemble the collagen fibers of cartilage.
“Nature does not have zinc batteries, but it had to solve a similar problem,” Kotov said. “Cartilage turned out to be a perfect prototype for an ion-transporting material in batteries. It has amazing mechanics, and it serves us for a very long time compared to how thin it is. The same qualities are needed from solid electrolytes separating cathodes and anodes in batteries.”
In our bodies, cartilage combines mechanical strength and durability with the ability to let water, nutrients and other materials move through it. These qualities are nearly identical to those of a good solid electrolyte, which has to resist damage from dendrites while also letting ions flow from one electrode to the other.
Dendrites are tendrils of metal that pierce the separator between the electrodes and create a fast lane for electrons, shorting the circuit and potentially causing a fire. Zinc has previously been overlooked for rechargeable batteries because it tends to short out after just a few charge/discharge cycles.
Not only can the membranes made by Kotov’s team ferry zinc ions between the electrodes, they can also stop zinc’s piercing dendrites. Like cartilage, the membranes are composed of ultrastrong nanofibers interwoven with a softer ion-friendly material.
In the batteries, aramid nanofibers — the stuff in bulletproof vests — stand in for collagen, with polyethylene oxide (a chain-like, carbon-based molecule) and a zinc salt replacing soft components of cartilage.
Demonstrating safety and utility
To make working cells, the team paired the zinc electrodes with manganese oxide — the combination found in standard alkaline batteries. But in the rechargeable batteries, the cartilage-like membrane replaces the standard separator and alkaline electrolyte. As secondary batteries on drones, the zinc cells can extend the flight time by 5 to 25 percent — depending on the battery size, mass of the drone and flight conditions.
Safety is critical to structural batteries, so the team deliberately damaged their cells by stabbing them with a knife. In spite of multiple “wounds,” the battery continued to discharge close to its design voltage. This is possible because there is no liquid to leak out.
For now, the zinc batteries are best as secondary power sources because they can’t charge and discharge as quickly as their lithium ion brethren. But Kotov’s team intends to explore whether there is a better partner electrode that could improve the speed and longevity of zinc rechargeable batteries.
The research was supported by the Air Force Office of Scientific Research and National Science Foundation. Kotov teaches in the Department of Chemical Engineering. He is also a professor of materials science and engineering, and macromolecular science and engineering.
Topological transitions of a deformed kagome lattice by uniform soft twisting. Credit: Nature Communications (2017). DOI: 10.1038/ncomms14201
When a material is made, you typically cannot change whether that material is hard or soft. But a group of University of Michigan researchers have developed a new way to design a “metamaterial” that allows the material to switch between being hard and soft without damaging or altering the material itself.
Metamaterials are man-made materials that get their properties—in this case, whether a material is hard or soft—from the way the material is constructed rather than the material that constructs it. This allows researchers to manipulate a metamaterial’s structure in order to make the material exhibit a certain property.
In the group’s study, published in the journal Nature Communications, the U-M researchers discovered a way to compose a metamaterial that can be easily manipulated to increase the stiffness of its surface by orders of magnitude—the difference between rubber and steel.
Since these properties are “topologically protected,” meaning that the material’s properties come from its total structure, they’re easily maintained even as the material shifts repeatedly between its hard and soft states.
“The novel aspect of this metamaterial is that its surface can change between hard and soft,” said Xiaoming Mao, assistant professor of physics. “Usually, it’s hard to change the stiffness of a traditional material. It’s either hard or soft after the material is made.”
For example, a dental filling cannot be changed after the dentist has set the filling without causing stress, either by drilling or grinding, to the original filling. A guitar string cannot be tightened without putting stress on the string itself, according to Mao.
Mao says the way an object comes in contact with the edge of the metamaterial changes the geometry of the material’s structure, and therefore how the material responds to stress at the edge. But metamaterial’s topological protection allows the inside of the metamaterial remains damage free.
The material could one day be used to build cars or rocket launch systems. In cars, the material could help absorb impacts from a crash.
“When you’re driving a car, you want the car to be stiff and to support a load,” Mao said. “During a collision, you want components to become softer to absorb the energy from the collision and protect the passenger in the car.”
The researchers also suggest the material could be used to make bicycle tires that could self-adjust to ride more easily on soft surfaces such as sand, or to make damage-resistant, reusable rockets.
More information: D. Zeb Rocklin et al. Transformable topological mechanical metamaterials, Nature Communications (2017). DOI: 10.1038/ncomms14201
In research that could one day lead to advances against neurodegenerative diseases like Alzheimer’s and Parkinson’s, University of Michigan engineering researchers have demonstrated a technique for precisely measuring the properties of individual protein molecules floating in a liquid.
Proteins are essential to the function of every cell. Measuring their properties in blood and other body fluids could unlock valuable information, as the molecules are a vital building block in the body. The body manufactures them in a variety of complex shapes that can transmit messages between cells, carry oxygen and perform other important functions.
Sometimes, however, proteins don’t form properly. Scientists believe that some types of these misshapen proteins, called amyloids, can clump together into masses in the brain. The sticky tangles block normal cell function, leading to brain cell degeneration and disease.
But the processes of how amyloids form and clump together are not well understood. This is due in part to the fact that there’s currently not a good way to study them. Researchers say current methods are expensive, time-consuming and difficult to interpret, and can only provide a broad picture of the overall level of amyloids in a patient’s system.
The University of Michigan and University of Fribourg researchers who developed the new technique believe that it could help solve the problem by measuring an individual molecule’s shape, volume, electrical charge, rotation speed and propensity for binding to other molecules.
They call this information a “5-D fingerprint” and believe that it could uncover new information that may one day help doctors track the status of patients with neurodegenerative diseases and possibly even develop new treatments. Their work is detailed in a paper published in Nature Nanotechnology.
“Imagine the challenge of identifying a specific person based only on their height and weight,” said David Sept, a U-M biomedical engineering professor who worked on the project. “That’s essentially the challenge we face with current techniques. Imagine how much easier it would be with additional descriptors like gender, hair color and clothing. That’s the kind of new information 5-D fingerprinting provides, making it much easier to identify specific proteins.”
Michael Mayer, the lead author on the study and a former U-M researcher who’s now a biophysics professor at Switzerland’s Adolphe Merkle Institute, says identifying individual proteins could help doctors keep better tabs on the status of a patient’s disease, and it could also help researchers gain a better understanding of exactly how amyloid proteins are involved with neurodegenerative disease.
To take the detailed measurements, the research team uses a nanopore 10-30 nanometers wide—so small that only one protein molecule can fit through at a time. The researchers filled the nanopore with a salt solution and passed an electric current through the solution.
As a protein molecule tumbles through the nanopore, its movement causes tiny, measurable fluctuations in the electric current. By carefully measuring this current, the researchers can determine the protein’s unique five-dimensional signature and identify it nearly instantaneously.
“Amyloid molecules not only vary widely in size, but they tend to clump together into masses that are even more difficult to study,” Mayer said. “Because it can analyze each particle one by one, this new method gives us a much better window to how amyloids behave inside the body.”
Ultimately, the team aims to develop a device that doctors and researchers could use to quickly measure proteins in a sample of blood or other body fluid. This goal is likely several years off; in the meantime, they are working to improve the technique’s accuracy, honing it in order to get a better approximation of each protein‘s shape. They believe that in the future, the technology could also be useful for measuring proteins associated with heart disease and in a variety of other applications as well.
“I think the possibilities are pretty vast,” Sept said. “Antibodies, larger hormones, perhaps pathogens could all be detected. Synthetic nanoparticles could also be easily characterized to see how uniform they are.”
The study is titled “Real-time shape approximation and fingerprinting of single proteins using a nanopore.”
Shape has a pervasive but often overlooked impact on how natural systems are ordered. At the same time, entropy (the probabilistic measure of the degree of energy delocalization in a system) – while often misunderstood as the state of a system’s disorder – and emergence (the sometimes controversial observance of macroscopic behaviors not seen in isolated systems of a few constituents) are two areas of research that have long received, and are likely to continue receiving, significant scientific attention. Now, materials science and chemical engineering researchers working with computer simulations of colloidal suspensions of hard nanoparticles at University of Michigan, Ann Arbor have linked entropy and emergence through a little-understood property they refer to as shape entropy – an emergent, entropic effect – unrelated to geometric entropy or topological entropy – that differs from and competes with intrinsic shape properties that arise from both the shape geometry and the material itself and affect surface, chemical and other intrinsic properties.
According to the researchers, shape entropy directly affects system structure through directional entropic forces (DEFs) that align neighboring particles and thereby optimize local packing density. Interestingly, the scientists demonstrate that shape entropy drives the emergence of DEFs in a wide class of soft matter systems as particles adopt local dense packing configurations when crowded and drives the phase behavior of systems of anisotropic shapes into complex crystals, liquid crystals and even ordered but non-periodic structures called quasicrystals through these DEFs. (Anisotropy refers to a difference in a material’s physical or mechanical properties – absorbance, refractive index, conductivity, tensile strength, and so on – when measured along different axes.)
Prof. Sharon C. Glotzer discussed the paper that she, lead author and Research Investigator Dr. Greg van Anders and their co-authors published in Proceedings of the National Academy of Sciences, noting that one of the fundamental issues they faced was the historical problem of linking microscopic mechanisms with macroscopic emergent behavior. “This is a difficult problem that was really, to our knowledge, only brought into sharp contrast for physical systems by Philip Warren Anderson in his 1972 essay More is Different1 – and really, the title says it all,” van Anders tells Phys.org. (Anderson is a physicist and Nobel laureate who in his essay addressed emergent phenomena and the limitations of reductionism.) “We’re interested in the type of systems that are dominated by entropy – meaning that their behavior originates from effects of the system as whole,” Glotzer points out. “In a way, we’re grappling with the problem of how things that operate with basic rules can produce complicated behavior.” For Glotzer and her team, the rules are shapes, and the behavior takes the form of complex crystals. “It’s very important to understand shape effects in nanosystems,” she adds, “because nanoparticles tend to have a natural shape to them because of how they grow.”
In addressing this problem, the scientists – in addition to isolating shape entropy in model systems – had to precisely delineate between and correlate the relative influences of shape entropy and intrinsic shape effects. This can be formidable: While the intrinsic shape of a cell or nanoparticle affects a range of other intrinsic properties, such as its surface and chemical characteristics, shape entropy is an effect that emerges from the geometry of the shape itself in the context of other shapes crowded around it. “Intrinsic shape effects are conceptually straightforward because they’re forces that originate from van der Waals, Coulomb, and other electrostatic and other forces, though in practice they may not be easy to measure experimentally,” Glotzer explains. “However, comparing intrinsic shape effects to shape entropy is a bit like comparing apples and oranges: there are many ways to characterize shapes, but forces aren’t typically one of them.” Moreover, research has historically focused on shape effects in specific systems, so a general solution was elusive, and there were no rules specifying the types of systems where shape effects might be seen.
Not surprisingly, then, a significant obstacle was quantitatively demonstrating that shape drives the phase behavior of systems of anisotropic particles upon crowding through directional entropic forces. “Our main problem here was trying to understand how there could be a local mechanism for global ordering that acts through entropy – which is a global construct,” Glotzer says. “It took us a while to realize that other investigators had already been asking this question for systems containing mixtures of large particles and very small particles.” (The latter, known as depletants, induce assembly or crystallization of larger particles.) “However,” she continues, “it was more challenging to determine how to pose and interpret this question mathematically when all particles are the same.” Glotzer adds that the technique van Anders and the rest of her team used to understand these systems – the potential of mean force and torque (PMFT), a treatment of isotropic entropic forces first given in 1949 by Jan de Boer2 at the Institute for Theoretical Physics, University of Amsterdam – is in many ways rather basic. Nevertheless, and somewhat remarkably, PMFT provided them with the key by allowing them to quantify directional entropic forces between anisotropic particles at arbitrary density. (PMFT is related to the potential of mean force, or PMF, an earlier approach that – unlike PMFT – has no concept of relative orientation between particles, and regarding shapes would only provide insight into radial, but not angular, dependence.)
The paper also address the relationships between shape entropy, self-assembly and packing behavior. (Self-assembly refers to thermodynamically stable or metastable phases that arise from systems maximizing their generalized entropy through spontaneous self-assembly in the presence of energetic and volumetric constraints, such as temperature and pressure; or through directed self-assembly due to other constraints, such as electromagnetic fields.) “Once we had determined how to measure the directional entropic forces,” van Anders explains, “the entropy/self-assembly connection became evident: On the systems we studied, the forces we were able to measure between particles were exactly in the range they should be to contribute to self-assembly (several kBT), which is on the order of intrinsic interactions between nanoparticles and on the scale of temperature-induced random motion.” (The metric kBT is the product of the Boltzmann constant, k, and the temperature, T, used in physics as a scaling factor for energy values or as a unit of energy in molecular-scale systems.)
That said, the scientists were able to use directional entropic forces to draw a distinction between self-assembly and packing behavior. “This was puzzling: For a long time, global density packing arguments have been used to predict assembly behavior in a range of systems,” Glotzer continues. “However, in the last few years – especially as researchers began looking more seriously at the anisotropic shapes being fabricated in the lab – these packing arguments started failing. Around the same time my group wrote a paper that showed that the assembled behavior can often be predicted by looking at the structure of a dense fluid of particles that hasn’t yet assembled.” The researchers realized that the forces they were seeing in their calculations were coming from local dense packing that happens in the fluid and the assembled systems. This showed that self-assembly and packing behavior were related, but not by global dense packing.
An important implication of understanding how shape entropy drives both self-assembly and packing despite their observable differences, Glotzer points out, is that there is growing interest in making ordered materials for various optical, electronic and other applications. “We’ve shown that, in general, it’s possible to use shape to control the structure of these materials,” she explains. “Now that we understand why particles are doing what they do when they form these materials, it becomes much easier to determine how to design them to generate desired materials rather than just going by trial-and-error.”
Another dramatic realization was that shape entropy drives the phase behavior of systems of anisotropic shapes through directional entropic forces. “We already knew from prior work in my group that you can quite often predict what crystal structure will form by looking at the fluid and the particle shape,” Glotzer tells Phys.org. “The problem for us was identifying what caused particles to arrange into the local structures they did in the fluid, and to show that they had the same sort of structure when they assembled.” Van Anders adds that the scientists were able to find the forces that induced and kept the particles in their preferred structures. “When they turned out to be in the right range we knew that we had it right.”
To date, the researchers have conducted their simulation studies only on idealized model systems. “Still,” says Glotzer, “our simulations capture what we believe to be the most important features of real colloidal systems.” Indeed, a growing number of published experimental studies now report the same structures her team predicted, and no counter-results have yet been observed. “We’re working closely with collaborators to leverage existing experimental techniques that will allow us to measure the strength of these forces and compare them with our predictions.” One such approach is measuring directional entropic forces in the lab by using confocal microscopy to determine the location and orientation of particles in assembling systems.
Moreover, Glotzer’s research group is collaborating with several experimental groups to investigate potential approaches to exploiting shape effects in the laboratory. “Now that we understand how local entropic forces work,” she tells Phys.org, “we can begin to think about designing particles so that entropy and internal energy balance in just the right way to yield complex target structures.”
Glotzer and van Anders conclude that “Researchers have been thinking about different kinds of entropy-driven systems since the 1930s, and since the 1950s have done a lot of work in systems in so-called depletant mixtures – but to our knowledge most people tend to think of those systems as having little to do with densely crowded, single-particle systems. Our work helps to tie these different lines of research together – and we hope that the decades of work done by the community in trying to understand depletant systems can help us get a deeper understanding of pure, dense systems, so that we can narrow our search for interesting new materials.”
Flexible electronics are the gateway to a new generation of phones, brain implants, artificial limbs, solar cells, and limitless other devices that benefit from the ability to bend, fold, and rollup.
The problem is figuring out how to make them.
Stretchability and conductivity are difficult properties to combine. Materials that are good conductors do not stretch well and materials that do stretch well are not good conductors.
This happens because the stretching of solid material lengthens chemical bonds, changing the distance between atoms, and in turn, decreasing conductivity. Alternatively, the crystalline structures of metals, which makes them good conductors of heat and electricity, are hard to mold since their internal bonds are not very forgiving.
“This is the story throughout the entire family of stretchable conductors,” said study researcher Nicholas Kotov, a professor of engineering at the University of Michigan, who may have developed the best stretchy conductor yet.
The new material is made from gold nanoparticles that are embedded in a flexible synthetic material called polyurethane. The bendy film, described in a paper published in Nature on Wednesday, July 17, can conduct electricity even when stretched to more than twice its original length.
Scientists used electron microscope images to see what happened when the material was stretched. It turns out that the gold nanoparticles aligned into chains when pulled — instead of becoming disorganized — creating a good conducting pathway. Importantly, the nanoparticles rearranged themselves when the strain was released, meaning the process is reversible.
The gold nanoparticles are produced in the lab, represented by this deep purple substance.
The secret lies in the gold nanoparticles, which were made in the lab so that they they would have a very thin shells on their surface. The thin shells are much better than thicker traditional shells.
“This is important because the shell stabilizes the particles and typically prevents the transfer of electrons from one nanoparticle to the other,” Kotov told Business Insider.
Without a thick shell, the electrons can hop from one nanoparticle to another more easily and are able to conduct electricity very well.
The practical applications of elastic metal are far-reaching, but Kotov is particularly interested in how his material can be used to improve medical devices.
There are a number of implantable devices for the brain, heart, and muscles. The problem with these rigid electrodes is that the human tissue easily recognizes them as foreign materials and generates scar tissue as a response, explains Kotov. The scar tissue reduces the performance of implantable devices. A pliable material that is more akin to our soft tissue is key to longer-term implants.
The search for a material that has the unusual combination of stretchability and electrical conductivity is ongoing, but this is a critical step forward.