NIST Research Suggests Graphene Can Stretch to be a Tunable Ion Filter – Applications for nanoscale sensors, drug delivery and water purification


 

 

Researchers at the National Institute of Standards and Technology (NIST) have conducted simulations suggesting that graphene, in addition to its many other useful features, can be modified with special pores to act as a tunable filter or strainer for ions (charged atoms) in a liquid.

The concept, which may also work with other membrane materials, could have applications such as nanoscale mechanical sensors, drug delivery, water purification and sieves or pumps for ion mixtures similar to biological ion channels, which are critical to the function of living cells. The research is described in the November 26 issue of Nature Materials.

“Imagine something like a fine-mesh kitchen strainer with sugar flowing through it,” project leader Alex Smolyanitsky said. “You stretch that strainer in such a way that every hole in the mesh becomes 1-2 percent larger. You’d expect that the flow through that mesh will be increased by roughly the same amount. Well, here it actually increases 1,000 percent. I think that’s pretty cool, with tons of applications.”

If it can be achieved experimentally, this graphene sieve would be the first artificial ion channel offering an exponential increase in ion flow when stretched, offering possibilities for fast ion separations or pumps or precise salinity control. Collaborators plan laboratory studies of these systems, Smolyanitsky said.

Graphene is a layer of carbon atoms arranged in hexagons, similar in shape to chicken wire, that conducts electricity. The NIST molecular dynamics simulations focused on a graphene sheet 5.5 by 6.4 nanometers (nm) in size and featuring small holes lined with oxygen atoms. These pores are crown ethers—electrically neutral circular molecules known to trap metal ions. A previous NIST simulation study showed this type of graphene membrane might be used for nanofluidic computing.

In the simulations, the graphene was suspended in water containing potassium chloride, a salt that splits into potassium and chlorine ions. The crown ether pores can trap potassium ions, which have a positive charge. The trapping and release rates can be controlled electrically. An electric field of various strengths was applied to drive the ion current flowing through the membrane.

Researchers then simulated tugging on the membrane with various degrees of force to stretch and dilate the pores, greatly increasing the flow of potassium ions through the membrane. Stretching in all directions had the biggest effect, but even tugging in just one direction had a partial effect.

Researchers found that the unexpectedly large increase in ion flow was due to a subtle interplay of a number of factors, including the thinness of graphene; interactions between ions and the surrounding liquid; and the ion-pore interactions, which weaken when pores are slightly stretched. There is a very sensitive balance between ions and their surroundings, Smolyanitsky said.

The research was funded by the Materials Genome Initiative.


Paper: A. Fang, K. Kroenlein, D. Riccardi and A. Smolyanitsky. Highly mechanosensitive ion channels from graphene-embedded crown ethers. Nature Materials. Published online November 26, 2018. DOI: 10.1038/s41563-018-0220-4

DNA and Nanotechnology: Protein ‘Rebar’ could help make Error-Free Nanostructures: NIST


DNA proteinrebarThe protein RecA (purple units), wraps around and fortifies double-stranded DNA, enabling scientists to build large structures with the genetic material. Credit: NIST

DNA is the stuff of life, but it is also the stuff of nanotechnology. Because molecules of DNA with complementary chemical structures recognize and bind to one another, strands of DNA can fit together like Lego blocks to make nanoscale objects of complex shape and structure.

But researchers need to work with much larger assemblages of DNA to realize a key goal: building durable miniature devices such as biosensors and drug-delivery containers. That’s been difficult because long chains of DNA are floppy and the standard method of assembling long chains is prone to error.

Using a DNA-binding protein called RecA as a kind of nanoscale rebar, or reinforcing bar, to support the floppy DNA scaffolding, researchers at the National Institute of Standards and Technology (NIST) have constructed several of the largest rectangular, linear and other shapes ever assembled from DNA. The structures can be two to three times larger than those built using standard DNA self-assembly techniques.

In addition, because the new method requires fewer chemically distinct pieces to build organized structures than the standard technique, known as DNA origami, it is likely to reduce the number of errors in constructing the shapes. That’s a big plus for the effort to produce reliable DNA-based devices in large quantities, said NIST researcher Alex Liddle.

Although RecA’s ability to bind to double-stranded DNA has been known for years, the NIST team is the first to integrate filaments of this protein into the assembly of DNA structures. The addition of RecA offers a particular advantage: Once one unit of the protein binds to a small segment of double-stranded DNA, it automatically attracts other units to line up alongside it, in the same way that bar magnets will join end-to-end. Like bricks filling out a foundation, RecA lines the entire length of the DNA strand, stretching, widening and strengthening it. A floppy, 2-nanometer-wide strand of DNA can transform into a rigid structure more than four times as wide.

“The RecA method greatly extends the ability of DNA self-assembly methods to build larger and more sophisticated structures,” said NIST’s Daniel Schiffels.

Schiffels, Liddle and their colleague Veronika Szalai describe their work in a recent article in ACS Nano.

The new method incorporates the DNA origami technique and goes beyond it, according to Liddle. In DNA origami, short strands of DNA that have a specific sequence of four base pairs are used as staples to tie together long sections of DNA. To make the skinny DNA skeleton stronger and thicker, the strand may loop back on itself, quickly using up the long string.

If DNA origami is all about the folding, Liddle likened his team’s new method to building a room, starting with a floor plan. The location of the short, single-stranded pieces of DNA that act as staples mark the corners of the room. Between the corners lies a long, skinny piece of single-stranded DNA. The enzyme DNA polymerase transforms a section of the long piece of single-stranded DNA into the double-stranded version of the molecule, a necessary step because RecA only binds strongly to double-stranded DNA. Then RecA assembles all along the double strand, reinforcing the DNA  and limiting the need for extra staples to maintain its shape.

With fewer staples required, the RecA  is likely able to build organized structures with fewer errors than DNA origami, Liddle said.

 Explore further: The promise of nanomanufacturing using DNA origami

More information: Daniel Schiffels et al. Molecular Precision at Micrometer Length Scales: Hierarchical Assembly of DNA–Protein Nanostructures, ACS Nano (2017). DOI: 10.1021/acsnano.7b00320

 

IBM Scientists: Quantum transport goes ballistic ~ ‘Q & A’ with NIST


 

IBM QT download

IBM scientists have shot an electron through an III-V semiconductor nanowire integrated on silicon for the first time (Nano Letters, “Ballistic One-Dimensional InAs Nanowire Cross-Junction Interconnects”). This achievement will form the basis for sophisticated quantum wire devices for future integrated circuits used in advanced powerful computational systems. (A Q&A with NIST)

NIST: The title of your paper is Ballistic one-dimensional InAs nanowire cross-junction interconnects. When I read “ballistic” rather large missiles come to mind, but here you are doing this at the nanoscale. Can you talk about the challenges this presents?
Johannes Gooth (JG): Yes, this is very similar, but of course at a much different scale. Electrons are fired from one contact electrode and fly through the nanowire without being scattered until they hit the opposed electrode. The nanowire acts as a perfect guide for electrons, such that the full quantum information of this electron (energy, momentum, spin) can be transferred without losses. 

Quantum Trnp goes Ballistic id46328_1IBM scientist Johannes Gooth is focused on nanoscale electronics and quantum physics.

 

We can now do this in cross-junctions, which allows us to build up electron pipe networks, where quantum information can perfectly be transmitted. The challenge is to fabricate a geometrically very well defined material with no scatterers inside on the nano scale. The template-assisted selective epitaxy or TASE process, which was developed here at the IBM Zurich Lab by my colleagues, makes this possible for the first time.
NIST: How does this research compare to other activities underway elsewhere?
JG: Most importantly, compared to optical and superconducting quantum applications the technique is scalable and compatible with standard electronics and CMOS processes.
NIST: What role do you see for quantum transport as we look to build a universal quantum computer?
JG: I see quantum transport as an essential piece. If you want to exercise the full power of quantum information technology, you need to connect everything ballistic: a quantum system that is fully ballistically (quantum) connected has an exponentially larger computational state space compared to classically connected systems.
Also, as stated above, the electronics are scalable. Moreover, combining our nanowire structures with superconductors allows for topological protected quantum computing, which enables fault tolerant computation. These are major advantages compared to other techniques.IBM QT II 51XVJRfhbNL._SX348_BO1,204,203,200_
NIST: How easily can this be manufactured using existing processes and what’s the next step?
JG: This is a major advantage of our technique because our devices are fully integrated into existing CMOS processes and technology.
NIST: What’s next for your research?
JG: The next steps will be the functionalization of the crosses, by means of attaching electronic quantum computational parts. We will start to build superconducting/nanowire hybrid devices for Majorana braiding, and attach quantum dots.
Source: NIST

 

New nanosensor of Silk could speed development of new infrastructure; Aerospace and Consumer Materials 



Silk nanosensor could speed development of new infrastructure, aerospace and consumer materials (Middle) Mechanophore-labeled silk fiber fluoresces in response to damage or stress. (Right) Control sample without the mechanophore. (Image: Chelsea Davis and Jeremiah Woodcock/NIST)

Posted: Mar 17, 2017

Consumers want fuel-efficient vehicles and high-performance sporting goods, municipalities want weather-resistant bridges, and manufacturers want more efficient ways to make reliable cars and aircraft. 
What’s needed are new lightweight, energy-saving composites that won’t crack or break even after prolonged exposure to environmental or structural stress. 

To help make that possible, researchers working at the National Institute of Standards and Technology (NIST) have developed a way to embed a nanoscale damage-sensing probe into a lightweight composite made of epoxy and silk.

The probe, known as a mechanophore, could speed up product testing and potentially reduce the amount of time and materials needed for the development of many kinds of new composites.

The NIST team created their probe from a dye known as rhodamine spirolactam (RS), which changes from a dark state to a light state in reaction to an applied force. In this experiment, the molecule was attached to silk fibers contained inside an epoxy-based composite. 

As more and more force was applied to the composite, the stress and strain activated the RS, causing it to fluoresce when excited with a laser. Although the change was not visible to the naked eye, a red laser and a microscope built and designed by NIST were used to take photos inside the composite, showing even the most minute breaks and fissures to its interior, and revealing points where the fiber had fractured.

The results were published today in the journal Advanced Materials Interfaces (“Observation of Interfacial Damage in a Silk-Epoxy Composite, Using a Simple Mechanoresponsive Fluorescent Probe”).

The materials used in the design of composites are diverse. In nature, composites such as crab shell or elephant tusk (bone) are made of proteins and polysaccharides. In this study, epoxy was combined with silk filaments prepared by Professor Fritz Vollrath’s group at Oxford University using Bombyx mori silk worms. Fiber-reinforcedpolymer composites such as the one used in this study combine the most beneficial aspects of the main components–the strength of the fiber and the toughness of the polymer.

What all composites have in common, though, is the presence of an interface where the components meet. The resilience of that interface is critical to a composite’s ability to withstand damage. Interfaces that are thin but flexible are often favored by designers and manufacturers, but it is very challenging to measure the interfacial properties in a composite.

“There have long been ways to measure the macroscopic properties of composites,” said researcher Jeffrey Gilman, who led the team doing the work at NIST. “But for decades the challenge has been to determine what was happening inside, at the interface.”

One option is optical imaging. However, conventional methods for optical imaging are only able to record images at scales as small as 200-400 nanometers. Some interfaces are only 10 to 100 nanometers in thickness, making such techniques somewhat ineffective for imaging the interphase in composites. 

By installing the RS probe at the interface, the researchers were able to “see” damage exclusively at the interface using optical microscopy.

The NIST research team is planning to expand their research to explore how such probes could be used in other kinds of composites as well. They also would like to use such sensors to enhance the capability of these composites to withstand extreme cold and heat. 
There’s a tremendous demand for composites that can withstand prolonged exposure to water, too, especially for use in building more resilient infrastructure components such as bridges and giant blades for wind turbines.

The research team plans to continue searching for more ways that damage sensors such as the one in this study could be used to improve standards for existing composites and create new standards for the composites of the future, ensuring that those materials are safe, strong and reliable.

“We now have a damage sensor to help optimize the composite for different applications,” Gilman said. “If you attempt a design change, you can figure out if the change you made improved the interface of a composite, or weakened it.”

Source: National Institute of Standards and Technology (NIST)

NIST-made ‘sun and rain’ used to study nanoparticle release from polymers


 

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)

If the 1967 film “The Graduate” were remade today, Mr. McGuire’s famous advice to young Benjamin Braddock would probably be updated to “Plastics … with nanoparticles.” These days, the mechanical, electrical and durability properties of polymers–the class of materials that includes plastics–are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water–and if so, what might be the health and ecological consequences?

In a recently published paper, researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts.

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film “Star Wars.” For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity).

To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff–with any loose nanoparticles–was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure.

Both sets of coating samples–those weathered in very low humidity and the others in very humid conditions–degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters.

“These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research.

This project resulted from a collaboration between NIST’s Engineering Laboratory and Material Measurement Laboratory. It is part of NIST’s work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them.

###

D.S. Jacobs, S-R Huang, Y-L Cheng, S.A. Rabb, J.M. Gorham, P.J. Krommenhoek, L.L. Yu, T. Nguyen and L. Sung. Surface degradation and nanoparticle release of a commercial nanosilica/polyurethane coating under UV exposure. September 2016. Journal of Coatings Technology and Research.

DOI: 10.1007/s11998-016-9796-2

Inventing a Fleet – Fast Test for Nanomanufacturing (nano-materials) for Quality Control


Measurements of electrical properties of a plastic tape (yellow), taken using a specially designed microwave cavity (the white cylinder at center) and accompanying electrical circuit, change quickly and consistently in response to changes in the tape’s thickness. The setup is inspired by high-volume roll-to-roll manufacturing devices used to mass-produce nanomaterials. The changes in the tape’s thickness spell NIST in Morse code.
Credit: NIST/Nathan Orloff

Manufacturers may soon have a speedy and nondestructive way to test a wide array of materials under real-world conditions, thanks to an advance that researchers at the National Institute of Standards and Technology (NIST) have made in roll-to-roll measurements. Roll-to-roll measurements are typically optical measurements for roll-to-roll manufacturing, any method that uses conveyor belts for continuous processing of items, from tires to nanotechnology components.

In order for new materials such as carbon nanotubes and graphene to play an increasingly important role in electronic devices, high-tech composites and other applications, manufacturers will need quality-control tests to ensure that products have desired characteristics, and lack flaws. Current test procedures often require cutting, scratching or otherwise touching a product, which slows the manufacturing process and can damage or even destroy the sample being tested.

To add to existing testing non-contact methods, NIST physicists Nathan Orloff, Christian Long and Jan Obrzut measured properties of films by passing them through a specially designed metal box known as a microwave cavity. Electromagnetic waves build up inside the cavity at a specific “resonance” frequency determined by the box’s size and shape, similar to how a guitar string vibrates at a specific pitch depending on its length and tension. When an object is placed inside the cavity, the resonance frequency changes in a way that depends on the object’s size, electrical resistance and dielectric constant, a measure of an object’s ability to store energy in an electric field. The frequency change is reminiscent of how shortening or tightening a guitar string makes it resonate at a higher pitch, says Orloff.

The researchers also built an electrical circuit to measure these changes. They first tested their device by running a strip of plastic tape known as polyimide through the cavity, using a roll-to-roll setup resembling high-volume roll-to-roll manufacturing devices used to mass-produce nanomaterials. As the tape’s thickness increased and decreased–the researchers made the changes in tape thickness spell “NIST” in Morse code–the cavity’s resonant frequency changed in tandem. So did another parameter called the “quality factor,” which is the ratio of the energy stored in the cavity to the energy lost per frequency cycle. Because polyimide’s electrical properties are well known, a manufacturer could use the cavity measurements to monitor whether tape is coming off the production line at a consistent thickness–and even feeding back information from the measurements to control the thickness.

Alternatively, a manufacturer could use the new method to monitor the electrical properties of a less well-characterized material of known dimensions. Orloff and Long demonstrated this by passing 12- and 15-centimeter-long films of carbon nanotubes deposited on sheets of plastic through the cavity and measuring the films’ electrical resistance. The entire process took “less than a second,” says Orloff. He added that with industry-standard equipment, the measurements could be taken at speeds beyond 10 meters per second, more than enough for many present-day manufacturing operations.

The new method has several advantages for a thin-film manufacturer, says Orloff. One, “You can measure the entire thing, not just a small sample,” he said. Such real-time measurements could be used to tune the manufacturing process without shutting it down, or to discard a faulty batch of product before it gets out the factory door. “This method could significantly boost prospects of not making a faulty batch in the first place,” Long noted.

And because the method is nondestructive, Orloff added, “If a batch passes the test, manufacturers can sell it.”

Films of carbon nanotubes and graphene are just starting to be manufactured in bulk for potential applications such as composite airplane materials, smartphone screens and wearable electronic devices.

Orloff, Long and Obrzut submitted a patent application for this technique in December 2015.

A producer of such materials has already expressed interest in the new method, said Orloff. “They’re really excited about it.” He added that the method is not specific to nanomanufacturing, and with a properly designed cavity, could also help with quality control of many other kinds of products, including tires, pharmaceuticals and even beer.


Story Source:

The above post is reprinted from materials provided by National Institute of Standards and Technology (NIST). Note: Materials may be edited for content and length.


Journal Reference:

  1. Nathan D. Orloff, Christian J. Long, Jan Obrzut, Laurent Maillaud, Francesca Mirri, Thomas P. Kole, Robert D. McMichael, Matteo Pasquali, Stephan J. Stranick, J. Alexander Liddle. Noncontact conductivity and dielectric measurement for high throughput roll-to-roll nanomanufacturing. Scientific Reports, 2015; 5: 17019 DOI: 10.1038/srep17019

NIST Method for Spotting Quantum Dots Could Help Make High-Performance Nanophotonic Devices


NIST 081115 15CNST009_quantum_dot_finder_LRLife may be as unpredictable as a box of chocolates, but ideally, you always know what you’re going to get from a quantum dot. A quantum dot should produce one, and only one, photon—the smallest constituent of light—each time it is energized. This characteristic makes it attractive for use in various quantum technologies such as secure communications. Oftentimes, however, the trick is in finding the dots.

NIST 081115 15CNST009_quantum_dot_finder_LR[Clockwise from top left] Circular grating for extracting single photons from a quantum dot. For optimal performance, the quantum dot must be located at the center of the grating. Image taken with the camera-based optical location technique. A single quantum dot appears as a bright spot within an area defined by four alignment marks. Electron-beam lithography is used to define a circular grating at the quantum dot’s location. Image of the emission of the quantum dot within the grating. The bright spot appears in the center of the device, as desired.

Credit: NIST
View hi-resolution image

“Self-assembled, epitaxially grown” quantum dots have the highest optical quality. They randomly emerge (self-assemble) at the interface between two layers of a semiconductor crystal as it is built up layer-by-layer (epitaxially grown).

They grow randomly, but in order for the dots to be useful, they need to be located in a precise relation to some other photonic structure, be it a grating, resonator or waveguide, that can control the photons that the quantum dot generates. However, finding the dots—they’re just about 10 nanometers across—is no small feat.

Always up for a challenge, researchers working at the National Institute of Standards and Technology (NIST) have developed a simple new technique for locating them, and used it to create high-performance single photon sources.

This new development, which appeared in Nature Communications,* may make the manufacture of high-performance photonic devices using quantum dots much more efficient. Such devices are usually made in regular arrays using standard nanofabrication techniques for the control structures. However because of the random distribution of the dots, only a small percentage of them will line up correctly with the control structures. This process produces very few working devices.

“This is a first step towards providing accurate location information for the manufacture of high performance quantum dot devices,” says NIST physicist Kartik Srinivasan. “So far, the general approach has been statistical—make a lot of devices and end up with a small fraction that work. Our camera-based imaging technique maps the location of the quantum dots first, and then uses that knowledge to build optimized light-control devices in the right place.”

According to co-lead researcher Luca Sapienza of the University of Southampton in the United Kingdom, the new technique is sort of a twist on a red-eye reducing camera flash, where the first flash causes the subject’s pupils to close and the second illuminates the scene. Instead of a xenon-powered flash, the NIST team uses two LEDs.

In their setup, one LED activates the quantum dots when it flashes (so the LED gives the quantum dots red-eye). At the same time, a second, different color LED flash illuminates metallic orientation marks placed on the surface of the semiconductor wafer the dots are embedded in. Then a sensitive camera snaps a 100-micrometer by 100-micrometer picture.

By cross-referencing the glowing dots with the orientation marks, the researchers can determine the dots’ locations with an uncertainty of less than 30 nanometers. The coordinates in hand, scientists can then tell the computer-controlled electron beam lithography tool to place the control structures in the correct places, with the result being many more usable devices.

Using this technique, the researchers demonstrated grating-based single photon sources in which they were able to collect 50 percent of the quantum dot’s emitted photons, the theoretical limit for this type of structure.

They also demonstrated that more than 99 percent of the light produced from their source came out as single photons. Such high purity is partly due to the fact that the location technique helps the researchers to quickly survey the wafer (10,000 square micrometers at a time) to find regions where the quantum dot density is especially low, only about one per 1,000 square micrometers. This makes it far more likely that each grating device contains one—and only one—quantum dot.

This work was performed in part at NIST’s Center for Nanoscale Science and Technology (CNST), a national user facility available to researchers from industry, academia and government. In addition to NIST and the University of Southampton, researchers from the University of Rochester contributed to this work.

* L. Sapienza, M. Davanço, A. Badolato and K. Srinivasan. Nanoscale optical positioning of single quantum dots for bright and pure
single-photon emission.
Nature Communications, 6, 7833 doi:10.1038/ncomms8833. Published 27 July 2015.

NIST: Reducing the High Costs of Hydrogen (Fuel) Pipelines


NIST 580303_10152072709285365_1905986131_nThe National Institute of Standards and Technology (NIST) has put firm numbers on the high costs of installing pipelines to transport hydrogen fuel–and also found a way to reduce those costs.

Samples of pipeline steel instrumented for fatigue testing in a pressurized hydrogen chamber (the vertical tube). NIST researchers used data from such tests to develop a model for hydrogen effects on pipeline lifetime, to support a federal effort to reduce overall costs of hydrogen fuel. (Image: NIST)
Pipelines to carry hydrogen cost more than other gas pipelines because of the measures required to combat the damage hydrogen does to steel’s mechanical properties over time. NIST researchers calculated that hydrogen-specific steel pipelines can cost as much as 68 percent more than natural gas pipelines, depending on pipe diameter and operating pressure.* By contrast, a widely used cost model** suggests a cost penalty of only about 10 percent.>Samples of pipeline steel instrumented for fatigue testingBut the good news, according to the new NIST study, is that hydrogen transport costs could be reduced for most pipeline sizes and pressures by modifying industry codes*** to allow the use of a higher-strength grade of steel alloy without requiring thicker pipe walls. The stronger steel is more expensive, but dropping the requirement for thicker walls would reduce materials use and related welding and labor costs, resulting in a net cost reduction. The code modifications, which NIST has proposed to the American Society of Mechanical Engineers (ASME), would not lower pipeline performance or safety, the NIST authors say.”The cost savings comes from using less–because of thinner walls–of the more expensive material,” says NIST materials scientist James Fekete, a co-author of the study. “The current code does not allow you to reduce thickness when using higher-strength material, so costs would increase. With the proposed code, in most cases, you can get a net savings with a thinner pipe wall, because the net reduction in material exceeds the higher cost per unit weight.”

The NIST study is part of a federal effort to reduce the overall costs of hydrogen fuel, which is renewable, nontoxic and produces no harmful emissions. Much of the cost is for distribution, which likely would be most economical by pipeline. The U.S. contains more than 300,000 miles of pipelines for natural gas but very little customized for hydrogen. Existing codes for hydrogen pipelines are based on decades-old data. NIST researchers are studying hydrogen’s effects on steel to find ways to reduce pipeline costs without compromising safety or performance.

As an example, the new code would allow a 24-inch pipe made of high-strength X70 steel to be manufactured with a thickness of 0.375 inches for transporting hydrogen gas at 1500 pounds per square inch (psi). (In line with industry practice, ASME pipeline standards are expressed in customary units.) According to the new NIST study, this would reduce costs by 31 percent compared to the baseline X52 steel with a thickness of 0.562 inches, as required by the current code. In addition, thanks to its higher strength, X70 would make it possible to safely transport hydrogen through bigger pipelines at higher pressure (36-inch diameter pipe to transport hydrogen at 1500 psi) than is allowed with X52, enabling transport and storage of greater fuel volumes. This diameter-pressure combination is not possible under the current code.

The proposed code modifications were developed through research into the fatigue properties of high-strength steel at NIST’s Hydrogen Pipeline Material Testing Facility. In actual use, pipelines are subjected to cycles of pressurization at stresses far below the failure point, but high enough to result in fatigue damage. Unfortunately, it is difficult and expensive to determine steel fatigue properties in pressurized hydrogen. As a result, industry has historically used tension testing data as the basis for pipeline design, and higher-strength steels lose ductility in such tests in pressurized hydrogen. But this type of testing, which involves steadily increasing stress to the failure point, does not predict fatigue performance in hydrogen pipeline materials, Fekete says.
NIST research has shown that under realistic conditions, steel alloys with higher strengths (such as X70) do not have higher fatigue crack growth rates than lower grades (X52). The data have been used to develop a model**** for hydrogen effects on pipeline steel fatigue crack growth, which can predict pipeline lifetime based on operating conditions.
Notes
* J.W. Sowards, J.R. Fekete and R.L. Amaro. Economic impact of applying high strength steels in hydrogen gas pipelines. International Journal of Hydrogen Energy. 2015. In press, corrected proof available online. DOI:10.1016/j.ijhydene.2015.06.090
** DOE H2A Delivery Analysis. U.S. Department of Energy. Available online at http://www.hydrogen.energy.gov/h2a_delivery.html.
*** ASME B31.12 Hydrogen Piping and Pipeline Code (ASME B31.12). Industry groups such as ASME commonly rely on NIST data in developing codes.
**** R.L. Amaro, N. Rustagi, K.O. Findley, E.S. Drexler and A.J. Slifka. Modeling the fatigue crack growth of X100 pipeline steel in gaseous hydrogen. Int. J. Fatigue, 59 (2014). pp 262-271.
Source: NIST

NIST’s ‘Nano-Raspberries’ Could Bear Fruit in Fuel Cells


Researchers at the National Institute of Standards and Technology (NIST) have developed a fast, simple process for making platinum “nano-raspberries”—microscopic clusters of nanoscale particles of the precious metal. The berry-like shape is significant because it has a high surface area, which is helpful in the design of catalysts. Even better news for industrial chemists: the researchers figured out when and why the berry clusters clump into larger bunches of “nano-grapes.”

15MML007_nanoraspberries_composite_colorized_LR
Colorized micrographs of platinum nanoparticles made at NIST. The raspberry color suggests the particles’ corrugated shape, which offers high surface area for catalyzing reactions in fuel cells. Individual particles are 3-4 nm in diameter but can clump into bunches of 100 nm or more under specific conditions discovered in a NIST study.
Credit: Curtin/NIST
View hi-resolution image

The research could help make fuel cells more practical. Nanoparticles can act as catalysts to help convert methanol to electricity in fuel cells. NIST’s 40-minute process for making nano-raspberries, described in a new paper,* has several advantages. The high surface area of the berries encourages efficient reactions. In addition, the NIST process uses water, a benign or “green” solvent. And the bunches catalyze methanol reactions consistently and are stable at room temperature for at least eight weeks.

Although the berries were made of platinum, the metal is expensive and was used only as a model. The study will actually help guide the search for alternative catalyst materials, and clumping behavior in solvents is a key issue. For fuel cells, nanoparticles often are mixed with solvents to bind them to an electrode. To learn how such formulas affect particle properties, the NIST team measured particle clumping in four different solvents for the first time. For applications such as liquid methanol fuel cells, catalyst particles should remain separated and dispersed in the liquid, not clumped.

“Our innovation has little to do with the platinum and everything to do with how new materials are tested in the laboratory,” project leader Kavita Jeerage says. “Our critical contribution is that after you make a new material you need to make choices. Our paper is about one choice: what solvent to use. We made the particles in water and tested whether you could put them in other solvents. We found out that this choice is a big deal.”

The NIST team measured conditions under which platinum particles, ranging in size from 3 to 4 nanometers (nm) in diameter, agglomerated into bunches 100 nm wide or larger. They found that clumping depends on the electrical properties of the solvent. The raspberries form bigger bunches of grapes in solvents that are less “polar,” that is, where solvent molecules lack regions with strongly positive or negative charges. (Water is a strongly polar molecule.)

The researchers expected that. What they didn’t expect is that the trend doesn’t scale in a predictable way. The four solvents studied were water, methanol, ethanol and isopropanol, ordered by decreasing polarity. There wasn’t much agglomeration in methanol; bunches got about 30 percent bigger than they were in water. But in ethanol and isopropanol, the clumps got 400 percent and 600 percent bigger, respectively—really humongous bunches. This is a very poor suspension quality for catalytic purposes.

Because the nanoparticles clumped up slowly and not too much in methanol, the researchers concluded that the particles could be transferred to that solvent, assuming they were to be used within a few days—effectively putting an  expiration date on the catalyst.

Two college students in NIST’s Summer Undergraduate Research Fellowship (SURF) program helped with the extensive data collection required for the study.

* I. Sriram, A.E. Curtin, A.N. Chiaramonti, J.H. Cuchiaro, A.D. Weidner, T.M. Tingley, L.F. Greenlee and K.M. Jeerage. Stability and phase transfer of catalytically active platinum nanoparticle suspensions. Journal of Nanoparticle Research 17:230.DOI 10.1007/s11051-015-3034-1. Published online May 22, 2015.

Scientists Are Beaming Over Quantum Teleportation Record: Video


star_trek_beam_me_upA new distance record has been set in the strange world of quantum teleportation.

In the record-setting experiment, the quantum spin of a light particle was teleported 15.5 miles (25 kilometers) across an optical fiber, making the operation the farthest quantum feat of its kind.

About five years ago, researchers could teleport quantum information, such as the direction in which a particle is spinning, across only a few meters. Now they can beam that information across several miles.

Quantum teleportation doesn’t mean it’s possible to beam someone aboard a spacecraft, as in “Star Trek.” Physicists can’t instantly transport matter, but they can instantly transport information through quantum teleportation. This works thanks to a bizarre quantum property called entanglement.

Entanglement happens when the states of two subatomic particles remain connected no matter how far apart they are. When one particle is disturbed, the entangled partner is instantly affected by the same disturbance.

In the record-breaking experiment, researchers from the University of Geneva, NASA’s Jet Propulsion Laboratory and the National Institute of Standards and Technology used a super-fast laser to pump out photons. Every once in a while, two photons would become entangled. Once the researchers had an entangled pair, they sent one down the optical fiber and stored the other in a crystal at the end of the cable. Then the researchers shot a third particle of light at the photon traveling down the cable. When the two collided, they obliterated each other.

Though both photons vanished, the quantum information from the collision appeared in the crystal that held the second entangled photon.

Quantum information has been transferred dozens of miles, but this is the farthest it’s been transported using an optical fiber, and then recorded and stored at the other end. Other quantum teleportation experiments that beamed photons farther used lasers instead of optical fibers to send the information. Unlike the laser method, the optical-fiber method could be used to develop super-secure quantum communication channels, or quantum computers that are capable of extremely fast computing.

The research was published Sept. 21 in Nature Photonics.

This is a condensed version of a report from LiveScience. Read the full report.