IBM Scientists: Quantum transport goes ballistic ~ ‘Q & A’ with NIST


IBM QT download

IBM scientists have shot an electron through an III-V semiconductor nanowire integrated on silicon for the first time (Nano Letters, “Ballistic One-Dimensional InAs Nanowire Cross-Junction Interconnects”). This achievement will form the basis for sophisticated quantum wire devices for future integrated circuits used in advanced powerful computational systems. (A Q&A with NIST)

NIST: The title of your paper is Ballistic one-dimensional InAs nanowire cross-junction interconnects. When I read “ballistic” rather large missiles come to mind, but here you are doing this at the nanoscale. Can you talk about the challenges this presents?
Johannes Gooth (JG): Yes, this is very similar, but of course at a much different scale. Electrons are fired from one contact electrode and fly through the nanowire without being scattered until they hit the opposed electrode. The nanowire acts as a perfect guide for electrons, such that the full quantum information of this electron (energy, momentum, spin) can be transferred without losses. 

Quantum Trnp goes Ballistic id46328_1IBM scientist Johannes Gooth is focused on nanoscale electronics and quantum physics.


We can now do this in cross-junctions, which allows us to build up electron pipe networks, where quantum information can perfectly be transmitted. The challenge is to fabricate a geometrically very well defined material with no scatterers inside on the nano scale. The template-assisted selective epitaxy or TASE process, which was developed here at the IBM Zurich Lab by my colleagues, makes this possible for the first time.
NIST: How does this research compare to other activities underway elsewhere?
JG: Most importantly, compared to optical and superconducting quantum applications the technique is scalable and compatible with standard electronics and CMOS processes.
NIST: What role do you see for quantum transport as we look to build a universal quantum computer?
JG: I see quantum transport as an essential piece. If you want to exercise the full power of quantum information technology, you need to connect everything ballistic: a quantum system that is fully ballistically (quantum) connected has an exponentially larger computational state space compared to classically connected systems.
Also, as stated above, the electronics are scalable. Moreover, combining our nanowire structures with superconductors allows for topological protected quantum computing, which enables fault tolerant computation. These are major advantages compared to other techniques.IBM QT II 51XVJRfhbNL._SX348_BO1,204,203,200_
NIST: How easily can this be manufactured using existing processes and what’s the next step?
JG: This is a major advantage of our technique because our devices are fully integrated into existing CMOS processes and technology.
NIST: What’s next for your research?
JG: The next steps will be the functionalization of the crosses, by means of attaching electronic quantum computational parts. We will start to build superconducting/nanowire hybrid devices for Majorana braiding, and attach quantum dots.
Source: NIST


New nanosensor of Silk could speed development of new infrastructure; Aerospace and Consumer Materials 

Silk nanosensor could speed development of new infrastructure, aerospace and consumer materials (Middle) Mechanophore-labeled silk fiber fluoresces in response to damage or stress. (Right) Control sample without the mechanophore. (Image: Chelsea Davis and Jeremiah Woodcock/NIST)

Posted: Mar 17, 2017

Consumers want fuel-efficient vehicles and high-performance sporting goods, municipalities want weather-resistant bridges, and manufacturers want more efficient ways to make reliable cars and aircraft. 
What’s needed are new lightweight, energy-saving composites that won’t crack or break even after prolonged exposure to environmental or structural stress. 

To help make that possible, researchers working at the National Institute of Standards and Technology (NIST) have developed a way to embed a nanoscale damage-sensing probe into a lightweight composite made of epoxy and silk.

The probe, known as a mechanophore, could speed up product testing and potentially reduce the amount of time and materials needed for the development of many kinds of new composites.

The NIST team created their probe from a dye known as rhodamine spirolactam (RS), which changes from a dark state to a light state in reaction to an applied force. In this experiment, the molecule was attached to silk fibers contained inside an epoxy-based composite. 

As more and more force was applied to the composite, the stress and strain activated the RS, causing it to fluoresce when excited with a laser. Although the change was not visible to the naked eye, a red laser and a microscope built and designed by NIST were used to take photos inside the composite, showing even the most minute breaks and fissures to its interior, and revealing points where the fiber had fractured.

The results were published today in the journal Advanced Materials Interfaces (“Observation of Interfacial Damage in a Silk-Epoxy Composite, Using a Simple Mechanoresponsive Fluorescent Probe”).

The materials used in the design of composites are diverse. In nature, composites such as crab shell or elephant tusk (bone) are made of proteins and polysaccharides. In this study, epoxy was combined with silk filaments prepared by Professor Fritz Vollrath’s group at Oxford University using Bombyx mori silk worms. Fiber-reinforcedpolymer composites such as the one used in this study combine the most beneficial aspects of the main components–the strength of the fiber and the toughness of the polymer.

What all composites have in common, though, is the presence of an interface where the components meet. The resilience of that interface is critical to a composite’s ability to withstand damage. Interfaces that are thin but flexible are often favored by designers and manufacturers, but it is very challenging to measure the interfacial properties in a composite.

“There have long been ways to measure the macroscopic properties of composites,” said researcher Jeffrey Gilman, who led the team doing the work at NIST. “But for decades the challenge has been to determine what was happening inside, at the interface.”

One option is optical imaging. However, conventional methods for optical imaging are only able to record images at scales as small as 200-400 nanometers. Some interfaces are only 10 to 100 nanometers in thickness, making such techniques somewhat ineffective for imaging the interphase in composites. 

By installing the RS probe at the interface, the researchers were able to “see” damage exclusively at the interface using optical microscopy.

The NIST research team is planning to expand their research to explore how such probes could be used in other kinds of composites as well. They also would like to use such sensors to enhance the capability of these composites to withstand extreme cold and heat. 
There’s a tremendous demand for composites that can withstand prolonged exposure to water, too, especially for use in building more resilient infrastructure components such as bridges and giant blades for wind turbines.

The research team plans to continue searching for more ways that damage sensors such as the one in this study could be used to improve standards for existing composites and create new standards for the composites of the future, ensuring that those materials are safe, strong and reliable.

“We now have a damage sensor to help optimize the composite for different applications,” Gilman said. “If you attempt a design change, you can figure out if the change you made improved the interface of a composite, or weakened it.”

Source: National Institute of Standards and Technology (NIST)

NIST-made ‘sun and rain’ used to study nanoparticle release from polymers



If the 1967 film “The Graduate” were remade today, Mr. McGuire’s famous advice to young Benjamin Braddock would probably be updated to “Plastics … with nanoparticles.” These days, the mechanical, electrical and durability properties of polymers–the class of materials that includes plastics–are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water–and if so, what might be the health and ecological consequences?

In a recently published paper, researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts.

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film “Star Wars.” For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity).

To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff–with any loose nanoparticles–was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure.

Both sets of coating samples–those weathered in very low humidity and the others in very humid conditions–degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters.

“These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research.

This project resulted from a collaboration between NIST’s Engineering Laboratory and Material Measurement Laboratory. It is part of NIST’s work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them.


D.S. Jacobs, S-R Huang, Y-L Cheng, S.A. Rabb, J.M. Gorham, P.J. Krommenhoek, L.L. Yu, T. Nguyen and L. Sung. Surface degradation and nanoparticle release of a commercial nanosilica/polyurethane coating under UV exposure. September 2016. Journal of Coatings Technology and Research.

DOI: 10.1007/s11998-016-9796-2

Inventing a Fleet – Fast Test for Nanomanufacturing (nano-materials) for Quality Control

Measurements of electrical properties of a plastic tape (yellow), taken using a specially designed microwave cavity (the white cylinder at center) and accompanying electrical circuit, change quickly and consistently in response to changes in the tape’s thickness. The setup is inspired by high-volume roll-to-roll manufacturing devices used to mass-produce nanomaterials. The changes in the tape’s thickness spell NIST in Morse code.
Credit: NIST/Nathan Orloff

Manufacturers may soon have a speedy and nondestructive way to test a wide array of materials under real-world conditions, thanks to an advance that researchers at the National Institute of Standards and Technology (NIST) have made in roll-to-roll measurements. Roll-to-roll measurements are typically optical measurements for roll-to-roll manufacturing, any method that uses conveyor belts for continuous processing of items, from tires to nanotechnology components.

In order for new materials such as carbon nanotubes and graphene to play an increasingly important role in electronic devices, high-tech composites and other applications, manufacturers will need quality-control tests to ensure that products have desired characteristics, and lack flaws. Current test procedures often require cutting, scratching or otherwise touching a product, which slows the manufacturing process and can damage or even destroy the sample being tested.

To add to existing testing non-contact methods, NIST physicists Nathan Orloff, Christian Long and Jan Obrzut measured properties of films by passing them through a specially designed metal box known as a microwave cavity. Electromagnetic waves build up inside the cavity at a specific “resonance” frequency determined by the box’s size and shape, similar to how a guitar string vibrates at a specific pitch depending on its length and tension. When an object is placed inside the cavity, the resonance frequency changes in a way that depends on the object’s size, electrical resistance and dielectric constant, a measure of an object’s ability to store energy in an electric field. The frequency change is reminiscent of how shortening or tightening a guitar string makes it resonate at a higher pitch, says Orloff.

The researchers also built an electrical circuit to measure these changes. They first tested their device by running a strip of plastic tape known as polyimide through the cavity, using a roll-to-roll setup resembling high-volume roll-to-roll manufacturing devices used to mass-produce nanomaterials. As the tape’s thickness increased and decreased–the researchers made the changes in tape thickness spell “NIST” in Morse code–the cavity’s resonant frequency changed in tandem. So did another parameter called the “quality factor,” which is the ratio of the energy stored in the cavity to the energy lost per frequency cycle. Because polyimide’s electrical properties are well known, a manufacturer could use the cavity measurements to monitor whether tape is coming off the production line at a consistent thickness–and even feeding back information from the measurements to control the thickness.

Alternatively, a manufacturer could use the new method to monitor the electrical properties of a less well-characterized material of known dimensions. Orloff and Long demonstrated this by passing 12- and 15-centimeter-long films of carbon nanotubes deposited on sheets of plastic through the cavity and measuring the films’ electrical resistance. The entire process took “less than a second,” says Orloff. He added that with industry-standard equipment, the measurements could be taken at speeds beyond 10 meters per second, more than enough for many present-day manufacturing operations.

The new method has several advantages for a thin-film manufacturer, says Orloff. One, “You can measure the entire thing, not just a small sample,” he said. Such real-time measurements could be used to tune the manufacturing process without shutting it down, or to discard a faulty batch of product before it gets out the factory door. “This method could significantly boost prospects of not making a faulty batch in the first place,” Long noted.

And because the method is nondestructive, Orloff added, “If a batch passes the test, manufacturers can sell it.”

Films of carbon nanotubes and graphene are just starting to be manufactured in bulk for potential applications such as composite airplane materials, smartphone screens and wearable electronic devices.

Orloff, Long and Obrzut submitted a patent application for this technique in December 2015.

A producer of such materials has already expressed interest in the new method, said Orloff. “They’re really excited about it.” He added that the method is not specific to nanomanufacturing, and with a properly designed cavity, could also help with quality control of many other kinds of products, including tires, pharmaceuticals and even beer.

Story Source:

The above post is reprinted from materials provided by National Institute of Standards and Technology (NIST). Note: Materials may be edited for content and length.

Journal Reference:

  1. Nathan D. Orloff, Christian J. Long, Jan Obrzut, Laurent Maillaud, Francesca Mirri, Thomas P. Kole, Robert D. McMichael, Matteo Pasquali, Stephan J. Stranick, J. Alexander Liddle. Noncontact conductivity and dielectric measurement for high throughput roll-to-roll nanomanufacturing. Scientific Reports, 2015; 5: 17019 DOI: 10.1038/srep17019

NIST Method for Spotting Quantum Dots Could Help Make High-Performance Nanophotonic Devices

NIST 081115 15CNST009_quantum_dot_finder_LRLife may be as unpredictable as a box of chocolates, but ideally, you always know what you’re going to get from a quantum dot. A quantum dot should produce one, and only one, photon—the smallest constituent of light—each time it is energized. This characteristic makes it attractive for use in various quantum technologies such as secure communications. Oftentimes, however, the trick is in finding the dots.

NIST 081115 15CNST009_quantum_dot_finder_LR[Clockwise from top left] Circular grating for extracting single photons from a quantum dot. For optimal performance, the quantum dot must be located at the center of the grating. Image taken with the camera-based optical location technique. A single quantum dot appears as a bright spot within an area defined by four alignment marks. Electron-beam lithography is used to define a circular grating at the quantum dot’s location. Image of the emission of the quantum dot within the grating. The bright spot appears in the center of the device, as desired.

Credit: NIST
View hi-resolution image

“Self-assembled, epitaxially grown” quantum dots have the highest optical quality. They randomly emerge (self-assemble) at the interface between two layers of a semiconductor crystal as it is built up layer-by-layer (epitaxially grown).

They grow randomly, but in order for the dots to be useful, they need to be located in a precise relation to some other photonic structure, be it a grating, resonator or waveguide, that can control the photons that the quantum dot generates. However, finding the dots—they’re just about 10 nanometers across—is no small feat.

Always up for a challenge, researchers working at the National Institute of Standards and Technology (NIST) have developed a simple new technique for locating them, and used it to create high-performance single photon sources.

This new development, which appeared in Nature Communications,* may make the manufacture of high-performance photonic devices using quantum dots much more efficient. Such devices are usually made in regular arrays using standard nanofabrication techniques for the control structures. However because of the random distribution of the dots, only a small percentage of them will line up correctly with the control structures. This process produces very few working devices.

“This is a first step towards providing accurate location information for the manufacture of high performance quantum dot devices,” says NIST physicist Kartik Srinivasan. “So far, the general approach has been statistical—make a lot of devices and end up with a small fraction that work. Our camera-based imaging technique maps the location of the quantum dots first, and then uses that knowledge to build optimized light-control devices in the right place.”

According to co-lead researcher Luca Sapienza of the University of Southampton in the United Kingdom, the new technique is sort of a twist on a red-eye reducing camera flash, where the first flash causes the subject’s pupils to close and the second illuminates the scene. Instead of a xenon-powered flash, the NIST team uses two LEDs.

In their setup, one LED activates the quantum dots when it flashes (so the LED gives the quantum dots red-eye). At the same time, a second, different color LED flash illuminates metallic orientation marks placed on the surface of the semiconductor wafer the dots are embedded in. Then a sensitive camera snaps a 100-micrometer by 100-micrometer picture.

By cross-referencing the glowing dots with the orientation marks, the researchers can determine the dots’ locations with an uncertainty of less than 30 nanometers. The coordinates in hand, scientists can then tell the computer-controlled electron beam lithography tool to place the control structures in the correct places, with the result being many more usable devices.

Using this technique, the researchers demonstrated grating-based single photon sources in which they were able to collect 50 percent of the quantum dot’s emitted photons, the theoretical limit for this type of structure.

They also demonstrated that more than 99 percent of the light produced from their source came out as single photons. Such high purity is partly due to the fact that the location technique helps the researchers to quickly survey the wafer (10,000 square micrometers at a time) to find regions where the quantum dot density is especially low, only about one per 1,000 square micrometers. This makes it far more likely that each grating device contains one—and only one—quantum dot.

This work was performed in part at NIST’s Center for Nanoscale Science and Technology (CNST), a national user facility available to researchers from industry, academia and government. In addition to NIST and the University of Southampton, researchers from the University of Rochester contributed to this work.

* L. Sapienza, M. Davanço, A. Badolato and K. Srinivasan. Nanoscale optical positioning of single quantum dots for bright and pure
single-photon emission.
Nature Communications, 6, 7833 doi:10.1038/ncomms8833. Published 27 July 2015.

NIST: Reducing the High Costs of Hydrogen (Fuel) Pipelines

NIST 580303_10152072709285365_1905986131_nThe National Institute of Standards and Technology (NIST) has put firm numbers on the high costs of installing pipelines to transport hydrogen fuel–and also found a way to reduce those costs.

Samples of pipeline steel instrumented for fatigue testing in a pressurized hydrogen chamber (the vertical tube). NIST researchers used data from such tests to develop a model for hydrogen effects on pipeline lifetime, to support a federal effort to reduce overall costs of hydrogen fuel. (Image: NIST)
Pipelines to carry hydrogen cost more than other gas pipelines because of the measures required to combat the damage hydrogen does to steel’s mechanical properties over time. NIST researchers calculated that hydrogen-specific steel pipelines can cost as much as 68 percent more than natural gas pipelines, depending on pipe diameter and operating pressure.* By contrast, a widely used cost model** suggests a cost penalty of only about 10 percent.>Samples of pipeline steel instrumented for fatigue testingBut the good news, according to the new NIST study, is that hydrogen transport costs could be reduced for most pipeline sizes and pressures by modifying industry codes*** to allow the use of a higher-strength grade of steel alloy without requiring thicker pipe walls. The stronger steel is more expensive, but dropping the requirement for thicker walls would reduce materials use and related welding and labor costs, resulting in a net cost reduction. The code modifications, which NIST has proposed to the American Society of Mechanical Engineers (ASME), would not lower pipeline performance or safety, the NIST authors say.”The cost savings comes from using less–because of thinner walls–of the more expensive material,” says NIST materials scientist James Fekete, a co-author of the study. “The current code does not allow you to reduce thickness when using higher-strength material, so costs would increase. With the proposed code, in most cases, you can get a net savings with a thinner pipe wall, because the net reduction in material exceeds the higher cost per unit weight.”

The NIST study is part of a federal effort to reduce the overall costs of hydrogen fuel, which is renewable, nontoxic and produces no harmful emissions. Much of the cost is for distribution, which likely would be most economical by pipeline. The U.S. contains more than 300,000 miles of pipelines for natural gas but very little customized for hydrogen. Existing codes for hydrogen pipelines are based on decades-old data. NIST researchers are studying hydrogen’s effects on steel to find ways to reduce pipeline costs without compromising safety or performance.

As an example, the new code would allow a 24-inch pipe made of high-strength X70 steel to be manufactured with a thickness of 0.375 inches for transporting hydrogen gas at 1500 pounds per square inch (psi). (In line with industry practice, ASME pipeline standards are expressed in customary units.) According to the new NIST study, this would reduce costs by 31 percent compared to the baseline X52 steel with a thickness of 0.562 inches, as required by the current code. In addition, thanks to its higher strength, X70 would make it possible to safely transport hydrogen through bigger pipelines at higher pressure (36-inch diameter pipe to transport hydrogen at 1500 psi) than is allowed with X52, enabling transport and storage of greater fuel volumes. This diameter-pressure combination is not possible under the current code.

The proposed code modifications were developed through research into the fatigue properties of high-strength steel at NIST’s Hydrogen Pipeline Material Testing Facility. In actual use, pipelines are subjected to cycles of pressurization at stresses far below the failure point, but high enough to result in fatigue damage. Unfortunately, it is difficult and expensive to determine steel fatigue properties in pressurized hydrogen. As a result, industry has historically used tension testing data as the basis for pipeline design, and higher-strength steels lose ductility in such tests in pressurized hydrogen. But this type of testing, which involves steadily increasing stress to the failure point, does not predict fatigue performance in hydrogen pipeline materials, Fekete says.
NIST research has shown that under realistic conditions, steel alloys with higher strengths (such as X70) do not have higher fatigue crack growth rates than lower grades (X52). The data have been used to develop a model**** for hydrogen effects on pipeline steel fatigue crack growth, which can predict pipeline lifetime based on operating conditions.
* J.W. Sowards, J.R. Fekete and R.L. Amaro. Economic impact of applying high strength steels in hydrogen gas pipelines. International Journal of Hydrogen Energy. 2015. In press, corrected proof available online. DOI:10.1016/j.ijhydene.2015.06.090
** DOE H2A Delivery Analysis. U.S. Department of Energy. Available online at
*** ASME B31.12 Hydrogen Piping and Pipeline Code (ASME B31.12). Industry groups such as ASME commonly rely on NIST data in developing codes.
**** R.L. Amaro, N. Rustagi, K.O. Findley, E.S. Drexler and A.J. Slifka. Modeling the fatigue crack growth of X100 pipeline steel in gaseous hydrogen. Int. J. Fatigue, 59 (2014). pp 262-271.
Source: NIST

NIST’s ‘Nano-Raspberries’ Could Bear Fruit in Fuel Cells

Researchers at the National Institute of Standards and Technology (NIST) have developed a fast, simple process for making platinum “nano-raspberries”—microscopic clusters of nanoscale particles of the precious metal. The berry-like shape is significant because it has a high surface area, which is helpful in the design of catalysts. Even better news for industrial chemists: the researchers figured out when and why the berry clusters clump into larger bunches of “nano-grapes.”

Colorized micrographs of platinum nanoparticles made at NIST. The raspberry color suggests the particles’ corrugated shape, which offers high surface area for catalyzing reactions in fuel cells. Individual particles are 3-4 nm in diameter but can clump into bunches of 100 nm or more under specific conditions discovered in a NIST study.
Credit: Curtin/NIST
View hi-resolution image

The research could help make fuel cells more practical. Nanoparticles can act as catalysts to help convert methanol to electricity in fuel cells. NIST’s 40-minute process for making nano-raspberries, described in a new paper,* has several advantages. The high surface area of the berries encourages efficient reactions. In addition, the NIST process uses water, a benign or “green” solvent. And the bunches catalyze methanol reactions consistently and are stable at room temperature for at least eight weeks.

Although the berries were made of platinum, the metal is expensive and was used only as a model. The study will actually help guide the search for alternative catalyst materials, and clumping behavior in solvents is a key issue. For fuel cells, nanoparticles often are mixed with solvents to bind them to an electrode. To learn how such formulas affect particle properties, the NIST team measured particle clumping in four different solvents for the first time. For applications such as liquid methanol fuel cells, catalyst particles should remain separated and dispersed in the liquid, not clumped.

“Our innovation has little to do with the platinum and everything to do with how new materials are tested in the laboratory,” project leader Kavita Jeerage says. “Our critical contribution is that after you make a new material you need to make choices. Our paper is about one choice: what solvent to use. We made the particles in water and tested whether you could put them in other solvents. We found out that this choice is a big deal.”

The NIST team measured conditions under which platinum particles, ranging in size from 3 to 4 nanometers (nm) in diameter, agglomerated into bunches 100 nm wide or larger. They found that clumping depends on the electrical properties of the solvent. The raspberries form bigger bunches of grapes in solvents that are less “polar,” that is, where solvent molecules lack regions with strongly positive or negative charges. (Water is a strongly polar molecule.)

The researchers expected that. What they didn’t expect is that the trend doesn’t scale in a predictable way. The four solvents studied were water, methanol, ethanol and isopropanol, ordered by decreasing polarity. There wasn’t much agglomeration in methanol; bunches got about 30 percent bigger than they were in water. But in ethanol and isopropanol, the clumps got 400 percent and 600 percent bigger, respectively—really humongous bunches. This is a very poor suspension quality for catalytic purposes.

Because the nanoparticles clumped up slowly and not too much in methanol, the researchers concluded that the particles could be transferred to that solvent, assuming they were to be used within a few days—effectively putting an  expiration date on the catalyst.

Two college students in NIST’s Summer Undergraduate Research Fellowship (SURF) program helped with the extensive data collection required for the study.

* I. Sriram, A.E. Curtin, A.N. Chiaramonti, J.H. Cuchiaro, A.D. Weidner, T.M. Tingley, L.F. Greenlee and K.M. Jeerage. Stability and phase transfer of catalytically active platinum nanoparticle suspensions. Journal of Nanoparticle Research 17:230.DOI 10.1007/s11051-015-3034-1. Published online May 22, 2015.

Scientists Are Beaming Over Quantum Teleportation Record: Video

star_trek_beam_me_upA new distance record has been set in the strange world of quantum teleportation.

In the record-setting experiment, the quantum spin of a light particle was teleported 15.5 miles (25 kilometers) across an optical fiber, making the operation the farthest quantum feat of its kind.

About five years ago, researchers could teleport quantum information, such as the direction in which a particle is spinning, across only a few meters. Now they can beam that information across several miles.

Quantum teleportation doesn’t mean it’s possible to beam someone aboard a spacecraft, as in “Star Trek.” Physicists can’t instantly transport matter, but they can instantly transport information through quantum teleportation. This works thanks to a bizarre quantum property called entanglement.

Entanglement happens when the states of two subatomic particles remain connected no matter how far apart they are. When one particle is disturbed, the entangled partner is instantly affected by the same disturbance.

In the record-breaking experiment, researchers from the University of Geneva, NASA’s Jet Propulsion Laboratory and the National Institute of Standards and Technology used a super-fast laser to pump out photons. Every once in a while, two photons would become entangled. Once the researchers had an entangled pair, they sent one down the optical fiber and stored the other in a crystal at the end of the cable. Then the researchers shot a third particle of light at the photon traveling down the cable. When the two collided, they obliterated each other.

Though both photons vanished, the quantum information from the collision appeared in the crystal that held the second entangled photon.

Quantum information has been transferred dozens of miles, but this is the farthest it’s been transported using an optical fiber, and then recorded and stored at the other end. Other quantum teleportation experiments that beamed photons farther used lasers instead of optical fibers to send the information. Unlike the laser method, the optical-fiber method could be used to develop super-secure quantum communication channels, or quantum computers that are capable of extremely fast computing.

The research was published Sept. 21 in Nature Photonics.

This is a condensed version of a report from LiveScience. Read the full report.

NIST Study Suggests Light May Be Skewing Lab Tests on Nanoparticles’ Health Effects

NIST 580303_10152072709285365_1905986131_nTruth shines a light into dark places. But sometimes to find that truth in the first place, it’s better to stay in the dark. That’s what recent findings* at the National Institute of Standards and Technology (NIST) show about methods for testing the safety of nanoparticles. It turns out that previous tests indicating that some nanoparticles can damage our DNA may have been skewed by inadvertent light exposure in the lab.

Nanoparticles made of titanium dioxide are a common ingredient in paint, and they also are considered safe for use both on the body (in sunscreen, where they help block ultraviolet light) and even within it (in foodstuffs such as salad dressings to make them appear whiter). It is well known that in the presence of light and water, these particles can form dangerous, highly reactive chemicals called free radicals that can damage DNA. Because light does not reach the human body’s interior, scientists have long accepted that these nanoparticles would not damage cells by forming free radicals from light activation.

nanoparticle containing items
Titanium dioxide nanoparticles are widely used not only in paints but in sunscreen and even salad dressing.
Credit: © GoodMood,Photo-Gabriele,

However, some recent studies using cells suggest that titanium dioxide can damage DNA even in darkness—a disturbing possibility. Because such findings could have major health implications, the NIST team set out to determine whether light was indeed required for the nanoparticles to cause DNA damage.

“We didn’t set out to test the safety of the particles themselves—that’s for someone else to determine,” says NIST’s Elijah Petersen. “Our main concern is to ensure that scientists have enough knowledge to make accurate measurements. That way, tests will give accurate representations of reality.”

The NIST team exposed samples of DNA to titanium dioxide nanoparticles under three different conditions: Some samples were exposed in the presence of visible or ultraviolet light while others were kept carefully and intentionally in complete darkness from the moment of exposure to the time the DNA damage was measured. The team found that only when exposed to laboratory or ultraviolet light did the DNA form base lesions, a form of DNA damage associated with attack by radicals. Their conclusion? The culprit in earlier studies may be ambient light from the laboratory that inadvertently caused DNA damage.

“The results suggest that titanium dioxide nanoparticles do not damage DNA when kept in the dark,” Petersen says. “These findings show that experimental conditions, such as lighting, must be carefully controlled before drawing conclusions about nanoparticle effects on DNA.”

*E.J. Petersen, V. Reipa, S.S. Watson, D.L. Stanley, S.A Rabb and B.C. Nelson. The DNA damaging potential of photoactivated P25 titanium dioxide nanoparticles. Chemical Research in Toxicology, October 2014 issue, DOI: 10.1021/tx500340v.

UMD and NIST Announce the Creation of the Joint Center for Quantum Information and Computer Science

NIST 580303_10152072709285365_1905986131_n Center researchers aim to understand how quantum systems can store, transport, process information

The University of Maryland (UMD) and the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) announced today the creation of the Joint Center for Quantum Information and Computer Science (QuICS), with the support and participation of the Research Directorate of the National Security Agency/Central Security Service (NSA/CSS). Scientists at the center will conduct basic research to understand how quantum systems can be effectively used to store, transport and process information.

This new center complements the fundamental quantum research performed at the work of the Joint Quantum Institute (JQI), which was established in 2006 by UMD, NIST and the NSA. Focusing on one of JQI’s original objectives to fully understand quantum information, QuICS will bring together computer scientists—who have expertise in algorithm and computational complexity theory and computer architecture—with quantum information scientists and communications scientists.

“This new endeavor builds on an already successful and fruitful collaboration at JQI,” said Acting Under Secretary of Commerce for Standards and Technology and Acting Director of NIST Willie May. “The new center will be a venue for groundbreaking basic research that will help to build our capacity for quantum research and train the next generation of researchers.”

UMD and NIST have a shared history of collaboration and cooperation in education, research and public service. They have long cooperated in building collaborative research consortia and programs that have resulted in extensive personal, professional and institutional relationships.

“By deepening our partnership with NIST, we now have all the ingredients in place to make major advances in quantum science,” said UMD President Wallace Loh. “This superb, world-class quantum program will team some of the best minds in physics, computer science and engineering to overcome the limitations of current computing systems.”

Dianne O’Leary, Distinguished University Professor Emerita in computer science at UMD, and Jacob Taylor, a NIST physicist and JQI Fellow, will serve as co-directors of the new center. Like the JQI, QuICS will be located on the UMD campus in College Park, Md.

The capabilities of today’s embedded and high-performance computer architectures have limited advances in critical areas, such as modeling the physical world, improving sensors and securing communications. Quantum computing could enable us to break through some of these barriers.

QuICS’ objectives will be to:

  • Develop a world-class research center that will build the scientific foundation for quantum information science to enable understanding of the relationships between information theory, computational complexity theory and nature, as well as the advances in computer science necessary to support potential quantum computing and communication devices and systems;
  • Maintain and enhance the nation’s leading role in quantum information science by expanding an already-powerful collaboration between UMD, NIST and NSA/CSS; and
  • Establish a unique, interdisciplinary center for the interchange of ideas among computer scientists, physicists and quantum information researchers.

Some of the topics QuICS researchers will initially examine include understanding how quantum mechanics informs computation and communication theories, determining what insights computer science can shed on quantum computing, investigating the consequences of quantum information theory for fundamental physics, and developing practical applications for theoretical advances in quantum computation and communication.

QuICS is expected to train scientists for future industrial and academic opportunities and provide U.S. industry with cutting-edge research results. By combining the strengths of UMD and NIST, QuICS will become an international center for excellence in quantum computer and information science.

QuICS will be the newest of 16 centers and labs within the University of Maryland Institute for Advanced Computer Studies (UMIACS). The center will bring together researchers from UMIACS; the UMD Departments of Physics and Computer Science; and the UMD Applied Mathematics & Statistics, and Scientific Computation program with NIST’s Information Technology and Physical Measurement laboratories.

About the University of Maryland

NIST Spin Rods 14CNST004_nanorod_LR_1The University of Maryland is home to three quantum science research centers: the Joint Center for Quantum Information and Computer Science, the Joint Quantum Institute, and the Quantum Engineering Center. UMD has nation-leading computer science, physics and math departments, with particular strengths in the areas relevant to quantum science research.

In the 2015 Best Graduate Schools ranking by U.S. News & World Report, UMD’s Department of Physics ranked 14th, the Department of Computer Science ranked 15th, and Department of Mathematics ranked 17th. The atomic/molecular/optical physics specialty ranked 6th, the quantum physics specialty ranked 8th, and the applied math specialty ranked 10th. Visit UMD’s website to learn more.

About NIST

As a non-regulatory agency of the U.S. Department of Commerce, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life. Visit NIST’s website for more information.