Advance could aid development of nanoscale biosensors

Sensors 021616 advancecould

Imagine a hand-held environmental sensor that can instantly test water for lead, E. coli, and pesticides all at the same time, or a biosensor that can perform a complete blood workup from just a single drop. That’s the promise of nanoscale plasmonic interferometry, a technique that combines nanotechnology with plasmonics—the interaction between electrons in a metal and light.

Now researchers from Brown University’s School of Engineering have made an important fundamental advance that could make such devices more practical. The research team has developed a technique that eliminates the need for highly specialized external sources that deliver coherent light, which the technique normally requires. The advance could enable more versatile and more compact devices.

“It has always been assumed that coherent light was necessary for plasmonic interferometry,” said Domenico Pacifici, a professor of engineering who oversaw the work with his postdoctoral researcher Dongfang Li, and graduate student Jing Feng. “But we were able to disprove that assumption.”

The research is described in Nature Scientific Reports.

Plasmonic interferometers make use of the interaction between light and surface plasmon polaritons, density waves created when light energy rattles free electrons in a metal. One type of interferometer looks like a bull’s-eye structure etched into a thin layer of metal. In the center is a hole poked through the metal layer with a diameter of about 300 nanometers—about 1,000 times smaller than the diameter of a human hair. The hole is encircled by a series of etched grooves, with diameters of a few micrometers. Thousands of these bulls-eyes can be placed on a chip the size of a fingernail.

When light from an external source is shown onto the surface of an interferometer, some of the photons go through the central hole, while others are scattered by the grooves. Those scattered photons generate that propagate through the metal inward toward the hole, where they interact with photons passing through the hole. That creates an interference pattern in the light emitted from the hole, which can be recorded by a detector beneath the metal surface.

When a liquid is deposited on top of an interferometer, the light and the surface plasmons propagate through that liquid before they interfere with each other. That alters the interference patterns picked up by the detector depending on the chemical makeup of the liquid or compounds present in it. By using different sizes of groove rings around the hole, the interferometers can be tuned to detect the signature of specific compounds or molecules. With the ability to put many differently tuned interferometers on one chip, engineers can hypothetically make a versatile detector.

Up to now, all plasmonic interferometers have required the use of highly specialized external light sources that can deliver coherent light—beams in which light waves are parallel, have the same wavelength, and travel in-phase (meaning the peaks and valleys of the waves are aligned). Without coherent light sources, the interferometers cannot produce usable interference patterns. Those kinds of light sources, however, tend to be bulky, expensive, and require careful alignment and periodic recalibration to obtain a reliable optical response.

But Pacifici and his group have come up with a way to eliminate the need for external . In the new method, fluorescent light-emitting atoms are integrated directly within the tiny hole in the center of the interferometer. An external light source is still necessary to excite the internal emitters, but it need not be a specialized coherent source.

“This is a whole new concept for optical interferometry,” Pacifici said, “an entirely new device.”

In this new device, incoherent light shown on the interferometer causes the fluorescent atoms inside the center hole to generate surface plasmons. Those plasmons propagate outward from the hole, bounce off the groove rings, and propagate back toward the hole after. Once a plasmon propagates back, it interacts with the atom that released it, causing an interference with the directly transmitted photon. Because the emission of a photon and the generation of a plasmon are indistinguishable, alternative paths originating from the same emitter, the process is naturally coherent and interference can therefore occur even though the emitters are excited incoherently.

“The important thing here is that this is a self-interference process,” Pacifici said. “It doesn’t matter that you’re using incoherent light to excite the emitters, you still get a coherent process.”

In addition to eliminating the need for specialized external light sources, the approach has several advantages, Pacifici said. Because the surface plasmons travel out from the hole and back again, they probe the sample on top of the interferometer surface twice. That makes the device more sensitive.

But that’s not the only advantage. In the new device, external light can be projected from underneath the metal surface containing the interferometers instead of from above. That eliminates the need for complex illumination architectures on top of the sensing surface, which could make for easier integration into compact devices.

The embedded light emitters also eliminate the need to control the amount of sample liquid deposited on the interferometer’s surface. Large droplets of liquid can cause lensing effects, a bending of light that can scramble the results from the . Most plasmonic sensors make use of tiny microfluidic channels to deliver a thin film of liquid to avoid lensing problems. But with internal light emitters excited from the bottom surface, the external light never comes in contact with the sample, so lensing effects are negated, as is the need for microfluidics.

Finally, the internal emitters produce a low intensity light. That’s good for probing delicate samples, such as proteins, than can be damaged by high-intensity light.

More work is required to get the system out of the lab and into devices, and Pacifici and his team plan to continue to refine the idea. The next step will be to try eliminating the external light source altogether. It might be possible, the researchers say, to eventually excite the internal emitters using tiny fiber optic lines, or perhaps electric current.

Still, this initial proof-of-concept is promising, Pacifici said.

“From a fundamental standpoint, we think this new device represents a significant step forward,” he said, “a first demonstration of plasmonic interferometry with incoherent light”.

Explore further: Periodic structures in organic light-emitters can efficiently enhance, replenish surface plasmon waves


Graphene-Coated e-Fabrics Detect Noxious Gases

S Korea Graphene Sensors fibersensorx250Scientists in Korea have developed wearable, graphene-coated fabrics that can detect dangerous gases present in the air, alerting the wearer by turning on a light-emitting diode (LED) light.

The researchers, from the Electronics and Telecommunications Research Institute and Konkuk Univ. in the Republic of Korea, coated cotton and polyester yarn with a nanoglue called bovine serum albumin (BSA). The yarns were then wrapped in graphene oxide sheets.

Graphene is an incredibly strong one-atom-thick layer of carbon, and is known for its excellent conductive properties of heat and electricity. The graphene sheets stuck very well to the nanoglue—so much so that further testing showed the fabrics retained their electrical conducting properties after 1,000 consecutive cycles of bending and straightening and ten washing tests with various chemical detergents. Finally, the graphene oxide yarns were exposed to a chemical reduction process, which involves the gaining of electrons.

The reduced-graphene-oxide-coated materials were found to be particularly sensitive to detecting nitrogen dioxide, a pollutant gas commonly found in vehicle exhaust that also results from fossil fuel combustion. Prolonged exposure to nitrogen dioxide can be dangerous to human health, causing many respiratory-related illnesses. Exposure of these specially treated fabrics to nitrogen dioxide led to a change in the electrical resistance of the reduced graphene oxide.

The fabrics were so sensitive that 30 mins of exposure to 0.25 ppm of nitrogen dioxide (just under five times above the acceptable standard set by the U.S. Environmental Protection Agency) elicited a response. The fabrics were three times as sensitive to nitrogen dioxide in air compared to another reduced graphene oxide sensor previously prepared on a flat material.

The new technology, according to the researchers, can be immediately adopted in related industries because the coating process is a simple one, making it suitable for mass production. It would allow outdoor wearers to receive relevant information about air quality. The materials could also be incorporated with air-purifying filters to act as “smart filters” that can both detect and filter harmful gas from air.

“This sensor can bring a significant change to our daily life since it was developed with flexible and widely used fibers, unlike the gas sensors invariably developed with the existing solid substrates,” says Dr. Hyung-Kun Lee, who led this research initiative. The study was published online in Scientific Reports.

Source: Electronics and Telecommunications Research Institute

A New Quantum Dot Could Make Quantum Communications Possible

QDmulicolorsMA new form of quantum dot has been developed by an international team of researchers that can produce identical photons at will, paving the way for multiple revolutionary new uses for light.

Many upcoming quantum technologies will require a source of multiple lone photons with identical properties, and for the first time these researchers may have an efficient way to make them. With these quantum dots at their disposal, engineers might be able to start thinking about new, large-scale quantum communications networks.

The reason we need identical photons for quantum communication comes back to the non-quantum idea of key distribution. From a mathematics perspective, it’s trivially easy to encrypt any message so that nobody can read it, but very hard to encrypt a message so only some select individuals can read it, and nobody else.

The reason is key distribution: if everybody who needs to decrypt a message has the associated key needed for decryption, then no problem. So how do you get the key to everyone who needs to decrypt it?

This Stanford invention helps handle entangled photons, but does it introduce vulnerabilities in the process?

Quantum key distribution uses the ability of quantum physics to provide evidence of surveillance. Rather than making it impossible to intercept the key, and thus decrypt the message, quantum key distribution simply makes it impossible to secretly intercept the key, thus giving the sender of the message warning that they should try again with a new key until one gets through successfully. Once you’re sure that your intended recipient has the key, and just as importantly that nobody else has it, then you could send the actual encrypted file via smoke signal if you really wanted to — at that point, the security of the transmission itself really shouldn’t matter.QDLED 08_Bulovic_QDs_inLiquidSolutions

There has been some promising research in this field — it’s not to be confused with the much more preliminary work on using quantum entanglement to transfer information in such a way that it literally does not traverse the intervening space. That may come along someday, but not for a long, long time.

Regardless, one of the big problems with implementing quantum key distribution is that the optical technology necessary to get these surveillance-aware signals from sender to recipient just aren’t there. In particular, the wavelength of photons changes as they move down an optical fiber — not good, since creating photon with precise attributes is the whole source of quantum security.

Texas Tech University: Nano-HPLC: Nano Scale Security & Sensors: Advanced Surveillance and Testing

PF_NanoHPLC_640x360Getting smaller can be a good thing for high performance liquid chromatography (HPLC). Nano-HPLC delivers lab-changing benefits to scientists.

The United States government—nearly 14 years after September 11, 2001—continues to invest heavily in homeland security. In fact, the president’s 2015 budget calls for US$38.2 billion for the Department of Homeland Security, and that equals more than 20 percent of the country’s gross domestic product (GDP). To keep ahead of and be able to manage security threats, governments require increasingly advanced methods for surveillance and testing. One of those is inductively coupled plasma mass spectrometry (ICP-MS).

Getting smaller can be a good thing for high performance liquid chromatography (HPLC). In fact, nano-HPLC delivers some lab-changing benefits to scientists. The key is all-around scale. Whereas traditional HPLC flow rates go as low as a few hundred microliters per minute, that metric drops to the nanoliter range in nano-HPLC. “That smaller scale means that everything needs to be downscaled,” explains Remco Swart, director of LC/ MS technologies at Thermo Fisher Scientific in Waltham, Massachusetts.

The LC columns reveal some of the key features of the changes in scale. For example, HPLC’s traditional columns are a couple of millimeters across inside, and nano-HPLC relies on columns with an internal diameter of just 75 micrometers. Consequently, the injection and flow rates must be adjusted accordingly. “You can see the stream of the eluent in traditional HPLC,” says Pat Young, senior product manager at Waters Corporation in Milford, Massachusetts. “If you examine the end of a nano-HPLC column as the sample elutes, you never actually see even a drop formed.” That shows just how much the flow rate is decreased in nano-HPLC.

Pushing down the HPLC scale, however, pushes up the possibilities. “The main reason for going to nano-HPLC is if a customer is limited in the amount of sample available,” says Swart. “There are areas where scientists want lots of information from limited samples.” Nano- HPLC makes that possible.

Down to the details

With a smaller internal-diameter column, nano- HPLC keeps the sample more concentrated, rather than its being relatively diluted across a wider column. Nonetheless, the smaller scale creates complexities in connecting the parts. To simplify this process, Thermo Fisher Scientific developed its nanoViper fittings. As Swart says, “With these, you make toolfree connections and eliminate dead volume easily to maintain the separation quality all the way into the detector.” He adds, “It lets you make nano-LC measurements in an easier way.”

Waters also created technology that makes it easier for scientists to use nano-HPLC. “We created the ACQUITY UPLC M-Class to do nano-flow for 75-micrometer columns all the way up to 100 microliters per minute for 1-millimeter internal diameter columns,” says Young. “It’s a direct nano-flow system so you don’t have to fuss around with things.” In addition, Waters’ ionKey/MS system integrates a 150-micrometer internal diameter separation into the source of the mass spectrometer.

Other vendors also offer systems that simplify nano- HPLC. For example, Gurmil Gendeh, director of biopharmaceutical segment marketing for Agilent in Santa Clara, California, says, “The Agilent 1260 Infinity HPLC-Chip/MS system is a reusable microfluidic chip-based technology for high-sensitivity nanospray LC/MS that seamlessly integrates the sample preparation, sample enrichment and separation nano columns, tubing, connections, and spray needle of a traditional nano-electrospray LC/MS system into a biocompatible polymer chip.” This system can also be adapted to specific applications by using different chips. As Gendeh says, “A wide variety of chips with specific chemistries enables a broad spectrum of applications, including intact monoclonal antibody characterization, monoclonal antibody sequence confirmation, and indepth characterization of the myriad post-translational modifications of these complex molecules.” For example, Agilent makes a chip for N-linked glycans. Gendeh adds, “Complete glycan release, analysis, and data processing can typically be completed in 15 minutes per sample and requires very little hands-on time.”

Scaled-up sensitivity

The decreased size in nano-HPLC delivers increased sensitivity. Although the magnitude of the sensitivity increase depends on the experimental conditions and analytes, in comparing traditional and nano-HPLC technology, says Swart, “a ballpark improvement in sensitivity of a few hundred times should be able to be realized.”

In brief, nano-HPLC’s ability to keep the sample concentrated delivers more of the sample to the detector, which is typically mass spectrometry (MS). “The flow rates of nano-HPLC couple extremely well with MS,” says Pete Claise, senior product manager at Waters Corporation. “At the flow rates of nano-HPLC, 40 percent to 50 percent of the sample gets to the detector, compared with only two percent to three percent at the rates of traditional HPLC.”

In addition, nano-HPLC creates more efficient ionization of the sample and allows some simplifications. “Because of less flow, there is no need for nebulization gas,” Claise explains.

Proteomics and beyond

When a scientist wants to identify and quantify the proteins in complex samples, like biological fluids, nano-HPLC provides the needed sensitivity. As Young says, “Proteomics is the sweet spot for micro- and nano-HPLC.” She adds, “Whether targeted or untargeted this application requires a wide dynamic range.”

In fact, Susan San Francisco, research associate professor at the Center for Biotechnology and Genomics at Texas Tech University in Lubbock, says, “Our lab has used nano-HPLC mostly in proteomic work.” She says that nano-HPLC offers many useful advantages for her lab’s work, including greater sensitivity and the use of less solvent. Nonetheless, she also points out several shortcomings of nano-HPLC, including the limited kinds of columns available. She adds, “Nanospray can be a bit less stable for downstream MS, and you must have a low-volume injector or trap system.” So nano-HPLC won’t always be the best choice, although it is worth considering in an increasing number of applications.

Beyond proteomics, San Francisco says that her lab has also used nano-HPLC for “a small amount of targeted small molecule separation.” Others also point out that nano-HPLC works out well in various applications. As an example, Swart says, “Biopharma needs to find biological drugs in complex matrices, like serum, when studying the impact of a drug.” Nano-HPLC works well in this bioanalytical application. As Claise says, “Over the past decade, lots of papers have shown the benefits of low-flow LC for bioanalytical work.” He adds, “Work in lipidomics and metabolomics is also pushing toward micro- and nano-flow HPLC.”

Although nano-HPLC started as a technology for experts, new commercial products carry it to other scientists. As Swart says, “A lot of universities these days have this technology in their labs.” In fact, the available technology brings nano-HPLC to nearly any lab. In addition, the technology is now easy and robust enough for other environments, such as industrial LC-MS laboratories. In any technology that goes from expertbased to the scientific masses, you never know how far it will go, and that is currently the case for nano-HPLC.

For additional resources on nano-HPLC, including useful articles and a list of manufacturers, visit

DARPA: New Nano-Material Could Change How We Work and Play

newmaterialworkplayx250Serendipity has as much a place in sci­ence as in love.

That’s what North­eastern Univ. physi­cists Swastik Kar and Srinivas Sridhar found during their four-year project to modify graphene, a stronger-than-steel infin­i­tes­i­mally thin lat­tice of tightly packed carbon atoms. Pri­marily funded by the Army Research Lab­o­ra­tory and Defense Advanced Research Projects Agency, or DARPA, the researchers were charged with imbuing the decade-old mate­rial with thermal sen­si­tivity for use in infrared imaging devices such as night-vision gog­gles for the military.

What they unearthed, pub­lished in Science Advances, was so much more: an entirely new mate­rial spun out of boron, nitrogen, carbon and oxygen that shows evi­dence of mag­netic, optical and elec­trical properties, as well as DARPA’s sought-after thermal ones. Its poten­tial appli­ca­tions run the gamut: from 20-megapixel arrays for cell­phone cam­eras to photo detec­tors to atom­i­cally thin tran­sis­tors that when mul­ti­plied by the bil­lions could fuel computers.

“We had to start from scratch and build every­thing,” says Kar, an assis­tant pro­fessor of physics in the Col­lege of Sci­ence. “We were on a journey, cre­ating a new path, a new direc­tion of research.”


An artistic ren¬dering of novel mag¬netism in 2D-BNCO sheets, the new mate¬rial Swastik Kar and Srinivas Sridhar cre¬ated. Image: Northeastern Univ.

The pair was familiar with “alloys,” con­trolled com­bi­na­tions of ele­ments that resulted in mate­rials with prop­er­ties that sur­passed graphene’s—for example, the addi­tion of boron and nitrogen to graphene’s carbon to con­note the con­duc­tivity nec­es­sary to pro­duce an elec­trical insu­lator. But no one had ever thought of choosing oxygen to add to the mix.

What led the North­eastern researchers to do so?

“Well, we didn’t choose oxygen,” says Kar, smiling broadly. “Oxygen chose us.”

Oxygen, of course, is every­where. Indeed, Kar and Sridhar spent a lot of time trying to get rid of the oxygen seeping into their brew, wor­ried that it would con­t­a­m­i­nate the “pure” mate­rial they were seeking to develop.

“That’s where the Aha! moment hap­pened for us,” says Kar. “We real­ized we could not ignore the role that oxygen plays in the way these ele­ments mix together.”

“So instead of trying to remove oxygen, we thought: Let’s con­trol its intro­duc­tion,” adds Sridhar, the Arts and Sci­ences Dis­tin­guished Pro­fessor of Physics and director of Northeastern’s Elec­tronic Mate­rials Research Institute.

Oxygen, it turned out, was behaving in the reac­tion chamber in a way the sci­en­tists had never antic­i­pated: It was deter­mining how the other elements—the boron, carbon and nitrogen—combined in a solid, crystal form, while also inserting itself into the lat­tice. The trace amounts of oxygen were, metaphor­i­cally, “etching away” some of the patches of carbon, explains Kar, making room for the boron and nitrogen to fill the gaps.

“It was as if the oxygen was con­trol­ling the geo­metric struc­ture,” says Sridhar.

They named the new mate­rial, sen­sibly, 2D-BNCO, rep­re­senting the four ele­ments in the mix and the two-dimensionality of the super-thin light­weight mate­rial, and set about char­ac­ter­izing and man­u­fac­turing it, to ensure it was both repro­ducible and scal­able. That meant inves­ti­gating the myriad per­mu­ta­tions of the four ingre­di­ents, holding three con­stant while varying the mea­sure­ment of the remaining one, and vice versa, mul­tiple times over.

After each trial, they ana­lyzed the struc­ture and the func­tional prop­er­ties of the product—elec­trical, optical—using elec­tron micro­scopes and spec­tro­scopic tools, and col­lab­o­rated with com­pu­ta­tional physi­cists, who cre­ated models of the struc­tures to see if the con­fig­u­ra­tions would be fea­sible in the real world.

Next they will examine the new material’s mechan­ical prop­er­ties and begin to exper­i­men­tally val­i­date the mag­netic ones con­ferred, sur­pris­ingly, by the inter­min­gling of these four non­mag­netic ele­ments. “You begin to see very quickly how com­pli­cated that process is,” says Kar.

Helping with that com­plexity were col­lab­o­ra­tors from around the globe. In addi­tion to North­eastern asso­ciate research sci­en­tists, post­doc­toral fel­lows, and grad­uate stu­dents, con­trib­u­tors included researchers in gov­ern­ment, industry, and acad­emia from the U.S., Mexico and India.

“There is still a long way to go but there are clear indi­ca­tions that we can tune the elec­trical prop­er­ties of these mate­rials,” says Sridhar. “And if we find the right com­bi­na­tion, we will very likely get to that point where we reach the thermal sen­si­tivity that DARPA was ini­tially looking for as well as many as-yet unfore­seen applications.”

Source: Northeastern Univ.

Nanotechnology Sensor for 1 – Step Detection: Combining Biology & Nanoscale Technology into Sensors

BPA Nano Sensors 041315 id39729Detection of very small amounts of a chemical contaminant, virus or bacteria in food systems is an important potential application of nanotechnology. The exciting possibility of combining biology and nanoscale technology into sensors holds the potential of increased sensitivity and therefore a significantly reduced response-time to sense potential problems.

“Graphene oxide has potential applications in a variety of biological fields because of its unique characteristics, In addition, due to large absorption cross-sections and the non-radioactive electronic excitation energy transfer from a fluorophore to GO, GO has been employed to construct fluorescence resonance energy transfer (FRET) biosensors,” Professor Chuanlai Xu from the State Key Lab of Food Science & Technology, and Director, Joint Lab of Biointerface and Biodetection, JiangNan University, tells Nanowerk. Xu and his collaborators have developed a novel, rapid, and sensitive fluorescence sensor to detect BPA. The team reported their findings in ACS Applied Materials & Interfaces (“Building An Aptamer/Graphene Oxide FRET Biosensor for One-Step Detection of Bisphenol A”).

Schematic illustration of the biosensor for BPA based on the target-induced conformational change of the anti-BPA aptamer and the interactions between the FAM-ssDNA probe and graphene oxideSchematic illustration of the biosensor for BPA based on the target-induced conformational change of the anti-BPA aptamer and the interactions between the FAM-ssDNA probe and GO. (Reprinted with permission by American Chemical Society) (click on image to enlarge)

Bisphenol A (BPA) is a chemical produced in large quantities for use primarily in the production of polycarbonate plastics and epoxy resins. Polycarbonate plastics have many applications including use in some food and drink packaging, e.g., water bottles, food packaging materials, impact-resistant safety equipment, and medical devices. Epoxy resins are used as lacquers to coat metal products such as food cans, bottle tops, and water supply pipes. Some research has shown that BPA can seep into food or beverages from containers that are made with BPA. Exposure to BPA is a concern because of possible health effects of BPA on the brain, behavior and prostate gland of fetuses, infants and children. While the actual toxicity of BPA is still debated, the direct measurement of BPA is difficult because of the weak response given by conventional electrochemical sensors, and current optical analysis methods are susceptible to the influence of interfering substances. The novel BPA biosensor developed by the Chinese team now provides a method for the rapid detection and risk assessment of BPA with high sensitivity and selectivity. Aptamers – single-stranded oligonucleotides that can be generated for a target molecule with high affinity – are highly suitable receptors for the selective and high-proficiency detection of a wide range of molecular targets. For instance, researchers have previously shown that aptamer-functionalized graphene can detect mercury in mussels. “Our sensor is based on water-soluble and well-dispersed graphene oxide, which was used as the fluorescence quenching agent, and a specific anti-BPA aptamer labeled by FAM (FAM-ssDNA),” Xu explains. “In the absence of BPA, FAM-ssDNA can be adsorbed onto the GO surface, leading to FRET between GO and FAM-ssDNA. Subsequently, the fluorescence can be quenched quickly. Conversely, BPA can interact with FAM-ssDNA and switch its conformation to prevent the adsorption of GO, resulting in fluorescence recovery in the sensing system.” Under different concentrations of BPA, based on the target-induced conformational change of anti-BPA aptamer and the interactions between the fluorescently modified anti-BPA aptamer (FAM-ssDNA) and GO, the team’s experimental results show that the intensity of the fluorescence signal was changed. They say that these results are comparable to traditional ELISA as well as other instrument-based methods, suggesting that this novel sensor might find applications in food safety testing and the monitoring of industrial production processes. The researchers suggest that their GO-based assay offers several advantages:

  • Fluorescent sensors tend to have higher sensitivity compared to most of the colorimetric sensors;
  • The relationship between GO and FAM-ssDNA provide theoretical support for the experiment;
  • GO can be easily chemically synthesized with large quantities. Besides, the method avoids the dual label of ssDNA with fluorophore and quencher units, which significantly lowers the detection cost.

By Michael Berger

Researchers build atomically thin gas and chemical sensors

AAAAAAAAAAAAAAAA 1-researchersbThe relatively recent discovery of graphene, a two-dimensional layered material with unusual and attractive electronic, optical and thermal properties, led scientists to search for other atomically thin materials with unique properties.

Molybdenum disulfide (MoS2) has proved to be one of the most promising. Single-layer and few-layer devices have been proposed for electronic, optoelectronic and energy applications. A team of researchers, led by engineers at the University of California, Riverside’s Bourns College of Engineering, have developed another potential application: sensors.

“The sensors are everywhere now, including in smart phones and other ,” said Alexander Balandin, UC Presidential Chair and professor of electrical and computer engineering at UC Riverside, who is the lead author of the paper. “The sensors we developed are small, thin, highly sensitive and selective, making them potentially ideal for many applications.”

Balandin and the graduate students in his lab built the atomically thin gas and chemical vapor sensors from molybdenum disulfide and tested them in collaboration with researchers at the Rensselaer Polytechnic Institute in Troy, N.Y. The devices have two-dimensional channels, which are great for sensor applications because of the high surface-to-volume ratio and widely tunable concentration of electrons.

The researchers demonstrated that the sensors, which they call molybdenum disulfide thin-film field-effect transistors (TF-FET), can selectively detect ethanol, acetonitrile, toluene, chloroform and methanol vapors.

The findings were published in a recent paper, “Selective chemical vapor sensing with few-layer MoS2 thin-film transistors: Comparison with graphene devices,” in the journal Applied Physics Letters. In addition to Balandin, co-authors were Rameez Samnakay and Chenglong Jiang, both Ph.D. students in Balandin’s lab, and Michael Shur and Sergey Rumyantsev, both of Rensselaer Polytechnic Institute.

The selective detection did not require prior functionalization of the surface to specific vapors. The tests were conducted with the as fabricated devices and intentionally aged devices. The molybdenum disulfide sensors used in the study were aged for two months because practical applications require that sensors remain stable and operational for at least a month.

Sensors made with atomically thin layers of MoS2 revealed better selectivity to certain gases owing to the electron energy band gap in this material, which resulted in strong suppression of electrical current upon exposure to some of the gases. Graphene devices, from the other side, demonstrated selectivity when one used current fluctuations as a sensing parameter.


Schematic of the molybdenum disulfide (MoS2) thin-film sensor with the deposited molecules that create additional charge. Credit: UC Riverside

“Sensors implemented with atomically thin MoS2 layers are complementary to graphene devices, which is good news,” Balandin said. “Graphene has very high electron mobility while MoS2 has the energy band gap.”

The uniqueness of the UC Riverside built atomically thin gas sensors – both graphene and MoS2 – is in the use of the low-frequency current fluctuations as additional sensing signal. Conventionally such chemical sensors use only the change in the electrical current through the or a change in the resistance of the device active channel.

In a separate paper, the same researchers demonstrated high temperature operation of the molybdenum disulfide atomically thin film transistors. The work was described in a paper, “High-temperature performance of MoS2 thin-film transistors: Direct current and pulse current-voltage characteristics,” that was just published in the Journal of Applied Physics.

Many electronic components for control systems and sensors are required to operate at temperature above 200 degrees Celsius. Examples of the high temperature applications include turbine engine control in aerospace and energy generation and oil field instruments.

The availability of transistors and circuits to operate at temperatures above 200 degrees Celsius is limited. Devices made of silicon carbide and gallium nitride – conventional semiconductors – hold promise for extended high-temperature operation but are still not cost-effective for high volume applications. There is a need for new material systems that can be used to make field-effect transistors that work at high temperatures.

Explore further: Molybdenum disulfide may find new application for thin-film transistors in extremely high-temperature electronics

Multi-million-dollar research program will help diversify economy, improve industrial productivity

UniversityOfAlberta_UglyLogo_1-796768New NSERC-AITF Associate Industrial Research Chair aims to improve industry efficiency through intelligent wireless technology.

(Edmonton) How does leading-edge research work to improve industrial productivity and diversify the economy? A new industrial research chair program at the University of Alberta’s Faculty of Engineering is aiming to achieve both goals.

By helping the oil and gas industry operate more efficiently through the use of new intelligent wireless sensors and antennas, mechanical engineering professor Pedram Mousavi is also helping to expand the province’s information and communications technology industry.n-ALBERTA-OILSANDS-large

Although the announcement of his appointment as the NSERC-AITF Associate Industrial Research Chair in Intelligent Integrated Sensors and Antennas was made Feb. 10, he and his research team have been working with industry partners for the past 18 months—and they’ve already accomplished a lot.

Mousavi’s team has licensed three new technologies to one of its industrial partners (Titan Logix) and is in the process of signing two more technologies to another (Pason Systems). An antennae coupling developed by the research team is being sold by yet another partner (Testforce) to clients around the world, and a spinoff company based on the team’s work is being created by his graduate students.

Mousavi says he was unsurprised that oil prices have dropped so dramatically this year—it is the natural cyclical character of resource markets. But he hopes his research program can diversify the economy enough to reduce the province’s reliance on the oil and gas sectors. At the same time, he wants to help the industry operate in more cost-effective and efficient ways.

Present-day technology only allows for a certain percentage of oil or gas to be brought to the surface. But Mousavi and his team, which also includes leading researchers in the Department of Electrical and Computer Engineering,  are developing intelligent wireless devices connecting multiple sensors that could make resource operations much more cost-effective.


Data gathered by these systems could improve productivity by informing off-site experts about an oil well’s operation and the quality of oil it is pumping from the ground. These analysts would be able to respond to the information and make real-time decisions about how the well should be operating.

Working in some of Canada’s top research facilities, the team is also developing an ultra-wideband radar system to monitor oil in transit on rail cars or in tankers hauled by trucks or being stored in tanks. It is also building antennas and front-end circuits for an emerging 5G network, working on ways to wirelessly power remote sensors, and developing a new type of 3-D printer capable of manufacturing electronic devices, sensors and antennas in one integrated process.

The research program has been funded for five years with a possibility of renewal. Funding for Mousavi’s research program totals $2.8 million, including $925,000 each from NSERC, Alberta Innovates – Technology Futures and industry partners (Telus, TRTech, Pason Systems Corp., Titan Logix Corp., ITS Electronics Inc., EMSCAN, InfoChip and TestForce). The remainder of the funding comes from the U of A.

Crucial in-kind support in time and expertise from industry partners is valued at about $900,000, and key equipment funded through the Canada Foundation for Innovation and Alberta Innovates – Technology Futures is valued at $1.5 million.

David Lynch, dean of the Faculty of Engineering, says collaborative partnerships such as the NSERC Industrial Research Chairs benefit everyone involved by bringing industry challenges to university classrooms and labs. This forms connections between education, research and application of new knowledge.

“When industry and universities work together, students gain a deeper understanding of the concepts they’re learning because they can see the connection between their studies and the practical applications of engineering principles,” Lynch said. “At the same time, industry and the engineering profession benefit as we educate a highly qualified new generation of engineers.”

Lynch said the team’s “incredible productivity” proves that such partnerships work.

“This demonstrates that these programs very quickly have significant outcomes for a receptive industrial community,” he said, adding that the relationship between university researchers and industry connects the “pull” of industry needs with the “push” of university research, so the two become aligned.

Pamela Moss, NSERC’s interim vice-president of partnership programs, said Mousavi has the right blend of private-sector and academic experience and success to make the partnership work, and that the technologies he is developing would help industry react more quickly to field conditions, avoiding costly shutdowns or errors.

“And these days, with the price of oil down, efficiency becomes even more important,” she said, adding that the fact that Mousavi has more than 20 graduate students as part of his team demonstrates the educational impact of the funding.

Tiny Sensors Inspired by Butterfly Wings Could Improve Bomb Detection

Bobmb Butterfly sensor_10Engineers in GE labs have built a penny-sized sensor that can detect the faintest traces of explosives and needs no power to operate.

The device uses a special film a tenth the thickness of a human hair to detect chemicals. The team was inspired by their research of the unique iridescence of Morpho butterflies caused by the jagged, forest-like scales found on their wings. (They applied data analytics developed for their bio-inspired Morpho light and temperature sensors to the new radiofrequency (RF) bomb sensors.)

“Our sensor could be placedas a sticker inside of a cargo container on a ship or on packaging for shippedgoods,” says Radislav Potyrailo, a chemical sensing principal scientist who is leading development of the detector at GE Global Research. “It’s a stick-it-and-forget-it kind of thing. This advance brings us closer to a future of ubiquitous testing of chemical explosives.”

The tiny device might be a game changer in detecting hazardous materials like chemical oxidizers and explosives, a process that today requires large and expensive equipment like spectrometers and chromatographs. Instead, the new sensor, which should cost a few cents to produce, is 300 times smaller and consumes 100 times less power than desktop detectors found at airports and other inspection areas.

Heat and chemicals can alter how the jagged structures on Morpho wings reflect light and change butterfly’s color. Heat and chemicals can alter how the jagged structures on Morpho wings reflect light and change butterfly’s color. The device uses a radio frequency identification (RFID) tag coated with an advanced chemical detection film. The scientists designed the film by pooling their knowledge of materials science, nanotechnology, chemistry and data analytics.

Potyrailo, for example, has been studying the scales on the wings of Morpho butterflies for several years. These complex structures absorb and bend light and give the butterflies their trademark shimmering coats. He found that when chemical molecules lodge themselves in the scales on the wings, the structures cause iridescence change.

“We analyze optical spectra from out bio-inspired Morpho sensors and spectra coming from the RF sensors using the same methods,” Potyrailo says. “Light and radio waves are very similar, after all. They are just different portions of the electromagnetic radiation.”

The detector is made of two parts: the RFID sensor tag and a battery-powered, cellphone-size handheld tag reader. Commuters will be familiar with the RFID tag component. It’s similar to the technology they stick on their windshield for automatic highway toll collection but without a battery.

The tag is composed of a flat, coiled antenna attached to a microchip in the center. The antenna harvests power from the reader when it is nearby to operate. Layered on top of the antenna and chip is the special film. This film and sensor combination is designed to respond only to molecules or particles of explosives or oxidizers that are used to make improvised bombs.

Morpho butterfly wings change their natural color (A) after exposure to ethanol (top B) and toluene (bottom B).Morpho butterfly wings change their natural color (A) after exposure to ethanol (top B) and toluene (bottom B).

The portable reader is hitting the tag with radio frequencies, just like light hitting the butterfly’s wing. When workers hold it up to the sensor tag, the radio frequency spectrum is predictably altered by the presence of hazardous materials trapped in the film. This radio spectrum response is picked up by the antenna and

The GE Global Research team behind the RFID sensor. Potyrailo is second from the left.The GE Global Research team behind the RFID sensor. Potyrailo is second from the left.Potyrailo says the technology’s sensing range will expand into an assortment of applications in the future, including passive gas leaks, electrical insulation degradation and bacterial contamination detection.

Potyrailo’s group has been working on the detector for several years. They have partnered with a number of GE labs as well as the Technical Support Working Group (TSWG), a U.S. interagency program for research and development into counterterrorism measures, and other companies to pull in expertise from a range of fields. Their device is designed to meet tough requirements for field deployment on ships and in punishing environments.

“It’s a very attractive device – reliable, robust, cost-effective, low power and high performance,” Potyrailo says. “Chemical threats can be detected and quantified at very low levels with a single sensor, even improvised explosive devices—crazy devices made out of common grocery or pharmacy stuff —we can detect them.”