Will Your ‘Electronic skin’ Come Equipped with Memory?


Nano Skin Sensors

Researchers have created a wearable device that is as thin as a temporary tattoo and can store and transmit data about a person’s movements, receive diagnostic information and release drugs into skin.

Similar efforts to develop ‘electronic skin’ abound, but the device is the first that can store information and also deliver medicine — combining patient treatment and monitoring. Its creators, who report their findings today in Nature Nanotechnology1, say that the technology could one day aid patients with movement disorders such as Parkinson’s disease or epilepsy.

The researchers constructed the device by layering a package of stretchable nanomaterials — sensors that detect temperature and motion, resistive RAM for data storage, microheaters and drugs — onto a material that mimics the softness and flexibility of the skin. The result was a sticky patch containing a device roughly 4 centimetres long, 2 cm wide and 0.003 millimetres thick, says study co-author Nanshu Lu, a mechanical engineer at the University of Texas in Austin.

“The novelty is really in the integration of the memory device,” says Stéphanie Lacour, an engineer at the Swiss Federal Institute of Technology in Lausanne, who was not involved in the work. No other device can store data locally, she adds.

Other Related Stories

The trade-off for that memory milestone is that the device works only if it is connected to a power supply and data transmitter, both of which need to be made similarly compact and flexible before the prototype can be used routinely in patients. Although some commercially available components, such as lithium batteries and radio-frequency identification tags, can do this work, they are too rigid for the soft-as-skin brand of electronic device, Lu says.

Even if softer components were available, data transmitted wirelessly would need to be converted into a readable digital format, and the signal might need to be amplified. “It’s a pretty complicated system to integrate onto a piece of tattoo material,” she says. “It’s still pretty far away.”

Novel Water Treatment Technology Surfaces at Ingenuity Lab: “Aquaporins”


Surfer at Peahi Bay on Maui, Hawaii

(Nanowerk Spotlight) Concern about the depletion of global water resources has grown rapidly in the past decade due to our increasing global population and growing demand for other diverse applications.

Since only 2.5% of the Earth’s water is fresh, it has been reported that almost half of the world’s population is at risk of a water crisis by the year 2025 [1]. Accordingly, significant research efforts have been focused on the desalination of brackish/seawater and the remediation and reuse of wastewater to meet the agricultural, industrial, and domestic water demands. While much progress has been made, the advent of membrane desalination techniques over fifty years ago has given significant impetus to the advancement of water purification technology. However, the need for improved membrane performance and lower operating costs have been a barrier for both researchers and consumers alike. Current water treatment technology Water purification membranes are typically divided into four categories according to pore size:

  • – microfiltration (MF, < few microns)
  • – ultrafiltration (UF, < 100 nm)
  • – nanofiltration (NF, < 10 nm)
  • – reverse osmosis (RO, < 1 nm)

Feed water quality is an important consideration when selecting a suitable membrane. NF and RO membranes have typically been designed for use of brackish water (2-5 g/L of salt), seawater (35 g/L of salt), and waste water (from agriculture and industry) treatments due to their separation capacity of ions (mono-/di-valent) and organic materials (macromolecules, proteins, glucose, and amino acids) from the water.

Of the various candidate materials, conventional desalination membranes are mainly fabricated using aromatic polyamide (PA) thin film composites on a polysulfone support and Loeb-Sourirajan-type cellulose acetate (CA) membranes to create desired architecture.

CA membranes exhibit a specific water flux of 1-20 L/m2/day/bar with an average NaCl rejection of > 98%. The advantage of CA membranes is that they are easy-to-make, fairly well priced, and offer excellent stability against mechanical stress and chlorine. However, an inherent weakness of CA membranes is their performance decrease due to changes in pH, temperature, hydrolysis, and fouling.        

reverse osmosis membrane(click image to enlarge)

PA membranes on the other hand exhibit a high flux (20-200 L/m2/day/bar) with a high salt rejection (> 99%) as well as increased stability against a wide range of pH and temperature. Unfortunately, despite having benefits, the extremely low resistance to chlorine and membrane fouling are construed as major obstacles for PA membranes. As a result, PA membranes are rendered uneconomical because of the high cost of pre-treatment steps prior to desalination membranes.

Most current desalination technologies on the market are based on energy-intensive processes such as multi-stage flash distillation (MSF; 35 kWh/m3) or a pressure-driven RO membranes (> 3 kWh/m3 for seawater and < 1 kWh/m3 for brackish water).

While membrane-based technology is more cost-effective than heat-based technology, the high cost for installation, operation, and maintenance are still major constraining factors for general use of membrane technologies in water treatment.

These costs (> 0.5 $/m3 for seawater and 0.2-0.3 $/m3 for brackish water [2]) are higher than the costs of obtaining fresh water from other sources. Furthermore, it has been predicted that current membrane technology is approaching the maximum performance achievable from CA and PA-based materials [3]. Considering that water treatment costs are directly related to the membrane performance, there is an increasing demand for innovative solutions that move beyond the modification of conventional materials, in order to meet scientific and economic requirements.

Aquaporin-embedded biomimetic membrane At Ingenuity Lab in Edmonton, Alberta, Dr. Carlo Montemagno and a team of world-class researchers have been investigating plausible solutions to existing water purification challenges. They are building on Dr. Montemagno’s earlier patented discoveries by using a naturally-existing water channel protein as the functional unit in water purification membranes [4].

Aquaporin-embedded biomimetic membrane

(click image to enlarge)

Aquaporins are water-transport proteins that play an important osmoregulation role in living organisms [5]. These proteins boast exceptionally high water permeability (~ 1010 water molecules/s), high selectivity for pure water molecules, and a low energy cost, which make aquaporin-embedded membrane well suited as an alternative to conventional RO membranes. Unlike synthetic polymeric membranes, which are driven by the high pressure-induced diffusion of water through size selective pores, this technology utilizes the biological osmosis mechanism to control the flow of water in cellular systems at low energy.

In nature, the direction of osmotic water flow is determined by the osmotic pressure difference between compartments, i.e. water flows toward higher osmotic pressure compartment (salty solution or contaminated water). This direction can however be reversed by applying a pressure to the salty solution (i.e., RO).

The principle of RO is based on the semipermeable characteristics of the separating membrane, which allows the transport of only water molecules depending on the direction of osmotic gradient. Therefore, as envisioned in the recent publication

“Recent Progress in Advanced Nanobiological Materials for Energy and Environmental Applications”

the core of Ingenuity Lab’s approach is to control the direction of water flow through aquaporin channels with a minimum level of pressure and to use aquaporin-embedded biomimetic membranes as an alternative to conventional RO membranes.

Ingenuity Lab’s ongoing research efforts Although introduced a decade ago, only recently has the proof-of-concept for aquaporin-based water purification membranes been demonstrated. Their ultimate success depends on improved membrane performance and membrane functionality which is affected by:

1) Activity of aquaporin in the membrane (rate of water transport)

2) Design concept of the protein-incorporated membrane matrix

3) Membrane manufacturability.

Since aquaporin-incorporated membranes are the key component to attaining higher levels of salt rejection and water flux, Ingenuity Lab has two intense research efforts underway.

The first is to improve the quality and properties of the materials used to produce aquaporin-based membranes. Both the production yield and the stability of the aquaporin is being improved through genetic modification. Additionally, new materials are being developed for use as the matrix to house the aquaporin molecules and form stable, biocompatible membranes that provide structural support for the protein and eliminate leakage around the protein. Efficient production of functional aquaporin and biomimetic materials with optimal protein compatibility guarantees the highest level of water purification capacity, which adds maximum economic benefits to the invention.

Ingenuity Lab’s second major research effort focuses on the development of new methods for assembling and fabricating water purification membranes using novel design concepts. The goal of this task is to develop a platform which protects aquaporin from mechanical and chemical stresses, while maintaining functionality, enabling low cost, scalable production.

The work being done at Ingenuity Lab holds great promise for our generation and those to come.  Ingenuity Lab’s water purification membranes will be applied to treat wastewater and seawater at a much lower pressure than current membranes. The low-energy requirement and high water flow rate of aquaporins are essential components to the realization of cost-effective water purification membranes. In addition to enhanced energy efficiency, unconventional manufacturability-driven membrane design contributes to cost-competitiveness, setting the membranes apart from traditional, more expensive, desalination processes.

In a unique approach,  Ingenuity Lab is applying technology and expertise from a variety of disciplines to actively solve some of the world’s most pressing environmental challenges; including here in Alberta, where this technology could be used to reduce the environmental impact of withdrawing bitumen from the oil sands. By reducing the environmental impact of oil sands mining it will allow us to continue to utilize this valuable resource for years to come.

References [1] Kulshreshtha, S. N. A global outlook for water resources to the year 2025. Water Resour. Manag. 1998, 12(3), 167-184. [2] Fritzmann, C.; Löwenberg, J.; Wintgens, T.; Melin T. State-of-the-art of reverse osmosis desalination. Desalination. 2007, 216, 1-76. [3] Elimelech, M.; Phillip, W.A. The future of seawater desalination: energy, technology, and the environment. Science. 2011, 333(6043), 712-717. [4] Montemagno, C.D.; Schmidt, J.J.; Tozzi, S.P. Biomimetic Membranes. U.S. Patent 7,208,089 B2, 24 April 2007. [5] Borgnia, M.; Nielsen, S.; Engel, A.; Agre, P. Cellular and molecular biology of the aquaporin water channels. Annu. Rev. Biochem. 1999, 68, 425–458.

Source: Ingenuity Lab
Read more: Novel water treatment technology surfaces at Ingenuity Lab http://www.nanowerk.com/spotlight/spotid=34964.php#ixzz2xB4hRpTp Follow us: @nanowerk on Twitter

 

 

High-tech materials purify water with sunlight


1x2 logo smPresented at a meeting of the American Chemical Society. March 16, 2014

 

DALLAS, March 16, 2014 — Sunlight plus a common titanium pigment might be the secret recipe for ridding pharmaceuticals, pesticides and other potentially harmful pollutants from drinking water. Scientists combined several high-tech components to make an easy-to-use water purifier that could

work with the world’s most basic form of energy, sunlight, in a boon for water purification in rural areas or developing countries.

The talk was one of more than 10,000 presentations at the 247th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society, taking place here through Thursday.

Anne Morrissey, Ph.D., explained that the new technology could someday be incorporated into an easy-to-use consumer product that would remove these stubborn pollutants from drinking water as a final step after it has already been treated with conventional methods.

Her group at Dublin City University in Ireland started with a compound called titanium dioxide (TiO­2), a powder used to whiten paints, paper, toothpaste, food and other products. With the right energy, TiO2 can also act as a catalyst — a molecule that encourages chemical reactions — breaking down unwanted compounds in drinking water like pesticides and pharmaceuticals. Morrissey explained that modifying current water treatment methods to get rid of these potentially harmful species can be costly and energy-intensive, and often, these modifications don’t completely eliminate the pollutants.

But Morrissey said TiO2 is usually only activated by ultraviolet light, which is produced by special bulbs. To access titanium dioxide’s properties with the sun’s light, Morrissey and her group experimented with different shapes of TiO2 that would better absorb visible light. She found that nanotubes about 1,000 times thinner than a human hair were best, but they couldn’t do it on their own.

That’s why she turned to graphene, a material made of sheets of carbon just one atom thick. “Graphene is the magic material, but its use for water treatment hasn’t been fully developed,” she said. “It has great potential.” Morrissey put the TiO2 nanotubes on these graphene sheets. Pollutants stuck to the surface of the graphene as they passed by, allowing TiO2 to get close enough to break them down.

Her research group successfully tested the system on diclofenac, an anti-inflammatory drug notorious for wiping out nearly an entire vulture population in India.

“We’re looking at using the graphene composite in a cartridge for one-step drinking water treatment,” said Morrissey. “You could just buy a cartridge off the shelf and plop it into the pipe where the drinking water comes into your house.” The cartridge system would also ensure that the graphene stays immobilized and does its job without contaminating the clean water.

Morrissey noted, however, that the technology will never be strong enough to completely clean drinking water on its own. Rather, she sees it as a polishing step after traditional water treatment processes to mop up the most insidious pollutants.

That could be especially useful in her home country, where she said many rural communities use small water treatment systems that only supply a few dozen homes. Because they don’t have the infrastructure that large-scale urban treatment plants do, she thinks that a cartridge that could clean with only the sun’s energy could help make their water safer.

Ultimately, Morrissey said there are still many questions to answer before declaring her TiO2-graphene system a success. One of the biggest is making sure that when it breaks down pollutants, it is producing harmless byproducts. She also wants to make sure that the energy required for the system compares favorably to simply using TiO2 with ultraviolet light. But so far, she reported, her design seems to be easier to make and dispose of than other visible-light activated TiO2 purifiers.

The authors wish to acknowledge the financial support of the Marie Curie Initial Training Network funded by the EC FP7 People ProgrammeATWARM (Advanced Technologies for Water Resource Management)Tyndall National Institute, Ireland, for their support through the SFI-funded National Access Programme (NAP407); and the Environmental Protection Agency STRIVE program.

The American Chemical Society is a nonprofit organization chartered by the U.S. Congress. With more than 161,000 members, ACS is the world’s largest scientific society and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in Washington, D.C., and Columbus, Ohio.

Green Fracking? 5 Technologies for Cleaner Shale Energy


green-fracking-05_77808_990x742

Patrick J. Kiger for National Geographic

Published March 19, 2014

It may seem strange to hear the words “fracking” and “environmentally friendly” in the same sentence.

After all, hydraulic fracturing, or fracking, in which high-pressure chemically treated water is used to crack rock formations and release trapped oil and gas, is a dirty term to many environmentalists. Critics decry the practice for consuming vast amounts of fresh water, creating toxic liquid waste, and adding to the atmosphere’s greenhouse gas burden, mostly because of increased risk of leaks of the potent heat-trapping gas, methane. (See related quiz, “What You Don’t Know About Natural Gas.”)

James Hill, chief executive of the Calgary, Alberta-based energy services firm GasFrac, is one of a handful of technology pioneers determined to change that. Hill’s company has introduced a new fracking method that uses no water at all. Instead, GasFrac uses a gel made from propane—a hydrocarbon that’s already naturally present underground—and a combination of what it says are relatively benign chemicals, such as magnesium oxide and ferric sulfate, a chemical used in water treatment plants. Over the past few years, GasFrac has used the process 2,500 times at 700 wells in Canada and the United States.

“We’re actually using hydrocarbons to produce hydrocarbons,” Hill said. “It’s a cycle that’s more sustainable.”

GasFrac is one of a growing number of companies, including giant GE and the oil services firm Halliburton, that are pioneering technological improvements to mitigate some of the environmental downsides to the process that has spurred a North American energy boom. (See Interactive, “Breaking Fuel From Rock.”) Besides GasFrac’s water-free method, other companies are working on ways to use recycled frack water or non-potable brine in fracking. Some are working on replacing harsh chemicals used in the process with more benign mixtures, or to cleanse water that’s been used in fracking. Other innovators are looking to replace diesel-powered drilling equipment with engines or motors powered by natural gas or solar energy, and to find ways to find and seal leaks that allow methane, a potent greenhouse gas, to escape.

Such efforts have even won cautious support from some environmental activists, who’ve decided that it may be more realistic to mitigate the consequences of fracking than to fight its use.

“Natural gas is a potential energy bounty for the country, and development is probably inevitable,” said Ben Ratner, a project manager for the nonprofit Environmental Defense Fund.  (See related “Interactive: Breaking Fuel From Rock” and “The Great Shale Gas Rush.”) “That’s why we’re investing our energy into doing everything, from science to policy to working with companies, to maximize the potential climate advantage that gas has over coal, and minimize the risk to public health and the environment. We think natural gas can be an exit ramp from coal, but we have to do it right.” (See related, “U.S. Energy-Related Carbon Emissions Fall to an 18-Year Low,” and Natural Gas Nation: EIA Sees U.S. Future Shaped by Fracking.”)

Here are a few of the efforts to make fracking greener:

Water-Free Fracking: GasFrac’s fracking system, which uses a gelled fluid containing propane, has other advantages besides eliminating the need for water, according to Hill. Because the gel retains sand better than water, it’s possible to get the same results with one-eighth the liquid and to pump at a slower rate. Because GasFrac says the amount of hydrocarbon in the gel is comparable to what’s in the ground, the fluid can simply merge into the flow being extracted from the ground, eliminating the need to drain contaminated wastewater and haul it away in trucks for disposal, usually at deep-well injection sites. “We present a much smaller footprint,” he said. (See related, “Fracking Waste Wells Linked to Ohio Earthquakes.”)

Using Recycled Water or Brine: While fracking typically uses freshwater, industry researchers have worked to perfect friction-reducing additives that would allow operators to use recycled “gray” water or brine pumped from underground. Halliburton’s UniStim, which went on the market about a year ago, can create a highly viscous fluid from any quality of water, according to Stephen Ingram, the company’s technology manager for North America. In northeastern Canada, one producer has tapped into a deep subsurface saline water aquifer for a portion of its supplies for hydraulic fracturing.

Eliminating Diesel Fumes: The diesel-powered equipment used in drilling and pumping wells can be a worrisome source of harmful pollutants such as particulates, as well as carbon emissions that contribute to global warming. And diesel fuel is expensive. Last year, Apache, a Houston-based oil and gas operator, announced it would become the first company to power an entire fracking job with engines using natural gas. In addition to reducing emissions, the company cut its fuel costs by 40 percent. Halliburton has introduced another innovation, the SandCastle vertical storage silo for the sand used in fracking, which is powered by solar panels. The company also has developed natural-gas-powered pump trucks, which Ingram said can reduce diesel consumption on a site by 60 to 70 percent, resulting in “a sizable reduction in both emissions and cost.”

Drainage water pond, Texas

PHOTOGRAPH BY DENNIS DIMICK, NATIONAL GEOGRAPHIC
Drainage water pours into a settling pond near the booming oil fields of the Midland-Odessa region of West Texas.

Treating Wastewater: At hydraulic fracturing sites, the amount of wastewater typically far exceeds the amount of oil produced. The fluid that returns to the surface through the well bore is not only the chemically treated frack water, but water from the rock formation that can contains brines, metals, and radionuclides. (See related, “Forcing Gas Out of Rock With Water.”) That wastewater must be captured and stored on site, and then often is shipped long distances to deep well injection underground storage facilities. There have been few treatment options. But Halliburton has developed the CleanWave treatment system, which uses positively charged ions and bubbles to remove particles from the water at the fracking site. Last September, GE and its partner Memsys also tested a new on-site treatment system that allows the water to be reused without being diluted with freshwater, by employing a desalination process called membrane distillation. (See related Quiz: What You Don’t Know About Water and Energy.

Plugging Methane Leaks: A major fracking concern has been whether companies are allowing a significant amount of natural gas to escape, because methane—the main component of natural gas—is a potent greenhouse gas, 34 times stronger than carbon dioxide (CO2). A recent study concluded U.S. methane emissions are likely 50 percent higher than official government estimates. (See related, “Methane Emissions Far Worse Than U.S. Estimates.“) New U.S. Environmental Protection Agency regulations that go into effect next year will require that all U.S. oil and gas sites have equipment designed to cut a wide range of pollutants, a step that the agency expects will cut methane. (See related, “Air Pollution From Fracked Wells Will Be Regulated Under New U.S. Rules.”)

Methane emissions from onshore oil and natural gas production could be reduced by 40 percent by 2018, at a cost that’s the equivalent of just one cent per thousand cubic feet of natural gas produced, concludes a just-released study, conducted by Fairfax, Va.-based consulting firm ICF International for the Environmental Defense Fund. EDF’s Ratner said that inspectors equipped with infrared cameras can spot leaks at fracking sites, which can then be plugged. “The cameras cost about $80,000 to $100,000 apiece,” he noted. “But that can pay for itself, because the more leaks you fix, the more gas you have to sell.” (See related blog post: “Simple Fixes Could Plug Methane Leaks From Energy Industry, Study Finds.”)

Another improvement that can reduce methane emissions: Replacing conventional pressure-monitoring pneumatic controllers, which are driven by gas pressure and vent gas when they operate. A U.S.-wide move to lower-bleed designs could reduce emissions by 35 billion cubic feet annually. And switching out conventional chemical injection pumps used in the fracking process, which are powered by gas pressure from the wells, and replacing them with solar-powered pumps, operators could eliminate an 5.9 billion cubic feet of methane emissions annually, the EDF report concludes.

The Cost-Benefit Equation

Some solutions do not require advanced technology. A study released Wednesday by the Boston-based Clean Air Task Force suggests that almost all of the methane leaks from the oil and gas infrastructure could be reduced at relatively little expense, often by simply tightening bolts or replacing worn seals.

A number of greener fracking technologies already are being implemented, according to industry officials. But one obstacle is economic. The newer, more environmentally friendly technologies generally cost more than the legacy equipment they would replace. Extracting natural gas with water-free fracking, for example, could cost 25 percent more than conventional fracking, according to David Burnett, a professor of petroleum engineering at Texas A&M University who heads that school’s Environmentally Friendly Drilling Systems Program. He said that switching fracking equipment from diesel to natural gas is the innovation that’s catching on most rapidly, because it provides a clear economic benefit as well as helping to lower carbon emissions. With the rising cost of renting fracking rigs, companies are eager to find improvements that will reduce their costs, he said.

Green fracking is “the same as with any industry—if you come out with a game-changing technology, you can get in the market first and ride that,” Burnett said.  (See related, “Can Natural Gas Bring Back U.S. Factory Jobs?“)

But Halliburton’s Ingram said that innovations such as chemical treatments to make brine usable will drop in price as the technology is perfected. “Eventually it will become the lower-cost chemistry,” he said.

A more difficult hurdle might be overcoming what Ingram calls “sociopolitical constraints” around the country. One major issue that reduces incentives to invest in green fracking innovations: the generally low price of freshwater. (See related, “Water Demand for Energy to Double by 2035.”)

This story is part of a special series that explores energy issues. For more, visit The Great Energy Challenge.

Renewable Energy Sources Vs. Cheap NG from Fracking .. And the Winner Is?


Has the Advanced Research Projects Agency–Energy failed in its mission to create alternative energy breakthroughs? By David Biello

artificial-leaf
ARTIFICIAL LEAF: Sun Catalytix hoped to turn its sunlight-and-split-water system into a cheap source of power for homes.
A single bottle of dirty water transformed into the power source for a home—such was the promise of a technology package that became known as the “artificial leaf.” And such was the vision introduced by its inventor, Daniel Nocera, at the inaugural summit of the Advanced Research Projects Agency–Energy in 2010.The artificial leaf pledged to store 30 kilowatt-hours of electricity after a mere four hours of splitting water into oxygen and hydrogen, or enough to power an average American “McMansion” for a day. It was exactly the kind of “high-risk, high-reward” technology touted by President Obama when he launched the agency in 2009 (an idea carried over from the George W. Bush–era).

Such technologies could help with the country’s energy, environmental and economic security by creating new industries and jobs as well as by reducing the pollution associated with energy production and use today. More succinctly, “ARPA–E turns things that are plausible into things that are possible,” proclaimed Acting Director Cheryl Martin at the 2014 summit.

Out of 37 projects that received initial ARPA–E funding, Sun Catalytix, a company founded by Nocera, was the poster child—or rather video favorite—featured in a U.S. Department of Energy (DoE) clip talking up the potential of transformational change. “Almost all the solar energy is stored in water-splitting,” intoned Nocera, a Massachusetts Institute of Technology chemist, at the inaugural ARPA–E summit. “Shift happens.”

The artificial leaf proved to be possible but implausible, however. It won’t be splitting water using sunlight on a mass scale anytime soon, its hydrogen dreams blown away by a gale of cheap natural gas that can also be easily converted to the lightest element.

So Sun Catalytix has set the artificial leaf aside and shifted focus to flow batteries, rechargeable fuel cells that use liquid chemistry to store electricity. A better flow battery might not shift the fundamental fuel of the American dream but it could help utilities cope with the vagaries of wind and solar power—and is more likely to become a salable product in the near future.

Five years in, ARPA–E’s priorities have shifted, too, for the same reason. The cheap natural gas freed from shale by horizontal drilling and hydraulic fracturing (or fracking) has helped kill off bleeding-edge programs like Electrofuels, a bid to use microbes to turn cheap electricity into liquid fuels, and ushered in programs like REMOTE, a bid to use microbes to turn cheap natural gas into liquid fuels. Even at the first summit in 2010, so full of alternative energy promise, this gassy revolution was becoming apparent.

Consulting firm Black & Veatch predicted that burning natural gas would provide nearly half of all U.S. electricity by 2034, a forecast fulfilled a few decades early in 2012. “We’ve got a lot of cheap gas,” said ARPA–E Program Director Dane Boysen at the 2014 summit. “The question is: What should we do with it?”

Methane Opportunities for Vehicular Energy, or MOVE program cars that run on natural gas or better batteries. Is enabling the energy predominance of another fossil fuel the kind of transformation E is failing?

The measure of success ARPA–E points to follow-on funding from other entities (whether corporate, government or venture capital) as an early measure of its success. So far, the agency has invested more than $900 million in 362 different research projects. Of those projects, 22 have garnered an additional $625 million from capitalists of one type or another; it is a group that includes Sun Catalytix.

ARPA–E funding has also allowed 24 projects to form spin-off companies whereas 16 projects have found a new funding source from other government agencies, including the DoE, which runs ARPA–E, and the Department of Defense.

The biggest successes include Makani Power, which makes souped up kites for wind power, and was acquired by Google after ARPA–E invested $6 million developing the technology. There’s also Ambri, which makes liquid-metal batteries for cheap energy storage on a massive scale and is now developing units capable of storing 20 kilowatt-hours for testing later this year.

And there’s 1366 Technologies, which became the first (and only, at that time in 2009) ARPA–E grantee in photovoltaics with a new manufacturing method that wastes less silicon. The company will begin construction this year on its first factory.

The outright failures have been mostly less prominent: algae breeding for biofuels and various carbon dioxide capture technologies, along with efforts to knit together hydrocarbons from sunshine, carbon dioxide and water. But some have proved more conspicuous. ARPA–E feted a would-be breakthrough battery maker named Envia in 2012. But by 2014, while at least one of the entrepreneurs backing the company still mingled in the summit’s halls at the Gaylord National Resort & Convention Center in Maryland, Envia was mired in lawsuits and failed to deliver the energy-dense batteries it promised to General Motors.

“I don’t call them failures, I call them opportunities to learn,” argued ARPA–E’s first director, Arun Majumdar, in a 2012 interview with Scientific American about failed projects in general. “If 100 percent of these projects worked out, we’re not doing our job.”

ARPA–E is definitely doing its job then: Biofuels haven’t quite delivered on their promise, even engineering tobacco plants for oil, while electrofuels were a “crazy-ass idea,” to use a term employed by William Caesar, president of the recycling business at Waste Management, at the 2014 summit to describe some of the concepts his company has evaluated for investment. And ARPA–E’s budget has always been too small to tackle innovation in certain areas. “My real hope was to have enough of a budget to try out something different than what we are doing in the nuclear field today,” such as a prototype for a new kind of reactor, Majumdar said in a 2013 interview with Scientific American.   “If you’re solving for climate change and you’re a serious person, your strategy starts with nuclear,” said David Crane, CEO of the electric utility NRG, at this year’s summit.

But ARPA–E’s budget has always been too small to encompass, for example, the hundreds of millions of dollars Crane lost during his tenure in a failed bid to build new standard nuclear reactors in Texas.   An analysis of the biggest programs by year and funding shows that electrofuels drew the biggest investment (at more than $41 million) in fiscal year 2010 followed by better hardware and software for the U.S. grid to help integrate renewables in 2011.

But in fiscal year 2012 the biggest tranche of funding to a single program went to Boysen’s MOVE projects (roughly $30 million) and, in fiscal year 2013, just behind the $36 million invested in better batteries for electric cars, was the REMOTE program of projects garnering $34 million. “It could have a small environmental footprint,” argues Ramon Gonzalez, program director for Reducing Emissions using Methanotrophic Organisms for Transportation Energy (REMOTE). “We can develop something that is a bridge to renewable energy or even is renewable itself in the future.”

Natural gas hardly seems to need ARPA–E’s help to become ubiquitous. And although natural gas can help with climate change in the short term—displacing coal that emits even more pollution when burned to generate electricity—in the long run it, too, is a fossil fuel and a greenhouse gas itself. Burning methane for electricity will also one day require capturing and storing the resulting carbon dioxide in order to combat climate change.

ARPA–E has not succeeded in delivering a technological breakthrough that would allow that to happen cheaply or efficiently, despite investing more than $30 million in its Innovative Materials and Processes for Advanced Carbon Capture Technologies (IMPACCT) back in 2010. “ARPA–E needs to revisit carbon capture and storage,” said Michael Matuszewski of the National Energy Technology Laboratory at this year’s summit.  

Long game Significant changes in energy sources—say from wood to coal or the current shift from coal to gas—take at least 50 years, judging by the record to date. “Looking at the climate risk mitigation agenda, we don’t have 50 to 60 years,” U.S. Secretary of Energy Ernest Moniz argued at the 2014 summit. “We have to cut [that time] in half,” and that will require breakthrough technologies that are cheaper, cleaner and faster to scale.

It is also exactly in times of overreliance on one energy source that funding into alternatives is not only necessary, but required. ARPA–E should continue to focus on transformational energy technologies that can be clean and cheap, even if political pressures incline the still young and potentially vulnerable agency to look for a better gas tank. After all, if ARPA–E and others succeed in finding ways to use ever-more natural gas, new shale supplies touted to last for a century at present consumption rates could be exhausted much sooner.

“Before this so-called ‘shale gale‘ came upon us, groupthink had most of us focusing on energy scarcity,” warned Alaska Sen. Lisa Murkowski (R) at the 2013 summit. “The consensus now is one of abundant energy. Don’t fall into the trap of groupthink again.”   Failure is a necessary part of research at the boundaries of plausibility.  As ARPA–E’s Martin said at this year’s summit: “It’s part of the process.” Many of the ideas the agency first funded were ideas that had sat unused on a shelf since the oil crisis of the 1970s.

And the ideas that go back on the shelf now, like the artificial leaf, provide the basic concepts—designer, metal-based molecules—for new applications, like flow batteries.   The artificial leaf, for one, could benefit from ARPA–E or other research to bring down the cost of the photovoltaic cells that provide the electricity to allow the leaf’s designer molecules to do their water-splitting work. Already, cheaper photovoltaics may be ushering in an energy transition of their own, cropping up on roofs across the country from California to New Jersey.

When such renewable sources of energy become a significant source of electricity, more storage will be needed for when the sun doesn’t shine or the wind doesn’t blow—and that storage needs to be cheap and abundant. In Germany, where the wind and sun now provide roughly one quarter of all that nation’s electricity, the long-term plan is to convert any excess to gas that can then be burned in times of deficit—so-called power to gas, which is a fledgling technology at best. And why couldn’t clean hydrogen be that gas, as Nocera has suggested?

So the artificial leaf bides its time, while research continues at the Joint Center for Artificial Photosynthesis established with DoE money in California. Failure is an investment in future success. “The challenge is not that the technology doesn’t work, but the economics don’t work,” observed Waste Management’s Caesar at the 2014 ARPA–E Summit. “I don’t like to talk about dead ends. There are things that their time just hasn’t come yet.”

Read “The Clean Energy Wars” here:

http://www.scientificamerican.com/report/the-clean-energy-wars/

University of Alberta: Oil Sands: The “Art” of Remediation and Sustainability


Aaron-preview-1

Aaron Veldstra is performing March 19 as part of U of Alberta Water Week 2014.

Performance art project explores remediation and sustainability in resource industry.    By Michael Brown

(Edmonton) At some point along the creative process, the waste left over from Aaron Veldstra’s various projects began to weigh on him. “Through creating, I realized I was making all these buckets of dirty water and just pouring them down the sink. I sort of became uncomfortable with the fact that I was just pouring my waste down the sink and it was disappearing, making it someone else’s problem,” said the first-year master of fine arts student in the Department of Art and Design. “Out of sight, out of mind—I was basically pushing the problem to a different space so I didn’t have to see it anymore.”

Veldstra, who has spent a decade of summers in the reclamation industry as a tree planter, says his first instinct was to find a way to reclaim his waste water, an interesting process in its own right that engaged his artistic side further. What has emerged is an early incarnation of a performance art project entitled Experiments in Artistic Hydrology, in which Veldstra attempts to engage people in a conversation about oil and the oil industry in Alberta using the concepts of remediation and sustainability.

“These terms get thrown around quite a bit, but what do they really mean? That’s the question that I’m asking or provoking through these acts.” The acts in question start with Veldstra marking his wall-sized canvas—two sheets of drywall—with a series of lines representing geographical data sets, such as pipelines, roads, cutlines and power lines, related to oil exploration and the resource industry in northern Alberta.

“When you clean something, you always make something else dirty.” Then, Veldstra applies thick beads of black ink using a syringe to trace along the data set lines. The resulting lines and drips are then sponged off using a combination of water and baking soda. “What I have is a bucket of dirty water, which I then filter using sand in a series of buckets,” said Veldstra, who models the filtration system after how most municipalities filter their citizens’ drinking water. “In the end, what I have is essentially clean water and a bunch of dirty sand. “The end result is when you clean something, you always make something else dirty.”

Veldstra says the project isn’t solely a critique of remediation and sustainability of oil producers, but also our reliance on oil in general. “We’re all implicated in our use of oil. It’s not specifically about oil companies, it’s about everybody: everybody wants to drive a car, everything we do involves oil in some way,” he said. “I’m using the oilsands as this contestable thing, but I’m not specifically talking about the giant holes we see north of Fort McMurray—it’s all of us.

“I’m just trying to broaden this conversation a little bit more, engage people through the act of doing something weird.” Watch Aaron Veldstra’s performance Veldstra is performing as part of UAlberta Water Week in the PCL Lounge of the Centennial Centre for Interdisciplinary Science March 19 from 5–9 p.m.

March 17–22 is UAlberta Water Week, a campus celebration of water leading up to UN World Water Day. The theme of this year’s events is “Exploring Sustainable Practices for Water and Energy.” Events are free and open to the public.

Learn more about UAlberta Water Week – See more at:

http://water.ualberta.ca/WaterWeek.aspx

New Nano-Composite Material Has Great Potential Application as a Sensitive and Selective Biosensor for Glucose


2x2-logo-sm.jpgNi3S2/carbon nanotube nano-composite as electrode material for hydrogen evolution reaction in alkaline electrolyte and enzyme-free glucose detection.

 

 

Highlights

  • The Ni3S2/CNT nanocomposite can be easily synthesized using hydrothermal method.
  • The HER performance of Ni3S2/CNT nanocomposite is activated by base treatment.
  • The HER activity of nanocomposite is correlated to Ni3S2 morphology on CNTs.
  • Ni3S2 and conductive CNTs impart high HER activity and stability to the catalyst.
  • The composite electrode exhibits high catalytic activity toward glucose oxidation.


Abstract

In this study, the nanocomposite of Ni3S2 and multi-walled carbon nanotubes (MWCNTs) with the high catalytic activities toward hydrogen evolution reaction (HER) and glucose oxidation was synthesized using glucose-assisted hydrothermal method. Ni3S2 nanoparticles with the diameters ranging from 10 to 80 nm were highly dispersed over conductive MWCNT surface.

A series of linear polarization measurements suggested that the HER activity of nanocomposite of Ni3S2 and MWCNTs was increased with decreasing the loading amount of Ni3S2 on MWCNTs and the optimal Ni3S2 loading on MWCNTs was 55 wt%. Furthermore, the immersion of the composite catalyst in a concentrated KOH solution induced the morphological change of the Ni3S2 nanoparticles on MWCNTs, which increases the active surface area of the composite electrode.

As a result, the KOH-treated composite electrode showed a higher HER activity than other electrodes. For example, the value of exchange current density of the KOH-treated composite electrode was ca. 395 times and 1.6 times larger than that of Ni3S2 electrode and as-synthesized composite, respectively. Furthermore, the impedance measurements showed the KOH-treated composite electrode had the smaller charge transfer resistance of the HER than Ni3S2 electrode.

Based on the slopes obtained from Arrhenius curves of the electrodes, the estimated HER activation energy (71.8 kJ/mol) of KOH-treated composite electrode was only one-third of that of the pure Ni3S2 electrode. The high catalytic activity of the KOH-treated composite electrode was stemmed from the synergistic effect of the large active surface area of Ni3S2 nanoparticles and the excellent electrical coupling to the conductive MWCNT network.

More importantly, the current density of KOH-treated composite electrode showed no sign of degradation after the continuous 1000 cycling in a 1 M KOH solution at the temperature of 323 K. On the other hand, the nanocomposite of Ni3S2 and MWCNTs was proposed for the first time as an enzyme-free sensor for glucose.

The Ni3S2 nanoparticles on MWCNTs exhibited high electrocatalytic activity toward glucose oxidation and were insensitive to uric acid and ascorbic acid. Furthermore, the composite electrode exhibited that its catalytic current was linearly dependent on the concentration of glucose in the range from 30 to 500 μM and its sensitivity was as high as 3345 μA/mM.

The present work suggested that the nanocomposite of Ni3S2 and MWCNTs not only served as an inexpensive, highly active and stable electrode material for alkaline water electrolysis, but also showed a great potential application as a highly sensitive and selective biosensor for glucose.


Keywords

  • Nickel sulfide;
  • Carbon nanotubes;
  • Nanocomposite;
  • Electrocatalyst;
  • Hydrogen evolution

‘Vanishing’ Electronics and Unlocking Nanomaterials’ Power Potential


2x2-logo-sm.jpgBrain sensors and electronic tags that dissolve. Boosting the potential of renewable energy sources. These are examples of the latest research from two pioneering scientists selected as this year’s Kavli lecturers at the 247th National Meeting & Exposition of the American Chemical Society (ACS).

 

The meeting features more than 10,000 presentations from the frontiers of chemical research, and is being held here through Thursday. Two of these talks are supported by The Kavli Foundation, a philanthropic organization that encourages basic scientific innovation. These lectures, which are a highlight of the conference, shine a spotlight on the work of both young and established researchers who are pushing the boundaries of science to address some of the world’s most pressing problems.

Tackling health and sustainability issues simultaneously, John Rogers, Ph.D., is developing a vast toolbox of materials—from magnesium and silicon to silk and even rice paper—to make biodegradable electronics that can potentially be used in a range of applications. He will deliver “The Fred Kavli Innovations in Chemistry Lecture.”

“What we’re finding is that there’s a robust and diverse palette of material options at every level,” said Rogers, who’s with the University of Illinois, Urbana-Champaign. “For the conductor, for the semiconductor, for the insulating layer and the package and the substrate, one can pick and choose materials depending on the application’s requirements.”

Rogers’ team is working to incorporate some of these elements in sensors that can, for example, detect the early onset of swelling and temperature changes in the brain after head injuries and then vanish when they’re no longer needed. Today, devices designed for these purposes are wired—they have to be implanted and later completely removed once they’re no longer needed. Rogers’ sensor could be implanted but work wirelessly and, after use, “simply disappear.” That eliminates the risk of infection and other complications associated with having to remove devices surgically. Rogers has successfully tested early prototypes of sensors in laboratory animals and envisions that such devices could be used one day in human patients.

His group is also working on biodegradable radio-frequency identification tags, or RFID tags. Currently, RFIDs are produced by the billions and used in everything from jeans for accurately tracking inventory to smart cards and injected into pets. They are also found in product packaging that ends up in landfills. Using cellulose, zinc and silicon, Rogers has successfully made dissolvable RFID tags in the lab. The next step would be figuring out how to scale production up and commercialize it.

“We’re quite optimistic,” Rogers said. “We see the way forward and are about halfway there.”

Delivering the “The Kavli Foundation Emerging Leader in Chemistry Lecture” is Emily Weiss, Ph.D., of Northwestern University. Her lab is focused on getting the most power possible out of mixed and matched nanomaterials that are being developed to maximize . Scientists can now engineer these materials with unprecedented precision to capture large amounts of energy—for example, from the sun and heat sources. But getting all that energy from these materials and pushing it out into the world to power up homes and gadgets have been major obstacles.

“Electric current originates from the movement of electrons through a material,” Weiss explained. “But as they move through a material or device, they encounter places where they have to jump from one type of material to another at what’s called an interface. By interfaces, I mean places where portions of the material that are not exactly alike meet up. The problem is when an electron has to cross from one material to another, it loses energy.”

As structures in materials get smaller, the interface problem becomes amplified because nanomaterials have more surface area compared to their volume. So electrons in these advanced devices have to travel across more and more interfaces, and they lose energy as heat every time.

But thanks to the latest advances in analytical instruments and computing power, Weiss’ group is poised to turn this disadvantage into a plus. “Rather than seeing all these interfaces as a negative, now we don’t need to consider it a drawback,” she said. “We can design an interface such that we can get rid of defects and get rid of this slowdown. We can actually use carefully designed interfaces to enhance the properties of your device. That sort of philosophy is starting to take hold.”

Explore further:     New ‘transient electronics’ disappear when no longer needed

More information: 1. Biodegradable electronics, John Rogers, Ph.D.:

Abstract

A remarkable feature of the modern integrated circuit is its ability to operate in a stable fashion, with almost perfect reliability. Recently developed classes of electronic materials create an opportunity to engineer the opposite outcome, in the form of devices that dissolve completely in water, with harmless end products. The enabled applications range from ‘green’ consumer electronics to bio-resorbable medical implants – none of which would be possible with technologies that exist today. This talk summarizes recent work on this physically ‘transient’ type of electronics, from basic advances in materials chemistry, to fundamental studies of dissolution reactions, to engineering development of complete sets of device components, sensors and integrated systems. An ‘electroceutical’ bacteriocide designed for treatment of surgical site infections provides an application example.

2. Behavior of electrons at nanoscopic organic/inorganic interfaces, Emily Weiss, Ph.D.:

Abstract The behavior of electrons and energy at interfaces between different types or phases of materials is an active research area of both fundamental and technological importance. Such interfaces often result in sharp free energy gradients that provide the thermodynamic driving force for some of the most crucial processes for energy conversion: migration of energy and charge carriers, conversion of excited states to mobile charge carriers, and redox-driven chemical reactions. Nanostructured materials are defined by high surface area-to-volume ratios, and should therefore be ideal for the job of energy conversion; however, they have a structural and chemical complexity that does not exist in bulk materials, and which presents a formidable challenge: mitigate or eliminate energy barriers to electron and energy flux that inevitably result from forcing dissimilar materials to meet in a spatial region of atomic dimensions. Chemical functionalization of nanostructured materials is perhaps the most versatile and powerful strategy for controlling the potential energy landscape of their interfaces, and for minimizing losses in energy conversion efficiency due to interfacial structural and electronic defects. Using metal and semiconductor nanoparticles as model systems, this talk will explore the power of tuning the chemistry at the organic-inorganic interface within colloidal semiconductor and metal nanoparticles as a strategy for controlling their structure and properties.

 

Read more at: http://phys.org/news/2014-03-electronics-nanomaterials-power-potential.html#jCp

New nanoparticle that only attacks cervical cancer cells


Nano Cancer 140314212122-largeOne of the most promising technologies for the treatment of various cancers is nanotechnology, creating drugs that directly attack the cancer cells without damaging other tissues’ development. The Laboratory of Cellular Oncology at the Research Unit in Cell Differentiation and Cancer, of the Faculty of Higher Studies (FES) Zaragoza UNAM (National Autonomous University of Mexico) developed a therapy to attack cervical cancer tumors.

The treatment, which has been tested in animal models, consists of a nanostructured composition encapsulating a protein called interleukin-2 (IL -2), lethal to cancer cells.

According to the researcher Rosalva Rangel Corona, head of the project, the antitumor effect of interleukin in cervical cancer is because their cells express receptors for interleukin-2 that “fit together ” like puzzle pieces with the protein to activate an antitumor response .

The scientist explains that the nanoparticle works as a bridge of antitumor activation between tumor cells and T lymphocytes. The nanoparticle has interleukin 2 on its surface, so when the protein is around it acts as a switch, a contact with the cancer cell to bind to the receptor and to carry out its biological action.

Furthermore, the nanoparticle concentrates interleukin 2 in the tumor site, which allows its accumulation near the tumor growth. It is not circulating in the blood stream, is “out there” in action.

The administration of IL-2 using the nanovector reduces the side effects caused by this protein if administered in large amounts to the body. These effects can be fever, low blood pressure, fluid retention and attack to the central nervous system, among others.

It is known that interleukin -2 is a protein (a cytokine, a product of the cell) generated by active T cells. The nanoparticle, the vector for IL-2, carries the substance to the receptors in cancer cells, then saturates them and kills them, besides generating an immune T cells bridge (in charge of activating the immune response of the organism). This is like a guided missile acting within tumor cells and activating the immune system cells that kill them.

A woman immunosuppressed by disease produces even less interleukin. For this reason, the use of the nanoparticle would be very beneficial for female patients. The researcher emphasized that his group must meet the pharmaceutical regulations to carry their research beyond published studies and thus benefit the population.


Story Source:

The above story is based on materials provided by Investigación y Desarrollo. Note: Materials may be edited for content and length.

MIT: A “Nano” Solution for Desalination – No Heat (Low Energy) Required?


March 16, 2014
Source: Massachusetts Institute of Technology
Summary:
Water Molecule 031714Consider the nearest water surface: a half-full glass on your desk, a puddle outside your window, or a lake across town. All of these surfaces represent liquid-vapor interfaces, where liquid meets air. Molecules of water vapor constantly collide with these liquid surfaces: Some make it through the surface and condense, while others simply bounce off. The probability that a vapor molecule will bounce, or reflect, off a liquid surface is a fundamental property of water, much like its boiling point. And yet, in the last century, there has been little agreement on the likelihood that a water molecule will bounce off the liquid surface.
“When a water vapor molecule hits a surface, does it immediately go into the liquid? Or does it come off and hit again and again, then eventually go in?” says Rohit Karnik, an associate professor of mechanical engineering at MIT. “There’s a lot of controversy, and there’s no easy way to measure this basic property.”

Knowing this bouncing probability would give scientists an essential understanding of a variety of applications that involve water flow: the movement of water through soil, the formation of clouds and fog, and the efficiency of water-filtration devices.

This last application spurred Karnik and his colleagues — Jongho Lee, an MIT graduate student in mechanical engineering, and Tahar Laoui, a professor at the King Fahd University of Petroleum and Minerals (KFUPM) in Saudi Arabia — to study water’s probability of bouncing. The group is developing membranes for water desalination; this technology’s success depends, in part, on the ability of water vapor to flow through the membrane and condense on the other side as purified water.

By observing water transport through membranes with pores of various sizes, the group has measured a water molecule’s probability of condensing or bouncing off a liquid surface at the nanoscale. The results, published in Nature Nanotechnology, could help in designing more efficient desalination membranes, and may also expand scientists’ understanding of the flow of water at the nanoscale.

“Wherever you have a liquid-vapor surface, there is going to be evaporation and condensation,” Karnik says. “So this probability is pretty universal, as it defines what water molecules do at all such surfaces.”

Getting in the way of flow

One of the simplest ways to remove salt from water is by boiling and evaporating the water — separating it from salts, then condensing it as purified water. But this method is energy-intensive, requiring a great deal of heat.

Karnik’s group developed a desalination membrane that mimics the boiling process, but without the need for heat. The razor-thin membrane contains nanoscale pores that, seen from the side, resemble tiny tubes. Half of each tube is hydrophilic, or water-attracting, while the other half is hydrophobic, or water-repellant.

As water flows from the hydrophilic to the hydrophobic side, it turns from liquid to vapor at the liquid-vapor interface, simulating water’s transition during the boiling process. Vapor molecules that travel to the liquid solution on the other end of the nanopore can either condense into it or bounce off of it. The membrane allows higher water-flow rates if more molecules condense, rather than bounce.

Designing an efficient desalination membrane requires an understanding of what might keep water from flowing through it. In the case of the researchers’ membrane, they found that resistance to water flow came from two factors: the length of the nanopores in the membrane and the probability that a molecule would bounce, rather than condense.

In experiments with membranes whose nanopores varied in length, the team observed that greater pore length was the main factor impeding water flow — that is, the greater the distance a molecule has to travel, the less likely it is to traverse the membrane. As pores get shorter, bringing the two liquid solutions closer together, this effect subsides, and water molecules stand a better chance of getting through.

But at a certain length, the researchers found that resistance to water flow comes primarily from a molecule’s probability of bouncing. In other words, in very short pores, the flow of water is constrained by the chance of water molecules bouncing off the liquid surface, rather than their traveling across the nanopores. When the researchers quantified this effect, they found that only 20 to 30 percent of water vapor molecules hitting the liquid surface actually condense, with the majority bouncing away.

A no-bounce design

They also found that a molecule’s bouncing probability depends on temperature: 64 percent of molecules will bounce at 90 degrees Fahrenheit, while 82 percent of molecules will bounce at 140 degrees. The group charted water’s probability of bouncing in relation to temperature, producing a graph that Karnik says researchers can refer to in computing nanoscale flows in many systems.

“This probability tells us how different pore structures will perform in terms of flux,” Karnik says. “How short do we have to make the pore and what flow rates will we get? This parameter directly impacts the design considerations of our filtration membrane.”

Jan Eijkel, a professor of microfluidics and nanofluidics at the University of Twente in the Netherlands, says the group’s work may be useful in understanding a wide range of phenomena, including the microphysics and chemistry of clouds, fluids, aerosols, and the atmosphere.

“Their main contribution is the introduction of an entirely new method, which has the very nice flexibility of being able to adjust the distance between water surfaces down to very short distances,” says Eijkel, who did not contribute to the work. “Also, the innovation of changing the composition of the two solutions independently is elegant.”

Lee says that knowing the bouncing probability of water may also help control moisture levels in fuel cells.

“One of the problems with proton exchange membrane fuel cells is, after hydrogen and oxygen react, water is generated. But if you have poor control of the flow of water, you’ll flood the fuel cell itself,” Lee says. “That kind of fuel cell involves nanoscale membranes and structures. If you understand the correct behavior of water condensation or evaporation at the nanoscale, you can control the humidity of the fuel cell and maintain good performance all the time.”

The research was funded by the Center for Clean Water and Clean Energy at MIT and KFUPM.


Story Source:

The above story is based on materials provided by Massachusetts Institute of Technology. The original article was written by Jennifer Chu. Note: Materials may be edited for content and length.


Journal Reference:

  1. Jongho Lee, Tahar Laoui, Rohit Karnik. Nanofluidic transport governed by the liquid/vapour interface. Nature Nanotechnology, 2014; DOI: 10.1038/nnano.2014.28