The World Of Tomorrow: Nanotechnology: Interview with PhD and Attorney D.M. Vernon

Bricks and Mortar chemistsdemoThe Editor interviews Deborah M. VernonPhD, Partner in McCarter & English, LLP’s Boston office.




Why It Matters –

” … I would say the two most interesting areas in the last year or two have been in 3-D printing and nanotechnology. 3-D printing is an additive technology in which one is able to make a three-dimensional product, such as a screw, by adding material rather than using a traditional reduction process, like a CNC (milling) process or a grinding-away process.

The other interesting area has been nanotechnology. Nanotechnology is the science of materials and structures that have a dimension in the nanometer range (1-1,000 nm) – that is, on the atomic or molecular scale. A fascinating aspect of nanomaterials is that they can have vastly different material properties (e.g., chemical, electrical, mechanical properties) than their larger-scale counterparts. As a result, these materials can be used in applications where their larger-scale counterparts have traditionally not been utilized.”


Editor: Deborah, please tell us about the specific practice areas of intellectual property in which you participate.



Vernon: My practice has been directed to helping clients assess, build, maintain and enforce their intellectual property, especially in the technology areas of material science, analytical chemistry and mechanical engineering. Prior to entering the practice of law, I studied mechanical engineering as an undergraduate and I obtained a PhD in material science engineering, where I focused on creating composite materials with improved mechanical properties.

Editor: Please describe some of the new areas of biological and chemical research into which your practice takes you, such as nanotechnology, three-dimensional printing technology, and other areas.

Vernon: I would say the two most interesting areas in the last year or two have been in 3-D printing and nanotechnology. 3-D printing is an additive technology in which one is able to make a three-dimensional product, such as a screw, by adding material rather than using a traditional reduction process, like a CNC (milling) process or a grinding-away process. The other interesting area has been nanotechnology. Nanotechnology is the science of materials and structures that have a dimension in the nanometer range (1-1,000 nm) – that is, on the atomic or molecular scale.

A fascinating aspect of nanomaterials is that they can have vastly different material properties (e.g., chemical, electrical, mechanical properties) than their larger-scale counterparts. As a result, these materials can be used in applications where their larger-scale counterparts have traditionally not been utilized.

Organ on a chip organx250

I was fortunate to work in the nanotech field in graduate school. During this time, I investigated and developed methods for forming ceramic composites, which maintain a nanoscale grain size even after sintering. Sintering is the process used to form fully dense ceramic materials. The problem with sintering is that it adds energy to a system, resulting in grain growth of the ceramic materials. In order to maintain the advantageous properties of the nanosized grains, I worked on methods that pinned the ceramic grain boundaries to reduce growth during sintering.

The methods I developed not only involved handling of nanosized ceramic particles, but also the deposition of nanofilms into a porous ceramic material to create nanocomposites. I have been able to apply this experience in my IP practice to assist clients in obtaining and assessing IP in the areas of nanolaminates and coatings, nanosized particles and nanostructures, such as carbon nanotubes, nano fluidic devices, which are very small devices which transport fluids, and 3D structures formed from nanomaterials, such as woven nanofibers.

Editor: I understand that some of the components of the new Boeing 787 are examples of nanotechnology.

Vernon: The design objective behind the 787 is that lighter, better-performing materials will reduce the weight of the aircraft, resulting in longer possible flight times and decreased operating costs. Boeing reports that approximately 50 percent of the materials in the 787 are composite materials, and that nanotechnology will play an important role in achieving and exceeding the design objective. (See,

While it is believed that nanocomposite materials are used in the fuselage of the 787, Boeing is investigating applying nanotechnology to reduce costs and increase performance not only in fuselage and aircraft structures, but also within energy, sensor and system controls of the aircraft.

Editor: What products have incorporated nanotechnology? What products are anticipated to incorporate its processes in the future?

Vernon: The products that people are the most familiar with are cosmetic products, such as hair products for thinning hair that deliver nutrients deep into the scalp, and sunscreen, which includes nanosized titanium dioxide and zinc oxide to eliminate the white, pasty look of sunscreens. Sports products, such as fishing rods and tennis rackets, have incorporated a composite of carbon fiber and silica nanoparticles to add strength. Nano products are used in paints and coatings to prevent algae and corrosion on the hulls of boats and to help reduce mold and kill bacteria. We’re seeing nanotechnology used in filters to separate chemicals and in water filtration.

The textile industry has also started to use nano coatings to repel water and make fabrics flame resistant. The medical imaging industry is starting to use nanoparticles to tag certain areas of the body, allowing for enhanced MRI imaging. Developing areas include drug delivery, disease detection and therapeutics for oncology. Obviously, those are definitely in the future, but it is the direction of scientific thinking.

Editor: What liabilities can product manufacturers incur who are incorporating nanotechnology into their products? What kinds of health and safety risks are incurred in their manufacture or consumption?Nano Body II 43a262816377a448922f9811e069be13

Vernon: There are three different areas that we should think about: the manufacturing process, consumer use and environmental issues. In manufacturing there are potential safety issues with respect to the incorporation or delivery of nanomaterials. For example, inhalation of nanoparticles can cause serious respiratory issues, and contact of some nanoparticles with the skin or eyes may result in irritation. In terms of consumer use, nanomaterials may have different material properties from their larger counterparts.

As a result, we are not quite sure how these materials will affect the human body insofar as they might have a higher toxicity level than in their larger counterparts. With respect to an environmental impact, waste or recycled products may lead to the release of nanoparticles into bodies of water or impact wildlife. The National Institute for Occupational Safety and Health has established the Nanotechnology Research Center to develop a strategic direction with respect to occupational safety and nanotechnology. Guidance and publications can be found at

Editor: The European Union requires the labeling of foods containing nanomaterials. What has been the position of the Food & Drug Administration and the EPA in the United States about food labeling?

Vernon: So far the FDA has taken the position that just because nanomaterials are smaller, they are not materially different from their larger counterparts, and therefore there have been no labeling requirements on food products. The FDA believes that their current standards for safety assessment are robust and flexible enough to handle a variety of different materials. That being said, the FDA has issued some guidelines for the food and cosmetic industries, but there has not been any requirement for food labeling as of now. The EPA has a nanotechnology division, which is also studying nanomaterials and their impact, but I haven’t seen anything that specifically requires a special registration process for nanomaterials.

Editor: What new regulations regarding nanotech products are expected? Should governmental regulations be adopted to prevent nanoparticles in foods and cosmetics from causing toxicity?

Vernon: The FDA has not telegraphed that any new regulations will be put into place. The agency is currently in the data collection stage to make sure that these materials are being safely delivered to people using current FDA standards – that materials are safe for human consumption or contact with humans. We won’t really understand whether or not regulations will be coming into place until we see data coming out that indicates that there are issues that are directly associated with nanomaterials. Rather than expecting regulations, I would suggest that we examine the data regarding nano products to optimize safe handling and use procedures.

Editor: Have there ever been any cases involving toxicity resulting from nano products?

Vernon: There are current investigations about the toxicity of carbon nano tubes, but the research is in its infancy. There is no evidence to show any potential harm from this technology. Unlike asbestos or silica exposure, the science is not there yet to demonstrate any toxicity link. The general understanding is that it may take decades for any potential harm to manifest. I believe my colleague, Patrick J. Comerford, head of McCarter’s product liability team in Boston, summarizes the situation well by noting that “if any supportable science was available, plaintiff’s bar would have already made this a high-profile target.”

Editor: While some biotech cases have failed the test of patentability before the courts, such as the case of Mayo v. Prometheus, what standard has been set forth for a biotech process to pass the test for patentability?

Vernon: There is no specified bright-line test for determining if a biotech process is patentable. But what the U.S. Patent and Trademark Office has done is to issue some new examination guidelines with respect to the Mayo decision that help examiners figure out whether a biotech process is patent eligible. Specifically, the guidelines look to see if the biotech process (i.e., a process incorporating a law of nature) also includes at least one additional element or step. That additional element needs to be significant and not just a mental or correlation step. If a biotech process patent claim includes this significant additional step, there still needs to be a determination if the process is novel and non-obvious over the prior art. So while this might not be a bright-line test to help us figure out whether a biotech process is patentable, it at least gives us some direction about what the examiners are looking for in the patent claims.

Editor: What effect do you think the new America Invents Act will have in encouraging biotech companies to file early in the first stages of product development? Might that not run the risk that the courts could deny patentability as in the Ariad case where functional results of a process were described rather than the specific invention?

Vernon: The AIA goes into effect next month. What companies, especially biotech companies, need to do is file early. Companies need to submit applications supported by their research to include both a written description and enablement of the invention. Companies will need to be more focused on making sure that they are not only inventing in a timely manner but are also involving their patent counsel in planned and well-thought-out experiments to make sure that the supporting information is available in a timely fashion for patenting.

Editor: Have there been any recent cases relating to biotechnology or nanotechnology that our readers should be informed about?

Vernon: The Supreme Court will hear oral arguments in April in the Myriad case. This case involves the BRCA gene, the breast cancer gene – and the issue is whether isolating a portion of a gene is patentable. While I am not a biotechnologist, I think this case will also impact nanotechnology as a whole. Applying for a patent on a portion of a gene is not too far distant from applying for a patent on a nanoparticle of a material that already exists but which has different properties from the original, larger-counterpart material. Would this nanosize material be patentable? This will be an important case to see what guidance the Supreme Court delivers this coming term.

Editor: Is there anything else you’d like to add?

Vernon: I think the next couple of years for nanotech will be very interesting. As I mentioned, I did my PhD thesis in the nanotechnology area a few years ago. My studies, like those of many other students, were funded in part with government grants. There is a great deal of government money being poured into nanotechnology. In the next ten years we will start seeing more and more of this research being commercialized and adopted into our lives. To keep current of developments, readers can visit

The Metropolitan Corporate Counsel
The Leading Resource For Corporate Counsel

As a leading publication in the corporate counsel community, MCC offers unique editorial content covering legal, regulatory, legislative and business developments, featuring original articles and interviews from experts at prestigious law firms, bar associations, accounting firms and legal service providers, as well as educators, business executives and high-level state, national and international officials.


Genesis Nanotech Headlines Are Out!

Organ on a chip organx250Genesis Nanotech Headlines Are Out! Read All About It!!headlines

Visit Our Website:

Visit/ Post on Our Blog:



Chairman Terry: “Nanotech is a true science race between the nations, and we should be encouraging the transition from research breakthroughs to commercial development.”

WASHINGTON, DCThe Subcommittee on Commerce, Manufacturing, and Trade, chaired by Rep. Lee Terry (R-NE), today held a hearing on:

“Nanotechnology: Understanding How Small Solutions Drive Big Innovation.”




“Great Things from Small Things!” … We Couldn’t Agree More!


Nanotech for Oil, Gas Applications – A “Smoother Flow”

Hawaii Nano hf_134212_articleA “smart coating” initially developed to help U.S. Navy ships ply through water more efficiently could help pipeline operators transport more crude oil without using costlier larger-diameter pipe or adding horsepower to pumps, according to the head of a Hawaii-based science, engineering and technology firm.

“The use of nanomaterials opens up a whole new dimension,” said Patrick Sullivan, founder and CEO of Oceanit. The company’s “Anhydra” coating technology manipulates the properties of a surface at the nanoscale –1,000 times smaller than a human hair, he noted.

“If you can control surfaces at that scale, you can create structures with specific performances that would otherwise be impossible,” continued Sullivan. “Being able to control something on that scale and then scaling it out creates tremendous efficiencies.” In the case of its original application as an antifouling coating to reduce drag on the hulls of naval vessels, Anhydra enables ships to go faster without expending extra energy for propulsion, Sullivan said.

Hawaii Nano hf_134212_article

Nanotech surface treatment could boost pipeline flow assurance, says exec.

The coating helps surfaces to behave differently and actually extends the service life of the material to which it is applied, he explained. Applications in Oil, Gas Oceanit is researching and developing new formulations of Anhydra for the military as well as the aerospace, healthcare and oil and gas industries. In the latter case, the company sees considerable potential for the technology to enhance and protect metallic surfaces exposed to a wide range of temperatures and pressures both offshore and onshore.

One potential application is an internal pipeline coating that repels crude oil – and the water and other constituents in it – in order to improve flow and prevent corrosion, Sullivan said. The technology’s “ice-phobic” properties could also prevent methane hydrates from accumulating in subsea pipelines, he added.

“In the oil and gas industry it’s a huge thing because if you can reduce the drag in a pipeline, that means for the same pump you get more distance or you can move material with the same amount of energy.” Aside from easing product movement inside pipelines, Oceanit’s nanotech coating could also protect the exterior surfaces of pipelines, offshore platforms and myriad other oil and gas infrastructure from corrosion, added Sullivan. Coatings could be designed to repair scratches and abrasions, protect a metallic surface from the elements and preempt the onset of corrosion, he explained.

Oceanit’s work on oil and gas applications of Anhydra has been limited to the laboratory, but the company has been actively courting industry players to partner in the critical step to scale up the technology. “We’ll develop the technology in a lab setting and then work collaboratively with an operator that will use it” in the field, Sullivan explained.

In addition to opening an office in the world’s energy capital Houston, Oceanit has stepped up its presence at major oil and gas events such as the recent Offshore Technology Conference and will be on-hand at International Association of Drilling Contractors and Society of Petroleum Engineers events this fall. The company’s outreach efforts to date have been fruitful, Sullivan noted.

“We’re in some discussions right now, we’re testing with some operators and and going to scale with some others,” he said, adding that Oceanit has been in “very preliminary” talks with manufacturers. “We’re always looking at how to go to scale because this industry is all about scale.” Oceanit also is in the process of deploying one of its high-performance coatings in the field, said Vinod Veedu, the company’s Houston-based director of strategic initiatives.

“We’ve quickly scaled up from the laboratory to the field in a matter of months,” Veedu concluded. “It’s an exciting time to be supporting this fast-moving industry.”

– See more at:



U of Alberta PhD Researcher seeks New Solutions for Cleaner Oilsands


PhD student wins scholarship to help find environmentally friendly ways of producing hydrogen for energy industry.

(Edmonton) We live in a province rich in fossil fuel resources, and great profits can be made from them. However, the use of these fossil fuels comes at a significant environmental cost. The greenhouse gas emissions footprint of Alberta’s oilsands industry is one of its most formidable challenges in the context of environmental stewardship.

Babatunde Olateju, a PhD candidate in the University of Alberta’s Department of Mechanical Engineering and a recipient of this year’s $13,000 Sadler Graduate Scholarships in Mechanical Engineering, is researching ways to mitigate an energy-intensive aspect of oilsands activities: hydrogen production.

Huge amounts of hydrogen are consumed in upgrading bitumen to synthetic crude oil, and considerable energy is consumed simply to produce usable hydrogen. (The use of hydrogen is expected to reach 3.1 million tonnes per year in the oilsands industry by 2023.) Hydrogen is an abundant simple element and is a potential source of emissions-free fuel. But hydrogen doesn’t exist on its own; it is locked up in water, carbon (coal) and other elements.

Albera Oil Sands II

Alberta ‘Oil Sands’ Projects

Most of the hydrogen used as a fuel in North America is extracted through a process known as steam methane reforming. This process results in considerable greenhouse gas emissions. Olateju is building computer models that consider both the technology and the costs of producing hydrogen through more environmentally friendly means.

His models consider two alternatives to current methods of producing hydrogen: one is using energy produced from renewable sources such as wind and hydro power, and the other is finding ways to mitigate the effects of hydrogen production as it is currently produced (with natural gas and coal) through carbon capture and sequestration (CCS) or underground coal gasification.

CCS is the geological storage (landfilling) of carbon dioxide generated from use of fossil fuels. CCS is still in the early stages of development. Underground coal gasification is a method of converting coal to gas (syngas) underground, and can be used in combination with CCS. Even if used without CCS, underground gasification results in a lower greenhouse gas footprint than traditional methods of coal combustion.

Olateju’s computer models assess large-scale, environmentally sustainable hydrogen production systems (and their costs) for the bitumen upgrading industry in Western Canada. This is data-intensive work; he uses data sourced mainly from refereed journals but also from government and industry. Despite these data, in Western Canada, very little research has been done on producing hydrogen in environmentally sustainable yet economically feasible ways. Olateju says the work is time-consuming but it remains a stimulating endeavour, especially considering the insight that can be gained from the model results. The oilsands industry is expanding, and it’s imperative that we find ways to make its growth sustainable.

There’s a need for environmental stewardship to balance the growth. Given the considerable amount of hydrogen used in the upgrading of bitumen, finding ways to produce hydrogen with lower or no greenhouse gas emissions will make a huge impact. Olateju is seeing his papers published in high-impact journals and receiving academic awards. In addition to the Sadler Graduate Scholarship, he received the Government of Alberta’s Graduate Citizenship Award. This is not surprising for the former co-president of the U of A’s Energy Club and, until December 2013, president of the university’s Nigerian Students’ Association. Olateju is part of a research program led by Amit Kumar, who holds the Industrial Research Chair in Energy and Environmental Systems Engineering funded by the Natural Sciences and Engineering Research Council of Canada, Cenovus Energy, Alberta Innovates – Energy and Environment Solutions, and Alberta Innovates – Bio Solutions.

Dave Hassan, Cenovus’s director of technology investments, said, “We believe that it is imperative for society to understand how to make the best use of our energy and water resources. The research pursued by Olateju and his colleagues at the U of A is critical to developing this understanding, and we look forward to learning more about his findings.” Olateju says he feels “profound gratitude” toward the U of A and especially toward the Department of Mechanical Engineering.

He also feels “a strong sense of fulfilment and motivation to sustain and deepen my intellectual pursuits, within and beyond the confines of academia. My journey to the University of Alberta was eventful, and not without its fair share of challenges and sacrifices.” Olateju values his relationships with his colleagues in the sustainable energy research group and adds that his relationship with Kumar “has been the most influential factor for my intellectual growth and research success.”

Asia-Pacific to Invest $2.5 trillion in Renewables to Build New Power Capacity Needed by 2030

Renewable Energy Pix5 Tera-Watts of NEW POWER Needed Worldwide by 2030 

The Asia-Pacific region will invest a massive $3.6 trillion over the years ahead to equip itself with the power capacity it needs for 2030. Two thirds of that sum will go on renewable generation technologies such as wind, solar and hydro-electric, according to a major report from research company Bloomberg New Energy Finance.

The report, BNEF 2030 Market Outlook, based on modelling of electricity market supply and demand, technology cost evolution and policy development in individual countries and regions, forecasts that Asia-Pacific will account for more than half of the 5TW of net new power capacity that will be added worldwide in the next decade and a half.
This will equate to $3.6 trillion of investment in Asia-Pacific.[1] Fossil fuel sources such as coal-fired and gas-fired generation will continue to grow in the region, despite rising concerns about pollution and climate change, but the biggest growth will be in renewables, with some $2.5 trillion invested and 1.7TW of capacity added.
Milo Sjardin, head of Asia Pacific for Bloomberg New Energy Finance, said: “The period to 2030 is going to see spectacular growth in solar in this region, with nearly 800GW of rooftop and utility-scale PV added. This will be driven by economics, not subsidies – our analysis suggests that solar will be fully competitive with other power sources by 2020, only six years from now.
“However, that does not mean that the days of fossil-fuel power are over. Far from it – rapid economic growth in Asia will still drive net increases of 434GW in coal-fired capacity and 314GW in gas-fired plant between now and 2030. That means that emissions will continue to increase for many years to come.”
Looking at individual countries in the region, China is forecast to add a net 1.4TW of new generating capacity between now and 2030 to meet power demand that is double that of today. This will require capital investment of around $2 trillion, of which 72% will go to renewables such as wind, solar and hydro.
Japan’s power sector will experience a very different trajectory in the next 16 years, with electricity demand only regaining its 2010 levels in 2021 and then growing at a modest 1% a year, as efficiency gains partially offset economic growth. Some $203bn is expected to be invested in new power generation capacity by 2030, with $116bn of that going to rooftop solar and $72bn to other renewable technologies.
India is forecast to see a quadrupling of its power generation capacity, from 236GW in 2013 to 887GW in 2030, with 169GW of the additions taking the form of utility-scale solar and 98GW onshore wind. Hydro will see capacity boosted by 95GW, coal by 155GW and gas by 55GW. Total investment to 2030 will be $754bn, with $477bn of that in renewables.


India RE images

Global Numbers
Globally, Bloomberg New Energy Finance expects $7.7 trillion to be invested in new generating capacity by 2030, with 66% of that going on renewable technologies including hydro. Out of the $5.1 trillion to be spent on renewables, Asia-Pacific will account for $2.5 trillion, the Americas $816bn, Europe $967bn and the rest of the world including Middle East and Africa $818bn.
Fossil fuels will retain the biggest share of power generation by 2030 at 44%, albeit down from 64% in 2013. Some 1,073GW of new coal, gas and oil capacity worldwide will be added over the next 16 years, excluding replacement plant. The vast majority will be in developing countries seeking to meet the increased power demand that comes with industrialisation, and also to balance variable generation sources such as wind and solar. Solar PV and wind will increase their combined share of global generation from 3% last year to 16% in 2030.
Michael Liebreich, chairman of the advisory board for Bloomberg New Energy Finance, commented: “This country-by-country, technology-by-technology forecast of power market investment is more bullish on renewable energy’s future share of total generation than some of the other major forecasts, largely because we have a more bullish view of continuing cost reductions. What we are seeing is global CO2 emissions on track to stop growing by the end of next decade, with the peak only pushed back because of fast-growing developing countries, which continue adding fossil fuel capacity as well as renewables.”
More on the data and the methodology can be found here.
[1] The actual period in which these investment allocations are made will be 2013-26, in order for the equivalent generating capacity to be commissioned by 2030.
Source: Bloomberg New Energy Finance

Accelerating Innovation in Alberta

U of Alberta 140618-emerald-awards-ualberta-sign-teaserUAlberta partnership with TEC Edmonton, Innovate Calgary receives federal funding to help grow promising startups. By TEC Edmonton Staff on June 24, 2014 (Edmonton)

A partnership of the University of Alberta, TEC Edmonton and Innovate Calgary has been selected by the Canadian Accelerator and Incubator Program to help business accelerators and incubators deliver their services to promising Canadian firms.

TEC Edmonton, Edmonton’s leading business incubator and accelerator, will offer additional business services to health-based startup companies, including new companies spun off from medical research at the U of A. Innovate Calgary, TEC Edmonton’s counterpart in Calgary, will focus its funding on energy-related high-tech startups.

With the U of A, the two business incubator/accelerators will also put the new funding to work by linking investment-ready new companies to existing investor networks focused on new, made-in-Alberta technologies.

U of Alberta 140618-emerald-awards-ualberta-sign-teaser

“This is fantastic news,” said Lorne Babiuk, vice-president (research) at the U of A. “It’s another example of how the University of Alberta continues to transfer its knowledge, discoveries and technologies into the community via commercialization to benefit society, the economy and Canada as a whole. We are delighted to be partnering with Innovate Calgary and TEC Edmonton, which are Alberta’s largest and most successful incubators, and among the best in the country. I thank the Government of Canada for their support and for this valuable program.” “CAIP funding allows us and our partners to enhance and expand our services supporting the innovation community and Alberta’s overall economic prosperity,” said Peter Garrett, president of Innovate Calgary.

“With our shareholders the University of Calgary, the Calgary Chamber and the City of Calgary, Innovate Calgary is committed to accelerating the growth of early-stage companies and entrepreneurs.” “TEC Edmonton is a true community partnership,” said TEC Edmonton CEO Chris Lumb. “We were created by the University of Alberta and the City of Edmonton (through the Edmonton Economic Development Corporation) with strong support from the regional entrepreneurial community, technology investors, the Province of Alberta, the Canadian government and hundreds of volunteers.

With such support, TEC Edmonton has grown into one of Canada’s best tech accelerators. “This new federal funding strengthens TEC Edmonton and Innovate Calgary’s ability to help grow great new companies and to further commercialize research at Alberta’s post-secondary institutions.”

– See more at:

Algorithms are Becoming Key to Designing New Materials

MidSummer solar panelsFrom solar panels to batteries, algorithms are becoming key to designing new materials






Summary: Materials science is being transformed by algorithms, and computers are now selecting new material combinations to test in the lab.

In the future, materials that could make a super efficient solar panel or a breakthrough battery probably won’t be discovered by a smart human scientist. Like everything else in this world, computers and software are increasingly identifying the best combination of materials to deliver a desired result, and then human researchers are testing out those computers’ choices in the lab.

For University of Colorado professor Alex Zunger, that idea is a fundamental change in materials research. Zunger is the chief theorist at the Center for Inverse Design, and at the SunShot Summit last week he spoke about how “inverse design” — identifying specific properties that are desired in a material, then determining that material’s required atomic structure — could transform sectors like solar.

Silicon wafers (solar)

For decades, materials for new applications have been selected to be tested “rather casually,” said Zunger, based on “simple ideas,” or even “availability in the lab.” But now, thanks to sophisticated algorithms, scientists can use computer intelligence to make these choices.

Zunger is particularly interested in using inverse design and computer intelligence to figure out the optimal materials to use quantum dots for solar materials. Quantum dots are little pieces of semiconductor crystals — less than 10 nanometers — that are so small they have different properties and characteristics than larger semiconductor pieces. But so far, Zunger says, there hasn’t been an obvious winning combination for solar quantum dots.

Zunger isn’t the only one doing this. It’s actually a hot trend for some of the most cutting-edge materials startups out there.


For example, a startup called Pellion Technologies, which was spun out of MIT, developed advanced algorithms and computer modeling that enabled it to test out 10,000 potential cathode materials to fit with a magnesium anode for a battery. Now the startup is developing a magnesium battery, which could have a very high energy density, and if it works could be important for electric vehicles and grid storage.

A founder of Pellion, MIT professor Gerbrand Ceder, helped develop the Materials Genome Project at MIT, which is a program that uses computer modeling and virtual simulations to deliver innovation in materials. The Economist once described Ceder’s work with the Materials Genome Project as “a short cut” for discovering electrodes and the interactions of inorganic chemical compounds.


Other smart people are also working on this idea. Columbia University’s Institute for Data Sciences and Engineering spearheaded important work in the area, and professors Venkat Venkatasubramanian and Sanat Kumar recently published research on their work designing nanostructured materials with an inverse design framework and genetic algorithms.

While this trend might seem like yet another way that computers are replacing humans, it’s actually an example of ways that computers can leverage massive data sets (that humans can’t) to advance society and make life better — for humans. It’s similar to the way that automated vehicles will make driving more efficient, safer and more productive. Odds are that the material breakthroughs of the future will come from this combination of artificial intelligence and human intelligence.

How Soon Will West Antarctic Ice Sheet’s Retreat Trigger A Sea Level Warning?

Western Glacial Ice Sheet

Two teams of scientists say the long-feared collapse of the West Antarctic Ice Sheet has begun, kicking off what they say will be a centuries-long, “unstoppable” process that could raise sea levels by as much as 15 feet.

“There’s been a lot of speculation about the stability of marine ice sheets, and many scientists suspected that this kind of behavior is under way,” Ian Joughin, a glaciologist at the University of Washington in Seattle, said in a news release about one of the studies. “This study provides a more qualitative idea of the rates at which the collapse could take place.”

The findings from Joughin and his colleagues, to be published this week in the journal Science, indicate that in some places, Antarctica’s Thwaites Glacier is losing tens of feet, or several meters, of ice elevation every year.

They estimate that Thwaites Glacier would probably disappear entirely in somewhere between 200 and 1,000 years. That loss would raise global sea levels by nearly 2 feet (60 centimeters). The glacier serves as a linchpin for the rest of the West Antarctic Ice sheet, which has enough frozen mass to cause another 10 to 13 feet (3 to 4 meters) of sea level rise.

A second study, published Monday in Geophysical Research Letters, reports the widespread retreat of Thwaites and other glaciers on the West Antarctic Ice Sheet — and says the retreat can’t help but continue.

“It has passed the point of no return,” the research team’s leader, Eric Rignot of the University of California at Irvine, told reporters during a NASA teleconference on Monday. The second study projected that the glacial retreat in Antarctica’s Amundsen Sea Embayment, which includes Thwaites Glacier, would result in 4 feet (1.2 meters) of sea level rise — and open the way to more widespread retreats.

Rignot’s team based their findings on a detailed analysis of radar data from two European Earth Remote Sensing satelllites, ERS-1 and ERS-2. Joughin’s team relied on radar maps primarily derived from an aerial NASA survey called Operation IceBridge.

Scientists have been warning for decades that the West Antarctic Ice Sheet was in peril due to climate change, and recent readings have shown that the region is warming more quickly than expected. The loss of ice that is floating on the seas surrounding the continent would not contribute significantly to sea level rise. However, losing the ice that’s currently grounded on the continent would.

A moment of ‘wow’

The two studies released on Monday document the glaciers’ retreat and project what’s likely to happen in the future. “We finally have hit this point where we have enough observations to put this all together, to say, ‘Wow, we really are in this state.'” NASA scientist Tom Wagner told reporters.

The key findings in both studies relate to what’s happening to the “grounding line” for Antarctica’s glaciers. That’s the subsurface boundary between ice that is floating on the sea and ice that is anchored to land.

“The grounding line is buried under a thousand or more meters of ice, so it is incredibly challenging for a human observer on the ice sheet surface to figure out exactly where the transition is,” Rignot explained in a NASA news release. “This analysis is best done using satellite techniques.”

The radar readings from both teams show that the grounding line for some areas of the West Antarctic Ice Sheet has retreated by as much as 20 miles (37 kilometers) over the past couple of decades, apparently due to the interaction with warmer seas. The main worry is that there appears to be no submerged hill or mountain that could slow down further retreat.

Rignot said that means the glacial retreat has triggered a process of “positive feedback.”

“We feel that this is at the point where even if the ocean is not providing additional heat, the system is in a chain reaction that is unstoppable,” he told reporters.

Joughin said computer models produce a wide range of scenarios for the collapse of Thwaites Glacier. Some scenarios suggest that the glacier could last more than a millennium longer, but the most likely scenarios predict that rapid collapse would occur somewhere between 200 and 500 years from now.

What lies ahead

Higher greenhouse-gas emissions would lead to faster ice loss, and lower emissions could slow down the meltdown. But in any case, the loss of Thwaites Glacier appears inevitable, Joughin said: “All of our simulations show it will retreat at less than a millimeter of sea level rise per year for a couple of hundred years, and then, boom, it just starts to really go.”

In its most recent assessment, the Intergovernmental Panel on Climate Change estimated that global sea levels were likely to rise between 4 inches and 3 feet (10 to 90 centimeters) by the year 2100. Sridhar Anandakrishnan, a geoscientist at Penn State University who didn’t play a role in either study, said future IPCC estimates “will almost certainly be revised, and revised upwards.”

“The IPCC projections don’t really include Antarctic contributions to any great measure,” he told reporters. “The results are just now starting to come together.”

Anandakrishnan said future middle-of-the-road estimates for 2100 may well zero in on the top end of the current IPCC projection, around 3 feet. Without mitigating measures, that amount of sea level rise would inundate significant areas of coastal cities including Miami Beach, New Orleans and New York.

A high-resolution radar map shows Thwaites Glacier’s thinning ice shelf. Warm circumpolar deep water is melting the underside of this floating shelf, leading to a speedup in the glacier’s retreat. This glacier now appears to be in the early stages of collapse, with full collapse potentially occurring within a few centuries.

In addition to Rignot, the authors of the paper in Geophysical Research Letters, “Widespread, Rapid Grounding Line Retreat of Pine Island, Thwaites, Smith and Kohler Glaciers, West Antarctica From 1992 to 2011,” include Jeremie Mouginot, Mathieu Morlighem, Helene Seroussi and Bernd Scheuchl.


Renewable Energy Sources Vs. Cheap NG from Fracking .. And the Winner Is?

Has the Advanced Research Projects Agency–Energy failed in its mission to create alternative energy breakthroughs? By David Biello

ARTIFICIAL LEAF: Sun Catalytix hoped to turn its sunlight-and-split-water system into a cheap source of power for homes.
A single bottle of dirty water transformed into the power source for a home—such was the promise of a technology package that became known as the “artificial leaf.” And such was the vision introduced by its inventor, Daniel Nocera, at the inaugural summit of the Advanced Research Projects Agency–Energy in 2010.The artificial leaf pledged to store 30 kilowatt-hours of electricity after a mere four hours of splitting water into oxygen and hydrogen, or enough to power an average American “McMansion” for a day. It was exactly the kind of “high-risk, high-reward” technology touted by President Obama when he launched the agency in 2009 (an idea carried over from the George W. Bush–era).

Such technologies could help with the country’s energy, environmental and economic security by creating new industries and jobs as well as by reducing the pollution associated with energy production and use today. More succinctly, “ARPA–E turns things that are plausible into things that are possible,” proclaimed Acting Director Cheryl Martin at the 2014 summit.

Out of 37 projects that received initial ARPA–E funding, Sun Catalytix, a company founded by Nocera, was the poster child—or rather video favorite—featured in a U.S. Department of Energy (DoE) clip talking up the potential of transformational change. “Almost all the solar energy is stored in water-splitting,” intoned Nocera, a Massachusetts Institute of Technology chemist, at the inaugural ARPA–E summit. “Shift happens.”

The artificial leaf proved to be possible but implausible, however. It won’t be splitting water using sunlight on a mass scale anytime soon, its hydrogen dreams blown away by a gale of cheap natural gas that can also be easily converted to the lightest element.

So Sun Catalytix has set the artificial leaf aside and shifted focus to flow batteries, rechargeable fuel cells that use liquid chemistry to store electricity. A better flow battery might not shift the fundamental fuel of the American dream but it could help utilities cope with the vagaries of wind and solar power—and is more likely to become a salable product in the near future.

Five years in, ARPA–E’s priorities have shifted, too, for the same reason. The cheap natural gas freed from shale by horizontal drilling and hydraulic fracturing (or fracking) has helped kill off bleeding-edge programs like Electrofuels, a bid to use microbes to turn cheap electricity into liquid fuels, and ushered in programs like REMOTE, a bid to use microbes to turn cheap natural gas into liquid fuels. Even at the first summit in 2010, so full of alternative energy promise, this gassy revolution was becoming apparent.

Consulting firm Black & Veatch predicted that burning natural gas would provide nearly half of all U.S. electricity by 2034, a forecast fulfilled a few decades early in 2012. “We’ve got a lot of cheap gas,” said ARPA–E Program Director Dane Boysen at the 2014 summit. “The question is: What should we do with it?”

Methane Opportunities for Vehicular Energy, or MOVE program cars that run on natural gas or better batteries. Is enabling the energy predominance of another fossil fuel the kind of transformation E is failing?

The measure of success ARPA–E points to follow-on funding from other entities (whether corporate, government or venture capital) as an early measure of its success. So far, the agency has invested more than $900 million in 362 different research projects. Of those projects, 22 have garnered an additional $625 million from capitalists of one type or another; it is a group that includes Sun Catalytix.

ARPA–E funding has also allowed 24 projects to form spin-off companies whereas 16 projects have found a new funding source from other government agencies, including the DoE, which runs ARPA–E, and the Department of Defense.

The biggest successes include Makani Power, which makes souped up kites for wind power, and was acquired by Google after ARPA–E invested $6 million developing the technology. There’s also Ambri, which makes liquid-metal batteries for cheap energy storage on a massive scale and is now developing units capable of storing 20 kilowatt-hours for testing later this year.

And there’s 1366 Technologies, which became the first (and only, at that time in 2009) ARPA–E grantee in photovoltaics with a new manufacturing method that wastes less silicon. The company will begin construction this year on its first factory.

The outright failures have been mostly less prominent: algae breeding for biofuels and various carbon dioxide capture technologies, along with efforts to knit together hydrocarbons from sunshine, carbon dioxide and water. But some have proved more conspicuous. ARPA–E feted a would-be breakthrough battery maker named Envia in 2012. But by 2014, while at least one of the entrepreneurs backing the company still mingled in the summit’s halls at the Gaylord National Resort & Convention Center in Maryland, Envia was mired in lawsuits and failed to deliver the energy-dense batteries it promised to General Motors.

“I don’t call them failures, I call them opportunities to learn,” argued ARPA–E’s first director, Arun Majumdar, in a 2012 interview with Scientific American about failed projects in general. “If 100 percent of these projects worked out, we’re not doing our job.”

ARPA–E is definitely doing its job then: Biofuels haven’t quite delivered on their promise, even engineering tobacco plants for oil, while electrofuels were a “crazy-ass idea,” to use a term employed by William Caesar, president of the recycling business at Waste Management, at the 2014 summit to describe some of the concepts his company has evaluated for investment. And ARPA–E’s budget has always been too small to tackle innovation in certain areas. “My real hope was to have enough of a budget to try out something different than what we are doing in the nuclear field today,” such as a prototype for a new kind of reactor, Majumdar said in a 2013 interview with Scientific American.   “If you’re solving for climate change and you’re a serious person, your strategy starts with nuclear,” said David Crane, CEO of the electric utility NRG, at this year’s summit.

But ARPA–E’s budget has always been too small to encompass, for example, the hundreds of millions of dollars Crane lost during his tenure in a failed bid to build new standard nuclear reactors in Texas.   An analysis of the biggest programs by year and funding shows that electrofuels drew the biggest investment (at more than $41 million) in fiscal year 2010 followed by better hardware and software for the U.S. grid to help integrate renewables in 2011.

But in fiscal year 2012 the biggest tranche of funding to a single program went to Boysen’s MOVE projects (roughly $30 million) and, in fiscal year 2013, just behind the $36 million invested in better batteries for electric cars, was the REMOTE program of projects garnering $34 million. “It could have a small environmental footprint,” argues Ramon Gonzalez, program director for Reducing Emissions using Methanotrophic Organisms for Transportation Energy (REMOTE). “We can develop something that is a bridge to renewable energy or even is renewable itself in the future.”

Natural gas hardly seems to need ARPA–E’s help to become ubiquitous. And although natural gas can help with climate change in the short term—displacing coal that emits even more pollution when burned to generate electricity—in the long run it, too, is a fossil fuel and a greenhouse gas itself. Burning methane for electricity will also one day require capturing and storing the resulting carbon dioxide in order to combat climate change.

ARPA–E has not succeeded in delivering a technological breakthrough that would allow that to happen cheaply or efficiently, despite investing more than $30 million in its Innovative Materials and Processes for Advanced Carbon Capture Technologies (IMPACCT) back in 2010. “ARPA–E needs to revisit carbon capture and storage,” said Michael Matuszewski of the National Energy Technology Laboratory at this year’s summit.  

Long game Significant changes in energy sources—say from wood to coal or the current shift from coal to gas—take at least 50 years, judging by the record to date. “Looking at the climate risk mitigation agenda, we don’t have 50 to 60 years,” U.S. Secretary of Energy Ernest Moniz argued at the 2014 summit. “We have to cut [that time] in half,” and that will require breakthrough technologies that are cheaper, cleaner and faster to scale.

It is also exactly in times of overreliance on one energy source that funding into alternatives is not only necessary, but required. ARPA–E should continue to focus on transformational energy technologies that can be clean and cheap, even if political pressures incline the still young and potentially vulnerable agency to look for a better gas tank. After all, if ARPA–E and others succeed in finding ways to use ever-more natural gas, new shale supplies touted to last for a century at present consumption rates could be exhausted much sooner.

“Before this so-called ‘shale gale‘ came upon us, groupthink had most of us focusing on energy scarcity,” warned Alaska Sen. Lisa Murkowski (R) at the 2013 summit. “The consensus now is one of abundant energy. Don’t fall into the trap of groupthink again.”   Failure is a necessary part of research at the boundaries of plausibility.  As ARPA–E’s Martin said at this year’s summit: “It’s part of the process.” Many of the ideas the agency first funded were ideas that had sat unused on a shelf since the oil crisis of the 1970s.

And the ideas that go back on the shelf now, like the artificial leaf, provide the basic concepts—designer, metal-based molecules—for new applications, like flow batteries.   The artificial leaf, for one, could benefit from ARPA–E or other research to bring down the cost of the photovoltaic cells that provide the electricity to allow the leaf’s designer molecules to do their water-splitting work. Already, cheaper photovoltaics may be ushering in an energy transition of their own, cropping up on roofs across the country from California to New Jersey.

When such renewable sources of energy become a significant source of electricity, more storage will be needed for when the sun doesn’t shine or the wind doesn’t blow—and that storage needs to be cheap and abundant. In Germany, where the wind and sun now provide roughly one quarter of all that nation’s electricity, the long-term plan is to convert any excess to gas that can then be burned in times of deficit—so-called power to gas, which is a fledgling technology at best. And why couldn’t clean hydrogen be that gas, as Nocera has suggested?

So the artificial leaf bides its time, while research continues at the Joint Center for Artificial Photosynthesis established with DoE money in California. Failure is an investment in future success. “The challenge is not that the technology doesn’t work, but the economics don’t work,” observed Waste Management’s Caesar at the 2014 ARPA–E Summit. “I don’t like to talk about dead ends. There are things that their time just hasn’t come yet.”

Read “The Clean Energy Wars” here:

University of Alberta: Oil Sands: The “Art” of Remediation and Sustainability


Aaron Veldstra is performing March 19 as part of U of Alberta Water Week 2014.

Performance art project explores remediation and sustainability in resource industry.    By Michael Brown

(Edmonton) At some point along the creative process, the waste left over from Aaron Veldstra’s various projects began to weigh on him. “Through creating, I realized I was making all these buckets of dirty water and just pouring them down the sink. I sort of became uncomfortable with the fact that I was just pouring my waste down the sink and it was disappearing, making it someone else’s problem,” said the first-year master of fine arts student in the Department of Art and Design. “Out of sight, out of mind—I was basically pushing the problem to a different space so I didn’t have to see it anymore.”

Veldstra, who has spent a decade of summers in the reclamation industry as a tree planter, says his first instinct was to find a way to reclaim his waste water, an interesting process in its own right that engaged his artistic side further. What has emerged is an early incarnation of a performance art project entitled Experiments in Artistic Hydrology, in which Veldstra attempts to engage people in a conversation about oil and the oil industry in Alberta using the concepts of remediation and sustainability.

“These terms get thrown around quite a bit, but what do they really mean? That’s the question that I’m asking or provoking through these acts.” The acts in question start with Veldstra marking his wall-sized canvas—two sheets of drywall—with a series of lines representing geographical data sets, such as pipelines, roads, cutlines and power lines, related to oil exploration and the resource industry in northern Alberta.

“When you clean something, you always make something else dirty.” Then, Veldstra applies thick beads of black ink using a syringe to trace along the data set lines. The resulting lines and drips are then sponged off using a combination of water and baking soda. “What I have is a bucket of dirty water, which I then filter using sand in a series of buckets,” said Veldstra, who models the filtration system after how most municipalities filter their citizens’ drinking water. “In the end, what I have is essentially clean water and a bunch of dirty sand. “The end result is when you clean something, you always make something else dirty.”

Veldstra says the project isn’t solely a critique of remediation and sustainability of oil producers, but also our reliance on oil in general. “We’re all implicated in our use of oil. It’s not specifically about oil companies, it’s about everybody: everybody wants to drive a car, everything we do involves oil in some way,” he said. “I’m using the oilsands as this contestable thing, but I’m not specifically talking about the giant holes we see north of Fort McMurray—it’s all of us.

“I’m just trying to broaden this conversation a little bit more, engage people through the act of doing something weird.” Watch Aaron Veldstra’s performance Veldstra is performing as part of UAlberta Water Week in the PCL Lounge of the Centennial Centre for Interdisciplinary Science March 19 from 5–9 p.m.

March 17–22 is UAlberta Water Week, a campus celebration of water leading up to UN World Water Day. The theme of this year’s events is “Exploring Sustainable Practices for Water and Energy.” Events are free and open to the public.

Learn more about UAlberta Water Week – See more at: