One Technology That Is Already Changing the Future Of Energy


 

Daniel Burrus

Best Selling Author, Global Futurist & Innovation Expert, Entrepreneur, Strategic Advisor & Keynote Speaker

DANIEL BURRUS is considered one of the world’s leading technology forecasters and innovation experts, and is the founder and CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients understand how technological, social and business forces are converging to create enormous untapped opportunities. He is the author of six books including The New York Times best seller Flash Foresight.

3adb215 D BurrisThe accelerating change of technology we use commercially and personally is dramatically increasing the global demand for electric power. As consumers, we’re gulping power at an alarming rate, from air conditioning systems, heating systems, household appliances, and all forms of home entertainment devices to cloud computing, computers, and consumer electronics. Over the past few years, we’re also plugging in electric vehicles at an ever increasing rate. And let’s not forget the industrial power needed to churn out all these products, as well as keep the other wheels of industry turning. In fact, global electricity demand has been projected to nearly double from the year 2010 to 2030.

It’s clear we’re already close to consuming more electricity than we can generate or distribute, as manifest through the rolling black and brown outs frequently seen during summer months where peak power demand is highest. The problem is we’re adding more demand for electricity (from everything mentioned above and more) than we’re adding capacity to supply it. With that said, we still need to stay cool and to turn on lights to see at night … and we’re certainly not going to turn off our home theater and gaming systems.

So what’s the answer? Expand power generation to meet growing demand? Not so fast. Investment in electric power generation and distribution is a slow, long-term proposition, and therefore has trailed well below the increase in GDP in most developed countries. In other words, no one has the appetite (or the capital) to build enough power plants and expand the grid to meet the rising demand for electricity.

A quick point of fact: Power generation—and the grid to distribute it—has to be scaled to meet peak demand. On average, power grids operate at around 80 percent capacity, so they’re ready to cover peak demand when those hot summer days roll around. If demand rises above that peak capacity, we experience those black and brown outs.

Additionally, 75 percent of the electricity generating capacity in the United States depends on the combustion of fossil fuels. This raises a multitude of other concerns, perhaps foremost that dependence on fossil fuels for electricity is causing severe environmental and health hazards, including large emissions of toxic air pollutants and greenhouse gases.

Over the past few years, thanks to technology developments such as fracking, which were impossible just a decade ago, we can now extract natural gas in very large quantities, and that has put the United States in a position of being an exporter of energy. The good news is that natural gas is far less polluting than other fossil fuels, such as coal, and the U.S. has very large reserves. On the other hand, the United States does not have an infrastructure for capitalizing on natural gas powered vehicles, and natural gas is a fossil fuel and does have harmful emissions, even if less than the others.

When we look at renewable energy sources such as wind, solar, and waves, great strides have been taken, but until we find a way to store electricity for use at a later time, these will help but not be game changing. The good news here is that there is a technology that is already changing the game.

What if we could increase energy production without adding new capacity? What if we could use the power we already generate more efficiently, rather than have to dramatically expand power generation? Enter the work that is being done to enable smart grids, smart homes, and smart cities to help us accomplish this. But will peak power demand modeling and technology that turns lights off in empty rooms be enough? Probably not for some time. That’s where promising energy storage technology comes in as a key change accelerator to help us use the electrical power we have now more efficiently.

One of the companies leading the way is Maxwell Technologies in San Diego, California. They have developed and are manufacturing one of the most promising clean-energy power storage technologies available: ultracapacitors, which use an electro-static field to quickly capture energy and then rapidly release it when needed. Conventional batteries and advanced lithium-ion batteries that rely on a chemical reaction cannot efficiently do this because they charge slowly and discharge slowly. When batteries are asked to charge and discharge quickly—which is the case in many applications today—they begin to fail and ultimately need to be replaced.

Ultracapacitors are being incorporated (where batteries cannot) into renewable energy power generation from solar, wind, and waves to improve efficiency and reliability. Because there are many disruptions in renewable energy output from clouds, wind fluctuations, and tides that last from a few seconds to a few minutes, output can swing as much as 50 percent at any time. This variability in power supply presents issues with power grid stability, causing the grid to disconnect from the renewable energy source.

The unique quick charge/discharge ability of ultracapacitors allows renewable energy installations to quickly store power and then deliver it back to the power grid “firming” output capacity and “ride through” during short-term disruptions. This increases renewable energy utilization by 30 – 50 percent so the power grid doesn’t need to be built to such a large scale (at an incremental cost) as demand for electrical power grows. Additionally, we could further increase our use of clean energy and decrease reliance on fossil fuels for power generation.

From a very broad perspective, this is a major example of how ultracapacitors can help us use the energy we already generate more efficiently. But what about places off the grid where we waste energy every day? How about planes, trains, automobiles, trucks, and busses?

Regenerative braking systems in electric and hybrid vehicles are being used to generate and quickly store electrical energy when brakes are applied, then rapidly release it for acceleration. Conventional friction-based braking systems simply lose all this kinetic energy to heat. Ultracapacitors are being used to quickly capture and release this energy to improve fuel economy and extend battery life. Regenerative breaking systems provide an average of 7 percent fuel efficiently and would save 12 million gallons of fuel in the U.S. each year.

Conventional internal-combustion vehicles are incorporating start-stop systems that kill the engine at stoplights and stop signs, and then restart it when the accelerator is applied. Ultracapacitors are being designed into these vehicles to stabilize starter systems, electrical systems, power steering, and onboard electronics. These start-stop systems improve fuel efficiency by up to 15 percent. In the U.S. alone we could save 25.5 million gallons of fuel annually if every conventional vehicle had this type of system. Image how much energy we could save and utilize if every vehicle on the planet had a start-stop system.

Finally, let’s think at the micro-level. Small ultracapacitors can be combined with batteries in laptops, tablets, smart phones, and electronic toys to use electric power more efficiently. Unlike ultracapacitors, batteries begin to degrade when they are tasked to quickly charge and discharge, but they are great sources of long-term power. Because ultracapacitors can quickly be charged and discharged up to a million times without loss of performance, they are ideal for providing the bursts of power required by today’s electronic devices, helping them perform better and batteries last longer.

There are a multitude of other applications where ultracapacitors can—and are starting to—help us use the power we’re already generating more efficiently, instead of simply generating more power. Clean-energy ultracapacitors are a change accelerating technology that will enable energy’s future and not inhibit the dizzying rate of technological, commercial, and social change we’ve come to expect and rely on.

Meanwhile, a question: Are there obvious or obscure places you can imagine where innovative power storage technology could help us use the electrical power we have now more efficiently?

###

 

The five most important names in renewable energy that you’ve never heard of


By Bill White

wind transmission lines
Shutterstock

abu-dhabi-solarFive people will make a decision soon that will have an outsized impact on the future of renewable energy in America. I’m not talking about big shots like Obama, Koch, Boehner, Bloomberg, or Steyer. I’m talking about names many have never heard of:  Moeller, Norris, LaFleur, Clark, and Binz (if he is confirmed). These are the chief electricity officers of the United States of America — they are the commissioners of the Federal Energy Regulatory Commission (FERC).

You’ve probably heard this before: “Scientists agree that in order to avoid the worst consequences of climate change, we must generate 80 percent of our energy from renewable sources by 2050.”  No single entity will play as crucial a role as FERC in ensuring that the infrastructure exists to handle new renewable energy generation.

President Obama’s climate plan is a courageous step forward and deserves the widespread media coverage it has received. But only the acceleration of utility-scale renewable energy projects can take us where we need to go.

Modernizing our nation’s power system is a daunting task, but there are good reasons to be optimistic. America has enough wind and solar to power the entire country more than a dozen times over. And with the cost of wind and solar going down every day, rapid development of large-scale generation projects appears inevitable.

But if you place the map of regions with the best wind and solar energy on top of a map of our current transmission system, you won’t find too much overlap. Transmission is the key to unlocking America’s virtually unlimited renewable resources and delivering their energy to users.

Unlike our interstate highway system, which is funded by taxpayers, high-voltage transmission lines are built with private capital. Investors will put money into transmission projects as long as they generate returns that are attractive relative to similar types of investments. This is where FERC steps in. They set the return on equity (ROE) for transmission projects across the nation.

As you might imagine, the higher the ROE, the more incentive there is to build transmission. A company would never invest in our grid if the maximum ROE was 1 percent — meaning it would take 100 years to recoup the costs of a project. And if it was 100 percent, we would end up building much more transmission than we need and sticking consumers with the bill.

Recent history also tells us that the cost of inadequate transmission is steep. Electric customers are still paying billions of dollars per year for congestion, poor reliability, and overpriced power from dirty, outdated, and inefficient power plants — all of which are the direct result of three decades of underinvestment in transmission. Renewable energy was locked out of a strained and inadequate grid. In the mid-2000s, FERC recognized the chronic neglect of transmission investments as a major burden on ratepayers and a barrier to modernizing our electric system, and stepped in to raise transmission ROEs.

That decision helped spur a wave of new transmission investments that are reducing costs to consumers and expanding access to renewable energy. For example, the Midwest ISO has begun a new set of transmission lines called the MVP projects. The average consumer is seeing $23 in savings for every $11 spent these new lines.

Why is transmission such a great deal for electric customers? It’s the smallest part of an electric bill — 11 percent on average — compared with 58 percent for generation and 31 percent for distribution. Transmission pays for itself quickly by relieving costly congestion, moving cheap and clean renewable power to customers, making the grid more reliable and secure, and putting old and inefficient power plants out of business. Simply put, transmission is essential infrastructure for competition, consumer choice, economic efficiency, and environmental protection.

Despite the well-documented value that transmission investments deliver to ratepayers and the environment, FERC has been hearing complaints recently that ROEs for transmission projects are too high, and that ratepayers need relief. These complaints are misguided, and their timing could not be worse. Never in our history has so much depended on expanding and modernizing our electric transmission system.

Our chief electricity officers may never get the ROE for transmission “just right”; the uncertainty of markets, interest rates, and the economy probably make that lofty goal impossible to achieve. But they can — and they must — ensure that ROEs remain at levels that ensure a steady and stable flow of private capital into urgently needed transmission investments. Failing to do so would stall renewable energy development and with it progress on reducing emissions, and would increase the cost of electricity for everyone.

The president’s climate plan is moving forward. State renewable energy standards are helping expedite that progress. The falling costs of wind and solar are driving growth. But none of that will matter if the infrastructure to deliver renewable energy to customers is not built.

Five FERC commissioners will make a little-noticed decision in the near future, one that will either keep us on the right track, or throw a major obstacle — one that we can ill-afford — on the road to achieving our nation’s renewable energy future and stabilizing our world’s climate.

Bill White manages the National Clean Energy Transmission Initiative for the Energy Future Coalition. During the Clinton administration, he served as senior advisor to EPA Administrator Carol Browner.

2013 Perspective on “War on Cancer” on from December 23, 1971 to ‘Where Are We Now’?


WORLD WAR CANCER

POSTED BY 

SciSource_9M9229-580.jpg201306047919620Richard Nixon launched the so-called War on Cancer on December 23, 1971, in what was supposed to be a “moonshot” effort to cure the disease. Two years later, a Time magazine cover read, “Toward Control of Cancer.” Two decades after that, it announced, in bold red letters, “Hope in the War Against Cancer,” surmising that “a turning point” may have been reached. In 2001, its cover asked if the blood cancer drug Gleevec “is the breakthrough we’ve been waiting for.” And this past April, the newsweekly pronounced “How to Cure Cancer.” Yet roughly one hundred and forty thousand Americans have died from the disease in the last three months.

Outrage over our paltry victories against cancer informs the forthcoming book, “The Truth in Small Doses: Why We’re Losing the War on Cancer—and How to Win It,” by Clifton Leaf, who wrote a much-discussed essay on the same topic for Fortune in 2004. The title comes from a 1959 pamphlet that tells doctors to trickle out information to cancer-stricken patients, since most of them “couldn’t stand” to know the truth: the disease would kill them and there was little that could be done about it. Today, draped in ribbons of every hue, blinded by the promises of targeted therapies and antioxidants, we have, according to Leaf, neglected a basic truth: “‘the cancer problem’ is, in reality, as formidable a challenge as ever.” (Jerome Groopman discussed the progress in cancer cures, particularly immune therapy, in the magazine last year.)

Leaf is not an oncologist, but he became acquainted with the profession at an early age; he was diagnosed with Hodgkin’s disease at fifteen years old. In the book’s most poignant moment, Leaf orders his father into the corner of his hospital room to atone for having dozed off while sitting bedside. When Leaf woke up the next morning, “the biggest man I had ever known” was still standing in the corner.

As an editor at Fortune, Leaf became enthralled by the promise of Gleevec, an enzyme inhibitor that, since its release in 2001, has proven highly effective at battling chronic myeloid leukemia. Many thought a new age was coming, in which the chaotic spread of cancer would be hindered by drugs that would be precision-targeted to block the replication of rogue cells. It seemed far better than indiscriminately killing both cancerous and healthy cells, as chemotherapy had been doing for the past half-century.

But Gleevec is the exception, not the rule—and C.M.L. is a relatively simple cancer compared to solid-state tumors of the lung, colon, pancreas, or breast. Once they metastasize, most cannot be cured. Those, like Leaf, who have faced cancer have good reason for their impatience: it takes an average of thirteen years to bring a new cancer drug to market. Many of these drugs are pellets fired into cancer’s flank. A recent article in the New York Times titled “Promising New Cancer Drugs Empower the Body’s Own Defense” hailed a new melanoma drug whose median survival rate was 16.8 months. An editorial this winter in The Lancet, the august British medical journal, put the matter even more bluntly: “Has cancer medicine failed patients? In the words of cancer experts, the answer is yes.”

Watch this video from “Nanobiotix” on the use of nanotechnology for treating Cancer using established treatment methods here:

http://www.youtube.com/watch?feature=player_detailpage&v=kxSX6YJTS2I&list=PL9C30814198614279

 

 

Leaf argues we should be closer to an all-out cure, considering our investment in the effort. The National Cancer Institute receives roughly five billion dollars per year from the federal government. If both public and private investments are to be accounted for, then Leaf estimates the United States spends about sixteen billion dollars a year on cancer research. Nor is there a lack of political will to eradicate cancer, as there is to, say, reducing carbon emissions. Leaf calls it a “bipartisan disease” that a Republican from Alabama would want defeated as much as a Democrat from Illinois. President Barack Obama said in 2009 that he would “launch a new effort to conquer a disease that has touched the life of nearly every American, including me, by seeking a cure for cancer in our time.”

In Leaf’s telling, oncology is a hidebound field averse to risk, a culture that “has grown progressively less hospitable to new voices and ideas over the past four decades.” He yearns for the likes of Sidney Farber, the unorthodox pathologist who invented chemotherapy in the late nineteen forties at Boston Children’s Hospital by injecting children stricken with acute lymphoblastic leukemia with aminopterin, which prevents cancer cells from replicating. A hero in Siddhartha Mukherjee’s “The Emperor of All Maladies,” Farber is largely responsible for the fact that childhood A.L.L. is a manageable disease today. But his methods had a high cost: he disobeyed superiors, conducted his own trial-and-error studies, and foisted unproven drugs on sick, vulnerable children.

What made Farber an iconoclast is that he wanted to cure cancer even more than he wanted to understand it. As he would come to argue, “The three hundred and twenty-five thousand patients with cancer who are going to die this year cannot wait; nor is it necessary, in order to make great progress in the cure for cancer, for us to have the full solution of all the problems of basic research…the history of Medicine is replete with examples of cures obtained years, decades, and even centuries before the mechanism of action was understood for these cures.”

Few new bold projects are being funded now, writes Leaf, noting that in 2010, the N.C.I. used the bulk of its two billion dollars in research grants on existing projects. He is as incensed that the same institutions get most of the money, writing that “in 2011, the top 43 research centers got more funding ($12 billion) than did the bottom 2,574 institutions receiving any kind of NIH support.” To some, this is the price of science that is both sound and safe. To others, it is a culture of scientific inefficiency, an I.B.M. mindset in a field that desperately yearns for Apple.

Oncologists in the field with whom I spoke agreed with this overall assessment of the War on Cancer. Andrea Hayes-Jordan, a pediatric surgical oncologist at the M. D. Anderson Cancer Center in Houston, told me that “Our strategic attacks are improving, and we are winning some battles, but not the war yet.” Silvia Formenti, who chairs the radiation oncology department at New York University’s Langone Medical Center, was even more negative in her assessment of the War on Cancer. She wrote to me in an e-mail, “We have managed to make cancer a huge business, and a national ‘terror,’ but the progress in reducing mortality is quite questionable.”

The book suggests some remedies, foremost among them preventing cancer before it strikes. At Stage 0, a cancerous growth can be detected and removed before it has diversified and spread. By the time a tumor is the size of a grape, it has as many as a billion cells. Those cells become increasingly heterogeneous, and once they break through the basement membrane that acts as a final barrier between organs and tissues, they are free to metastasize throughout the body via the bloodstream or the lymphatic system.

The book finds great promise in the chemoprevention pioneered by Dartmouth researcher Michael Sporn, who wants to treat pre-invasive lesions as seriously as full-blown cancers. This seems to fly in the face of the cautious watch-and-wait philosophy popular with many oncologists, who have become convinced (not without reason) that the cure—toxic chemotherapy, high doses of radiation—could be worse than the disease.

However, other than the breast cancer drug tamoxifen and the H.P.V. vaccine—both of which can reduce the risk of getting cancer, not cure the disease—the promise of chemoprevention remains largely unrealized. A recent paper by two preventative oncologists concluded, “There have been numerous chemoprevention trials in the past 10 years, but the number of approved chemoprevention drugs is still quite small.” Another recent study on older men with prostate cancer suggested that “watchful waiting” was often the best route, noting that many patients opted for expensive treatments they didn’t need, thus leading to impotence and incontinence. And a federal task force ruled four years ago that women should delay getting mammograms until age fifty (ten years later than the previous recommendation) because of the procedure’s own potential dangers.

Leaf acknowledges these dangers, and also points out an even more serious problem with chemoprevention: biomarkers that would signal carcinogenesis in its earliest stages have not been found. So while he is correct to highlight the potential promise of a prophylactic approach, Leaf’s own description of “the failed biomarker hunt” is, indirectly, a defense of why oncologists today are left with no choice but to wait until the disease develops.

The desire for an accelerated approach to cancer has antecedents in the AIDS activism of the nineteen-eighties. As Mukherjee describes in his book, organizations like ACT UP “made the FDA out to be a woolly bureaucratic grandfather—exacting but maddeningly slow.” That had repercussions in cancer medicine, where patients also demanded quicker access to potentially life-saving therapies. Especially en vogue by the early nineties was “megadose chemotherapy” for breast cancer, complemented by a bone marrow transplant. (The original marrow would have been destroyed by the high toxicity of the purported cure.) Yet as Mukherjee notes, by early 2000, the procedure was discovered to have been supported by fictional studies. One of its main proponents, a South African oncologist named Werner Bezwoda, had charmed his fellow practitioners with astounding results that masked the true, fatal dangers of this excessive approach. Mukherjee calls Bezwoda’s influential drug trials “a fraud, an invention, a sham,” yet he was hardly the lone cheerleader for megadose chemotherapy. Any urge to hasten the War on Cancer—however justified that urge may be—must grapple with the risk of promising anecdotes curdling into hideous truths.

Of course, some approaches are neither terribly controversial nor difficult, at least from a medical standpoint: Debu Tripathy of the University of Southern California’s Norris Cancer Center told me that he believes that ninety per cent of all lung cancers could be eliminated through the cessation of cigarette smoking. Studies have shown a link between red meat consumption and an elevated risk of cancer. Here, then, may be cancer prevention in its simplest form.

On the whole, Leaf is much less optimistic than Mukherjee. Surveying the state of cancer medicine as it was in 2005, Mukherjee concludes, “The empire of cancer was still indubitably vast…but it was losing power, fraying at its borders.” Surveying some three thousand years of humanity’s battle with cancer, Mukherjee’s is the more meditative work. Leaf’s book is more urgent, more insistent—the voice of a frightened patient who yearns for a cure, rather than of the sober oncologist concerned with getting the science right. “Emperor” is a story; “Truth” is an argument.

Earlier in June, researchers discovered a tumor of the rib bone of a Neanderthal believed to be a hundred and twenty thousand years old. What plagued him then still plagues us today, much as it plagued Atossa, the ancient Persian queen who is believed to have suffered from breast cancer, as well as the London chimney sweeps stricken with scrotal malignancies. This war has been a long one.

Alexander Nazaryan is a writer living in Brooklyn.

Photograph by Biophoto Associates/Science Source.

SOURCE

http://www.newyorker.com/online/blogs/elements/2013/07/world-war-cancer.html

NIST seeks proposals to establish new Center of Excellence on Advanced Materials Research


(Nanowerk News) The National Institute of Standards and  Technology (NIST) has announced a competition to create an Advanced Materials  Center of Excellence to foster interdisciplinary collaborations between NIST  researchers and scientists and engineers from academia and industry. The new  center will focus on accelerating the discovery and development of advanced  materials through innovations in measurement science and in new modeling,  simulation, data and informatics tools.        
Block Copolymer
Computer models of polymer mixtures studied at NIST can help develop  improved lithography resists for nanomanufacturing.
NIST anticipates funding the new center at approximately $5  million per year for five years, with the possibility of renewing the award for  an additional five years. Funding is subject to the availability of funds  through NIST’s appropriations. The competition is open to accredited  institutions of higher education and nonprofit organizations located in the  United States and its territories. The proposing institution may work as part of  a consortium that could include other academic institutions; nonprofit  organizations; companies; or state, tribal or local governments.                      
Advanced materials, such as new high-performance alloys or  ceramics, polymers, glasses, nanocomposites or biomaterials, are a key factor in  global competitiveness. They drive the development of new products and new  technical capabilities, and can create whole new industries. However, currently,  the average time from laboratory discovery of a new material to its first  commercial use can take up to 20 years. Reducing that lag by half is one of the  primary goals of the administration’s Materials Genome Initiative, announced in 2011.                      
In many cases, the lengthy time for materials development is due  to a repetitive process of trial and error experimentation that would be  familiar to Thomas Edison. The Materials Genome Initiative and the new NIST  center focus on dramatically reducing this through the use of measurement and  data-based research tools: massive materials databases, computer models and  computer simulations. The new center will provide a mechanism to merge NIST  expertise and resources in materials science, materials characterization,  reference data and standards with leading research capabilities in industry and  academia for designing, producing and processing advanced materials.                      
Full details of the solicitation, including eligibility  requirements, selection criteria, legal requirements and the mechanism for  submitting proposals are found in an announcement of Federal Funding Opportunity  (FFO) posted at Grants.gov under funding opportunity number  2013-NIST-ADV-MAT-COE-01.                     
Applications will only be accepted through the Grants.gov  website. The deadline for applications is 11:59 p.m. Eastern time, Aug. 12,  2013.                     
NIST will offer a webinar presentation on the Advanced Materials  Center of Excellence on July 15, 2013, at 2 p.m. Eastern time. The webinar will  offer general guidance on preparing proposals and provide an opportunity to  answer questions from the public about the program. Participation in the webinar  is not required to apply. There is no cost for the webinar, but participants  must register in advance. Information on, and registration for the webinar is  available at www.nist.gov/mgi.  
Source: NIST

Read more: http://www.nanowerk.com/news2/newsid=31095.php#ixzz2XeHbclDW

NREL’s Keith Emery Awarded Prestigious Cherry Award: Efficiency of Solar Cells


Top PV award goes to researcher who brought credibility to testing of solar cells and modules

June 19, 2013

QDOTS imagesCAKXSY1K 8An engineer from the Energy Department’s National Renewable Energy Laboratory (NREL) whose testing and characterization laboratory brings credibility to the measurement of efficiency of solar cells and modules has been awarded the prestigious William R. Cherry Award by the Institute of Electrical and Electronics Engineers (IEEE).

Keith Emery, a principal scientist at NREL, received the award at the 39th IEEE’s Photovoltaic Specialists Conference in Tampa Bay.

“Accredited measurements from Emery’s laboratories are considered the gold standard by the U.S. and international PV communities,” said NREL colleague Pete Sheldon, Deputy Director of the National Center for Photovoltaics on the NREL campus in Golden, CO. “His leadership in the development of cell and module performance measurement techniques and the development of standards, has set the foundation for the PV community for the last 25 years.”

The award is named in honor of William R. Cherry, a founder of the photovoltaic community. In the 1950s, Cherry was instrumental in establishing solar cells as the ideal power source for space satellites and for recognizing, advocating and nurturing the use of photovoltaic systems for terrestrial applications. The purpose of the award is to recognize an individual engineer or scientist who devoted a part of their professional life to the advancement of the science and technology of photovoltaic energy conversion.

Emery is the third consecutive Cherry Award winner from NREL. In 2011, Jerry Olson, who developed the multi-junction solar cell, won the award. Last year, Sarah Kurtz, who helped Olson develop the multi-junction cell and now is a global leader in solar module reliability, won the award. Three other NREL scientists won the Cherry Award previously – Paul Rappaport (1980), Larry Kazmerski (1993), and Tim Coutts (2005).

Emery says he was floored by the award, considered among the top one or two annual awards globally in the photovoltaic community.

Others aren’t surprised, citing his work to bring iron-clad certainty to the claims made by solar companies about the efficiency of their photovoltaic cells and modules – not to mention the 320 scientific publications he was able to write.

He has spent his career building the capabilities of that testing and characterization lab, making it one of a handful of premier measurement labs in the world – and the only place in the United States that calibrates primary terrestrial standards for solar-cell characterization.

Unbelievable claims of high efficiency would be out in the literature without any independent verification. “We decided that independent verification was critical for credibility,” Emery said.

“We have to thank DOE for this,” Emery said. “They’ve funded it. We’ve been able to offer the service to all terrestrial PV groups in the U.S. from national labs to universities to low-budget startups. They all get the same quality of service.”

The readily available service is so researchers and companies have equal access to the resources needed for independent efficiency measurement, he said. “We provide the same playing field for everyone.”

Emery spent the first 25 years of his life in Lansing, Michigan, attending public schools, then going on to Lansing Community College and Michigan State University where he earned his bachelor’s and master’s degrees. From there he went to Colorado State University to fabricate and test ITO on silicon solar cells, and then was hired at NREL. At NREL, in the 1980s, Emery developed the test equipment and put together the data-acquisition system for characterizing and measuring the efficiency of solar cells.

Emery gives much of the credit to the colleagues who work in his lab and who have on average about 16 years at NREL. “Take my team away and I wouldn’t have gotten this award – it’s that simple.”

Sheldon said Emery’s work “brings scientific credibility to the entire photovoltaic field, ensuring global uniformity in cell and module measurements. His getting the award is certainly well deserved.”

NREL is the U.S. Department of Energy’s primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for the Energy Department by the Alliance for Sustainable Energy, LLC.

###

Visit NREL online at www.nrel.gov

$244 BILLION in Renewable Energy Investments: 2012


Jun 14, 2013

REN21 Frankfort School - UNEP reports renewable energy investments globally
7
Share
Around $244 billion was invested in renewable energy in 2012, with a geographic shift toward developing countries, according to sister reports released by REN21, a global renewable energy policy network, and the Frankfurt School – UNEP Collaborating Centre for Climate & Sustainable Energy Finance.

According to the reports, 115 GW of new renewable capacity was installed globally in 2012, although global investments fell 12 percent from 2011. The drop in investment was mainly because of dramatically lower solar prices and weakened U.S. and European markets, according to the report from the Frankfurt School.

Despite that, 2012 remains the second highest year in history for renewable energy investments, with a continuing upward trend in developing companies. The reports note that investments in developing countries were around $112 billion, compared to $132 billion in developed countries.

The reports noted that renewable energy made up about half the total electric capacity addition in the U.S. in 2012, with more capacity added from wind power than any other technology. Total investment in renewable energy was down 34 percent to $36 billion, however. The reports cite uncertainties over U.S. policy as the main reason for the decrease in total renewable investment.

The largest investment in renewable energy was made in China, where a 22 percent increase raised the total investment in 2012 to $67 billion, largely due to a jump in solar investment, according to the reports.

Wind power accounted for about 39 percent of the renewable energy capacity added in 2012, followed by hydropower and solar photovoltaic power, which each accounted for about 26 percent of new capacity.

Renewable Energy: World Invests $244 billion in 2012, Geographic Shift to Developing Countries. Installed capacity continues to grow as solar prices drop 30-40%, new wind installations surge

For only the second time since 2006, global investments in renewable energy in 2012 failed to top the year before, falling 12% mainly due to dramatically lower solar prices and weakened US and EU markets.

There was a continuing upward trend in developing countries in 2012, with investments in the South topping $112 billion vs $132 billion in developed countries – a dramatic change from 2007, when developed economies invested 2.5 times more in renewables (excluding large hydro) than developing countries, a gap that has closed to just 18%.

The main issue holding back investment last year was instability in the policy regime for renewable energy in important developed-economy markets. Future investment is likely to coalesce in countries that can offer policies that command investor confidence, plus the need for extra generating capacity and strong renewable power resources.

After being neck-and-neck with the US in 2011, China was the dominant country in 2012 for investment in renewable energy, its commitments rising 22% to $67 billion, thanks to a jump in solar investment. But there were also sharp increases in investment for several other emerging economies, including South Africa, Morocco, Mexico, Chile and Kenya.

The other major theme of 2012 was a further, significant reduction in the costs of solar photovoltaic technology. The levelised cost of generating a MWh of electricity from PV was around one third lower last year than the 2011 average. This took small-scale residential PV power, in particular, much closer to competitiveness.

Quantum Dots that Assemble Themselves


QDOTS imagesCAKXSY1K 8A paper on the new technology, “Self-assembled Quantum Dots in a Nanowire System for Quantum Photonics,” appears in the current issue of the scientific journal Nature Materials. Quantum dots are tiny crystals of semiconductor a few billionths of a meter in diameter. At that size they exhibit beneficial behaviors of quantum physics such as forming electron-hole pairs and harvesting excess energy.

The scientists demonstrated how quantum dots can self-assemble at the apex of the gallium arsenide/aluminum gallium arsenide core/shell nanowire interface. Crucially, the quantum dots, besides being highly stable, can be positioned precisely relative to the nanowire’s center. That precision, combined with the materials’ ability to provide quantum confinement for both the electrons and the holes, makes the approach a potential game-changer.

Electrons and holes typically locate in the lowest energy position within the confines of high-energy materials in the nanostructures. But in the new demonstration, the electron and hole, overlapping in a near-ideal way, are confined in the quantum dot itself at high energy rather than located at the lowest energy states. In this case, that’s the gallium-arsenide core. It’s like hitting the bulls-eye rather than the periphery.

The quantum dots, as a result, are very bright, spectrally narrow and highly anti-bunched, displaying excellent optical properties even when they are located just a few nanometers from the surface – a feature that even surprised the scientists. “Some Swiss scientists announced that they had achieved this, but scientists at the conference had a hard time believing it,” said NREL senior scientist Jun-Wei Luo, one of the co-authors of the study. Luo got to work constructing a quantum-dot-in-nanowire system using NREL’s supercomputer and was able to demonstrate that despite the fact that the overall band edges are formed by the gallium Arsenide core, the thin aluminum-rich barriers provide quantum confinement both for the electrons and the holes inside the aluminum-poor quantum dot. That explains the origin of the highly unusual optical transitions.

Several practical applications are possible. The fact that stable quantum dots can be placed very close to the surface of the nanowires raises a huge potential for their use in detecting local electric and magnetic fields. The quantum dots also could be used to charge converters for better light-harvesting, as in the case of photovoltaic cells.

The team of scientists working on the project came from universities and laboratories in Sweden, Switzerland, Spain, and the United States.

More information: http://www.nature.com/nmat/journal/vaop/ncurrent/fig_tab/nmat3557_F4.htmlJournal reference: Nature Materials Provided by National Renewable Energy Laboratory

Read more at: http://phys.org/news/2013-04-team-quantum-dots.html#jCp

Making the Most of Big Data


QDOTS imagesCAKXSY1K 8Aiming to make the most of the explosion of Big Data and the tools needed to analyze it, the Obama Administration announced a “National Big Data Research and Development Initiative” on March 29, 2012.  To launch the initiative, six Federal departments and agencies announced more than $200 million in new commitments that, together, promise to greatly improve and develop the tools, techniques, and human capital needed to move from data to knowledge to action.   The Administration is also working to “liberate” government data and voluntarily-contributed corporate data to fuel entrepreneurship, create jobs, and improve the lives of Americans in tangible ways.

As we enter the second year of the Big Data Initiative, the Administration is encouraging multiple stakeholders including federal agencies, private industry, academia, state and local government, non-profits, and foundations, to develop and participate in Big Data innovation projects across the country. Later this year, the Office of Science and Technology Policy (OSTP), NSF, and other agencies in the Networking and Information Technology R&D (NITRD) program plan to convene an event that highlights high-impact collaborations and identifies areas for expanded collaboration between the public and private sectors.  The Administration is particularly interested in projects and initiatives that:

  • Advance technologies that support Big Data and data analytics;
  • Educate and expand the Big Data workforce;
  • Develop, demonstrate and evaluate applications of Big Data that improve key outcomes in economic growth, job creation, education, health, energy, sustainability, public safety, advanced manufacturing, science and engineering, and global development;
  • Demonstrate the role that prizes and challenges can play in deriving new insights from Big Data; and
  • Foster regional innovation.

Please submit a two-page summary of projects to BIGDATA@nsf.gov.  The summary should identify:

  1. The goal of the project, with metrics for evaluating the success or failure of the project;
  2. The multiple stakeholders that will participate in the project and their respective roles and responsibilities;
  3. Initial financial and in-kind resources that the stakeholders are prepared to commit to this project; and
  4. A principal point of contact for the partnership.

The submission should also indicate whether the NSF can post the project description to a public website.  This announcement is posted solely for information and planning purposes; it does not constitute a formal solicitation for grants, contracts, or cooperative agreements.

Deadline Date for Submission of Summaries: April 22, 2013

International partnership between New York State and the State of Israel to grow nanotechnology industry


QDOTS imagesCAKXSY1K 8(Nanowerk News) Governor Andrew M. Cuomo today  announced the signing of a Memorandum of Understanding (MOU) to establish an  international partnership between New York State and the State of Israel,  through a collaboration involving the College of Nanoscale Science and  Engineering (CNSE) and the Israeli Industry Center for Research &  Development (MATIMOP), that will significantly expand business, technology, and  economic relations in the burgeoning field of nanotechnology, while enabling  billions of dollars in new investments and the creation of thousands of  high-tech jobs in New York and Israel.
“I am so proud of the partnership between the State of Israel  and College of Nanoscale Science and Engineering, which continues to be the  leader in the global nanotechnology industry,” said Governor Cuomo. “This  partnership will strengthen our state’s relationship with the State of Israel,  while also investing in a thriving industry that will create jobs and expand the  economy right here in New York.”
Lieutenant Governor Robert Duffy said, “This partnership is yet  another example of how Governor Cuomo has strengthened New York State’s global  reputation as an attractive place to do business and create jobs. I thank the  State of Israel for partnering with New York State to ensure the continued  growth of the global nanotechnology industry. New York State is at the forefront  of this industry, and I commend Dr. Alain Kaloyeros for his leadership and hard  work on this agreement. Through this partnership, the College of Nanoscale  Science and Engineering can continue to drive this emerging and rapidly growing  field.”
Nili Shalev, Israel’s Economic Minister to North America, said, “This agreement is the first significant step to stimulate scientific and  industrial collaboration in areas where both states excel. The partnership will  enable Israeli companies access to CNSE’s renowned facilities and collaborate  with leading American and multinational companies on campus. It introduces many  other opportunities, including industrial R&D and commercialization joint  ventures, natural synergy between the two G450 Consortia of both states, and the  enhancement of academic research in Nano scaling. I would like to congratulate  Governor Cuomo, Lt. Governor Duffy and Dr. Alain Kaloyeros, the CEO of CNSE, for  supporting this initiative.”
Dan Vilenski, Former Chairman of Applied Materials’ Israeli  subsidiary and Board member of the Israeli National Nanotechnology Initiative,  said, “Nanotechnology is one of the major areas in which both Israel and New  York have a great deal to offer. Israel is the leader in metrology and  inspection in the semiconductor market, and the State of New York has built one  of the leading facilities in the world for Nano scaling research and will play a  significant role in shaping the future of this industry.”
State University of New York Chancellor Nancy Zimpher said, “The  governor has fostered an innovation environment in New York that is drawing top  scientists from around the world, and through SUNY’s globally renowned  NanoCollege, the potential for advancement and discovery is limitless. Only a  world-class university system like SUNY can generate an international  collaboration and investment of this magnitude. I want to welcome our new  Israeli partners to New York and to the SUNY system. I am sure our combined  expertise and passion for academic excellence and high tech innovation will  yield tremendous results.”
CNSE Senior Vice President and CEO Dr. Alain Kaloyeros said, “As  further testament to the pioneering leadership, strategic vision, and critical  investments of Governor Andrew Cuomo, which have truly established New York as  the epicenter for the global nanotechnology industry, the NanoCollege is  delighted to enter into this partnership with the most prestigious Israeli  Industry Center. In harnessing the power of nanotechnology innovation to bring  together corporate and university partners from the U.S. and Israel, this  collaboration sets the stage for leading-edge advances in nanoscale  technologies, and opens the door for high-tech growth that will provide exciting  career and economic opportunities for individuals and companies across the New  New York.”
The partnership announced today between CNSE and MATIMOP, acting  on behalf of the Office of the Chief Scientist (OCS) in the ministry of Industry  Trade and Labor, builds on and leverages the multi-billion dollar investments in  New York’s nanotechnology industry under the leadership of Governor Cuomo. This  partnership will facilitate and promote bilateral and multilateral research,  development, and commercialization programs in innovative nanoscale technologies  between corporations and academic institutions in the U.S. and Israel.
Through the agreement, the Israeli government has allocated up  to $300 million a year to fund access for Israeli companies and universities to  CNSE’s state-of-the-art 300mm wafer and 450mm wafer infrastructure, facilities,  resources, and know-how, which are unparalleled worldwide. In addition, a  publicity and marketing campaign is being prepared to generate interest and  participation from Israel’s corporate and academic entities.
The centerpiece of the collaboration is the NanoCollege, the  most advanced nanotechnology education, research, development, and deployment  enterprise in the world. With more than $14 billion in high-tech investments,  over 300 global corporate partners, and a footprint that spans upstate New York,  CNSE is uniquely positioned to support this first-of-its-kind partnership.
The agreement is designed to enable a host of nanotechnology  research and development (R&D), prototyping, demonstration and  commercialization activities, including the facilitation of partnerships to spur  collaborative projects targeting industrial R&D and commercialization;  exchange of technical information and expertise to promote global development of  next-generation nanoscale technologies; and the organization of joint seminars  and workshops to enhance cooperation between corporate and academic entities in  New York and Israel.
Specific technology areas targeted for initial collaboration  include sub-systems, sensors and accessories for deployment in the nanoscale  cleanroom environment; simulation and modeling for next-generation tools and  technologies; and tools, processes, and testing technologies essential to  accelerate critical innovations in the multiple fields enabled by  nanotechnology, including nanoelectronics, energy, and health care, among  others.
Congressman Paul Tonko said, “This partnership between New York  State and Israel is yet further proof that the Capital Region is not only  renowned on a national stage, but indeed on the world stage. Clean energy  innovation jobs and long-term economic growth require investments, and Tech  Valley laid that foundation years ago. As a fast-growing region for high tech  jobs and all the ancillary benefits that follow, these sorts of partnerships led  by Governor Cuomo will ensure we remain a bright spot for continued education,  research, development and deployment by some of the most cutting edge  innovators, entrepreneurs, small businesses and large companies in the world.”
Senator Neil D. Breslin said, “This is a fantastic partnership  between New York State and the State of Israel that will create jobs, further  leverage a proven investment, and continue to let the Capital Region shine as  the forefront of the nanotechnology industry. I commend Governor Cuomo for  championing the growth in nanotechnology in New York, and the State of Israel  for choosing to enter into this great partnership.”
Assemblywoman Patricia Fahy said, “I am pleased that this  partnership between New York State and the State of Israel will not only create  jobs, but add immensely to a much-needed boost of economic development in the  Capital Region. I congratulate the Governor for bringing global attention to New  York State in the field of nanotechnology, and the State of Israel for choosing  to do business in New York.”
Mayor Jerry D. Jennings said, “I applaud Governor Cuomo for his  leadership in developing Albany’s nanotechnology sector, and thank the State of  Israel and the College of Nanoscale Science and Engineering for their hard work  in making this partnership a reality. This is great news for the Capital Region,  which has already seen immense growth in this industry, and I look forward to  ensuring that this progress continues.”
Albany County Executive Daniel McCoy said, “Governor Cuomo has  done a great job leading the way toward greater economic development, and this  partnership between New York State and the State of Israel is just another  example. I applaud the Governor, the State of Israel, and the College of  Nanoscale Science and Engineering for their hard work in developing this  partnership that will spur job creation, economic development, and greater  international attention for our state.”
Source: College of Nanoscale Science and  Engineering

Read more: http://www.nanowerk.com/news2/newsid=29643.php?utm_source=feedburner&utm_medium=twitter&utm_campaign=Feed%3A+nanowerk%2FagWB+%28Nanowerk+Nanotechnology+News%29#ixzz2O7lwYH2V

Saudi Money Shaping U.S. Research


Susan Schmidt | February 11, 2013

qdots-imagescakxsy1k-8.jpgSaudi Arabia’s oil reserves are expected to run dry in fifty years. This prospect has encouraged the Saudis to go shopping for cutting-edge science that can secure the kingdom’s future—at elite American research universities.

 

King Abdullah and Saudi Aramco are spending tens of billions on technology research to make the oil last longer and develop other energy resources that future Saudi generations can someday export.

KAUST_lab

King Abdullah University of Science and Technology opened its doors in 2009 and already has lavished more than $200 million on top U.S. university scientists. Stanford, Cornell, Texas A&M, UC Berkeley, CalTech, Georgia Tech—all are awash in new millions of Saudi cash for research directed at advancing solutions for Saudi energy and water needs. The new university, known as KAUST, has similar partnerships with scientists at Peking University and Oxford.

Many American universities and their scientists, lured by research grants of as much as $25 million, have jumped at the chance to partner with KAUST. Some of those scientists do research at their universities here and spend a small part of their time in Saudi Arabia creating “mirror” labs.

The arrangement with KAUST raises novel and largely unaddressed issues for American universities. With the United States determined to become energy self-sufficient, what are the ramifications of having scientists at top university labs—many of them recipients of U.S. government research dollars—devoting their efforts to energy pursuits selected by Saudi Arabia?

KAUST funding for U.S. scientists is geared to helping the Saudis cut their own heavy oil use at home to lengthen the life of their much more lucrative exports. It’s aimed at getting more oil per well with new technology, finding new reserves and developing new methods of carbon capture for continued use of fossil fuels. American scientists are also working to develop solar technology, including solar panels that can survive sandstorms and power desalinization of the Red Sea for water and electricity.

Among the areas KAUST is not funding is research on biofuels—which compete with oil—except for work on Red Sea algae.

KAUST’s mission statement lays out a plan to rapidly become a top international institution that “will play a crucial role in the development of Saudi Arabia and the world.” KAUST’s goal is not only to find new energy sources, but to create a Silicon Valley-like commercial hub of jobs and innovation. King Abdullah provided a whopping $20 billion endowment to launch the graduate-level research institution, and named the Saudi oil minister chairman of the board of trustees. Aramco built the campus, funds current operating costs and provided administrative leadership.

“It’s an important research lab for Aramco with a university façade,” said Alyn Rockwood, one of several scientists who say they want KAUST to succeed but believe a corporate ethos is stifling academic autonomy.

Some have bridled over changes that require them to get administrative approval in spending their research funds. KAUST officials declined interview requests, but in a Science magazine story late last year that cited some of those complaints, the former Aramco executive who runs KAUST, Nadhmi al-Nasr, acknowledged that he comes from a “top-down” corporate culture and is adjusting to academia.

Scientific research at universities is a key driver of debate over how to meet global energy needs. Often of late, it is the research itself that gets debated. Dueling studies about the environmental impact of biofuels and the safety of hydraulic fracking for natural gas has spurred charges and countercharges about the role of commercial interests biasing the science, for example.

The impact of published studies is not lost on the leaders at KAUST. In fact, the top of its mission statement sets out very specific goals for getting its research published in “prestigious professional journals.” By that measure, KAUST-funded scientists have been highly successful, with stacks of prestigious journal publications and patents to their credit.

One of them is William J. Koros, a Georgia Tech professor who was awarded a $10 million research grant for his work there on hydrocarbons. “They are very generous to home universities,” he said. Koros is working on technology that would help capture impurities from natural gas. “The Middle East is loaded with natural gas. They viewed this as a world problem that intersected with their interests,” he said.

Experts in issues related to academic research funding say KAUST’s relationship with U.S. scientists is unusual, posing pitfalls as well as opportunities.

“I don’t think there is a framework for dealing with foreign governments or corporations who invest in American universities to compete,” Tufts professor Sheldon Krimsky, who has studied conflicts of interest in academic research. Where American researchers get money does not mean the science produced will be anything less than honest. But, he said, scientific inquiry is shaped by the scope of the questions asked.

James Luyten, former director of Woods Hole Oceanographic Institution, sees the creation of a specific research agenda as a problem at KAUST. KAUST awarded Woods Hole $25 million and Luyten spent three years helping set up their Red Sea research center.

“They are using their money to limit and constrain where people put their energy as research scientists,” said Luyten, something that corporate sponsors often try to achieve by carefully choosing which science to fund and which to ignore.

Luyten said he was under “enormous pressure” to devote resources to algae biofuels research, for example, but was discouraged from research on the effect of carbon emissions on Red Sea coral. “A group of us wanted to hold a symposium on climate change,” he said, but the university president rejected the idea. “We were told that was not in the interest of Saudi Arabia,” he said.

KAUST reserves the right to review studies before publication, something that is not generally done by U.S. universities, though scientists and administrators who’ve worked at KAUST say so far it has been pro forma.

American universities, faced with a shrinking pool of research dollars at home, have welcomed the Saudi partnership as a way to fund important science, including in the area of carbon capture, an issue that has global implications. Creating jobs and educating the Saudi populace is seen as vital to making theirs a stable society, something that may benefit the rest of the world, though aiding a repressive regime has drawn objections from faculty on a few U.S. campuses. To bring in foreign scientists, the Saudi king has made KAUST an oasis of modernity, where male and female students are allowed to mix.

Several prominent scientists said KAUST has the resources to have a big impact on scientific research.

“I don’t think there is any university in the world that has as advanced equipment as they have,” said Stanford solar cell researcher Mike McGeehee. He spent a month helping set up a lab at KAUST and leads Stanford’s Center for Advanced Molecular Photovoltaics, created with a $25 million KAUST grant.

Science at KAUST is directed more toward commercial application. “Things are different there. There’s a tighter connection to industry,’’ said McGeehee.

“You can’t do certain kinds of research at US universities—you can’t have industry come in and do experiments because federal dollars are paying for it, and you can’t give one company an advantage over another. But there, the king says I’m paying for it, I want [commercial] spin-offs.”

American university relationships with corporate research sponsors are a hotly debated topic, notably because of controversy over biased drug studies paid for by pharmaceutical companies. Many universities encourage professors to find corporate as well as government funders, but they keep those contractual arrangements confidential, including terms for industry access to research as well as intellectual-property arrangements. The American Association of University Professors is completing a major study on how universities should structure industry relationships.

To date, in fact, KAUST’s website has publicized its grants to a greater degree than the U.S. universities and scientists receiving them. Universities here have reported very few of the KAUST grants and contracts to the U.S. Department of Education, which maintains a public database of foreign funds to American colleges.

AAUP president Cary Nelson, who is working on the report on corporate-sponsored research, said he was not previously aware of the KAUST grants. “What you are looking at is the touchiest area. All funded research should be reviewed by faculty senate or faculty committee. It should be transparent,” he said.

Cornell University campus publications contain more information of its work with KAUST than is available from other universities, but even there administrators are circumspect about terms of Cornell’s $28 million in KAUST grants and contracts.

“It’s not public,” said Celia Szczepura, administrator of the KAUST-Cornell Center for Energy and Sustainability. As for the work Cornell does that may end up aiding the Saudi oil industry, she said: “KAUST isn’t an industry sponsor—it’s a university. What they share with Aramco and what they don’t, you’d have to ask KAUST.”

But separating the Saudi king’s new university from the kingdom’s oil industry is all but impossible. For now, Saudi Arabia’s petroleum interests have a key role in choosing what energy research is pursued by some of America’s leading scientists.

Susan Schmidt is a longtime Washington journalist and a visiting fellow with the Foundation for Defense of Democracies.