Finding Ways to Cure the Energy Dense but Short-Lived Lithium-Sulfur Battery – A ‘First-Time Look’


Lithium Sulfur I chemistsseek

Everyone’s heard the phrase about seeing both the details and the big picture, and that struggle comes into sharp relief for those studying how to create batteries that hold more energy and cost less. It’s difficult to see the details of atomic and topographical changes as a battery operates.

For DOE’s Joint Center for Energy Storage Research (JCESR), Vijay Murugesan and his colleagues at Pacific Northwest National Laboratory and Texas A&M University found a way. The result? They saw the products of the parasitic electrolyte decomposition reactions. The reactions led to a layer that smothers the electrode in energy-dense-but short-lived-lithium-sulfur batteries (Chemistry of Materials, “In-Situ Chemical Imaging of Solid-Electrolyte Interphase Layer Evolution in Li-S Batteries”).
This research is thanks, in part, to a new device that let the team track the progression of sulfur in a vacuum inside a powerful scientific instrument. “We can now realistically probe the reactions happening and view how the products actually spread,” said Murugesan, PNNL researcher.
The Forest, the Trees and Parasitic Reactions in Batteries
The Forest, the Trees and Parasitic Reactions in Batteries. Researchers built a new stage and created a designer electrolyte to obtain both detailed and broad overviews of a troubling layer that causes promising lithium-sulfur batteries to fail. (Image: Nathan Johnson, PNNL)
 

Better batteries affect everything from how you get to work to how long you can work on your laptop computer before finding an outlet. The results from this fundamental study benefit energy storage in two ways. First, to do the work, the team created a new “stage.” This device let scientists determine the atomic composition and electronic and chemical state of the atoms on the electrode while the battery was running. Scientists can use this device to obtain a detailed view of other batteries.

“Doing this measurement is challenging,” said Vaithiyalingam Shutthanandan, a PNNL scientist who worked on the research. “This is the first time we could access this level of quantity and quality data while batteries were charging and discharging.”
The second benefit of this study is the potential to solve the fading issue in lithium-sulfur batteries. “Sulfur is significantly cheaper than current cathode materials in lithium-ion batteries,” said Murugesan. “So the total cost of a lithium-sulfur battery will be low. Simultaneously, the energy density will be a huge advantage-approximately five times more than lithium-ion batteries.”
The team achieved the results thanks to a combination of scientific innovation and serendipity. The innovation came in building the unique stage for the X-ray photoelectron spectroscopy (XPS) instrument. The researchers needed to track the sulfur in the battery, but sulfur volatilizes in a vacuum. All samples in an XPS are studied under vacuum. Combining the newly designed stage and ionic liquids as electrolyte media let the team operate the battery inside the XPS and monitor the growth of sulfur-based compounds to see the parasitic reactions.
“We designed a completely new capability for the XPS system,” said Ashleigh Schwarz, who performed many of the XPS scans on the battery and helped determine the electrolyte to use on the stage.
The electrolyte’s composition is crucial, as it must survive the vacuum used by XPS. Schwarz and her colleagues tested different compositions to see how well the electrolyte performed in the XPS. The team’s choice contained 20 percent of the traditional solvent (DOL/DME) combined with an ionic solvent.
Using the XPS in analysis or spectroscopy mode, the team obtained the atomic information, including the atoms present and the chemical bonds between them. Switching over to an imaging or microscopic mode, the researchers acquired topological views of the solid-electrolyte interphase (SEI) layer forming. This view let them see where the elements were on the surface and more. The combination of views let them obtain critical information over a wide range of spatial resolutions, spanning from angstroms to micrometers as the battery drained and charged.
The XPS resides in EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science user facility at PNNL.
Lithium Sulfur II bg-applications-1In addition, the team benefited from a serendipitous meeting at a national scientific conference. Murugesan was talking with Perla Balbuena, Texas A&M University, about her research into lithium-sulfur batteries. The pair quickly realized that her work on ab initio molecular dynamics modeling would benefit the experiments. Balbuena and her colleague Luis Camacho-Forero worked with the experimentalists to interpret the results and test new ideas about how the SEI layer forms. Knowing how the layer forms could lead to options that stop its formation altogether and greatly extend the battery life cycle.
As part of JCESR, the team is continuing to answer tough questions necessary to create the next generation of energy storage technologies.
Source: Pacific Northwest National Laboratory

 

Long-lasting flow battery could run for more than a decade with minimum upkeep – Harvard Paulson School of Engineering 


Battery stores energy in nontoxic, noncorrosive aqueous solutions

Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a new flow battery that stores energy in organic molecules dissolved in neutral pH water.

This new chemistry allows for a non-toxic, non-corrosive battery with an exceptionally long lifetime and offers the potential to significantly decrease the costs of production.

The research, published in ACS Energy Letters, was led by Michael Aziz, the Gene and Tracy Sykes Professor of Materials and Energy Technologies and Roy Gordon, the Thomas Dudley Cabot Professor of Chemistry and Professor of Materials Science.

Flow batteries store energy in liquid solutions in external tanks — the bigger the tanks, the more energy they store.


Flow batteries are a promising storage solution for renewable, intermittent energy like wind and solar but today’s flow batteries often suffer degraded energy storage capacity after many charge-discharge cycles, requiring periodic maintenance of the electrolyte to restore the capacity.




By modifying the structures of molecules used in the positive and negative electrolyte solutions, and making them water soluble, the Harvard team was able to engineer a battery that loses only one percent of its capacity per 1000 cycles.


“Lithium ion batteries don’t even survive 1000 complete charge/discharge cycles,” said Aziz.

“Because we were able to dissolve the electrolytes in neutral water, this is a long-lasting battery that you could put in your basement,” said Gordon.

 

 

“If it spilled on the floor, it wouldn’t eat the concrete and since the medium is noncorrosive, you can use cheaper materials to build the components of the batteries, like the tanks and pumps.”

This reduction of cost is important. The Department of Energy (DOE) has set a goal of building a battery that can store energy for less than $100 per kilowatt-hour, which would make stored wind and solar energy competitive with energy produced from traditional power plants.


“If you can get anywhere near this cost target then you change the world,” said Aziz. “It becomes cost effective to put batteries in so many places. This research puts us one step closer to reaching that target.”

“If you can get anywhere near this cost target then you change the world,” said Aziz. “It becomes cost effective to put batteries in so many places. This research puts us one step closer to reaching that target.”

“This work on aqueous soluble organic electrolytes is of high significance in pointing the way towards future batteries with vastly improved cycle life and considerably lower cost,” said Imre Gyuk, Director of Energy Storage Research at the Office of Electricity of the DOE.

“I expect that efficient, long duration flow batteries will become standard as part of the infrastructure of the electric grid.”

The key to designing the battery was to first figure out why previous molecules were degrading so quickly in neutral solutions, said Eugene Beh, a postdoctoral fellow and first author of the paper.

By first identifying how the molecule viologen in the negative electrolyte was decomposing, Beh was able to modify its molecular structure to make it more resilient.

Next, the team turned to ferrocene, a molecule well known for its electrochemical properties, for the positive electrolyte.

“Ferrocene is great for storing charge but is completely insoluble in water,” said Beh. “It has been used in other batteries with organic solvents, which are flammable and expensive.”

But by functionalizing ferrocene molecules the same way as the viologen, the team was able to turn an insoluble molecule into a highly soluble one that could be cycled stably.

“Aqueous soluble ferrocenes represent a whole new class of molecules for flow batteries,” said Aziz.

The neutral pH should be especially helpful in lowering the cost of the ion-selective membrane that separates the two sides of the battery.

Most flow batteries today use expensive polymers that can withstand the aggressive chemistry inside the battery. They can account for up to one-third of the total cost of the device. 


With essentially salt water on both sides of the membrane, expensive polymers can be replaced by cheap hydrocarbons. 

This research was coauthored by Diana De Porcellinis, Rebecca Gracia, and Kay Xia. It was supported by the Office of Electricity Delivery and Energy Reliability of the DOE and by the DOE’s Advanced Research Projects Agency-Energy.

With assistance from Harvard’s Office of Technology Development (OTD), the researchers are working with several companies to scale up the technology for industrial applications and to optimize the interactions between the membrane and the electrolyte.

Harvard OTD has filed a portfolio of pending patents on innovations in flow battery technology.

DOE: One small change makes Quantum Dot solar cells more efficient



The quest for more efficient solar cells has led to the search of new materials. For years, scientists have explored using tiny drops of designer materials, called quantum dots.




Now, we know that adding small amounts of manganese decreases the ability of quantum dots to absorb light but increases the current produced by an average of 300%. Under certain conditions, the current produced increased by 700%.

The enhancement is due to the faster rate that the electrons move from the quantum dot to the balance of the solar cell (what the scientists call the electron tunneling rate) in the presence of the manganese atoms at the interface.

Importantly, this observation is confirmed by theory, opening up possibilities for applying this approach to other systems (Applied Physics Letters, “Giant photocurrent enhancement by transition metal doping in quantum dot sensitized solar cells”).

The power conversion efficiency of quantum dot solar cells has reached about 12%. However, the overall efficiency of quantum dot solar cells is relatively low compared to photovoltaic systems in use today that are based on silicon. In addition, quantum dot solar cells are not as efficient as emerging next-generation solar cells.

The results obtained in this work point to a surprisingly straightforward alternative route. Scientists can significantly improve the performance of this family of solar cells by adding small amounts of alternate metals.

In the quest to replace more traditional solar materials, such as silicon, with more efficient and high-performing options, scientists have been studying quantum dot solar cells as an alternative to harvest sunlight for conversion to electricity.

In this solar cell design, quantum dots are used as the material that absorbs sunlight and converts it to electricity. Quantum dots are very small, nanometer-sized, particles, whose solar conversion properties, in this case a characteristic gap in the energy levels of the electrons called the “bandgap,” are tunable by changing the size or chemical composition.

This is in contrast to bulk materials whose bandgap is fixed by the chemical composition or choice of material(s) alone. This size dependence of bandgap makes quantum dots attractive for multi-junction solar cells, whose efficiency is enhanced by using a variety of materials that absorb different parts of the “rainbow” of wavelengths of light found in the solar spectrum.

This research team discovered that adding small amounts of the transition metal manganese (Mn), or “doping,” resulted in a huge enhancement in the efficiency rate of changing light to electricity for lead sulfide (PbS) quantum dot sensitized solar cells.

Relatively small concentrations of Mn (4 atomic percent) cause the current to increase by an average of 300% with a maximum increase of up to 700%.

Moreover, the mechanism by which this occurs cannot be explained by the light absorption alone because both the experimental and theoretical absorption spectra demonstrate a several times decrease in the absorption coefficient on the addition of Mn.

The team proposes that the dramatic increase is due to a mechanism of increased electron tunneling through the atom pairs at the quantum dot interface with the next layer of the solar cell.

The team used ab initio calculations, which is a computational approach that can describe new phenomena without the need to fit or extrapolate experimental data, to confirm this mechanism.

While typical doping approaches focus on improving exciton lifetime and light absorption channels, results obtained in this study provide an alternative route for significant improvement on the efficiency of quantum dot sensitized solar cells.

Source: U.S. Department of Energy, Office of Science

NREL Wins Award for Isothermal Battery Calorimeters – Measuring Battery Heat Levels and Energy Efficiency with 98% Accuracy – Video


NREL engineer Matthew Keyser holds a A123 battery module over the calorimeter he designed and built with the help of his staff.

” …. The IBCs can determine heat levels and battery energy efficiency with 98% accuracy and provide precise measurements through complete thermal isolation.”

NREL’s R&D 100 Award-winning Isothermal Battery Calorimeters (IBCs) are the only calorimeters in the world capable of providing the precise thermal measurements needed for safer, longer-lasting, and more cost-effective electric-drive vehicle (EDV) batteries. In order for EDVs hybrids (HEVs), plug-in hybrids (PHEVs), and all-electric vehicles (EVs) to realize ultimate market penetration, their batteries need to operate at maximum efficiency, performing at optimal temperatures in a wide range of driving conditions and climates, and through numerous charging cycles.

ibc_rotator_1Cutaway showing battery in the test chamber, heat flux gauges, isothermal fluid surrounding the test chamber, and outside container with insulation holding the bath fluid and the test chamber. Image: Courtesy of NETZSCH

 

NREL’s IBCs make it possible to accurately measure the heat generated by electric-drive vehicle batteries, analyze the effects of temperature on battery systems, and pinpoint ways to manage temperatures for the best performance and maximum life. Three models, the IBC 284, the Module IBC, and the Large-Volume IBC, make it possible to test energy devices at a full range of scales.

The World’s Most Precise Battery Calorimeters

Development of precisely calibrated battery systems relies on accurate measurements of heat generated by battery modules during the full range of charge/discharge cycles, as well as determination of whether the heat was generated electrochemically or resistively. The IBCs can determine heat levels and battery energy efficiency with 98% accuracy and provide precise measurements through complete thermal isolation. These are the first calorimeters designed to analyze heat loads generated by complete battery systems.

This video describes NREL’s R&D 100 Award-winning Isothermal Battery Calorimeters, the only calorimeters in the world capable of providing the precise thermal measurements needed for safer, longer-lasting, and more cost-effective electric-drive vehicle batteries.

Calorimeter Specifications
Specifications IBC 284 (Cell) Module IBC Large-Volume IBC (Pack)
Maximum Voltage (Volts) 50 500 600
Sustained Maximum Current (Amps) 250 250 450
Excursion Currents (Amps) 300 300 1,000
Volume (liters) 9.4 14.7 96
Maximum Dimensions (cm) 20.3 x 20.3 x 15.2 35 x 21 x 20 60 x 40 x 40
Operating Temperature (C) -30 to 60 -30 to 60 -40 to 100
Maximum Constant Heat Generation (W) 50 150 4,000

Working with Industry to Fine-Tune Energy Storage Designs

The IBCs’ capabilities make it possible for battery developers to predict thermal performance before installing batteries in vehicles. Manufacturers use these metrics to compare battery performance to industry averages, troubleshoot thermal issues, and fine-tune designs.

NREL in partnership with NETSCH Instrument North America and with support from the U.S. Department of Energy is using IBCs to help industry design better thermal management systems for EDV battery cells, modules, and packs. The U.S. Advanced Battery Consortium (USABC) and its partners rely on NREL for precise measurement of energy storage devices’ heat generation and efficiency under different states of charge, power profiles, and temperatures.

ORNL: Nano-Ribbons may hold Key to ‘on-off’ states for Graphene ~ Applications in Electronics


Graphene Nano RibbonsZigzag+graphene+nanoribbon_9e206c5e-4835-4d17-b6ca-d717724ca971-prv

“Using electrons like photons could provide the basis for a new electronic device that could carry current with virtually no resistance, even at room temperature.”

 

A new way to grow narrow ribbons of graphene, a lightweight and strong structure of single-atom-thick carbon atoms linked into hexagons, may address a shortcoming that has prevented the material from achieving its full potential in electronic applications.

Graphene nanoribbons, mere billionths of a meter wide, exhibit different electronic properties than two-dimensional sheets of the material.   “Confinement changes graphene’s behavior,” said An-Ping Li, a physicist at the Department of Energy’s Oak Ridge National Laboratory. Graphene in sheets is an excellent electrical conductor, but narrowing graphene can turn the material into a semiconductor if the ribbons are made with a specific edge shape.

Previous efforts to make graphene nanoribbons employed a metal substrate that hindered the ribbons’ useful electronic properties.   Now, scientists at ORNL and North Carolina State University report in the journal Nature Communications that they are the first to grow graphene nanoribbons without a metal substrate. Instead, they injected charge carriers that promote a chemical reaction that converts a polymer precursor into a graphene nanoribbon.

At selected sites, this new technique can create interfaces between materials with different electronic properties. Such interfaces are the basis of semiconductor electronic devices from integrated circuits and transistors to light-emitting diodes and solar cells.   “Graphene is wonderful, but it has limits,” said Li.

“In wide sheets, it doesn’t have an energy gap–an energy range in a solid where no electronic states can exist. That means you cannot turn it on or off.”   When a voltage is applied to a sheet of graphene in a device, electrons flow freely as they do in metals, severely limiting graphene’s application in digital electronics.

GNR 2017 images“When graphene becomes very narrow, it creates an energy gap,” Li said. “The narrower the ribbon is, the wider is the energy gap.”   In very narrow graphene nanoribbons, with a width of a nanometer or even less, how structures terminate at the edge of the ribbon is important too. For example, cutting graphene along the side of a hexagon creates an edge that resembles an armchair; this material can act like a semiconductor. Excising triangles from graphene creates a zigzag edge–and a material with metallic behavior.

To grow graphene nanoribbons with controlled width and edge structure from polymer precursors, previous researchers had used a metal substrate to catalyze a chemical reaction. However, the metal substrate suppresses useful edge states and shrinks the desired band gap.   Li and colleagues set out to get rid of this troublesome metal substrate. At the Center for Nanophase Materials Sciences, a DOE Office of Science User Facility at ORNL, they used the tip of a scanning tunneling microscope to inject either negative charge carriers (electrons) or positive charge carriers (“holes”) to try to trigger the key chemical reaction. They discovered that only holes triggered it. They were subsequently able to make a ribbon that was only seven carbon atoms wide–less than one nanometer wide–with edges in the armchair conformation.

“We figured out the fundamental mechanism, that is, how charge injection can lower the reaction barrier to promote this chemical reaction,” Li said. Moving the tip along the polymer chain, the researchers could select where they triggered this reaction and convert one hexagon of the graphene lattice at a time.   Next, the researchers will make heterojunctions with different precursor molecules and explore functionalities. They are also eager to see how long electrons can travel in these ribbons before scattering, and will compare it with a graphene nanoribbon made another way and known to conduct electrons extremely well.

Using electrons like photons could provide the basis for a new electronic device that could carry current with virtually no resistance, even at room temperature.

“It’s a way to tailor physical properties for energy applications,” Li said. “This is an excellent example of direct writing. You can direct the transformation process at the molecular or atomic level.” Plus, the process could be scaled up and automated.  

Source and top image: Oak Ridge National Laboratory

“An Energy Miracle” ~ Making Solar Fuel to Power Our Energy Needs


Bill Gates Fuel from Solar AAEAAQAAAAAAAArDAAAAJDNlZDZlNmMzLTZkYjMtNDNlYy1iMTliLTIyMzY2ZDg5MjcwOQ

*** Bill Gates: Original Post From gatesnotes.com  

The sun was out in full force the fall morning I arrived at Caltech to visit Professor Nate Lewis’s research laboratory. Temperatures in southern California had soared to 20 degrees above normal, prompting the National Weather Service to issue warnings for extreme fire danger and heat-related illnesses.

The weather was a fitting introduction to what I had come to see inside Nate’s lab—how we might be able to tap the sun’s tremendous energy to make fuels to power cars, trucks, ships, and airplanes.

Stepping into the lab cluttered with computer screens, jars of chemicals, beakers, and other equipment, Nate handed me a pair of safety goggles and offered some advice for what I was about to see. “Everything we do is simple in the end, even though there’s lots of complicated stuff,” he said.

What’s simple is the idea behind all of his team’s research: The sun is the most reliable, plentiful source of renewable energy we have. In fact, more energy from the sun hits the Earth in one hour than humans use in an entire year. If we can find cheap and efficient ways to tap just a fraction of its power, we will go a long way toward finding a clean, affordable, and reliable energy source for the future.

We are all familiar with solar panels, which convert sunlight into electricity. As solar panel costs continue to fall, it’s been encouraging to see how they are becoming a growing source of clean energy around the world. Of course, there’s one major challenge of solar power. The sun sets each night and there are cloudy days. That’s why we need to find efficient ways to store the energy from sunlight so it’s available on demand. 

Batteries are one solution. Even better would be a solar fuel. Fuels have a much higher energy density than batteries, making it far easier to use for storage and transportation. For example, one ton of gasoline stores the same amount of energy as 60 tons of batteries. That’s why, barring a major breakthrough in battery technology, it’s hard to imagine flying from Seattle to Tokyo on a plug-in airplane. Solar Twist download

I’ve written before about the need for an energy miracle to halt climate change and provide access to electricity to millions of the poorest families who live without it. Making solar fuel would be one of those miracles. It would solve the energy storage problem for when the sun isn’t shining. And it would provide an easy-to-use power source for our existing transportation infrastructure. We could continue to drive the cars we have now. Instead of running on fossil fuels from the ground, they would be powered by fuel made from sunlight. And because it wouldn’t contribute additional greenhouses gases to the atmosphere, it would be carbon neutral. 

Imagining such a future is tantalizing. Realizing it will require a lot of hard work. No one knows if there’s a practical way to turn sunlight into fuel. Thanks to the U.S. Department of Energy, Nate and a group of other researchers around the U.S. are receiving research support to find out if it is possible.

We live in a time when new discoveries and innovations are so commonplace that it’s easy to take the cutting-edge research I saw at Caltech for granted. But most breakthroughs that improve our lives—from new health interventions to new clean energy ideas—get their start as government-sponsored research like Nate’s. If successful, that research leads to new innovations, that spawn new industries, that create new jobs, that spur economic growth. It’s impossible to overemphasize the importance of government support in this process. Without it, human progress would not come as far as it has.

tenka-growing-plants-082616-picture1Nate and his team are still at the first stage of this process. But they have reason to be optimistic about what lies ahead. After all, turning sunlight into chemical energy is what plants do every day. Through the process of photosynthesis, plants combine sunlight, water, and carbon dioxide to store solar energy in chemical bonds. At Nate’s lab, his team is working with the same ingredients. The difference is that they need to figure out how to do it even better and beat nature at its own game.

“We want to create a solar fuel inspired by what nature does, in the same way that man built aircraft inspired by birds that fly,” Nate said. “But you don’t build an airplane out of feathers. And we’re not going to build an artificial photosynthetic system out of chlorophylls and living systems, because we can do better than that.”

One of Nate’s students showed me how light can be used to split water into oxygen and hydrogen—a critical first step in the path to solar fuels. The next step would involve combining hydrogen with carbon dioxide to make fuels. Using current technologies, however, it is too costly to produce a fuel from sunlight. To make it cheaper, much more research needs to be done to understand the materials and systems that could create a dependable source of solar fuel.hydrogen-earth-150x150

One idea his team is working on is a kind of artificial turf made of plastic cells that could be easily rolled out to capture sunlight to make fuel. Each plastic cell would contain water, light absorbers, and a catalyst. The catalyst helps accelerate the chemical reactions so each cell can produce hydrogen or carbon-based fuels more efficiently. Unfortunately, the best catalysts are among the rarest and most expensive elements, like platinum. A key focus of Nate’s research is finding other catalysts that are not only effective and durable, but also economical.

Nate’s interest in clean energy research started during the oil crisis in the 1970s, when he waited for hours in gas lines with his dad. He says he knew then that he wanted to dedicate his life to energy research. Now, he is helping to train a new generation of scientists to help solve our world’s energy challenge. Seeing the number of young people working in Nate’s lab was inspiring. The pace of innovation for them is now much faster than ever before. “We do experiments now in a day that would once take a year or an entire Ph.D. thesis to do,” Nate said.

Still, I believe we should be doing a lot more. We need thousands of scientists following all paths that might lead us to a clean energy future. That’s why a group of investors and I recently launched Breakthrough Energy Ventures, a fund that will invest more than $1 billion in scientific discoveries that have the potential to deliver cheap and reliable clean energy to the world.

While we won’t be filling up our cars with solar fuels next week or next year, Nate’s team has already made valuable contributions to our understanding of how we might achieve this bold goal. With increased government and private sector support, we will make it possible for them to move ahead with their research at full speed.

This originally appeared on gatesnotes.com.

MIT: New ‘Solar Skin’ Solar panels get a face-lift with custom Display Capabilities


mit-sistine-solar_0

Startup aims for wider U.S. solar adoption with photovoltaic panels that can display any image.

Founded at the MIT Sloan School of Management, Sistine Solar creates custom solar panels designed to mimic home facades and other environments, as well as display custom designs, with aims of enticing more homeowners to install photovoltaic systems. Courtesy of Sistine Solar

Residential solar power is on a sharp rise in the United States as photovoltaic systems become cheaper and more powerful for homeowners. A 2012 study by the U.S. Department of Energy (DOE) predicts that solar could reach 1 million to 3.8 million homes by 2020, a big leap from just 30,000 homes in 2006.

But that adoption rate could still use a boost, according to MIT spinout Sistine Solar. “If you look at the landscape today, less than 1 percent of U.S. households have gone solar, so it’s nowhere near mass adoption,” says co-founder Senthil Balasubramanian MBA ’13.

Founded at the MIT Sloan School of Management, Sistine creates custom solar panels designed to mimic home facades and other environments, with aims of enticing more homeowners to install photovoltaic systems.

Sistine’s novel technology, SolarSkin, is a layer that can be imprinted with any image and embedded into a solar panel without interfering with the panel’s efficacy. Homeowners can match their rooftop or a grassy lawn. Panels can also be fitted with business logos, advertisements, or even a country’s flag. SolarSkin systems cost about 10 percent more than traditional panel installations. But over the life of the system, a homeowner can still expect to save more than $30,000, according to the startup.

A winner of a 2013 MIT Clean Energy Prize, Sistine has recently garnered significant media attention as a rising “aesthetic solar” startup. Last summer, one of its pilot projects was featured on the Lifetime television series “Designing Spaces,” where the panels blended in with the shingle roof of a log cabin in Hubbardston, Massachusetts.

In December, the startup installed its first residential SolarSkin panels, in a 10-kilowatt system that matches a cedar pattern on a house in Norwell, Massachusetts. Now, the Cambridge-based startup says it has 200 homes seeking installations, primarily in Massachusetts and California, where solar is in high demand.

“We think SolarSkin is going to catch on like wildfire,” Balasubramanian says. “There is a tremendous desire by homeowners to cut utility bills, and solar is finding reception with them — and homeowners care a lot about aesthetics.”

hancock-solar-imagesmountain-solar-images

 

 

 

suburb-solar-images

 

Captivating people with solar – Who Said Solar Can’t Be Beautiful?

SolarSkin is the product of the co-founders’ unique vision, combined with MIT talent that helped make the product a reality.

Balasubramanian came to MIT Sloan in 2011, after several years in the solar-power industry, with hopes of starting his own solar-power startup — a passion shared by classmate and Sistine co-founder Ido Salama MBA ’13.

One day, the two were brainstorming at the Muddy Charles Pub, when a surprisingly overlooked issue popped up: Homeowners, they heard, don’t really like the look of solar panels. That began a nebulous business mission to “captivate people’s imaginations and connect people on an emotional level with solar,” Balasubramanian says.

Recruiting Jonathan Mailoa, then a PhD student in MIT’s Photovoltaic Research Laboratory, and Samantha Holmes, a mosaic artist trained in Italy who is still with the startup, the four designed solar panels that could be embedded on massive sculptures and other 3-D objects. They took the idea to 15.366 (Energy Ventures), where “it was drilled into our heads that you have to do a lot of market testing before you build a product,” Balasubramanian says.

That was a good thing, too, he adds, because they realized their product wasn’t scalable. “We didn’t want to make a few installations that people talk about. … We [wanted to] make solar so prevalent that within our lifetime we can see the entire world convert to 100 percent clean energy,” Balasubramanian says.

The team’s focus then shifted to manufacturing solar panels that could match building facades or street fixtures such as bus shelters and information kiosks. In 2013, the idea earned the team — then officially Sistine Solar — a modest DOE grant and a $20,000 prize from the MIT Clean Energy Prize competition, “which was a game-changer for us,” Balasubramanian says.

But, while trying to construct custom-designed panels, another idea struck: Why not just make a layer to embed into existing solar panels? Recruiting MIT mechanical engineering student Jody Fu, Sistine created the first SolarSkin prototype in 2015, leading to pilot projects for Microsoft, Starwood Hotels, and other companies in the region.

That summer, after earning another DOE grant for $1 million, Sistine recruited Anthony Occidentale, an MIT mechanical engineering student who has since helped further advance SolarSkin. “We benefited from the incredible talent at MIT,” Balasubramanian says. “Anthony is a shining example of someone who resonates with our vision and has all the tools to make this a reality.”

Imagination is the limit

SolarSkin is a layer that employs selective light filtration to display an image while still transmitting light to the underlying solar cells. The ad wraps displayed on bus windows offer a good analogy: The wraps reflect some light to display an image, while allowing the remaining light through so passengers inside the bus can see out. SolarSkin achieves a similar effect — “but the innovation lies in using a minute amount of light to reflect an image [and preserve] a high-efficiency solar module,” Balasubramanian says.

To achieve this, Occidentale and others at Sistine have developed undisclosed innovations in color science and human visual perception. “We’ve come up with a process where we color-correct the minimal information we have of the image on the panels to make that image appear, to the human eye, to be similar to the surrounding backdrop of roof shingles,” Occidentale says.

As for designs, Sistine has amassed a database of common rooftop patterns in the United States, such as asphalt shingles, clay tiles, and slate, in a wide variety of colors. “So if a homeowner says, for instance, ‘We have manufactured shingles in a barkwood pattern,’ we have a matching design for that,” he says. Custom designs aren’t as popular, but test projects include commercial prints for major companies, and even Occidentale’s face on a panel.

Currently, Sistine is testing SolarSkin for efficiency, durability, and longevity at the U.S. National Renewable Energy Laboratory under a DOE grant.

The field of aesthetic solar is still nascent, but it’s growing, with major companies such as Tesla designing entire solar-panel roofs. But, as far as Balasubramanian knows, Sistine is the only company that’s made a layer that can be integrated into any solar panel, and that can display any color as well as intricate patterns and actual images.

Companies could thus use SolarSkin solar panels to double as business signs. Municipalities could install light-powering solar panels on highways that blend in with the surrounding nature. Panels with changeable advertisements could be placed on bus shelters to charge cell phones, information kiosks, and other devices. “You can start putting solar in places you typically didn’t think of before,” Balasubramanian says. “Imagination is really the only limit with this technology.”

Argonne National Laboratory-led projects among $39.8 million in first-round “Exascale” Computing Project awards -Enabled Precision Medicine for Cancer


doe-iii-doeThe U.S. Department of Energy’s (DOE’s) Exascale Computing Project (ECP) today announced its first round of funding with the selection of 15 application development proposals for full funding and seven proposals for seed funding, representing teams from 45 research and academic organizations.

Exascale refers to high-performance computing systems capable of at least a billion billion calculations per second, or a factor of 50 to 100 times faster than the nation’s most powerful supercomputers in use today.

The 15 awards being announced total $39.8 million, targeting advanced modeling and simulation solutions to specific challenges supporting key DOE missions in science, clean energy and national security, as well as collaborations such as the Precision Medicine Initiative with the National Institutes of Health’s National Cancer Institute.

Of the proposals announced that are receiving full funding, two are being led by principal investigators at the DOE’s Argonne National Laboratory:

  1. Computing the Sky at Extreme Scales equips cosmologists with the ability to design foundational simulations to create “virtual universes” on demand at the extreme fidelities demanded by future multi-wavelength sky surveys. The new discoveries that will emerge from the combination of sky surveys and advanced simulation provided by the ECP will shed more light on three key ingredients of our universe: dark energy, dark matter and inflation. All three of these concepts reach beyond the known boundaries of the Standard Model of particle physics.Salman Habib, Principal Investigator, Argonne National Laboratory, with Los Alamos National Laboratory and Lawrence Berkeley National Laboratory.argone-ii-nl-mira_-_blue_gene_q_at_argonne_national_laboratory
  1. Exascale Deep Learning and Simulation Enabled Precision Medicine for Cancer focuses on building a scalable deep neural network code called the CANcer Distributed Learning Environment (CANDLE) that addresses three top challenges of the National Cancer Institute: understanding the molecular basis of key protein interactions, developing predictive models for drug response and automating the analysis and extraction of information from millions of cancer patient records to determine optimal cancer treatment strategies.Rick Stevens, Principal Investigator, Argonne National Laboratory, with Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory and the National Cancer Institute.

Additionally, a third project led by Argonne will be receiving seed funding:

  1. Multiscale Coupled Urban Systems will create an integrated modeling framework comprising data curation, analytics, modeling and simulation components that will equip city designers, planners and managers to scientifically develop and evaluate solutions to issues that affect cities now and in the future. The framework will focus first on integrating urban atmosphere and infrastructure heat exchange and air flow; building energy demand at district or city-scale, generation and use; urban dynamics and socioeconomic models; population mobility and transportation; and hooks to expand to include energy systems (biofuels, electricity and natural gas) and water resources.Charlie Catlett, Principal Investigator, Argonne National Laboratory, with Lawrence Berkeley National Laboratory, National Renewable Energy Laboratory, Oak Ridge National Laboratory and Pacific Northwest National Laboratory.

The application efforts will help guide DOE’s development of a U.S. exascale ecosystem as part of President Obama’s National Strategic Computing Initiative. DOE, the U.S. Department of Defense and the National Science Foundation have been designated as lead agencies, and ECP is the primary DOE contribution to the initiative.

The ECP’s multiyear mission is to maximize the benefits of high-performance computing for U.S. economic competitiveness, national security and scientific discovery. In addition to applications, the DOE project addresses hardware, software, platforms and workforce development needs critical to the effective development and deployment of future exascale systems.

argone-nl-090115-114727Leadership of the ECP comes from six DOE national laboratories: the Office of Science’s Oak Ridge, Argonne and Lawrence Berkeley national labs and the National Nuclear Security Administration’s (NNSA’s) Lawrence Livermore, Los Alamos and Sandia national labs.

The Exascale Computing Project is a collaborative effort of two DOE organizations — the Office of Science and the NNSA. As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures, and workforce development to meet the scientific and national security mission needs of DOE in the mid-2020s timeframe.

Established by Congress in 2000, NNSA is a semi-autonomous agency within DOE responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, and effectiveness of the U.S. nuclear weapons stockpile without nuclear explosive testing; works to reduce the global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the United States and abroad.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.

DOE to invest $16M in computational design of new materials for alt and renewable energy, electronics and other fields


DOE_Logo

17 August 2016

The US Department of Energy will invest $16 million over the next four years to accelerate the design of new materials through use of supercomputers.

Two four-year projects—one team led by DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), the other teamled by DOE’s Oak Ridge National Laboratory (ORNL)—will leverage the labs’ expertise in materials and take advantage of lab supercomputers to develop software for designing fundamentally new functional materials destined to revolutionize applications in alternative and renewable energy, electronics, and a wide range of other fields. The research teams include experts from universities and other national labs.

The new grants—part of DOE’s Computational Materials Sciences (CMS) program launched in 2015 as part of the US Materials Genome Initiative—reflect the enormous recent growth in computing power and the increasing capability of high-performance computers to model and simulate the behavior of matter at the atomic and molecular scales.lawrence-berkeley-national-laboratory

The teams are expected to develop sophisticated and user-friendly open-source software that captures the essential physics of relevant systems and can be used by the broader research community and by industry to accelerate the design of new functional materials.

Berkeley_Lab_Logo_Small 082016The Berkeley Lab team will be led by Steven G. Louie, an internationally recognized expert in materials science and condensed matter physics. A longtime user of NERSC supercomputers, Louie has a dual appointment as Senior Faculty Scientist in the Materials Sciences Division at Berkeley Lab and Professor of Physics at the University of California, Berkeley. Other team members are Jack Deslippe, Jeffrey B. Neaton, Eran Rabani, Feng Wang, Lin-Wang Wang and Chao Yang, Lawrence Berkeley National Laboratory; and partners Daniel Neuhauser, University of California at Los Angeles, and James R. Chelikowsky, University of Texas, Austin.

This investment in the study of excited-state phenomena in energy materials will, in addition to pushing the frontiers of science, have wide-ranging applications in areas such as electronics, photovoltaics, light-emitting diodes, information storage and energy storage. We expect this work to spur major advances in how we produce cleaner energy, how we store it for use, and to improve the efficiency of devices that use energy.

—Steven Louie

ORNL researchers will partner with scientists from national labs and universities to develop software to accurately predict the properties of quantum materials with novel magnetism, optical properties and exotic quantum phases that make them well-suited to energy applications, said Paul Kent of ORNL, director of the Center for Predictive Simulation of Functional Materials, which includes partners from Argonne, Lawrence Livermore, Oak Ridge and Sandia National Laboratories and North Carolina State University and the University of California–Berkeley.ORNL 082016 steamplant_8186871_ver1.0_640_480

Our simulations will rely on current petascale and future exascale capabilities at DOE supercomputing centers. To validate the predictions about material behavior, we’ll conduct experiments and use the facilities of the Advanced Photon Source, Spallation Neutron Source and the Nanoscale Science Research Centers.

—Paul Kent

At Argonne, our expertise in combining state-of-the-art, oxide molecular beam epitaxy growth of new materials with characterization at the Advanced Photon Source and the Center for Nanoscale Materials will enable us to offer new and precise insight into the complex properties important to materials design. We are excited to bring our particular capabilities in materials, as well as expertise in software, to the center so that the labs can comprehensively tackle this challenge.

—Olle Heinonen, Argonne materials scientist

Researchers are expected to make use of the 30-petaflop/s Cori supercomputer now being installed at the National Energy Research Scientific Computing center (NERSC) at Berkeley Lab, the 27-petaflop/s Titan computer at the Oak Ridge Leadership Computing Facility (OLCF) and the 10-petaflop/s Mira computer at Argonne Leadership Computing Facility (ALCF). OLCF, ALCF, and NERSC are all DOE Office of Science User Facilities. One petaflop/s is 1015 or a million times a billion floating-point operations per second.

In addition, a new generation of machines is scheduled for deployment between 2016 and 2019 that will take peak performance as high as 200 petaflops. Ultimately the software produced by these projects is expected to evolve to run on exascale machines, capable of 1000 petaflops and projected for deployment in the mid-2020s.

Research will combine theory and software development with experimental validation, drawing on the resources of multiple DOE Office of Science User Facilities, including the Molecular Foundry and Advanced Light Source at Berkeley Lab, the Center for Nanoscale Materials and the Advanced Photon Source at Argonne National Laboratory (ANL), and the Center for Nanophase Materials Sciences and the Spallation Neutron Source at ORNL, as well as the other Nanoscience Research Centers across the DOE national laboratory complex.

The new research projects will begin in Fiscal Year 2016. Subsequent annual funding will be contingent on available appropriations and project performance.

The two new projects expand the ongoing CMS research effort, which began in FY 2015 with three initial projects, led respectively by ANL, Brookhaven National Laboratory and the University of Southern California.

DOE – Brookhaven: Smarter self-assembly opens new pathways for nanotechnology – Essential to Fully Exploit the Nanoscale for ‘Next Generation’ Electronic Devices


Smart Self Assem 081116 smarterselfaTo continue advancing, next-generation electronic devices must fully exploit the nanoscale, where materials span just billionths of a meter. But balancing complexity, precision, and manufacturing scalability on such fantastically small scales is inevitably difficult. Fortunately, some nanomaterials can be coaxed into snapping themselves into desired formations-a process called self-assembly.

Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have just developed a way to direct the self-assembly of multiple molecular patterns within a single material, producing new nanoscale architectures. The results were published in the journal Nature Communications.

“This is a significant conceptual leap in self-assembly,” said Brookhaven Lab physicist Aaron Stein, lead author on the study. “In the past, we were limited to a single emergent pattern, but this technique breaks that barrier with relative ease. This is significant for basic research, certainly, but it could also change the way we design and manufacture electronics.”

Microchips, for example, use meticulously patterned templates to produce the nanoscale structures that process and store information. Through self-assembly, however, these structures can spontaneously form without that exhaustive preliminary patterning. And now, self-assembly can generate multiple distinct patterns-greatly increasing the complexity of nanostructures that can be formed in a single step.

“This technique fits quite easily into existing microchip fabrication workflows,” said study coauthor Kevin Yager, also a Brookhaven physicist. “It’s exciting to make a fundamental discovery that could one day find its way into our computers.”

The experimental work was conducted entirely at Brookhaven Lab’s Center for Functional Nanomaterials (CFN), a DOE Office of Science User Facility, leveraging in-house expertise and instrumentation.

Smarter self-assembly opens new pathways for nanotechnology
Electron beam lithography is used to adjust the spacing and thickness of line patterns etched onto a template (lower layer). These patterns drive a self-assembling block copolymer (top layer) to locally form different types of patterns, …more

 

Cooking up organized complexity

The collaboration used block copolymers-chains of two distinct molecules linked together-because of their intrinsic ability to self-assemble.

“As powerful as self-assembly is, we suspected that guiding the process would enhance it to create truly ‘responsive’ self-assembly,” said study coauthor Greg Doerk of Brookhaven. “That’s exactly where we pushed it.”

To guide self-assembly, scientists create precise but simple substrate templates. Using a method called electron beam lithography-Stein’s specialty-they etch patterns thousands of times thinner than a human hair on the template surface. They then add a solution containing a set of block copolymers onto the template, spin the substrate to create a thin coating, and “bake” it all in an oven to kick the molecules into formation. Thermal energy drives interaction between the block copolymers and the template, setting the final configuration-in this instance, parallel lines or dots in a grid.

“In conventional self-assembly, the final nanostructures follow the template’s guiding lines, but are of a single pattern type,” Stein said. “But that all just changed.”

Smarter self-assembly opens new pathways for nanotechnology
Brookhaven National Laboratory Center for Functional Nanomaterials researchers Gwen Wright and Aaron Stein are at the electron beam lithography writer in the CFN cleanroom. Credit: Brookhaven National Laboratory

Lines and dots, living together

The collaboration had previously discovered that mixing together different allowed multiple, co-existing line and dot nanostructures to form.

“We had discovered an exciting phenomenon, but couldn’t select which morphology would emerge,” Yager said. But then the team found that tweaking the substrate changed the structures that emerged. By simply adjusting the spacing and thickness of the lithographic line patterns-easy to fabricate using modern tools-the self-assembling blocks can be locally converted into ultra-thin lines, or high-density arrays of nano-dots.

“We realized that combining our self-assembling materials with nanofabricated guides gave us that elusive control. And, of course, these new geometries are achieved on an incredibly small scale,” said Yager.

“In essence,” said Stein, “we’ve created ‘smart’ templates for nanomaterial self-assembly. How far we can push the technique remains to be seen, but it opens some very promising pathways.”

Gwen Wright, another CFN coauthor, added, “Many nano-fabrication labs should be able to do this tomorrow with their in-house tools-the trick was discovering it was even possible.”

The scientists plan to increase the sophistication of the process, using more complex materials in order to move toward more device-like architectures.

“The ongoing and open collaboration within the CFN made this possible,” said Charles Black, director of the CFN. “We had experts in , electron beam lithography, and even electron microscopy to characterize the materials, all under one roof, all pushing the limits of nanoscience.”

Explore further: Copolymers block out new approaches to microelectronics at NIST

More information: A. Stein et al, Selective directed self-assembly of coexisting morphologies using block copolymer blends, Nature Communications (2016). DOI: 10.1038/ncomms12366