MIT Energy Conference: Making energy storage so simple it’s “boring”

MIT-Energy-Conference01A1_0At the Friday evening “Energy Showcase,” participants tried out a virtual reality demonstration put on by Shell, one of the conference sponsors. Courtesy of the MIT Energy Conference

Inexpensive, reliable, and forgettable storage systems could enable boom in renewables, say speakers at MIT Energy Conference.

One of the key technologies needed to transform world energy supplies away from fossil fuels and toward clean, renewable sources is a cheap and reliable way of storing and releasing energy. That will enable intermittent supplies such as solar and wind power, with their variable and often unpredictable outputs, to store energy that’s produced when it’s not needed and to release it when it’s needed most (or can be sold for the best price).

That was a message that came up repeatedly over the two days of talks and panel discussions at the 12th annual MIT Energy Conference, held on March 3 and 4. But while better, cheaper storage is essential, implementing it faces technical, economic, and policy challenges, several speakers noted.

Massachusetts, home to a number of leading startup ventures in the energy storage area, has “a huge opportunity to be a leader” in this burgeoning industry, said Judith Judson, the commissioner of the Massachusetts Department of Energy Resources, in one of the conference’s panel discussions. The state is already home to a number of companies working on innovative battery technologies, several of which are based on research from MIT labs.

But getting such technologies out into the world is more than just a matter of building a better mousetrap. For one thing, “the role of the utilities to push on this technology is so important,” said Belen Linares, the global research and development director for Spain-based energy company Acciona. “It’s collaborations between schools and industry that are going to give these technologies the real boost they need.”mit_logo

Ravi Manghani, energy storage director for GTM Research, a solar-market analysis firm, who moderated that panel, concluded that what researchers really need to do now is “work on making energy storage less complicated and more boring.” It needs to be the kind of simple technology that can be purchased and then forgotten, he suggested, somewhat like a home water heater.

One promising new entry in that sector is Ambri, a company developing grid-scale batteries based on all-liquid active components, a technology developed at MIT in the lab of Donald Sadoway, the John F. Elliott Professor in Materials Science and Engineering. Dana Guernsey, Ambri’s director of corporate development, said that “one of the exciting reasons to be in this industry” is the potential to open up electrification “in places where there is no grid at all. Storage systems will electrify parts of the world that have been dark,” by enabling the construction of small, localized “microgrids” that can serve villages, schools, or small businesses.

energy_storage_2013 042216 _11-13-1Consumers can help to bring about that transformation, a number of speakers said. A panel discussion on “the engaged ratepayer” made it clear that utilities are becoming ever more responsive to the needs and wants of their customers, as the business has become more competitive. “The rate of change now in the industry is extraordinary,” said Terry Sobolewski, the chief customer officer of the utility company National Grid. “And it’s almost entirely driven by customers.”

In fact, the customer-centric orientation of the companies is now so strong that the formerly universal term “ratepayer” has now been officially discontinued by most of these companies because it implies a one-way relationship, and the companies now really want to stress their responsiveness and engagement, Sobolewski said.

Part of that responsiveness includes working with business and industrial customers. “More and more industries are asking for help,” he said, “on ‘how do I integrate renewables into my business?’” The change in attitude, he said, suggests that “we are on the precipice of this transformation.”

Accelerating the transformation to non-fossil-fuel energy sources will also require improved analytical tools for gathering data about the performance of buildings and industrial plants using different combinations of energy and efficiency systems, several speakers said.

Southern California Edison is making inroads in this area, said Andre Ramirez, a principal advisor to the company. Its 5 million customers, he said, are all now on a “time-of-use” rate system, by default. That kind of rate system allows customers to shift energy-intensive activities, such as running a clothes dryer or charging an electric car, to off-peak periods when the rates may be much lower. Such shifting, if done on a large scale, can greatly reduce the utility’s need to build expensive peak-power generators to handle those loads. “We’re at the leading edge of that change,” Ramirez said.

The kind of detailed information that can be gathered over time about residential use can lead to specific, targeted suggestions for energy improvements for a given home or business. But information can also influence other kinds of decisions. For example, “millennials want to work for companies that have a conscience,” including what their energy strategies are and what their values are, said Tim Healy, CEO and chairman of the energy data analysis and management company EnerNOC. Transparency about a company’s business practices, including the details of its actual energy production sources, can be an important part of that, he said.

As for the energy systems themselves, innovations in the regulatory system that establishes rates and governs the interactions among power producers, distribution networks, and end-users are a major need at this point. “Regulatory affairs is where the innovation needs to occur,” Healy said.

MIT’s Energy Conference is organized annually under the auspices of the MIT Energy Club, which with its 5,000 members, including students, faculty, staff, and alumni, is “the largest energy club anywhere,” said Robert Armstrong, director of the MIT Energy Initiative, in his opening remarks at this year’s event.

PNNL ~ One-pot technique creates structures with more efficient energy storage and manufacturing

one-pot-588a732e171b4Highly ordered sodium silicate particles (bottom right) with a regular array of spherical pores (bottom left) form on silicon surface. The one-step synthesis is directed by the atomic ordering of the substrate, which induces the formation …more

To create more efficient catalysts, sensing and separation membrane, and energy storage devices, scientists often start with particles containing tiny pore channels. Defects between the particles can hamper performance. At Pacific Northwest National Laboratory, a team created a one-pot method that produces complex, well-structured microscopic pyramids. This approach offers control over three-dimensional material growth similar to that seen in nature, a vital benchmark for material synthesis.

“It’s relatively easy to grow thin layers of material,” said Dr. Maria Sushko, a PNNL materials scientist who worked on the study. “Now, we can grow supported three-dimensional crystals that have a larger ordered structure on the inside as well—a crystal within a crystal.”

Energy storage materials that are more efficient could the way we use renewable energy. More efficient catalysts, sensors, and separators that last longer and work harder could reduce the energy demands and waste from manufacturing plants and refineries. These technologies require innovative materials, and the team’s technique offers a new way to create them. Now, scientists can grow well-defined three-dimensional structures on a surface in a single step. Growing a material directly on the surface eliminates steps in testing new ideas for electrodes or catalysts.

In the simplest terms, the team’s approach takes advantage of a relationship among the atomic ordering of a silicon substrate, structure of organic template, and atomic structure of sodium silicate. When organic molecules and a sodium silicate precursor are combined in the right proportions and the solution is heated in the presence of the silicon surface, the directs the template’s self-assembly along a specific crystallographic direction. The template directs the formation of sodium silicate along the same crystallographic direction of the substrate, ensuring near-perfect lattice matching between silicon and sodium silicate.

After a series of transformations, the organic template forms an array of well-defined spherical micelles several nanometers in diameter. The micelles are arranged in a cubic lattice and encapsulated into sodium silicate. The result is an array of oriented ordered porous pyramids with a well-defined cubic lattice of pores, confirmed by electron microscopes at the U.S. Department of Energy’s (DOE’s) EMSL, a scientific user facility.

In nature, proteins direct the growth of complex structures, such as shells, bone and tooth enamel. The team’s novel approach provides precise control over materials architecture similar to that seen in nature. The scientists can vary the structure and size of the particles. Their system makes different structures, with different sizes and compositions, as needed. This level of control in the laboratory is a significant benchmark for materials synthesis.

The team’s technique is an important addition to the methods of synthesizing supported three-dimensional structures. The team is exploring ways to expand this technique beyond to other materials.

Explore further: New insights into the forms of metal-organic frameworks

More information: Yongsoon Shin et al. Double Epitaxy as a Paradigm for Templated Growth of Highly Ordered Three-Dimensional Mesophase Crystals, ACS Nano (2016). DOI: 10.1021/acsnano.6b03999


University of Colorado:Engineers transform brewery wastewater into energy storage … Only in Colorado … Only in Boulder would …

colorado-brewery-engineerstraEquipment at a brewery. Credit: FTGallo / Wikipedia.


University of Colorado Boulder engineers have developed an innovative bio-manufacturing process that uses a biological organism cultivated in brewery wastewater to create the carbon-based materials needed to make energy storage cells.

This unique pairing of breweries and batteries could set up a win-win opportunity by reducing expensive wastewater treatment costs for beer makers while providing manufacturers with a more cost-effective means of creating renewable, naturally-derived fuel cell technologies.

“Breweries use about seven barrels of water for every barrel of beer produced,” said Tyler Huggins, a graduate student in CU Boulder’s Department of Civil, Environmental and Architectural Engineering and lead author of the new study. “And they can’t just dump it into the sewer because it requires extra filtration.”

The process of converting biological materials, or biomass, such as timber into carbon-based battery electrodes is currently used in some energy industry sectors. But, naturally-occurring biomass is inherently limited by its short supply, impact during extraction and intrinsic chemical makeup, rendering it expensive and difficult to optimize.

However, the CU Boulder researchers utilize the unsurpassed efficiency of biological systems to produce sophisticated structures and unique chemistries by cultivating a fast-growing fungus, Neurospora crassa, in the sugar-rich wastewater produced by a similarly fast-growing Colorado industry: breweries.

“The wastewater is ideal for our fungus to flourish in, so we are happy to take it,” said Huggins.

By cultivating their feedstock in wastewater, the researchers were able to better dictate the fungus’s chemical and physical processes from the start. They thereby created one of the most efficient naturally-derived lithium-ion battery electrodes known to date while cleaning the wastewater in the process.

The findings were published recently in the American Chemical Society journal Applied Materials & Interfaces.

If the process were applied on a large scale, breweries could potentially reduce their municipal wastewater costs significantly while manufacturers would gain access to a cost-effective incubating medium for advanced battery technology components.

“The novelty of our process is changing the manufacturing process from top-down to bottom-up,” said Zhiyong Jason Ren, an associate professor in CU Boulder’s Department of Civil, Environmental and Architectural Engineering and a co-author of the new study. “We’re biodesigning the materials right from the start.”

Huggins and study co-author Justin Whiteley, also of CU Boulder, have filed a patent on the process and created Emergy, a Boulder-based company aimed at commercializing the technology.

“We see large potential for scaling because there’s nothing required in this process that isn’t already available,” said Huggins.

The researchers have partnered with Avery Brewing in Boulder in order to explore a larger pilot program for the technology. Huggins and Whiteley recently competed in the finals of a U.S. Department of Energy-sponsored startup incubator competition at the Argonne National Laboratory in Chicago, Illinois.

“This research speaks to the spirit of entrepreneurship at CU Boulder,” said Ren, who plans to continue experimenting with the mechanisms and properties of the fungus growth within the wastewater. “It’s great to see students succeeding and creating what has the potential to be a transformative technology. Energy storage represents a big opportunity for the state of Colorado and beyond.”cu-boulder-maxresdefault

Explore further: Researchers use wastewater treatment to capture CO2, produce energy

More information: Tyler M. Huggins et al. Controlled Growth of Nanostructured Biotemplates with Cobalt and Nitrogen Codoping as a Binderless Lithium-Ion Battery Anode, ACS Applied Materials & Interfaces (2016). DOI: 10.1021/acsami.6b09300


Creating the Future of Batteries



We need better ways to store and use energy, that’s no secret. Cell phones need charging every day, electric cars only have a range of about a hundred miles and our ability to use solar and wind energy to feed the power grid is still very limited. These are things we’ve taken for granted, but if you look, historically, at the rate in which our technology improves — just think about cell phones and computers in the last 20 years — it’s easy to see that this area of technological development has severely lagged.

energy storage device.jpgWhile there are a number of political, philosophical and theoretical explanations for why energy storage development has fallen behind, experts agree that if the problem is going to be fixed in our lifetime, it needs to start now.

Energy storage is a limiting factor that researchers have been aware of for quite a while, but their work to improve our storage devices has taken many, disparate directions. In a recent edition of Nature Communications, Drexel materials science and engineering researchers Yury Gogotsi, PhD, and Maria Lukatskaya, PhD, who have been surveying the landscape of energy storage research for years, offer a unified route for bringing our energy storage and distribution capabilities level with our energy production and consumption.

rice-nanoporus-battery-102315-untitled-1You May Also Want To Read: Nanoporous Material Combines the Best of Batteries and Supercapacitors for ESS (Energy Storage Systems)


Read about the work of Dr. Jim Tour at Rice University – “Changing the Equation” for how we think about Batteries, Super Capacitors and Energy Storage.        Rice logo_rice3



Lukatskaya and Gogotsi unpacked the problem for the News Blog and offered up three ways in which energy storage research and development need to change right now to get things moving in the right direction:

 So, the directions where we want our energy storage devices — such as batteries — to go are pretty intuitive: we want them to store more energy per unit of volume (or mass) so that it would provide longer autonomy times for portable electronics without making them bulkier. We also want to enable fast charging of the devices, so that five minutes of charging would provide full-day power for device operation. And last, but not least, we want to increase the lifespan of batteries — meaning the number of charge/discharge cycles they can undergo without performance degradation.  

To achieve that, we need to rethink conventional electrode architectures and materials that are currently used in energy storage devices, such as batteries and supercapacitors.

  1. Clean up all the wasted space

For example, in state of the art batteries, too much volume is occupied by the cell components that do not store charge. It is estimated that in smaller devices more than 80 percent of the volume is occupied by the inert cell components: current collectors, separators and casings. So new design concepts that minimize use of current collectors would lead to substantial improvement in energy that can be stored per unit of mass or volume of the device.

  1. Come up with a better recipe

Secondly, new electrolyte and electrode chemistries should be explored. Currently, oxide materials dominate the “insides” of batteries. Oxides have many advantages, being among the most studied material, and they provided a reliable energy storage solution for quite a while, but in order to address growing needs for high-energy batteries, other electrode materials should be explored that have high electrical conductivity and can enable multielectron redox reactions (storing more charges per atom than lithium).

  1. Get electrons and ions on the expressway

In order to make energy storage devices fast, it is again necessary to reconsider electrode architectures to ensure rapid accessibility of ions and electrons toward active sites. Basically, we need to create such architectures where, instead of a “maze,” ions can move on “highways” providing fast charging.


Gogotsi is Distinguished University and Trustee Chair professor in the College of Engineering and director of the A.J. Drexel Nanomaterials Institute. Lukatskaya, was a doctoral candidate in the Department of Materials Science and Engineeringwhen she worked with Gogotsi on this research. She is now a post-doctoral research fellow at Stanford University.

You can read their Nature Communications paper “Multidimensional materials and device architectures for future hybrid energy storage” here:


DOE to invest $16M in computational design of new materials for alt and renewable energy, electronics and other fields


17 August 2016

The US Department of Energy will invest $16 million over the next four years to accelerate the design of new materials through use of supercomputers.

Two four-year projects—one team led by DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), the other teamled by DOE’s Oak Ridge National Laboratory (ORNL)—will leverage the labs’ expertise in materials and take advantage of lab supercomputers to develop software for designing fundamentally new functional materials destined to revolutionize applications in alternative and renewable energy, electronics, and a wide range of other fields. The research teams include experts from universities and other national labs.

The new grants—part of DOE’s Computational Materials Sciences (CMS) program launched in 2015 as part of the US Materials Genome Initiative—reflect the enormous recent growth in computing power and the increasing capability of high-performance computers to model and simulate the behavior of matter at the atomic and molecular scales.lawrence-berkeley-national-laboratory

The teams are expected to develop sophisticated and user-friendly open-source software that captures the essential physics of relevant systems and can be used by the broader research community and by industry to accelerate the design of new functional materials.

Berkeley_Lab_Logo_Small 082016The Berkeley Lab team will be led by Steven G. Louie, an internationally recognized expert in materials science and condensed matter physics. A longtime user of NERSC supercomputers, Louie has a dual appointment as Senior Faculty Scientist in the Materials Sciences Division at Berkeley Lab and Professor of Physics at the University of California, Berkeley. Other team members are Jack Deslippe, Jeffrey B. Neaton, Eran Rabani, Feng Wang, Lin-Wang Wang and Chao Yang, Lawrence Berkeley National Laboratory; and partners Daniel Neuhauser, University of California at Los Angeles, and James R. Chelikowsky, University of Texas, Austin.

This investment in the study of excited-state phenomena in energy materials will, in addition to pushing the frontiers of science, have wide-ranging applications in areas such as electronics, photovoltaics, light-emitting diodes, information storage and energy storage. We expect this work to spur major advances in how we produce cleaner energy, how we store it for use, and to improve the efficiency of devices that use energy.

—Steven Louie

ORNL researchers will partner with scientists from national labs and universities to develop software to accurately predict the properties of quantum materials with novel magnetism, optical properties and exotic quantum phases that make them well-suited to energy applications, said Paul Kent of ORNL, director of the Center for Predictive Simulation of Functional Materials, which includes partners from Argonne, Lawrence Livermore, Oak Ridge and Sandia National Laboratories and North Carolina State University and the University of California–Berkeley.ORNL 082016 steamplant_8186871_ver1.0_640_480

Our simulations will rely on current petascale and future exascale capabilities at DOE supercomputing centers. To validate the predictions about material behavior, we’ll conduct experiments and use the facilities of the Advanced Photon Source, Spallation Neutron Source and the Nanoscale Science Research Centers.

—Paul Kent

At Argonne, our expertise in combining state-of-the-art, oxide molecular beam epitaxy growth of new materials with characterization at the Advanced Photon Source and the Center for Nanoscale Materials will enable us to offer new and precise insight into the complex properties important to materials design. We are excited to bring our particular capabilities in materials, as well as expertise in software, to the center so that the labs can comprehensively tackle this challenge.

—Olle Heinonen, Argonne materials scientist

Researchers are expected to make use of the 30-petaflop/s Cori supercomputer now being installed at the National Energy Research Scientific Computing center (NERSC) at Berkeley Lab, the 27-petaflop/s Titan computer at the Oak Ridge Leadership Computing Facility (OLCF) and the 10-petaflop/s Mira computer at Argonne Leadership Computing Facility (ALCF). OLCF, ALCF, and NERSC are all DOE Office of Science User Facilities. One petaflop/s is 1015 or a million times a billion floating-point operations per second.

In addition, a new generation of machines is scheduled for deployment between 2016 and 2019 that will take peak performance as high as 200 petaflops. Ultimately the software produced by these projects is expected to evolve to run on exascale machines, capable of 1000 petaflops and projected for deployment in the mid-2020s.

Research will combine theory and software development with experimental validation, drawing on the resources of multiple DOE Office of Science User Facilities, including the Molecular Foundry and Advanced Light Source at Berkeley Lab, the Center for Nanoscale Materials and the Advanced Photon Source at Argonne National Laboratory (ANL), and the Center for Nanophase Materials Sciences and the Spallation Neutron Source at ORNL, as well as the other Nanoscience Research Centers across the DOE national laboratory complex.

The new research projects will begin in Fiscal Year 2016. Subsequent annual funding will be contingent on available appropriations and project performance.

The two new projects expand the ongoing CMS research effort, which began in FY 2015 with three initial projects, led respectively by ANL, Brookhaven National Laboratory and the University of Southern California.

ORNL: Speedy ion conduction in solid electrolytes clears road for advanced energy devices

ORNL Speedy Ion 051016 160505145033_1_540x360An ORNL-led research team found the key to fast ion conduction in a solid electrolyte. Tiny features maximize ion transport pathways, represented by red and green.
Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

In a rechargeable battery, the electrolyte transports lithium ions from the negative to the positive electrode during discharging. The path of ionic flow reverses during recharging. The organic liquid electrolytes in commercial lithium-ion batteries are flammable and subject to leakage, making their large-scale application potentially problematic. Solid electrolytes, in contrast, overcome these challenges, but their ionic conductivity is typically low.

Now, a team led by the Department of Energy’s Oak Ridge National Laboratory has used state-of-the-art microscopy to identify a previously undetected feature, about 5 billionths of a meter (nanometers) wide, in a solid electrolyte. The work experimentally verifies the importance of that feature to fast ion transport, and corroborates the observations with theory. The new mechanism the researchers report in Advanced Energy Materials points out a new strategy for the design of highly conductive solid electrolytes.

“The solid electrolyte is one of the most important factors in enabling safe, high-power, high-energy, solid-state batteries,” said first author Cheng Ma of ORNL, who conducted most of the study’s experiments. “But currently the low conductivity has limited its applications.”

ORNL’s Miaofang Chi, the senior author, said, “Our work is basic science focused on how we can facilitate ion transport in solids. It is important to the design of fast ion conductors, not only for batteries, but also for other energy devices.” These include supercapacitors and fuel cells.

To directly observe the atomic arrangement in the solid electrolyte, the researchers used aberration-corrected scanning transmission electron microscopy to send electrons through a sample. To observe an extremely small feature in a three-dimensional (3D) material with a method that essentially provides a two-dimensional (2D) projection, they needed a sample of extraordinary thinness. To prepare one, they relied on comprehensive materials processing and characterization capabilities of the Center for Nanophase Materials Sciences, a DOE Office of Science User Facility at ORNL.

“Usually the transmission electron microscopy specimen is 20 nanometers thick, but Ma developed a method to make the specimen ultra-thin (approximately 5 nanometers),” Chi said. “That was the key because such a thickness is comparable to the size of the hidden feature we finally resolved.”

The researchers examined a prototype system called LLTO, shorthand for its lithium, lanthanum, titanium and oxygen building blocks. LLTO possesses the highest bulk conductivity among oxide systems.

In this material, lithium ions move fastest in the planar 2D pathways resulting from alternating stacks of atomic layers rich in either lanthanum or lithium. The ORNL-led team was the first to see that, without hurting this superior 2D transport, tiny domains, or fine features approximately 5 to 10 nanometers wide, throughout the 3D material provided more directions in which lithium ions could move. The domains looked like sets of shelves stacked at right angles to others. The smaller the shelves, the easier it was for ions to flow in the direction of an applied current.

ORNL’s Yongqiang Cheng and Bobby Sumpter performed molecular dynamics simulations that corroborated the experimental findings.

Previously, scientists looked at the atomic structure of the simplest repeating unit of a crystal–called a unit cell–and rearranged its atoms or introduced different elements to see how they could facilitate ion transport. Unit cells are typically less than 1 nanometer wide. In the material that the ORNL scientists studied for this paper, the unit cell is nearly half a nanometer. The team’s unexpected finding–that fine features, of only a few nanometers and traversing a few unit cells, can maximize the number of ionic transport pathways–provides new perspective.

“The finding adds a new criterion,” Chi said. “This largely overlooked length scale could be the key to fast ionic conduction.”

Researchers will need to consider phenomena on the order of several nanometers when designing materials for fast ion conduction.

Ma agreed. “The prototype material has high ionic conductivity because not only does it maintain unit-cell structure, but also it adds this fine feature, which underpins 3D pathways,” Ma said. “We’re not saying that we shouldn’t be looking at the unit-cell scale. We’re saying that in addition to the unit cell scale, we should also be looking at the scale of several unit cells. Sometimes that outweighs the importance of one unit cell.”

For several decades, when researchers had no explanation for certain material behaviors, they speculated phenomena transcending one unit cell could be at play. But they never saw evidence. “This is the first time we proved it experimentally,” Ma said. “This is a direct observation, so it is the most solid evidence.”

Story Source:

The above post is reprinted from materials provided by DOE/Oak Ridge National Laboratory. Note: Materials may be edited for content and length.

Journal Reference:

  1. Cheng Ma, Yongqiang Cheng, Kai Chen, Juchuan Li, Bobby G. Sumpter, Ce-Wen Nan, Karren L. More, Nancy J. Dudney, Miaofang Chi. Mesoscopic Framework Enables Facile Ionic Transport in Solid Electrolytes for Li Batteries.Advanced Energy Materials, 2016; DOI:10.1002/aenm.201600053

Rice University’s Laser-Induced Graphene makes Simple, Powerful Energy Storage Possible: Video

Published on Dec 3, 2015

Rice University researchers who pioneered the development of laser-induced graphene have configured their discovery into flexible, solid-state microsupercapacitors that rival the best available for energy storage and delivery.

The devices developed in the lab of Rice chemist James Tour are geared toward electronics and apparel. They are the subject of a new paper in the journal Advanced Materials.

Microsupercapacitors are not batteries, but inch closer to them as the technology improves. Traditional capacitors store energy and release it quickly (as in a camera flash), unlike common lithium-ion batteries that take a long time to charge and release their energy as needed.


Carbon doped with nitrogen dramatically improves storage capacity of supercapacitors

Carbon Doped Super Capacitors 56814eee6e72e


A team of researchers working in China has found a way to dramatically improve the energy storage capacity of supercapacitors—by doping carbon tubes with nitrogen. In their paper published in the journal Science, the team describes their process and how well the newly developed supercapacitors worked, and their goal of one day helping supercapacitors compete with batteries.

Like a , a capacitor is able to hold a charge, unlike a battery, however, it is able to be charged and discharged very quickly—the down side to capacitors is that they cannot hold nearly as much charge per kilogram as batteries. The work by the team in China is a step towards increasing the amount of charge that can be held by supercapacitors (capacitors that have much higher capacitance than standard capacitors—they generally employ carbon-based electrodes)—in this case, they report a threefold increase using their new method—noting also that that their supercapacitor was capable of storing 41 watt-hours per kilogram and could deliver 26 kilowatts per kilogram to a device.

The new supercapacitor was made by first forming a template made of tubes of silica. The team then covered the inside of the tubes with carbon using and then etched away the silica, leaving just the carbon tubes, each approximately 4 to 6 nanometers in length. Then, the carbon tubes were doped with nitrogen atoms. Electrodes were made from the resulting material by pressing it in powder form into a graphene foam. The researchers report that the doping aided in chemical reactions within the supercapacitor without causing any changes to its electrical conductivity, which meant that it was still able to charge and discharge as quickly as conventional supercapcitors. The only difference was the dramatically increased storage capacity.

Because of the huge increase in , the team believes they are on the path to building a supercapacitor able to compete directly with batteries, perhaps even . They note that would mean being able to charge a phone in mere seconds. But before that can happen, the team is looking to industrialize their current new , to allow for its use in actual devices.

Explore further: Researchers find ordinary pen ink useful for building a supercapacitor

More information: T. Lin et al. Nitrogen-doped mesoporous carbon of extraordinary capacitance for electrochemical energy storage, Science (2015). DOI: 10.1126/science.aab3798

Carbon-based supercapacitors can provide high electrical power, but they do not have sufficient energy density to directly compete with batteries. We found that a nitrogen-doped ordered mesoporous few-layer carbon has a capacitance of 855 farads per gram in aqueous electrolytes and can be bipolarly charged or discharged at a fast, carbon-like speed. The improvement mostly stems from robust redox reactions at nitrogen-associated defects that transform inert graphene-like layered carbon into an electrochemically active substance without affecting its electric conductivity. These bipolar aqueous-electrolyte electrochemical cells offer power densities and lifetimes similar to those of carbon-based supercapacitors and can store a specific energy of 41 watt-hours per kilogram (19.5 watt-hours per liter).