The latest rechargeable battery technology could drastically improve the capabilities of mobile phones and electric vehicles.
It seems that nearly every household electronic item these days requires a lithium-ion rechargeable battery, from a vacuum cleaner to a pair of headphones.
This results in many of us having a multitude of different devices hooked up to various chargers at any given time, which isn’t exactly ideal.
Now, however, a team of scientists from the University of Michigan is heralding a major breakthrough that could drastically increase the power of rechargeable batteries, with the added bonus of not catching on fire.
Existing rechargeable batteries are made from lithium-ion, a technology that enables a quick charge but has the massive drawback of its exposure to open air causing it to explode and catch fire. It also requires regular charging and can degrade quickly due to overcharging.
But, in a paper soon to be published to the Journal of Power Sources, the research team describe how by using a ceramic, solid-state electrolyte, it was able to harness the power of lithium-metal batteries without any of the traditional negatives of lithium-ion.
In doing so, it could double the output of batteries, meaning a phone could run for days or weeks without charging, or an electric vehicle (EV) could rival fossil fuel-powered cars in range.
Jeff Sakamoto, leader of the research team, said: “This could be a game-changer, a paradigm shift in how a battery operates.”
In the 1980s, lithium-metal batteries were seen as the future, but their tendency to combust during charging led researchers to switch to lithium-ion.
10 times the charging speed
These batteries replaced lithium metal with graphite anodes, which absorb the lithium and prevent tree-like filaments called dendrites from forming, but also come with performance costs.
For example, graphite has a maximum capacity of 350 milliampere hours per gram (mAh/g), whereas lithium metal in a solid-state battery has a specific capacity of 3,800 mAh/g.
To get around the ever so problematic exploding problem in lithium-metal batteries, the team created a ceramic layer that stabilises the surface, keeping dendrites from forming and preventing fires.
With some tweaking, chemical and mechanical treatments of the ceramic provided a pristine surface for lithium to plate evenly.
Whereas once it would take a lithium-metal EV up to 50 hours to charge, the team said it could now do it in three hours or less.
“We’re talking a factor of 10 increase in charging speed compared to previous reports for solid-state lithium-metal batteries,” Sakamoto said.
“We’re now on par with lithium-ion cells in terms of charging rates, but with additional benefits.”
Net Power’s pilot natural gas plant with carbon capture, near Houston, Texas.
An Interview with Julio Friedmann
At current rates of greenhouse-gas emissions, the world could lock in 1.5 ˚C of warming as soon as 2021, an analysis by the website Carbon Brief has found. We’re on track to blow the carbon budget for 2 ˚C by 2036.
Amid this daunting climate math, many researchers argue that capturing carbon dioxide from power plants, factories, and the air will have to play a big part in any realistic efforts to limit the dangers of global warming.
If it can be done economically, carbon capture and storage (CCS) offers the world additional flexibility and time to make the leap to cleaner systems. It means we can retrofit, rather than replace, vast parts of the global energy infrastructure. And once we reach disastrous levels of warming, so-called direct air capture offers one of the only ways to dig our way out of trouble, since carbon dioxide otherwise stays in the atmosphere for thousands of years.
Julio Friedmann has emerged as one of the most ardent advocates of these technologies. He oversaw research and development efforts on clean coal and carbon capture at the US Department of Energy’s Office of Fossil Energy under the last administration. Among other roles, he’s now working with or advising the Global CCS Institute, the Energy Futures Initiative, and Climeworks, a Switzerland-based company already building pilot plants that pull carbon dioxide from the air.
In an interview with MIT Technology Review, Friedmann argues that the technology is approaching a tipping point: a growingnumber of projectsdemonstrate that it works in the real world, and that it is becoming more reliable and affordable. He adds that the boosted US tax credit for capturing and storing carbon, passed in the form of the Future Act as part of the federal budget earlier this year, will push forward many more projects and help create new markets for products derived from carbon dioxide (see “The carbon-capture era may finally be starting”).
But serious challenges remain. Even with the tax credit, companies will incur steep costs by adding carbon capture systems to existing power plants. And a widely cited 2011 study, coauthored by MIT researcher Howard Herzog, found that direct air capture will require vast amounts of energy and cost 10 times as much as scrubbing carbon from power plants.
(This interview has been edited for length and clarity.)
In late February, you wrote a Medium post saying that with the passage of the increased tax credit for carbon capture and storage, we’ve “launched the climate counter-strike.” Why is that a big deal?
As I’ve said many times before, the lack of progress in deploying CCS up until this point is not a question of cost. It’s really been a question of finance.
The Future Act creates that financing.
I identified an additional provision which said not only can you consider a power plant a source or an industrial site a source, you can consider the air a source.
Even if we zeroed out all our emissions today, we still have a legacy of harm of two trillion tons of CO2 in the air, and we need to do something about that.
And this law says, yeah, we should. It says we can take carbon dioxide out of the air and turn it into stuff.
At the Petra Nova plant in Texas, my understanding is the carbon capture costs are something like $60 to $70 a ton, which is still going to outstrip the tax credit today. How are we going to close that gap?
There are many different ways to go about it. For example, the state of New Jersey today passed a 90 percent clean energy portfolio standard. Changing the policy from a renewable portfolio standard [which would exclude CCS technologies] to a clean energy standard [which would allow them] allowed higher ambition.
In that context, somebody who would build a CCS project and would get a contract to deliver that power, or deliver that emissions abatement, can actually again get staked, get financed, and get built. That can happen without any technology advancement.
The technology today is already cost competitive. CCS today, as a retrofit, is cheaper than a whole bunch of stuff. It’s cheaper than new-build nuclear, it’s cheaper than offshore wind. It’s cheaper than a whole bunch of things we like, and it’s cheaper than rooftop solar, almost everywhere. It’s cheaper than utility-scale concentrating solar pretty much everywhere, and it is cheaper than what solar and wind were 10 years ago.
What do you make of the critique that this is all just going to perpetuate the fossil-fuel industry?
The enemy is not fossil fuels; the enemy is emissions.
In a place like California that has terrific renewable resources and a good infrastructure for renewable energy, maybe you can get to zero [fossil fuels] someday.
If you’re in Saskatchewan, you really can’t do that. It is too cold for too much of the year, and they don’t have solar resources, and their wind resources are problematic because they’re so strong they tear up the turbines. Which is why they did the CCS project in Saskatchewan. For them it was the right solution.
Shifting gears to direct air capture, the basic math says that you’re moving 2,500 molecules to capture one of CO2. How good are we getting at this, and how cheaply can we do this at this point?
If you want to optimize the way that you would reduce carbon dioxide economy-wide, direct air capture is the last thing you would tackle. Turns out, though, that we don’t live in that society. We are not optimizing anything in any way.
Analysis of a newly approved tax credit shows it could make an immediate dent in industrial emissions and narrow the financial gap for power plants.
The cost for all of these things now today, all-in costs, is somewhere between $300 and $600 a ton. I’ve looked inside all those companies and I believe all of them are on a glide path to get to below $200 a ton by somewhere between 2022 and 2025. And I believe that they’re going to get down to $100 a ton by 2030. At that point, these are real options.
At $200 a ton, we know today unambiguously that pulling CO2 out of the air is cheaper than trying to make a zero-carbon airplane, by a lot. So it becomes an option that you use to go after carbon in the hard-to-scrub parts of the economy.
Is it ever going to work as a business, or is it always going to be kind of a public-supported enterprise to buy ourselves out of climate catastrophes?
Direct air capture is not competitive today broadly, but there are places where the value proposition is real. So let me give you a couple of examples.
In many parts of the world there are no sources of CO2. If you’re running a Pepsi or a Coca-Cola plant in Sri Lanka, you literally burn diesel fuel and capture the CO2 from it to put into your cola, at a bonkers price. It can cost $300 to $800 a ton to get that CO2. So there are already going to be places in some people’s supply chain where direct air capture could be cheaper.
We talk to companies like Goodyear, Firestone, or Michelin. They make tires, and right now the way that they get their carbon black [a material used in tire production that’s derived from fossil fuel] is basically you pyrolize bunker fuel in the Gulf Coast, which is a horrible, environmentally destructive process. And then you ship it by rail cars to wherever they’re making the tires.
If they can decouple from that market by gathering CO2 wherever they are and turn that into carbon black, they can actually avoid market shocks. So even if it costs a little more, the value to that company might be high enough to bring it into the market. That’s where I see direct air actually gaining real traction in the next few years.
It’s not going to be enough for climate. We know that we will have to do carbon storage, for sure, if we want to really manage the atmospheric emissions. But there’s a lot of ground to chase this, and we never know quite where technology goes.
In one of your earlier Medium posts you said that we’re ultimately going to have to pull 10 billion tons of CO2 out of the atmosphere every year. Climeworks is doing about 50 [at their pilot plant in Iceland]. So what does that scale-up look like?
You don’t have to get all 10 billion tons with direct air capture. So let’s say you just want one billion.
Right now, Royal Dutch Shell as a company moves 300 million tons of refined product every year. This means that you need three to four companies the size of Royal Dutch Shell to pull CO2 out of the atmosphere.
The good news is we don’t need that billion tons today. We have 10 or 20 or 30 years to get to a billion tons of direct air capture. But in fact we’ve seen that kind of scaling in other kinds of clean-tech markets. There’s nothing in the laws of physics or chemistry that stops that.
A composite thin film made of two different inorganic oxide materials significantly improves the performance of solar cells, as recently demonstrated by a joint team of researchers led by Professor Federico Rosei at the Institut national de la recherche scientifique (INRS), and Dr. Riad Nechache from École de technologie supérieure (ÉTS), both in the Montreal Area (Canada).
Following an original device concept, Mr. Joyprokash Chakrabartty, the researchers have developed this new composite thin film material which combines two different crystal phases comprising the atomic elements bismuth, manganese, and oxygen.
The combination of phases with two different compositions optimizes this material’s ability to absorb solar radiation and transform it into electricity. The results are highly promising for the development of future solar technologies, and also potentially useful in other optoelectronic devices.
The key discovery consists in the observation that the composite thin film—barely 110 nanometres thick—absorbs a broader portion of the solar spectrum compared to the wavelengths absorbed in the thin films made of the two individual materials. The interfaces between the two different phases within the composite film play a crucial role in converting more sunlight into electricity. This is a surprising, novel phenomenon in the field of inorganic perovskite oxide-based solar cells.
The composite material leads to a power conversion efficiency of up to 4.2%, which is a record value for this class of materials.
A breakthrough at Uppsala University has solved the practical implementation issues of the world’s strongest material graphene.
Up until now a major challenge has been agglomeration under upscaling that has effectively prevented utilization of the fantastic properties of graphene in real-life applications.
The novel hybrid ionic graphene material named Aros Graphene® solves this and is expected to revolutionize the way we design electronics, energy storage and mechanical systems. The highly anticipated revolution of graphene just came one step closer.
Graphene is a two-dimensional carbon material that is only one atom thick and it is the strongest and thinnest material ever known. It is also extremely conductive for heat and electricity as well as ultra-light and transparent. It was first isolated in 2004 and was rewarded with the Nobel prize in 2010.
Graphene is predicted to revolutionize the energy sector and electronics and we could even build lightweight aircraft of graphene composites in the future.
But until now there has been one major challenge with graphene. More than 10 years after the first isolation of graphene we can still use it in very limited applications. Graphene’s properties dramatically degrades under upscaling.
Researchers all over the world have been struggling with this challenge and recently a breakthrough was made at the Ångström Laboratory, Uppsala University in Sweden.
A major challenge of working with graphene was the agglomeration under upscaling. We had fantastic properties at the nano-scale and less encouraging properties at macro-scale. The challenges have driven me to intensively think about solutions to bring such a wonder-material to industrial products while keeping its amazing properties, says Dr. Mamoun Taher.
Dr. Mamoun Taher is a material scientist of Syrian origin, who came to Sweden in 2010 for his masters and PhD studies. Since 2015 he has been doing research at the Ångström Laboratory at Uppsala University and has also been working on graphene related projects with ABB, one of the largest engineering companies in the world.
Aros Graphene® is a hybrid ionic graphene material that is easy and eco-friendly to manufacture and can be applied as an additive into a matrix, a coating or even by 3D printing.
With Aros Graphene® we can finally realize the full potential of graphene and we have already shown that in preliminary tests with potential customers. The first commercial applications will be available in 2019.
The most remarkable discovery was, however, not that we had produced a new material but the striking properties we found that this novel material possessed.
It turns out that Aros Graphene® has the electrical and thermal properties of graphene not only in two dimensions but in 3D, and furthermore the surface has extremely low friction and high wear resistance.
This novel material is expected to pave the way for new sustainable products in a number of industrial applications, says Björn Lindh, entrepreneur, previously in Disruptive Materials with the famous material Upsalite®, and now co-founder of Graphmatech, which will commercialize Aros Graphene®.
Graphmatech has been accepted both to the EU-sponsored incubator program InnoEnergy and ABB’s Innovation and Growth hub SynerLeap and got initial funding from Vinnova. The next step is to prove Aros Graphene® in different customer applications.
Additional information, pictures and data about Aros Graphene®, Graphmatech can be found at www.graphmatech.com
In September 2015, world leaders gathered at a historic UN summit to adopt the Sustainable Development Goals (SDGs). These are 17 ambitious targets and indicators that help guide and coordinate governments and international organizations to alleviate global problems. For example, SDG 3 is to “ensure healthy lives and promote well-being for all at all ages.” Others include access to clean water, reducing the effects of climate change, and affordable healthcare.
If you think these goals might be difficult to meet, you’re right. Reports show progress is lacking in many of the 17 categories, implying they may not be met by the target date of 2030. However, paired with progress in social and political arenas, advances in science and technology could be a key accelerant to progress too.
Just one example? Graphene, a futuristic material with a growing set of potential applications.
Graphene is comprised of tightly-knit carbon atoms arranged into a sheet only one atom thick. This makes it the thinnest substance ever made, yet it is 200 times stronger than steel, flexible, stretchable, self-healing, transparent, more conductive than copper, and even superconductive. A square meter of graphene weighing a mere 0.0077 grams can support four kilograms. It is a truly remarkable material—but this isn’t news to science and tech geeks.
Headlines touting graphene as the next wonder material have been a regular occurrence in the last decade, and the trip from promise to practicality has felt a bit lengthy.
But that’s not unexpected; it can take time for new materials to go mainstream. Meanwhile, the years researching graphene have yielded a long list of reasons to keep at it.
Since first isolated in 2004 at the University of Manchester—work that led to a Nobel Prize in 2010— researchers all over the world have been developing radical ways to use and, importantly, make graphene. Indeed, one of the primary factors holding back widespread adoption has been how to produce graphene at scale on the cheap, limiting it to the lab and a handful of commercial applications. Fortunately, there have been advances toward mass production.
Last year, for example, a team from Kansas State University used explosions to synthesize large quantities of graphene. Their method is simple: Fill a chamber with acetylene or ethylene gas and oxygen. Use a vehicle spark plug to create a contained detonation. Collect the graphene that forms afterward. Acetylene and ethylene are composed of carbon and hydrogen, and when the hydrogen is consumed in the explosion, the carbon is free to bond with itself, forming graphene. This method is efficient because all it takes is a single spark.
Whether this technique will usher in the graphene revolution, as some have claimed, remains to be seen. What’s more certain is there will be no shortage of problems solved when said revolution arrives. Here’s a look at the ways today’s research suggests graphene may help the UN meet its ambitious development goals.
SDG 6 is to “ensure availability and sustainable management of water and sanitation for all.” As of now, the UN estimates that “water scarcity affects more than 40 percent of the global population and is projected to rise.”
Graphene-based filters could very well be the solution. Jiro Abraham from the University of Manchester helped develop scalable graphene oxide sieves to filter seawater. He claims, “The developed membranes are not only useful for desalination, but the atomic scale tunability of the pore size also opens new opportunity to fabricate membranes with on-demand filtration capable of filtering out ions according to their sizes.”
Furthermore, researchers from Monash University and the University of Kentucky have developed graphene filters that can filter out anything larger than one nanometer. They say their filters “could be used to filter chemicals, viruses, or bacteria from a range of liquids. It could be used to purify water, dairy products or wine, or in the production of pharmaceuticals.”
SDG 13 focuses on taking “urgent action to combat climate change and its impacts.”
Of course, one of the main culprits behind climate change is the excessive amount of carbon dioxide being emitted into the atmosphere. Graphene membranes have been developed that can capture these emissions.
Researchers at the University of South Carolina and Hanyang University in South Korea independently developed graphene-based filters that can be used to separate unwanted gases from industrial, commercial, and residential emissions. Henry Foley at the University of Missouri has claimed these discoveries are “something of a holy grail.”
With these, the world might be able to stem the rise of CO2 in the atmosphere, especially now that we have crossed the important 400 parts per million threshold.
Many around the world do not have access to adequate healthcare, but graphene may have an impact here as well.
First of all, graphene’s high mechanical strength makes it a perfect material for replacing body parts like bones, and because of its conductivity it can replace body parts that require electrical current, like organs and nerves. In fact, researchers at the Michigan Technological University are working on using 3D printers to print graphene-based nerves, and this team is developing biocompatible materials using graphene to conduct electricity.
Graphene can also be used to make biomedical sensors for detecting diseases, viruses, and other toxins. Because every atom of graphene is exposed, due to it being only one atom thick, sensors can be far more sensitive. Graphene oxide sensors, for example, could detect toxins at levels 10 times less than today’s sensors. These sensors could be placed on or under the skin and provide doctors and researchers with vast amounts of information.
SDG 9 is to “build resilient infrastructure, promote inclusive and sustainable industrialization and foster innovation.” Graphene-enhanced composites and other building materials could bring us closer to meeting this goal.
Recent research shows that the more graphene is added, the better the composite becomes. This means graphene can be added to building materials like concrete, aluminum, etc., which will allow for stronger and lighter materials.
SDG 7 is to “ensure access to affordable, reliable, sustainable and modern energy for all.” Because of its light weight, conductivity, and tensile strength, graphene may make sustainable energy cheaper and more efficient.
For example, graphene composites can be used to create more versatile solar panels. Researchers at MIT say, “The ability to use graphene…is making possible truly flexible, low-cost, transparent solar cells that can turn virtually any surface into a source of electric power.”
We’ll also be able to build bigger and lighter wind turbines thanks to graphene composites.
Further, graphene is already being used to enhance traditional lithium-ion batteries, which are the batteries commonly found in consumer electronics. Research is also being done into graphene aerogels for energy storage and supercapacitors. All of these will be essential for large-scale storage of renewable energy.
Over the next decade, graphene is likely to find more and more uses out in the real world, not only helping the UN and member states meet the SDGs, but enhancing everything from touch screens to MRI machines and from transistors to unknown uses as a superconductor.
New research is being published and new patents being filed regularly, so keep an eye out for this amazing material.
The ability to quickly generate ultra-small, well-ordered nanopatterns over large areas on material surfaces is critical to the fabrication of next-generation technologies in many industries, from electronics and computing to energy and medicine. For example, patterned media, in which data are stored in periodic arrays of magnetic pillars or bars, could significantly improve the storage density of hard disk drives.
Scientists can coax thin films of self-assembling materials called block copolymers—chains of chemically distinct macromolecules (polymer “blocks”) linked together—into desired nanoscale patterns through heating (annealing) them on a substrate. However, defective structures that deviate from the regular pattern emerge early on during self-assembly.
Materials scientist Gregory Doerk preparing a sample for electron microscopy at Brookhaven Lab’s Center for Functional Nanomaterials. The scanning electron microscope image on the computer screen shows a cross-sectional view of line …more
The presence of these defects inhibits the use of block copolymers in the nanopatterning of technologies that require a nearly perfect ordering—such as magnetic media, computer chips, antireflective surfaces, and medical diagnostic devices. With continued annealing, the block copolymer patterns can reconfigure to remove the imperfections, but this process is exceedingly slow. The polymer blocks do not readily mix with each other, so they must overcome an extremely large energy barrier to reconfigure.
Adding small things with a big impact
Now, scientists from the Center for Functional Nanomaterials (CFN)—a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory—have come up with a way to massively speed up the ordering process. They blended a line-forming block copolymer with significantly smaller polymer chains made of only one type of molecule (homopolymers) from each of the two constituent blocks. The electron microscopy images they took after annealing the films for only a few minutes show that the addition of these two smaller homopolymers dramatically increases the size of well-ordered line-pattern areas, or “grains.”
“Without the homopolymers, the same block copolymer cannot produce grains with these sizes,” said CFN materials scientist Gregory Doerk, who led the work, which was published online in an ACS Nano paper on December 1. “Blending in homopolymers that are less than one-tenth of the size of the block copolymer greatly accelerates the ordering process. In the resulting line patterns, there is a constant spacing between each of the lines, and the same directions of line-pattern orientations—for example, vertical or horizontal—persist over longer distances.”
Doerk and coauthor Kevin Yager, leader of the Electronic Nanomaterials Group at CFN, used image analysis software to calculate the grain size and repeat spacing of the line patterns.
While blending different concentrations of homopolymer to determine how much was needed to achieve the accelerated ordering, they discovered that the ordering sped up as more homopolymer was added. But too much homopolymer actually resulted in disordered patterns.
“The homopolymers accelerate the self-assembly process because they are small enough to uniformly distribute throughout their respective polymer blocks,” said Doerk. “Their presence weakens the interface between the two blocks, lowering the energy barrier associated with the block copolymer reconfiguring to remove the defects. But if the interface is weakened too much through the addition of too much homopolymer, then the blocks will mix together, resulting in a completely disordered phase.”
Guiding the self-assembly of useful nanopatterns in minutes
To demonstrate how the rapid ordering in the blended system could accelerate the self-assembly of well-aligned nanopatterns over large areas, Doerk and Yager used line-pattern templates they had previously prepared through photolithography. Used to build almost all of today’s digital devices, photolithography involves projecting light through a mask (a plate containing the desired pattern) that is positioned over a wafer (usually made of silicon) coated with a light-sensitive material. This template can then be used to direct the self-assembly of block copolymers, which fill in the spaces between the template guides. In this case, after only two minutes of annealing, the polymer blend self-assembles into lines that are aligned across these gaps. However, after the same annealing time, the unblended block copolymer self-assembles into a mostly unaligned pattern with many defects between the gaps.
“The width of the gaps is more than 80 times the repeat spacing, so the fact that we got this degree of alignment with our polymer blend is really exciting because it means we can use templates with huge gaps, created with very low-resolution lithography,” said Doerk. “Typically, expensive high-resolution lithography equipment is needed to align block copolymer patterns over this large of an area.”
For these patterns to be useful for many nanopatterning applications, they often need to be transferred to other more robust materials that can withstand harsh manufacturing processes—for example, etching, which removes layers from silicon wafer surfaces to create integrated circuits or make the surfaces antireflective. In this study, the scientists converted the nanopatterns into a metal-oxide replica. Through chemical etching, they then transferred the replica pattern into a silicon dioxide layer on a silicon wafer, achieving clearly defined line patterns.
Doerk suspects that blending homopolymers with other block copolymers will similarly yield accelerated assembly, and he is interested in studying blended polymers that self-assemble into more complicated patterns. The x-ray scattering capabilities at the National Synchrotron Light Source II—another DOE Office of Science User Facility at Brookhaven—could provide the structural information needed to conduct such studies.
“We have introduced a very simple and easily controlled way of immensely accelerating self-assembly,” concluded Doerk. “Our approach should substantially reduce the number of defects, helping to meet the demands of the semiconductor industry. At CFN, it opens up possibilities for us to use block copolymer self-assembly to make some of the new functional materials that we envision.”
Top row (l-r): Tata Center spinoff Khethworks develops affordable irrigation for the developing world; students discuss utility research in Washington; thin, lightweight solar cell developed by Professor Vladimir Bulović and team. Bottom row (l-r): MIT’s record-setting Alcator tokamak fusion research reactor; a researcher in the MIT Energy Laboratory’s Combustion Research Facility; Professor Kripa Varanasi, whose research on slippery surfaces has led to a spinoff co-founded with Associate Provost Karen Gleason.
Photos: Tata Center for Technology and Design, MITEI, Joel Jean and Anna Osherov, Bob Mumgaard/PSFC, Energy Laboratory Archives, Bryce Vickmark
Research, education, and student activities help create a robust community focused on fueling the world’s future.
On any given day at MIT, undergraduates design hydro-powered desalination systems, graduate students test alternative fuels, and professors work to tap the huge energy-generating potential of nuclear fusion, biomaterials, and more. While some MIT researchers are modeling the impacts of policy on energy markets, others are experimenting with electrochemical forms of energy storage.
This is the robust energy community at MIT. Developed over the past 10 years with the guidance and support of the MIT Energy Initiative (MITEI) — and with roots extending back into the early days of the Institute — it has engaged more than 300 faculty members and spans more than 900 research projects across all five schools.
In addition, MIT offers a multidisciplinary energy minor and myriad energy-related events and activities throughout the year. Together, these efforts ensure that students who arrive on campus with an interest in energy have free rein to pursue their ambitions.
Opportunities for students
“The MIT energy ecosystem is an incredible system, and it’s built from the ground up,” says Robert C. Armstrong, a professor of chemical engineering and the director of MITEI, which is overseen at the Institute level by Vice President for Research Maria Zuber. “It begins with extensive student involvement in energy.”
Opportunities begin the moment undergraduates arrive on campus, with a freshman pre-orientation program offered through MITEI that includes such hands-on activities as building motors and visiting the Institute’s nuclear research reactor.
“I got accepted into the pre-orientation program and from there, I was just hooked. I learned about solar technology, wind technology, different types of alternative fuels, bio fuels, even wave power,” says graduate student Priyanka Chatterjee ’15, who minored in energy studies and majored in mechanical and ocean engineering.
Those who choose the minor take a core set of subjects encompassing energy science, technology, and social science. Those interested in a deep dive into research can participate in the Energy Undergraduate Research Opportunities Program (UROP), which provides full-time summer positions. UROP students are mentored by graduate students and postdocs, many of them members of the Society of Energy Fellows, who are also conducting their own energy research at MIT.
For extracurricular activities, students can join the MIT Energy Club, which is among the largest student-run organizations at MIT with more than 5,000 members. They can also compete for the MIT Clean Energy Prize, a student competition that awards more than $200,000 each year for energy innovation. And there are many other opportunities.
The Tata Center for Technology and Design, now in its sixth year, extends MIT’s reach abroad. It supports 65 graduate students every year who conduct research central to improving life in developing countries — including lowering costs of rural electrification and using solar energy in novel ways.
Students have other opportunities to conduct and share energy research internationally as well.
“Over the years, MITEI has made it possible for several of the students I’ve advised to engage more directly in global energy and climate policy negotiations,” says Valerie Karplus, an assistant professor of global economics and management. “In 2015, I joined them at the Paris climate conference, which was a tremendous educational and outreach experience for all of us.”
“What is important is to provide our students a holistic understanding of the energy challenges,” says MIT Associate Dean for Innovation Vladimir Bulović.
Adds Karplus: “There’s been an evolution in thinking from ‘How do we build a better mousetrap?’ to ‘How do we bring about change in society at a system level?’”
This kind of thinking is at the root of MIT’s multidisciplinary approach to addressing the global energy challenge — and it has been since MITEI was conceived and launched by then-MIT President Susan Hockfield, a professor of neuroscience. While energy research has been part of the Institute since its founding (MIT’s first president, William Barton Rogers, famously collapsed and died after uttering the words “bituminous coal” at the 1882 commencement), the concerted effort to connect researchers across the five schools for collaborative projects is a more recent development.
“The objective of MITEI was really to solve the big energy problems, which we feel needs all of the schools’ and departments’ contributions,” says Ernest J. Moniz, a professor emeritus of physics and special advisor to MIT’s president. Moniz was the founding director of MITEI before serving as U.S. Secretary of Energy during President Obama’s administration.
Hockfield says great technology by itself “can’t go anywhere without great policy.”
“It’s the economics, it’s the sociology, it’s the science and the engineering, it’s the architecture — it’s all of the pieces of MIT that had to come together if we were going to develop really impactful sustainable energy solutions,” she says.
This multidisciplinary approach is evident in much of MIT’s energy research — notably the series of comprehensive studies MITEI has conducted on such topics as the future of solar energy, natural gas, the electric grid, and more.
“To make a better world, it’s essential that we figure out how to take what we’ve learned at MIT in energy and get that out into the world,” Armstrong says.
MITEI’s eight low-carbon energy research centers — focused on a range of topics from materials design to solar generation to carbon capture and storage — similarly address challenges on multiple technology and policy fronts. These centers are a core component of MIT’s five-year Plan for Action on Climate Change, announced by President L. Rafael Reif in October 2015. The centers employ a strategy that has been fundamental to MIT’s energy work since the founding of MITEI: broad, sustained collaboration with stakeholders from industry, government, and the philanthropic and non-governmental organization communities.
“It’s one thing to do research that’s interesting in a laboratory. It’s something very different to take that laboratory discovery into the world and deliver practical applications,” Hockfield says. “Our collaboration with industry allowed us to do that with a kind of alacrity that we could never have done on our own.”
For example, MITEI’s members have supported more than 160 energy-focused research projects, representing $21.4 million in funding over the past nine years, through the Seed Fund Program. Projects have led to follow-on federal and industry funding, startup companies, and pilot plants for solar desalinization systems in India and Gaza, among other outcomes.
What has MIT’s energy community as a whole accomplished over the past decade? Hockfield says it’s raised the visibility of the world’s energy problems, contributed solutions — both technical and sociopolitical — and provided “an army of young people” to lead the way to a sustainable energy future.
“I couldn’t be prouder of what MIT has contributed,” she says. “We are in the midst of a reinvention of how we make energy and how we use energy. And we will develop sustainable energy practices for a larger population, a wealthier population, and a healthier planet.”
A 60-acre solar farm in Camp Ripley, a National Guard base in Minnesota.
A new report suggests the economics of large-scale batteries are reaching an important inflection point.
When it comes to renewable energy, Minnesota isn’t typically a headline-grabber: in 2016 it got about 18 percent of its energy from wind, good enough to rank in the top 10 states.
But it’s just 28th in terms of installed solar capacity, and its relatively small size means projects within its borders rarely garner the attention that giants like California and Texas routinely get.
A new report on the future of energy in the state should turn some heads (PDF). According to the University of Minnesota’s Energy Transition Lab, starting in 2019 and for the foreseeable future, the overall cost of building grid-scale storage there will be less than that of building natural-gas plants to meet future energy demand.
Minnesota currently gets about 21 percent of its energy from renewables. That’s not bad, but current plans also call for bringing an additional 1,800 megawatts of gas-fired “peaker” plants online by 2028 to meet growing demand. As the moniker suggests, these plants are meant to spin up quickly to meet daily peaks in energy demand—something renewables tend to be bad at because the wind doesn’t always blow and the sun doesn’t always shine.
Storing energy from renewables could solve that problem, but it’s traditionally been thought of as too expensive compared with other forms of energy.
The new report suggests otherwise. According to the analysis, bringing lithium-ion batteries online for grid storage would be a good way to stockpile energy for when it’s needed, and it would prove less costly than building and operating new natural-gas plants.
The finding comes at an interesting time. For one thing, the price of lithium-ion batteries continues to plummet, something that certainly has the auto industry’s attention. And grid-scale batteries, while still relatively rare, are popping up more and more these days. The Minnesota report, then, suggests that such projects may become increasingly common—and could be a powerful way to lower emissions without sending our power bills skyrocketing in the process.
(Read more: Minnesota Public Radio, “Texas and California Have Too Much Renewable Energy,”
“The One and Only Texas Wind Boom,” “By 2040, More Than Half of All New Cars Could Be Electric”)
The average American drives about 30 miles (48 kilometers) per day, according to AAA, yet many people are still reluctant to buy electric cars that can travel three times that distance on a single charge.
This so-called range anxiety is one reason gasoline-powered vehicles still rule the road, but a team of scientists is working to ease those fears.
Mareike Wolter, Project Manager of Mobile Energy Storage Systems at Fraunhofer-Gesellschaft in Dresden, Germany, is working with a team on a new battery that would give electric cars a range of about 620 miles (1,000 km) on a single charge.
Wolter said the project began about three years ago when researchers from Fraunhofer as well as ThyssenKrupp System Engineering and IAV Automotive Engineering started brainstorming about how they could improve the energy density of automotive lithium batteries.
They turned to the popular all-electric car, the Tesla, as a starting point. Tesla’s latest vehicle, the Model S 100D has a 100-kilowatt-hour battery pack, which reportedly gives it a range of 335 miles (540 km).
The pack is large, about 16 feet long, 6 feet wide and 4 inches thick. It contains more than 8,000 lithium-ion battery cells, each one individually packaged inside a cylinder housing that measures about 2 to 3 inches (6 to 7 centimeters) high and about 0.8 inches (2 cm) across.
“We thought if we could use the same space as the battery in the Tesla, but improve the energy density and finally drive 1,000 km, this would be nice,” Wolter told Live Science.
One way of doing this would be to refine the materials inside the battery so that it could store more energy, she said. But another way would be to improve the system’s design as a whole, Wolter said.
Nearly 50 percent of each cell is devoted to components such as the housing, the anode (the battery’s negative terminal), the cathode (the battery’s positive terminal) and the electrolyte, the liquid that transports the charged particles.
Additional space is needed inside the car to wire the battery packs to the vehicle’s electrical system.
“It’s a lot of wasted space,” Wolter said. “You have a lot of inactive components in the system, and that’s a problem from our point of view.”
The scientists decided to reimagine the entire design, they said.
An illustration that shows how the new electric battery is stacked like a ream of paper. Credit: Fraunhofer IKTS
To do so, they got rid of the housings that encase individual batteries and turned to a thin, sheet-like design instead of a cylinder.
Their metallic sheet is coated with an energy-storage material made from powdered ceramic mixed with a polymer binder. One side serves as the cathode, and other side serves as the anode.
The researchers stacked several of these so-called bipolar electrodes one on top of the other, like sheets of paper in a ream, separating the electrodes by thin layers of electrolyte and a material that prevents electrical charges from shorting out the whole system.
The “ream” is sealed within a package measuring about 10 square feet (1square meter), and contacts on the top and bottom connect to the car’s electrical system.
The goal is to build a battery system that fits in the same space as the one used by Tesla’s vehicles or other electric vehicles, the researchers said.
“We can put more electrodes storing the energy in the same space,” Wolter said.
She added that the researchers aim to have such a system ready to test in cars by 2020.
German chancellor Angela Merkel visits Accumotive’s plant in Kamenz, Germany.
Tesla gets the headlines, but big battery factories are being built all over the world, driving down prices.
Battery production is booming, and Tesla is far from the only game in town.
According to Bloomberg New Energy Finance, global battery production is forecast to more than double between now and 2021. The expansion is in turn driving prices down, good news both for the budding electric-car industry and for energy companies looking to build out grid-scale storage to back up renewable forms of energy.
While Tesla gets tons of attention for its “gigafactories”—one in Nevada that will produce batteries, and another in New York that will produce solar panels—the fact is, the company has a lot of battery-building competition.
Exhibit A is a new battery plant in Kamenz, Germany, run by Accumotive. The half-billion-euro facility broke ground on Monday with a visit from German chancellor Angela Merkel and will supply batteries to its parent company, Daimler, which is betting heavily on the burgeoning electric-vehicle market.
But the lion’s share of growth is expected to be in Asia. BYD, Samsung, LG, and Panasonic (which has partnered with Tesla) are all among the world’s top battery producers, and nine of the world’s largest new battery factories are under construction in China (paywall), according to Benchmark Minerals.
That competition means the steady downward trend in battery prices is going to continue. On a per-kilowatt-hour basis, costs have fallen from $542 in 2012 to around $139 today, according to analysis by Benchmark.
That makes for a huge difference in the cost of an electric car, of which 40 percent is usually down to the battery itself.
Bloomberg’s analysts have already said that the 2020s could be the decade in which electric cars take off—and one even went so far as to say that by 2030, electric cars could be cheaper than those powered by internal combustion.
Those watching the industry might worry that a flood of cheap batteries could end up hurting profitability for producers, as happened in the solar-panel business.
That could happen, but India and China, two huge rising automotive markets, are bullish about using electric cars to help solve problems like traffic congestion and air pollution. So even as supply ramps up, there is likely to be plenty of demand to go around.