From seawater to drinking water – With just a push of a button!


MIT Researchers build a portable desalination unit that generates clear, clean drinking water without the need for filters or high-pressure pumps.

MIT researchers have developed a portable desalination unit, weighing less than 10 kilograms, that can remove particles and salts to generate drinking water.

The suitcase-sized device, which requires less power to operate than a cell phone charger, can also be driven by a small, portable solar panel, which can be purchased online for around $50. It automatically generates drinking water that exceeds World Health Organization quality standards. The technology is packaged into a user-friendly device that runs with the push of one button.

Unlike other portable desalination units that require water to pass through filters, this device utilizes electrical power to remove particles from drinking water. Eliminating the need for replacement filters greatly reduces the long-term maintenance requirements.

This could enable the unit to be deployed in remote and severely resource-limited areas, such as communities on small islands or aboard seafaring cargo ships. It could also be used to aid refugees fleeing natural disasters or by soldiers carrying out long-term military operations.

“This is really the culmination of a 10-year journey that I and my group have been on. We worked for years on the physics behind individual desalination processes, but pushing all those advances into a box, building a system, and demonstrating it in the ocean, that was a really meaningful and rewarding experience for me,” says senior author Jongyoon Han, a professor of electrical engineering and computer science and of biological engineering, and a member of the Research Laboratory of Electronics (RLE).

Joining Han on the paper are first author Junghyo Yoon, a research scientist in RLE; Hyukjin J. Kwon, a former postdoc; SungKu Kang, a postdoc at Northeastern University; and Eric Brack of the U.S. Army Combat Capabilities Development Command (DEVCOM). The research has been published online in Environmental Science and Technology.

Watch: YouTube Videp

Filter-free technology

Commercially available portable desalination units typically require high-pressure pumps to push water through filters, which are very difficult to miniaturize without compromising the energy-efficiency of the device, explains Yoon.

Instead, their unit relies on a technique called ion concentration polarization (ICP), which was pioneered by Han’s group more than 10 years ago. Rather than filtering water, the ICP process applies an electrical field to membranes placed above and below a channel of water. The membranes repel positively or negatively charged particles — including salt molecules, bacteria, and viruses — as they flow past. The charged particles are funneled into a second stream of water that is eventually discharged.

The process removes both dissolved and suspended solids, allowing clean water to pass through the channel. Since it only requires a low-pressure pump, ICP uses less energy than other techniques.

But ICP does not always remove all the salts floating in the middle of the channel. So the researchers incorporated a second process, known as electrodialysis, to remove remaining salt ions.

Yoon and Kang used machine learning to find the ideal combination of ICP and electrodialysis modules. The optimal setup includes a two-stage ICP process, with water flowing through six modules in the first stage then through three in the second stage, followed by a single electrodialysis process. This minimized energy usage while ensuring the process remains self-cleaning.

“While it is true that some charged particles could be captured on the ion exchange membrane, if they get trapped, we just reverse the polarity of the electric field and the charged particles can be easily removed,” Yoon explains.

They shrunk and stacked the ICP and electrodialysis modules to improve their energy efficiency and enable them to fit inside a portable device. The researchers designed the device for nonexperts, with just one button to launch the automatic desalination and purification process. Once the salinity level and the number of particles decrease to specific thresholds, the device notifies the user that the water is drinkable.

The researchers also created a smartphone app that can control the unit wirelessly and report real-time data on power consumption and water salinity.

Beach tests

After running lab experiments using water with different salinity and turbidity (cloudiness) levels, they field-tested the device at Boston’s Carson Beach.

Yoon and Kwon set the box near the shore and tossed the feed tube into the water. In about half an hour, the device had filled a plastic drinking cup with clear, drinkable water.

“It was successful even in its first run, which was quite exciting and surprising. But I think the main reason we were successful is the accumulation of all these little advances that we made along the way,” Han says.

The resulting water exceeded World Health Organization quality guidelines, and the unit reduced the amount of suspended solids by at least a factor of 10. Their prototype generates drinking water at a rate of 0.3 liters per hour, and requires only 20 watts of power per liter.

“Right now, we are pushing our research to scale up that production rate,” Yoon says.

One of the biggest challenges of designing the portable system was engineering an intuitive device that could be used by anyone, Han says.

Yoon hopes to make the device more user-friendly and improve its energy efficiency and production rate through a startup he plans to launch to commercialize the technology.

In the lab, Han wants to apply the lessons he’s learned over the past decade to water-quality issues that go beyond desalination, such as rapidly detecting contaminants in drinking water.

“This is definitely an exciting project, and I am proud of the progress we have made so far, but there is still a lot of work to do,” he says.

For example, while “development of portable systems using electro-membrane processes is an original and exciting direction in off-grid, small-scale desalination,” the effects of fouling, especially if the water has high turbidity, could significantly increase maintenance requirements and energy costs, notes Nidal Hilal, professor of engineering and director of the New York University Abu Dhabi Water research center, who was not involved with this research.

“Another limitation is the use of expensive materials,” he adds. “It would be interesting to see similar systems with low-cost materials in place.”

The research was funded, in part, by the DEVCOM Soldier Center, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), the Experimental AI Postdoc Fellowship Program of Northeastern University, and the Roux AI Institute.

MIT Creates Waterless Cleaning System to Remove Dust on Solar Panels: Maintains Peak Efficiency and Service Longevity


The accumulation of dust on solar panels or mirrors is already a significant issue – it can reduce the output of photovoltaic panels. So regular cleaning is essential for such installations to maintain their peak efficiency. However, cleaning solar panels is currently estimated to use billions of gallons of water per year, and attempts at waterless cleaning are labor-intensive and tend to cause irreversible scratching of the surfaces, which also reduces efficiency. Robots can be useful; recently, a Belgian startup developed HELIOS, an automated cleaning service for solar panels.

Now, a team of researchers at MIT has now developed a waterless cleaning method to remove dust on solar installations in water-limited regions, improving overall efficiency.

The waterless, no-contact system uses electrostatic repulsion to cause dust particles to detach without the need for water or brushes. To activate the system, a simple electrode passes just above the solar panel‘s surface. The electrical charge it releases repels dust particles from the panels. The system can be operated automatically using a simple electric motor and guide rails along the side of the panel.

The team designed and fabricated an electrostatic dust removal system for a lab-scale solar panel. The glass plate on top of the solar panel was coated with a 5-nm-thick transparent and conductive layer of aluminum-doped zinc oxide (AZO) using atomic layer deposition (ALD) and formed the bottom electrode. The top electrode is mobile to avoid shading and moves along the panel during cleaning with a linear guide stepper motor mechanism. The system can be operated at a voltage of around 12V and can recover 95% of the lost power after cleaning for particle sizes greater than around 30 μm.

“We performed experiments at varying humidities from 5% to 95%,” says MIT graduate student Sreedath Panat. “As long as the ambient humidity is greater than 30%, you can remove almost all of the particles from the surface, but as humidity decreases, it becomes harder.”

By eliminating the dependency on trucked-in water, by eliminating the build-up of dust that can contain corrosive compounds, and by lowering the overall operational costs, such cleaning systems have the potential to significantly improve the overall efficiency and reliability of solar installations Kripa Varanasi says.

MIT’s Solar-Powered Desalination System More Efficient, Less Expensive


A team of researchers at MIT and in China has developed a new solar-powered desalination system that is both more efficient and less expensive than previous solar desalination methods. The process could be used to treat contaminated wastewater or to generate steam for sterilizing medical instruments, all without requiring any power source other than sunlight itself.

Many attempts at solar desalination systems rely on some kind of wick to draw the saline water through the device, but these wicks are vulnerable to salt accumulation and relatively difficult to clean. The MIT team focused on developing a wick-free system instead.

The system is comprised of several layers with dark material at the top to absorb the sun’s heat, then a thin layer of water above a perforated layer of material, sitting atop a deep reservoir of the salty water such as a tank or a pond. The researchers determined the optimal size for the holes drilled through the perforated material, which in their tests was made of polyurethane. At 2.5 millimeters across, these holes can be easily made using commonly available waterjets.

In this schematic, a confined water layer above the floating thermal insulation enables the simultaneous thermal localization and salt rejection.
In this schematic, a confined water layer above the floating thermal insulation enables the simultaneous thermal localization and salt rejection. Credit: MIT

With the help of dark material, the thin layer of water is heated until it evaporates, which can then be condensed onto a sloped surface for collection as pure water. The holes in the perforated material are large enough to allow for a natural convective circulation between the warmer upper layer of water and the colder reservoir below. That circulation naturally draws the salt from the thin layer above down into the much larger body of water below, where it becomes well-diluted and no longer a problem.

During the experiments, the team says their new technique achieved over 80% efficiency in converting solar energy to water vapor and salt concentrations up to 20% by weight. Their test apparatus operated for a week with no signs of any salt accumulation.

MIT-experimental solar desalResearchers test two identical outdoor experimental setups placed next to each other. Credit: MIT

So far, the team has proven the concept using small benchtop devices, so the next step will be starting to scale up to devices that could have practical applications. According to the researchers, their system with just 1 square meter (about a square yard) of collecting area should be sufficient to provide a family’s daily needs for drinking water. They calculated that the necessary materials for a 1-square-meter device would cost only about $4.

Off Grid Solar Desal

The team says the first applications are likely to be providing safe water in remote off-grid locations or for disaster relief after hurricanes, earthquakes, or other disruptions of normal water supplies. MIT graduate student Lenan Zhang adds that “if we can concentrate the sunlight a little bit, we could use this passive device to generate high-temperature steam to do medical sterilization” for off-grid rural areas.

MIT – A New Language for Quantum Computing


While the nascent field of quantum computing can feel flashy and futuristic, quantum computers have the potential for computational breakthroughs in classically unsolvable tasks, like cryptographic and communication protocols, search, and computational physics and chemistry. Credits: Photo: Graham Carlow/IBM

Twist is an MIT-developed programming language that can describe and verify which pieces of data are entangled to prevent bugs in a quantum program.

Time crystals. Microwaves. Diamonds. What do these three disparate things have in common? 

Quantum computing. Unlike traditional computers that use bits, quantum computers use qubits to encode information as zeros or ones, or both at the same time. Coupled with a cocktail of forces from quantum physics, these refrigerator-sized machines can process a whole lot of information — but they’re far from flawless. Just like our regular computers, we need to have the right programming languages to properly compute on quantum computers. 

Programming quantum computers requires awareness of something called “entanglement,”computational multiplier for qubits of sorts, which translates to a lot of power. When two qubits are entangled, actions on one qubit can change the value of the other, even when they are physically separated, giving rise to Einstein’s characterization of “spooky action at a distance.” But that potency is equal parts a source of weakness. When programming, discarding one qubit without being mindful of its entanglement with another qubit can destroy the data stored in the other, jeopardizing the correctness of the program. 

Scientists from MIT’s Computer Science and Artificial Intelligence (CSAIL) aimed to do some unraveling by creating their own programming language for quantum computing called Twist. Twist can describe and verify which pieces of data are entangled in a quantum program, through a language a classical programmer can understand. The language uses a concept called purity, which enforces the absence of entanglement and results in more intuitive programs, with ideally fewer bugs. For example, a programmer can use Twist to say that the temporary data generated as garbage by a program is not entangled with the program’s answer, making it safe to throw away.

While the nascent field can feel a little flashy and futuristic, with images of mammoth wiry gold machines coming to mind, quantum computers have potential for computational breakthroughs in classically unsolvable tasks, like cryptographic and communication protocols, search, and computational physics and chemistry. One of the key challenges in computational sciences is dealing with the complexity of the problem and the amount of computation needed. Whereas a classical digital computer would need a very large exponential number of bits to be able to process such a simulation, a quantum computer could do it, potentially, using a very small number of qubits — if the right programs are there. 

“Our language Twist allows a developer to write safer quantum programs by explicitly stating when a qubit must not be entangled with another,” says Charles Yuan, an MIT PhD student in electrical engineering and computer science and the lead author on a new paper about Twist. “Because understanding quantum programs requires understanding entanglement, we hope that Twist paves the way to languages that make the unique challenges of quantum computing more accessible to programmers.” 

Yuan wrote the paper alongside Chris McNally, a PhD student in electrical engineering and computer science who is affiliated with the MIT Research Laboratory of Electronics, as well as MIT Assistant Professor Michael Carbin. They presented the research at last week’s 2022 Symposium on Principles of Programming conference in Philadelphia.

Untangling quantum entanglement 

Imagine a wooden box that has a thousand cables protruding out from one side. You can pull any cable all the way out of the box, or push it all the way in.

After you do this for a while, the cables form a pattern of bits — zeros and ones — depending on whether they’re in or out. This box represents the memory of a classical computer. A program for this computer is a sequence of instructions for when and how to pull on the cables.

Now imagine a second, identical-looking box. This time, you tug on a cable, and see that as it emerges, a couple of other cables are pulled back inside. Clearly, inside the box, these cables are somehow entangled with each other. 

The second box is an analogy for a quantum computer, and understanding the meaning of a quantum program requires understanding the entanglement present in its data. But detecting entanglement is not straightforward. You can’t see into the wooden box, so the best you can do is try pulling on cables and carefully reason about which are entangled. In the same way, quantum programmers today have to reason about entanglement by hand. This is where the design of Twist helps massage some of those interlaced pieces. 

The scientists designed Twist to be expressive enough to write out programs for well-known quantum algorithms and identify bugs in their implementations. To evaluate Twist’s design, they modified the programs to introduce some kind of bug that would be relatively subtle for a human programmer to detect, and showed that Twist could automatically identify the bugs and reject the programs.

They also measured how well the programs performed in practice in terms of runtime, which had less than 4 percent overhead over existing quantum programming techniques.

For those wary of quantum’s “seedy” reputation in its potential to break encryption systems, Yuan says it’s still not very well known to what extent quantum computers will actually be able to reach their performance promises in practice. “There’s a lot of research that’s going on in post-quantum cryptography, which exists because even quantum computing is not all-powerful. So far, there’s a very specific set of applications in which people have developed algorithms and techniques where a quantum computer can outperform classical computers.” 

An important next step is using Twist to create higher-level quantum programming languages. Most quantum programming languages today still resemble assembly language, stringing together low-level operations, without mindfulness towards things like data types and functions, and what’s typical in classical software engineering.

“Quantum computers are error-prone and difficult to program. By introducing and reasoning about the ‘purity’ of program code, Twist takes a big step towards making quantum programming easier by guaranteeing that the quantum bits in a pure piece of code cannot be altered by bits not in that code,” says Fred Chong, the Seymour Goodman Professor of Computer Science at the University of Chicago and chief scientist at Super.tech. 

The work was supported, in part, by the MIT-IBM Watson AI Lab, the National Science Foundation, and the Office of Naval Research.

Samsung and IBM Could Break the Nanosheet Threshold in Chips With ‘Vertically Stacked Transistors’ – IBM & Samsung Indicate This Could DOUBLE Processor Performance (MIT/ NTU)


This design can either double the performance of chips or reduce power use by 85%.

In May of 2021, we brought you a breakthrough in semiconductor materials that saw the creation of a chip that could push back the “end” of Moore’s Law and further widen the capability gap between China and U.S.-adjacent efforts in the field of 1-nanometer chips.

Now, IBM and Samsung claim they have also made a breakthrough in semiconductor design, revealing a new concept for stacking transistors vertically on a chip, according to a press release acquired by . It’s called Vertical Transport Field Effect Transistors (VTFET) and it sees transistors lie perpendicular to one another while current flows vertically.

The breakthrough was accomplished in a joint effort, involving the Massachusetts Institute of Technology (MIT), National Taiwan University (NTU), and the Taiwan Semiconductor Manufacturing Co (TSMC), which is the world’s largest contract manufacturer of advanced chips. At the core of the breakthrough was a process that employs semi-metal bismuth to allow for the manufacture of semiconductors below the 1-nanometer (nm) level.

This is a drastic change from today’s models where transistors lie flat on the surface of the silicon, and then electric current flows from side to side. By doing this, IBM and Samsung hope to extend Moore’s Law beyond the nanosheet threshold and waste less energy.

What will that look like in terms of processors? Well, IBM and Samsung state that these features will double the performance or use 85 percent less power than chips designed with FinFET transistors. But these two firms are not the only ones testing this type of technology.

Intel is also experimenting with chips stacked above each other, as reported by Reuters. “By stacking the devices directly on top of each other, we’re clearly saving area,” Paul Fischer, director and senior principal engineer of Intel’s Components Research Group told Reuters in an interview. “We’re reducing interconnect lengths and really saving energy, making this not only more cost efficient, but also better performing.”

All these advances are great for our cell phones who could one day go weeks without charging and for energy-intensive activities such as crypto mining. But then, we might also find ourselves in a Jevon’s paradox, which occurs when technological progress increases the efficiency with which a resource is used, but the rate of consumption of that resource also rises due to increasing demand. Isn’t that what’s going on with cryptocurrencies in a way?

The Rapid Cost Decline of lithium-ion batteries’ – Why?


Lithium-ion batteries, those marvels of lightweight power that have made possible today’s age of handheld electronics and electric vehicles, have plunged in cost since their introduction three decades ago at a rate similar to the drop in solar panel prices, as documented by a study published last March.

But what brought about such an astonishing cost decline, of about 97 percent?

Some of the researchers behind that earlier study have now analyzed what accounted for the extraordinary savings. They found that by far the biggest factor was work on research and development, particularly in chemistry and materials science. This outweighed the gains achieved through economies of scale, though that turned out to be the second-largest category of reductions.

The new findings are being published in the journal Energy and Environmental Science, in a paper by MIT postdoc Micah Ziegler, recent graduate student Juhyun Song Ph.D. ’19, and Jessika Trancik, a professor in MIT’s Institute for Data, Systems and Society.

The findings could be useful for policymakers and planners to help guide spending priorities in order to continue the pathway toward ever-lower costs for this and other crucial energy storage technologies, according to Trancik. Their work suggests that there is still considerable room for further improvement in electrochemical battery technologies, she says.

The analysis required digging through a variety of sources, since much of the relevant information consists of closely held proprietary business data. “The data collection effort was extensive,” Ziegler says. “We looked at academic articles, industry and government reports, press releases, and specification sheets. We even looked at some legal filings that came out. We had to piece together data from many different sources to get a sense of what was happening.” He says they collected “about 15,000 qualitative and quantitative data points, across 1,000 individual records from approximately 280 references.”

Data from the earliest times are hardest to access and can have the greatest uncertainties, Trancik says, but by comparing different data sources from the same period they have attempted to account for these uncertainties.

Overall, she says, “we estimate that the majority of the cost decline, more than 50 percent, came from research-and-development-related activities.” That included both private sector and government-funded research and development, and “the vast majority” of that cost decline within that R&D category came from chemistry and materials research.

That was an interesting finding, she says, because “there were so many variables that people were working on through very different kinds of efforts,” including the design of the battery cells themselves, their manufacturing systems, supply chains, and so on. “The cost improvement emerged from a diverse set of efforts and many people, and not from the work of only a few individuals.”

The findings about the importance of investment in R&D were especially significant, Ziegler says, because much of this investment happened after lithium-ion battery technology was commercialized, a stage at which some analysts thought the research contribution would become less significant. Over roughly a 20-year period starting five years after the batteries’ introduction in the early 1990s, he says, “most of the cost reduction still came from R&D. The R&D contribution didn’t end when commercialization began. In fact, it was still the biggest contributor to cost reduction.”

The study took advantage of an analytical approach that Trancik and her team initially developed to analyze the similarly precipitous drop in costs of silicon solar panels over the last few decades. They also applied the approach to understand the rising costs of nuclear energy. “This is really getting at the fundamental mechanisms of technological change,” she says. “And we can also develop these models looking forward in time, which allows us to uncover the levers that people could use to improve the technology in the future.”

One advantage of the methodology Trancik and her colleagues have developed, she says, is that it helps to sort out the relative importance of different factors when many variables are changing all at once, which typically happens as a technology improves. “It’s not simply adding up the cost effects of these variables,” she says, “because many of these variables affect many different cost components. There’s this kind of intricate web of dependencies.” But the team’s methodology makes it possible to “look at how that overall cost change can be attributed to those variables, by essentially mapping out that network of dependencies,” she says.

This can help provide guidance on public spending, private investments, and other incentives. “What are all the things that different decision makers could do?” she asks. “What decisions do they have agency over so that they could improve the technology, which is important in the case of low-carbon technologies, where we’re looking for solutions to climate change and we have limited time and limited resources? The new approach allows us to potentially be a bit more intentional about where we make those investments of time and money.”

David Chandler MIT Technology

More information: Determinants of lithium-ion battery technology cost decline, Energy and Environmental Science (2021). DOI: 10.1039/d1ee01313k

Journal information: Energy and Environmental Science

Provided by Massachusetts Institute of Technology

Making the case for hydrogen in a zero-carbon economy


Hydrogen Power
As the United States races to achieve its goal of zero-carbon electricity generation by 2035, energy providers are swiftly ramping up renewable resources such as solar and wind. But because these technologies churn out electrons only when the sun shines and the wind blows, they need backup from other energy sources, especially during seasons of high electric demand. Currently, plants burning fossil fuels, primarily natural gas, fill in the gaps.

“As we move to more and more renewable penetration, this intermittency will make a greater impact on the ,” says Emre Gençer, a research scientist at the MIT Energy Initiative (MITEI). That’s because grid operators will increasingly resort to fossil-fuel-based “peaker”  that compensate for the intermittency of the variable renewable  (VRE) sources of sun and wind. “If we’re to achieve zero-carbon electricity, we must replace all greenhouse gas-emitting sources,” Gençer says.

Low- and zero-carbon alternatives to greenhouse-gas emitting peaker plants are in development, such as arrays of lithium-ion batteries and  power generation. But each of these evolving technologies comes with its own set of advantages and constraints, and it has proven difficult to frame the debate about these options in a way that’s useful for policymakers, investors, and utilities engaged in the clean energy transition.

Now, Gençer and Drake D. Hernandez SM ’21 have come up with a model that makes it possible to pin down the pros and cons of these peaker-plant alternatives with greater precision. Their hybrid technological and , based on a detailed inventory of California’s power system, was published online last month in Applied Energy. While their work focuses on the most cost-effective solutions for replacing peaker power plants, it also contains insights intended to contribute to the larger conversation about transforming energy systems.

“Our study’s essential takeaway is that hydrogen-fired power generation can be the more economical option when compared to lithium-ion batteries—even today, when the costs of hydrogen production, transmission, and storage are very high,” says Hernandez, who worked on the study while a graduate research assistant for MITEI. Adds Gençer, “If there is a place for hydrogen in the cases we analyzed, that suggests there is a promising role for hydrogen to play in the energy transition.”

Adding up the costs

California serves as a stellar paradigm for a swiftly shifting power system. The state draws more than 20 percent of its electricity from solar and approximately 7 percent from wind, with more VRE coming online rapidly. This means its peaker plants already play a pivotal role, coming online each evening when the sun goes down or when events such as heat waves drive up electricity use for days at a time.

“We looked at all the peaker plants in California,” recounts Gençer. “We wanted to know the cost of electricity if we replaced them with hydrogen-fired turbines or with lithium-ion batteries.” The researchers used a core metric called the levelized cost of electricity (LCOE) as a way of comparing the costs of different technologies to each other. LCOE measures the average total cost of building and operating a particular energy-generating asset per unit of total electricity generated over the hypothetical lifetime of that asset.

Selecting 2019 as their base study year, the team looked at the costs of running natural gas-fired peaker plants, which they defined as plants operating 15 percent of the year in response to gaps in intermittent renewable electricity. In addition, they determined the amount of carbon dioxide released by these plants and the expense of abating these emissions. Much of this information was publicly available.

Coming up with prices for replacing peaker plants with massive arrays of lithium-ion batteries was also relatively straightforward: “There are no technical limitations to lithium-ion, so you can build as many as you want; but they are super expensive in terms of their footprint for energy storage and the mining required to manufacture them,” says Gençer.

But then came the hard part: nailing down the costs of hydrogen-fired electricity generation. “The most difficult thing is finding cost assumptions for new technologies,” says Hernandez. “You can’t do this through a literature review, so we had many conversations with equipment manufacturers and plant operators.”

The team considered two different forms of hydrogen fuel to replace natural gas, one produced through electrolyzer facilities that convert water and electricity into hydrogen, and another that reforms natural gas, yielding hydrogen and carbon waste that can be captured to reduce emissions. They also ran the numbers on retrofitting natural gas plants to burn hydrogen as opposed to building entirely new facilities. Their model includes identification of likely locations throughout the state and expenses involved in constructing these facilities.

The researchers spent months compiling a giant dataset before setting out on the task of analysis. The results from their modeling were clear: “Hydrogen can be a more cost-effective alternative to lithium-ion batteries for peaking operations on a power grid,” says Hernandez. In addition, notes Gençer, “While certain technologies worked better in particular locations, we found that on average, reforming hydrogen rather than electrolytic hydrogen turned out to be the cheapest option for replacing peaker plants.”

making-the-case-for-hy

Credit: DOI: 10.1016/j.apenergy.2021.117314

A tool for energy investors

When he began this project, Gençer admits he “wasn’t hopeful” about hydrogen replacing natural gas in peaker plants. “It was kind of shocking to see in our different scenarios that there was a place for hydrogen.” That’s because the overall price tag for converting a fossil-fuel based plant to one based on hydrogen is very high, and such conversions likely won’t take place until more sectors of the economy embrace hydrogen, whether as a fuel for transportation or for varied manufacturing and industrial purposes.

A nascent hydrogen production infrastructure does exist, mainly in the production of ammonia for fertilizer. But enormous investments will be necessary to expand this framework to meet grid-scale needs, driven by purposeful incentives. “With any of the climate solutions proposed today, we will need a carbon tax or carbon pricing; otherwise nobody will switch to new technologies,” says Gençer.

The researchers believe studies like theirs could help key energy stakeholders make better-informed decisions. To that end, they have integrated their analysis into SESAME, a life cycle and techno-economic assessment tool for a range of energy systems that was developed by MIT researchers. Users can leverage this sophisticated modeling environment to compare costs of energy storage and emissions from different technologies, for instance, or to determine whether it is cost-efficient to replace a -powered plant with one powered by hydrogen.

“As utilities, industry, and investors look to decarbonize and achieve zero-emissions targets, they have to weigh the costs of investing in low-carbon technologies today against the potential impacts of climate change moving forward,” says Hernandez, who is currently a senior associate in the energy practice at Charles River Associates. Hydrogen, he believes, will become increasingly cost-competitive as its production costs decline and markets expand.

A study group member of MITEI’s soon-to-be published Future of Storage study, Gençer knows that hydrogen alone will not usher in a zero-carbon future. But, he says, “Our research shows we need to seriously consider hydrogen in the energy transition, start thinking about key areas where hydrogen should be used, and start making the massive investments necessary.”


Explore further

Green hydrogen production from curtailed wind and solar power

An Alternative to Kevlar – MIT and Caltech Create Nanotech Carbon Materials – Can withstand supersonic microparticle impacts


So, nanotechnology. “Great Things from Small Things”. Really amazing stuff … really.

So amazing in fact, that some researchers and engineers at Caltech, MIT, and ETH Zurich have discovered how to make lighter than Kevlar materials that can withstand supersonic microparticle impacts.

What does all this mean for material science? A whole lot if you ask me. I mean, this is literally going to change to way we produced shielding of any kind, especially for law enforcement agencies. Hang on a second, I’m getting a little ahead of myself here. 

A new study by engineers at the above-mentioned institutes discovered that “nano-architected” materials are showing insane promise in use as armor. What are “nano-architected” materials? Simply put, they’re materials and structures that are designed from “precisely patterned nanoscale structures,” meaning that the entire thing is a pre-meditated and arranged structure; what you see is exactly what was desired. 

Not only this, but the material is completed from nanoscale carbon struts. Arranged much like rings in chainmail, these carbon struts are combined, layer upon layer to create the structure you see in the main photo. So yeah, medieval knights had it right all along, they just needed more layers of something that already weighed upwards of 40 lbs for a full body suit.

So now that the researchers had a structure, what to do with it. Why not shoot things at it? Well, like any scientists, pardon me, “researchers,” that have been cooped up in a lab for too long, that’s just what they did, in the process, documenting and recording all the results.

To do this, researchers shot laser-induced microparticles up to 1,100 meters per second at the nanostructure. A quick calculation and you’re looking at a particle that’s traveling at 3,608 feet per second. Want to know more? That’s 2,460 miles per hour! 

Two test structures were arranged, one with slightly looser struts, and the second with a tighter formation. The tighter formation kept the particle from tearing through and even embedded into the structure. 

If that’s not enough, and this is a big one, once the particle was removed and the underlying structure examined, researchers found that the surrounding structure remained intact. Yes, this means it can be reused.

The overall result? They found that shooting this structure with microparticles at supersonic speeds proved to offer a higher impact resistance and absorption effect than Kevlar, steel, aluminum, and a range of other impact-absorbing materials. The images in the gallery even show that particles didn’t even make it thirty percent of the way through the structure; I counted about six to seven deformed layers.

To get an idea of where this sort of tech will be taking things, co-author of the paper, Julia R. Greer of Caltech, whose lab led the material’s fabrication, says that “The knowledge from this work… could provide design principles for ultra-lightweight impact resistant materials [for use in] efficient armor materials, protective coatings, and blast-resistant shields desirable in defense and space applications.” 

Imagine for a second what this means once these structures are created on a larger scale. It will change the face of armor, be it destined for human or machine use, coatings, and downright clothing.

I’m not saying that suddenly we can stop bullets walking down the street, but it won’t be long until funding for large-scale production begins, and what I just said may become a reality. Maybe not for all people at first, but the military will definitely have their eye on this tech.

Submitted By Cristian Curmei

Nanoparticles can turn off genes in bone marrow cells


MIT-Bone-Marrow-01-PRESS_0Using these new particles, researchers could develop treatments for heart disease and other conditions.

Using specialized nanoparticles, MIT engineers have developed a way to turn off specific genes in cells of the bone marrow, which play an important role in producing blood cells. These particles could be tailored to help treat heart disease or to boost the yield of stem cells in patients who need stem cell transplants, the researchers say.

This type of genetic therapy, known as RNA interference, is usually difficult to target to organs other than the liver, where nanoparticles would tend to accumulate. The MIT researchers were able to modify their particles in such a way that they would accumulate in the cells found in the bone marrow.

“If we can get these particles to hit other organs of interest, there could be a broader range of disease applications to explore, and one that we were really interested in this paper was the bone marrow. The bone marrow is a site for hematopoiesis of blood cells, and these give rise to a whole lineage of cells that contribute to various types of diseases,” says Michael Mitchell, a former MIT postdoc and one of the lead authors of the study.

In a study of mice, the researchers showed that they could use this approach to improve recovery after a heart attack by inhibiting the release of bone marrow blood cells that promote inflammation and contribute to heart disease.

Marvin Krohn-Grimberghe, a cardiologist at the Freiburg University Heart Center in Germany, and Maximilian Schloss, a research fellow at Massachusetts General Hospital, are also lead authors of the paper, which appears today in Nature Biomedical Engineering. The paper’s senior authors are Daniel Anderson, a professor of chemical engineering at MIT and a member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science, and Matthias Nahrendorf, a professor of radiology at MGH.

Targeting the bone marrow

RNA interference is a strategy that could potentially be used to treat a variety of diseases by delivering short strands of RNA that block specific genes from being turned on in a cell. So far, the biggest obstacle to this kind of therapy has been the difficulty in delivering it to the right part of the body. When injected into the bloodstream, nanoparticles carrying RNA tend to accumulate in the liver, which some biotech companies have taken advantage of to develop new experimental treatments for liver disease.

Anderson’s lab, working with MIT Institute Professor Robert Langer, who is also an author of the new study, has previously developed a type of polymer nanoparticles that can deliver RNA to organs other than the liver. The particles are coated with lipids that help stabilize them, and they can target organs such as the lungs, heart, and spleen, depending on the particles’ composition and molecular weight.

“RNA nanoparticles are currently FDA-approved as a liver-targeted therapy but hold promise for many diseases, ranging from Covid-19 vaccines to drugs that can permanently repair disease genes,” Anderson says. “We believe that engineering nanoparticles to deliver RNA to different types of cells and organs in the body is key to reaching the broadest potential of genetic therapy.”

In the new study, the researchers set out to adapt the particles so that they could reach the bone marrow. The bone marrow contains stem cells that produce many different types of blood cells, through a process called hematopoiesis. Stimulating this process could enhance the yield of hematopoietic stem cells for stem cell transplantation, while repressing it could have beneficial effects on patients with heart disease or other diseases.

“If we could develop technologies that could control cellular activity in bone marrow and the hematopoietic stem cell niche, it could be transformative for disease applications,” says Mitchell, who is now an assistant professor of bioengineering at the University of Pennsylvania.

The researchers began with the particles they had previously used to target the lungs and created variants that had different arrangements of a surface coating called polyethylene glycol (PEG). They tested 15 of these particles and found one that was able to avoid being caught in the liver or the lungs, and that could effectively accumulate in endothelial cells of the bone marrow. They also showed that RNA carried by this particle could reduce the expression of a target gene by up to 80 percent.

The researchers tested this approach with two genes that they believed could be beneficial to knock down. The first, SDF1, is a molecule that normally prevents hematopoietic stem cells from leaving the bone marrow. Turning off this gene could achieve the same effect as the drugs that doctors often use to induce hematopoietic stem cell release in patients who need to undergo radiation treatments for blood cancers. These stem cells are later transplanted to repopulate the patient’s blood cells.

“If you have a way to knock down SDF1, you can cause the release of these hematopoietic stem cells, which could be very important for a transplantation so you can harvest more from the patient,” Mitchell says.

The researchers showed that when they used their nanoparticles to knock down SDF1, they could boost the release of hematopoietic stem cells fivefold, which is comparable to the levels achieved by the drugs that are now used to enhance stem cell release. They also showed that these cells could successfully differentiate into new blood cells when transplanted into another mouse.

“We are very excited about the latest results,” says Langer, who is also the David H. Koch Institute Professor at MIT. “Previously we have developed high-throughput synthesis and screening approaches to target the liver and blood vessel cells, and now in this study, the bone marrow. We hope this will lead to new treatments for diseases of the bone marrow like multiple myeloma and other illnesses.”

Combatting heart disease

The second gene that the researchers targeted for knockdown is called MCP1, a molecule that plays a key role in heart disease. When MCP1 is released by bone marrow cells after a heart attack, it stimulates a flood of immune cells to leave the bone marrow and travel to the heart, where they promote inflammation and can lead to further heart damage.

In a study of mice, the researchers found that delivering RNA that targets MCP1 reduced the number of immune cells that went to the heart after a heart attack. Mice that received this treatment also showed improved healing of heart tissue following a heart attack.

“We now know that immune cells play such a key role in the progression of heart attack and heart failure,” Mitchell says. “If we could develop therapeutic strategies to stop immune cells that originate from bone marrow from getting into the heart, it could be a new means of treating heart attack. This is one of the first demonstrations of a nucleic-acid-based approach of doing this.”

At his lab at the University of Pennsylvania, Mitchell is now working on new nanotechnologies that target bone marrow and immune cells for treating other diseases, especially blood cancers such as multiple myeloma.

The research was funded in part by the National Institutes of Health, the European Union’s Horizon 2020 research and innovation program, the MGH Research Scholar Program, a Burroughs Wellcome Fund Career Award at the Scientific Interface, a Koch-Prostate Cancer Foundation Award in Nanotherapeutics, the Koch Institute Marble Center for Cancer Nanomedicine, and the Koch Institute Support (core) Grant from the National Cancer Institute.

“Great Things from Small Things” Top 50 Nanotech Blog – Our Top Posts This Week


Happy Holiday (Labor Day) Weekend Everyone! Here are our Top Posts from this past week … Just in case you missed them! We hope all of you are well and safe and continuing to ‘get back to normal’ as the COVID-19 Pandemic of 2020 continues to restrain all of us in one way or another.

Thankfully however, COVID-19 has NOT restricted the Forward Advance of Innovation and Technology Solutions from the small worlds of Nanotechnology – “Great Things from Small Things” – Read and Enjoy and wonderful Holiday Weekend!Team GNT

Carbon Nanotube Second Skin Protects First Responders and Warfighters against Chem, Bio Agents – Lawrence Livermore National Laboratory

The same materials (adsorbents or barrier layers) that provide protection in current garments also detrimentally inhibit breathability.

Recent events such as the COVID-19 pandemic and the use of chemical weapons in the Syria conflict have provided a stark reminder of the plethora of chemical and biological threats that soldiers, medical personnel and first responders face during routine and emergency operations. Researchers have developed a smart, breathable fabric designed to protect the wearer against biological and chemical warfare agents. Material of this type could be used in clinical and medical settings as well.

Recent events such as the COVID-19 pandemic and the use of chemical weapons in the Syria conflict have provided a stark reminder of the plethora of chemical and biological threats that soldiers, medical personnel and first responders face during routine and emergency operations.

Read More … https://genesisnanotech.wordpress.com/2020/05/11/carbon-nanotube-second-skin-protects-first-responders-and-warfighters-against-chem-bio-agents-lawrence-livermore-national-laboratory/

MIT: Lighting the Way to Better Battery Technology

Supratim Das is determined to demystify lithium-ion batteries, by first understanding their flaws.  Photo: Lillie Paquette/School of Engineering

Doctoral candidate Supratim Das wants the world to know how to make longer-lasting batteries that charge mobile phones and electric cars.

Supratim Das’s quest for the perfect battery began in the dark. Growing up in Kolkata, India, Das saw that a ready supply of electric power was a luxury his family didn’t have. “I wanted to do something about it,” Das says. Now a fourth-year PhD candidate in MIT chemical engineering who’s months away from defending his thesis, he’s been investigating what causes the batteries that power the world’s mobile phones and electric cars to deteriorate over time.

Lithium-ion batteries, so-named for the movement of lithium ions that make them work, power most rechargeable devices today. The element lithium has properties that allow lithium-ion batteries to be both portable and powerful; the 2019 Nobel Prize in Chemistry was awarded to scientists who helped develop them in the late 1970s. But despite their widespread use, lithium-ion batteries, essentially a black box during operation, harbor mysteries that prevent scientists from unlocking their full potential. Das is determined to demystify them, by first understanding their flaws.

Read More … https://genesisnanotech.wordpress.com/2020/07/06/mit-lighting-the-way-to-better-battery-technology/

Nuclear Diamond Batteries could disrupt Energy/ Energy Storage as we know it … “Imagine a World where you wouldn’t need to charge your battery for …. Decades!”

Illustration of the NDB Battery in a Most Recognizable ‘18650’ Format

They will blow any energy density comparison out of the water, lasting anywhere from a decade to 28,000 years without ever needing a charge.”

“They will offer higher power density than lithium-ion. They will be nigh-on indestructible and totally safe in an electric car crash.”

And in some applications, like electric cars, they stand to be considerably cheaper than current lithium-ion packs despite their huge advantages.

In the words of Dr. John Shawe-Taylor, UNESCO Chair and University College London Professor: “NDB has the potential to solve the major global issue of carbon emissions in one stroke without the expensive infrastructure projects, energy transportation costs, or negative environmental impacts associated with alternate solutions such as carbon capture at fossil fuel power stations, hydroelectric plants, turbines, or nuclear power stations.

Read More … https://genesisnanotech.wordpress.com/2020/08/25/nano-diamond-self-charging-batteries-could-disrupt-energy-as-we-know-it-imagine-a-world-where-you-wouldnt-need-to-charge-your-battery-for-decades/

“Practical and Viable” Hydrogen Production from Solar – Long Sought Goal of Renewable Energy – Is Close … Oh So Close

Technology developed at the Technion: the oxygen and hydrogen are produced and stored in completely separate cells.

Technion Israel Institute of Technology

Israeli and Italian scientists have developed a renewable energy technology that converts solar energy to hydrogen fuel — and it’s reportedly at the threshold of “practical” viability.The new solar tech would offer a sustainable way to turn water and sunlight into storable energy for fuel cells, whether that stored power feeds into the electrical grid or goes to fuel-cell powered trucks, trains, cars, ships, planes or industrial processes.Think of this research as a sort of artificial photosynthesis, said Lilac Amirav, associate professor of chemistry at the Technion — Israel Institute of Technology in Haifa. (If it could be scaled up, the technology could eventually be the basis of “solar factories” in which arrays of solar collectors split water into stores of hydrogen fuel——as well as, for reasons discussed below, one or more other industrial chemicals.)Read More … https://genesisnanotech.wordpress.com/2020/09/02/practical-and-viable-hydrogen-production-from-solar-long-sought-goal-of-renewable-energy-is-close-oh-so-close/

Watch More … The EV ‘Revolution and Evolution’ … Will the Era of the ICE be over in 2025? 2030?

Tony Seba, Silicon Valley entrepreneur, Author and Thought Leader, Lecturer at Stanford University, Keynote The reinvention and connection between infrastructure and mobility will fundamentally disrupt the clean transport model. It will change the way governments and consumers think about mobility, how power is delivered and consumed and the payment models for usage.