Have fossil fuels been substituted (replaced) by renewables? (To quote the great John Wayne, “Not Hardly!”)


Bad New Renewable Energy dte-energy-gas-plant-1170x658

“That brings us right back around to all the bad news about renewable energy. It all comes down to policy and political will.”

The Renewable Energy Internet News is all abuzz over the bad news about renewable energy and energy storage. Among the concerns bubbling to the surface in the last week or so are:

(1) Renewables might not be doing such a great job of replacing fossil fuel capacity after all,

(2) Energy storage is fomenting yet another carbon emissions problem instead of solving one, and

(3) Renewables are making electricity rates go up, not down.

 

Ouch! Nevertheless, the fact is that renewable energy and energy storage are both here to stay. So, what are we going to do about it? Energy 1

 

First, Admit You Have A Problem

The issue of fossil fuel replacement was tackled by a newly published study titled “Have fossil fuels been substituted by renewables? An empirical assessment for 10 European countries.”

To Read the Abstract and/or Purchase the Report:

Energy Headline 042216 E7YUC4SFHave fossil fuels been substituted by renewables? An empirical assessment for 10 European countries

The study looks at this problem: energy managers among the 10 nations in the study are coping with the intermittent nature of wind and solar by installing more fossil fuel capacity.

That’s more natural gas capacity, to be specific. The case for natural gas integration with renewables is pretty straightforward if your only goal is to ensure reliability when there’s a lot of wind and solar on the grid. In contrast to coal, natural gas power plants can hang out on standby mode when not needed, and rev up quickly when needed.

On the plus side, the study notes that an increase in natural gas capacity doesn’t necessarily correlate to an increase in fossils burned. Remember, you can build all the capacity you want, but in an integrated grid that new gas power plant is competing with wind, solar, hydropower and other renewables.

The problem will become more apparent as the global economy transitions to full electrification. Unless other measures are taken, that means natural gas capacity will also continue increasing. Eventually, the long-term result could be an increase in fossils burned for electricity.

As if on cue, last week the US company DTE Energy won approval to build a new $1 billion natural gas power plant that was vehemently opposed by clean power stakeholders. Here’s the rundown from Midwest Energy News reporter Andy Balaskovitz:

The Michigan Public Service Commission said DTE’s near-term forecasts showed a “significant near-term need for power” as it retires coal generation. The company retired 510 MW of coal over the past two years, and plans another 1,970 MW by 2023.

Groups also say the decision puts Michigan on a risky trend of overbuilding natural gas plants over clean energy alternatives.

Nuts!

Of course, the sparkling green power generation sector of the future doesn’t necessarily need to rely on building more natural gas capacity. Another pathway identified by the authors is the use of bioenergy instead of natural gas to smooth out supply spikes in wind and solar, so there’s that.

The authors also note that shaving peak demand would help. In that regard, DTE’s long-term outlook underscores the importance of demand-side solutions:

The MPSC noted that demand-side alternatives and renewables “could potentially displace — not just defer” a second gas plant DTE has forecasted to meet its needs in 2029.

Clean Energy Storage I header1

Then there are energy storage solutions, like pumped hydro for example. The energy storage solution opens up a whole ‘nother can of worms, though…

Second, Admit That Energy Storage Has A Problem

For the bad news about energy storage we turn to Vox reporter David Roberts’s latest article, “Batteries have a dirty secret.”

Roberts reports that “emissions are higher today than they would have been if no storage had ever been deployed in the US.”

Wait, what?

Roberts takes an especially close look at a study of emissions related to bulk energy storage in the US from researchers with the Rochester Institute of Technology and Carnegie Mellon University, titled “Bulk Energy Storage Increases United States Electricity System Emissions.”

The study digs into the conventional practice of storing cheap energy (typically at night) and releasing it during peak consumption hours. The economic benefit is obvious — buy low, sell high — but according to the study, the result is an increase in carbon emissions as well as other pollutants. Here’s the rundown:

Although economically valuable, storage is not fundamentally a “green” technology, leading to reductions in emissions…We find that net system CO2 emissions resulting from storage operation are nontrivial when compared to the emissions from electricity generation, ranging from 104 to 407 kg/MWh of delivered energy depending on location, storage operation mode, and assumptions regarding carbon intensity…

Double Nuts!

Roberts emphasizes, though, that energy storage itself is not the problem. It’s just being used in ways that prioritize costs and profits, not carbon emissions.

The main takeaway from the article is that a stronger renewable energy policy will counterbalance storage related emissions and eventually cut into carbon emissions overall.

Third, Why Are Electricity Rates Going Up?

That thing about costs and profits leads us to that third bit of bad news, in the form of an op-ed by energy and environment writer Michael Shellenberger that appears in the Forbes under the title, “If Solar And Wind Are So Cheap, Why Are They Making Electricity So Expensive?” [ Asked No One  when interviewing/ questioning the previous U.S. Administration on its ‘sell-out’ policy on Fossil Fuel “Bad” vs. Renewable Energy “Good” mandate(s).

Good question!

Shellenberger begins with the proposition that all the good news about low wind and solar prices may give the general public the idea that their electricity rates will necessarily go down.

For obvious reasons (labor, overhead, insurance, profit margins, transmission and other infrastructure costs aside power generation, etc.) that’s not gonna happen, but for now let’s just go with that.

Shellenberger notes that the cost of wind and solar fell precipitously between 2009 and 2017 (yes, they did — here’s a solar example from 2015 and a wind example from last year), but some areas with the most wind and solar deployment didn’t see their rates go down. In fact, they went up:

…Electricity prices increased by 51 percent in Germany during its expansion of solar and wind energy from 2006 to 2016; 24 percent in California during its solar energy build-out from 2011 to 2017; over 100 percent in Denmark since 1995 when it began deploying renewables (mostly wind) in earnest…

After other possibilities, Shellenberger zeroes in on wind and solar. He cites German economist Leon Hirth, who crystal balled the situation back in 2013:

…Both solar and wind produce too much energy when societies don’t need it, and not enough when they do. Solar and wind thus require that natural gas plants, hydro-electric dams, batteries or some other form of reliable power be ready at a moment’s notice…

Hirth also cites economist James Bushnell of the University of California:

The story of how California’s electric system got to its current state is a long and gory one, [but] the dominant policy driver in the electricity sector has unquestionably been a focus on developing renewable sources of electricity generation.

The picture is a bit more complicated when you look at the other 49 states, though. EnergySage, an online solar marketplace backed by the US Department of Energy, takes a deep dive into the issue of rising electricity costs and finds more nuance.

That point is illustrated by the situation in Montana and the Dakotas, which EnergySage includes in the rates-going-up group. The three states also have fairly healthy wind energy profiles.

A recent rate proposal by the Montana-Dakota Utilities Company takes wind into consideration, but the Bismark Tribune takes note of additional detail:

The need for increase is mainly to do with infrastructure, including investments at the Heskett Station in Mandan, the purchase of the Thunder Spirit wind farm, transmission and substation build out and a new billing system. Hanson said, from 2010 to 2017, MDU will have doubled its infrastructure investment — $412 million in new energy production, $126 million in transmission, $142 on substations for distribution and $36 million for other administrative expenses.

As far back as 2005, the Tribune took note that North Dakota’s aging fleet of power generation facilities was in sore need of an expensive makeover, so there’s that.

For another angle on the topic of costs, take a look at the Energy Department’s deep dive into the the economic value of wind power.

What’s So Bad About High Electricity Rates?

Anyways, according to one school of thought, high electricity rates are actually a good thing because they encourage conservation, and when ratepayers invest in conservation their monthly bill will go down, not up.

That’s nice for electricity consumers with the wherewithal to invest in weatherization and other conservation measures, and who can change their habits to use less energy, but for various reasons a lot of ratepayers don’t have those options.

One solution is ratepayer assistance, which predates the advent of wind and solar. Renewable energy or not, energy costs have always been a big issue in policy making (see Oil Crisis, 1973, OPEC). Various forms of assistance have long been available to ratepayers, including rate subsidies and weatherization help.

That brings us right back around to all the bad news about renewable energy. It all comes down to policy and political will.

The bottom line is that if renewable energy is going to help save the Earth, then development has to accelerate at a pace that will push fossil fuels out of the power generation and energy storage markets, conservation and demand side issues must be addressed, and social policy must continue to play a role in ratemaking and assistance programs.

There has been a subtle policy shift in the US, but the transition to solar, wind and storage is still bubbling away through local policy decisionsstate-level initiatives, and recent changes in state policy. 

Nanorobots successfully target and kill cancerous tumors


Science fiction no more

In an article out today in Nature Biotechnology, scientists were able to show tiny autonomous bots have the potential to function as intelligent delivery vehicles to cure cancer in mice.

These DNA nanorobots do so by seeking out and injecting cancerous tumors with drugs that can cut off their blood supply, shriveling them up and killing them.

“Using tumor-bearing mouse models, we demonstrate that intravenously injected DNA nanorobots deliver thrombin specifically to tumor-associated blood vessels and induce intravascular thrombosis, resulting in tumor necrosis and inhibition of tumor growth,” the paper explains.

DNA nanorobots are a somewhat new concept for drug delivery. They work by getting programmed DNA to fold into itself like origami and then deploying it like a tiny machine, ready for action.

DNA nanorobots, Nature Biotechnology 2018

The scientists behind this study tested the delivery bots by injecting them into mice with human breast cancer tumors. Within 48 hours, the bots had successfully grabbed onto vascular cells at the tumor sites, causing blood clots in the tumor’s vessels and cutting off their blood supply, leading to their death.

Remarkably, the bots did not cause clotting in other parts of the body, just the cancerous cells they’d been programmed to target, according to the paper.

The scientists were also able to demonstrate the bots did not cause clotting in the healthy tissues of Bama miniature pigs, calming fears over what might happen in larger animals.

The goal, say the scientists behind the paper, is to eventually prove these bots can do the same thing in humans. Of course, more work will need to be done before human trials begin.

Regardless, this is a huge breakthrough in cancer research. The current methods of either using chemotherapy to destroy every cell just to get at the cancer cell are barbaric in comparison. Using targeted drugs is also not as exact as simply cutting off blood supply and killing the cancer on the spot. Should this new technique gain approval for use on humans in the near future it could have impressive affects on those afflicted with the disease.

Einstein’s Theory of ‘entanglement’ (aka – spooky action) goes massive


Quantum Entangle 1 download

Perhaps the strangest prediction of quantum theory is entanglement, a phenomenon whereby two distant objects become intertwined in a manner that defies both classical physics and a “common-sense” understanding of reality. In 1935, Albert Einstein expressed his concern over this concept, referring to it as “spooky action at a distance”.

Nowadays, entanglement is considered a cornerstone of quantum mechanics, and it is the key resource for a host of potentially transformative quantum technologies. Entanglement is, however, extremely fragile, and it has previously been observed only in microscopic systems such as light or atoms, and recently in superconducting electric circuits.

In work recently published in Nature, a team led by Prof. Mika Sillanpää at Aalto University in Finland has shown that entanglement of massive objects can be generated and detected.

The researchers managed to bring the motions of two individual vibrating drumheads – fabricated from metallic aluminium on a silicon chip – into an entangled quantum state. The objects in the experiment are truly massive and macroscopic compared to the atomic scale: the circular drumheads have a diametre similar to the width of a thin human hair.

The team also included scientists from the University of New South Wales Canberra in Australia, the University of Chicago, and the University of Jyväskylä in Finland. The approach taken in the experiment was based on a theoretical innovation developed by Dr. Matt Woolley at UNSW and Prof. Aashish Clerk, now at the University of Chicago.

‘The vibrating bodies are made to interact via a superconducting microwave circuit. The electromagnetic fields in the circuit are used to absorb all thermal disturbances and to leave behind only the quantum mechanical vibrations,’ says Mika Sillanpää, describing the experimental setup.

Eliminating all forms of noise is crucial for the experiments, which is why they have to be conducted at extremely low temperatures near absolute zero, at -273 °C. Remarkably, the experimental approach allows the unusual state of entanglement to persist for long periods of time, in this case up to half an hour.

Quantum Entangle 2 2873c6e954901a23c40ff5afdf8a924d

‘These measurements are challenging but extremely fascinating. In the future, we will attempt to teleport the mechanical vibrations. In quantum teleportation, properties of physical bodies can be transmitted across arbitrary distances using the channel of “spooky action at a distance”,’ explains Dr. Caspar Ockeloen-Korppi, the lead author on the work, who also performed the measurements.

The results demonstrate that it is now possible to have control over large mechanical objects in which exotic quantum states can be generated and stabilized. Not only does this achievement open doors for new kinds of quantum technologies and sensors, it can also enable studies of fundamental physics in, for example, the poorly understood interplay of gravity and quantum mechanics. einstein_solving_problems_zpsde94bc7e

The experimental research was carried out at the OtaNano national research infrastructure for micro- and nanotechnologies in Finland, and was funded also by the European Research Council, the European Union’s Horizon 2020 research and innovation programme, and by the Academy of Finland.

Stabilized entanglement of massive mechanical oscillators’.C. F. Ockeloen-Korppi, E. Damskägg, J.-M. Pirkkalainen, A. A. Clerk, F. Massel, M. J. Woolley, M. A. Sillanpää: ‘Stabilized entanglement of massive mechanical oscillators’. Nature 556, 7702 (2018). https://doi.org/10.1038/s41586-018-0038-x

For More Information:

Mika Sillanpää, Professor
Aalto University, Department of Applied Physics

 Matt Woolley, Senior Lecturer
UNSW Canberra, School of Engineering and Information Technology

MIT Technology Review: Sustainable Energy: The daunting math of climate change means we’ll need carbon capture … eventually


 

MIT CC Friedman unknown-1_4

 Net Power’s pilot natural gas plant with carbon capture, near Houston, Texas.

An Interview with Julio Friedmann

At current rates of greenhouse-gas emissions, the world could lock in 1.5 ˚C of warming as soon as 2021, an analysis by the website Carbon Brief has found. We’re on track to blow the carbon budget for 2 ˚C by 2036.

Amid this daunting climate math, many researchers argue that capturing carbon dioxide from power plants, factories, and the air will have to play a big part in any realistic efforts to limit the dangers of global warming.

If it can be done economically, carbon capture and storage (CCS) offers the world additional flexibility and time to make the leap to cleaner systems. It means we can retrofit, rather than replace, vast parts of the global energy infrastructure. And once we reach disastrous levels of warming, so-called direct air capture offers one of the only ways to dig our way out of trouble, since carbon dioxide otherwise stays in the atmosphere for thousands of years.

Julio Friedmann has emerged as one of the most ardent advocates of these technologies. He oversaw research and development efforts on clean coal and carbon capture at the US Department of Energy’s Office of Fossil Energy under the last administration. Among other roles, he’s now working with or advising the Global CCS Institute, the Energy Futures Initiative, and Climeworks, a Switzerland-based company already building pilot plants that pull carbon dioxide from the air.

In an interview with MIT Technology Review, Friedmann argues that the technology is approaching a tipping point: a growing number of projects demonstrate that it works in the real world, and that it is becoming more reliable and affordable. He adds that the boosted US tax credit for capturing and storing carbon, passed in the form of the Future Act as part of the federal budget earlier this year, will push forward many more projects and help create new markets for products derived from carbon dioxide (see “The carbon-capture era may finally be starting”).

But serious challenges remain. Even with the tax credit, companies will incur steep costs by adding carbon capture systems to existing power plants. And a widely cited 2011 study, coauthored by MIT researcher Howard Herzog, found that direct air capture will require vast amounts of energy and cost 10 times as much as scrubbing carbon from power plants.

(This interview has been edited for length and clarity.)

In late February, you wrote a Medium post saying that with the passage of the increased tax credit for carbon capture and storage, we’ve “launched the climate counter-strike.” Why is that a big deal?

It actually sets a price on carbon formally. It says you should get paid to not emit carbon dioxide, and you should get paid somewhere between $35 a ton and $50 a ton. So that is already a massive change. In addition to that, it says you can do one of three things: you can store CO2, you can use it for enhanced oil recovery, or you can turn it into stuff. Fundamentally, it says not emitting has value.

As I’ve said many times before, the lack of progress in deploying CCS up until this point is not a question of cost. It’s really been a question of finance.

The Future Act creates that financing.

I identified an additional provision which said not only can you consider a power plant a source or an industrial site a source, you can consider the air a source.

Even if we zeroed out all our emissions today, we still have a legacy of harm of two trillion tons of CO2 in the air, and we need to do something about that.

And this law says, yeah, we should. It says we can take carbon dioxide out of the air and turn it into stuff.

At the Petra Nova plant in Texas, my understanding is the carbon capture costs are something like $60 to $70 a ton, which is still going to outstrip the tax credit today. How are we going to close that gap?

There are many different ways to go about it. For example, the state of New Jersey today passed a 90 percent clean energy portfolio standard. Changing the policy from a renewable portfolio standard [which would exclude CCS technologies] to a clean energy standard [which would allow them] allowed higher ambition.

In that context, somebody who would build a CCS project and would get a contract to deliver that power, or deliver that emissions abatement, can actually again get staked, get financed, and get built. That can happen without any technology advancement.

The technology today is already cost competitive. CCS today, as a retrofit, is cheaper than a whole bunch of stuff. It’s cheaper than new-build nuclear, it’s cheaper than offshore wind. It’s cheaper than a whole bunch of things we like, and it’s cheaper than rooftop solar, almost everywhere. It’s cheaper than utility-scale concentrating solar pretty much everywhere, and it is cheaper than what solar and wind were 10 years ago.

What do you make of the critique that this is all just going to perpetuate the fossil-fuel industry?

The enemy is not fossil fuels; the enemy is emissions.

In a place like California that has terrific renewable resources and a good infrastructure for renewable energy, maybe you can get to zero [fossil fuels] someday.

If you’re in Saskatchewan, you really can’t do that. It is too cold for too much of the year, and they don’t have solar resources, and their wind resources are problematic because they’re so strong they tear up the turbines. Which is why they did the CCS project in Saskatchewan. For them it was the right solution.

Shifting gears to direct air capture, the basic math says that you’re moving 2,500 molecules to capture one of CO2. How good are we getting at this, and how cheaply can we do this at this point?

If you want to optimize the way that you would reduce carbon dioxide economy-wide, direct air capture is the last thing you would tackle. Turns out, though, that we don’t live in that society. We are not optimizing anything in any way.

So instead we realize we have this legacy of emissions in the atmosphere and we need tools to manage that. So there are companies like ClimeworksCarbon Engineering, and Global Thermostat. Those guys said we know we’re going to need this technology, so I’m going to work now. They’ve got decent financing, and the costs are coming down and improving (see “Can sucking CO2 out of the atmosphere really work?”).

The cost for all of these things now today, all-in costs, is somewhere between $300 and $600 a ton. I’ve looked inside all those companies and I believe all of them are on a glide path to get to below $200 a ton by somewhere between 2022 and 2025. And I believe that they’re going to get down to $100 a ton by 2030. At that point, these are real options.

At $200 a ton, we know today unambiguously that pulling CO2 out of the air is cheaper than trying to make a zero-carbon airplane, by a lot. So it becomes an option that you use to go after carbon in the hard-to-scrub parts of the economy.

Is it ever going to work as a business, or is it always going to be kind of a public-supported enterprise to buy ourselves out of climate catastrophes?

Direct air capture is not competitive today broadly, but there are places where the value proposition is real. So let me give you a couple of examples.

In many parts of the world there are no sources of CO2. If you’re running a Pepsi or a Coca-Cola plant in Sri Lanka, you literally burn diesel fuel and capture the CO2 from it to put into your cola, at a bonkers price. It can cost $300 to $800 a ton to get that CO2. So there are already going to be places in some people’s supply chain where direct air capture could be cheaper.

We talk to companies like Goodyear, Firestone, or Michelin. They make tires, and right now the way that they get their carbon black [a material used in tire production that’s derived from fossil fuel] is basically you pyrolize bunker fuel in the Gulf Coast, which is a horrible, environmentally destructive process. And then you ship it by rail cars to wherever they’re making the tires.

If they can decouple from that market by gathering CO2 wherever they are and turn that into carbon black, they can actually avoid market shocks. So even if it costs a little more, the value to that company might be high enough to bring it into the market. That’s where I see direct air actually gaining real traction in the next few years.

It’s not going to be enough for climate. We know that we will have to do carbon storage, for sure, if we want to really manage the atmospheric emissions. But there’s a lot of ground to chase this, and we never know quite where technology goes.

In one of your earlier Medium posts you said that we’re ultimately going to have to pull 10 billion tons of CO2 out of the atmosphere every year. Climeworks is doing about 50 [at their pilot plant in Iceland]. So what does that scale-up look like?

You don’t have to get all 10 billion tons with direct air capture. So let’s say you just want one billion.

Right now, Royal Dutch Shell as a company moves 300 million tons of refined product every year. This means that you need three to four companies the size of Royal Dutch Shell to pull CO2 out of the atmosphere.

The good news is we don’t need that billion tons today. We have 10 or 20 or 30 years to get to a billion tons of direct air capture. But in fact we’ve seen that kind of scaling in other kinds of clean-tech markets. There’s nothing in the laws of physics or chemistry that stops that.

Clean Disruption of Energy and Transportation – Conference on World Affairs – Boulder, Colorado: Conference Video


Tony Seba 1 images

 

Published on Apr 25, 2018

tony-seba 2 -ev-cost-curve‘Rethinking the Future – Clean Disruption of Energy and Transportation’ is Tony Seba’s opening keynote at the 70th annual Conference on World Affairs in Boulder, Colorado, April 9th, 2018. The Clean Disruption will be the fastest, deepest, most consequential disruption of energy and transportation in history. Based on Seba’s #1 Amazon bestselling book “Clean Disruption” and Rethinking Transportation 2020-2030, this presentation lays out what the key technologies and business model innovations are (batteries, electric vehicles, autonomous vehicles, ride-hailing and solar PV), how this technology disruption will unfold over the next decade as well as key implications for society, finance, industry, cities, geopolitics, and infrastructure. The 2020s will be the most technologically disruptive decade in history. By analyzing and anticipating these disruptions we can learn that the benefits to humanity will be immense but to seize the upside we will need to mitigate the negative consequences. As the opening keynote speaker at the prestigious Conference on World Affairs, Seba follows on the footsteps of luminaries such as Eleanor Roosevelt and Buckminster Fuller.

Watch the Video 
 

Supporting the EV Revolution: New battery technologies are getting a “charge” from venture investors


Battery Investors 5 ev-salesVenture capital investors once again are getting charged up over new battery technologies.

The quest to build a better battery has occupied venture investors for nearly a decade, since the initial clean technology investment bubble of the mid-2000s.

Read More: Mobility Disruption by Tony Seba – Silicon Valley Entrepreneur and Lecturer at Stanford University – The Coming EV Revolution by 2030?

Battery Investors 6 Announcements

Now, some of those same investors are returning to invest in battery businesses, drawn by the promise of novel chemistries and new materials that aim to make more powerful, smaller and safer batteries.

One of the latest to raise new money is Gridtential, a battery technology developer pitching a new take on a classic battery chemistry… the centuries old lead acid battery. Gridtential’s innovation, for which it’s filed several patents, is to use silicon plating instead of non-reactive lead plating in the battery.

The company’s novel approach has won it the backing of four big battery manufacturers, in an earlier $6 million round of funding in January, and now the company has raised another $5 million to continue to build out the business from new investor 1955 Capital.

Gridtential’s funding is the latest in a series of new investments into battery companies coming from venture firms this year.

Battery companies raised $480 million in the first half of the year according to data from cleantech investment and advisory services firm Mercom Capital.

Much of that capital was actually committed to one big battery company, Microvast. The Texas-based battery manufacturer raised $400 million in funding led by CITIC Securities and CDH Investment — two of China’s biggest and best investment firms.

Battery Investors 7 china-leads-push-for-new-energy-technologies-lg-11272017

The presence of big Chinese investors in a Stafford, Texas-based company shouldn’t come as a surprise. Batteries are big business (just ask Tesla).

As more vehicles become electrified, the demand for new energy storage solutions will just continue to climb. Add a movement to put more renewable energy on the electricity grid, and that more than doubles the demand for good, big, high performance storage solutions. Go Ultra Low Electric Vehicle on charge on a London street

Indeed, major tech companies are swarming all over the battery business. In addition to Tesla’s push into power, Alphabet is also looking at developing new grid-scale storage technologies, according to a recent report from Bloomberg.

Go Ultra Low Nissan LEAF (L) and Kia Soul EV (R) on charge on a London street. Ultra-low emission vehicles such as this can cost as little as 2p per mile to run and some electric cars and vans have a range of up to 700 miles.

Battery industry players aren’t sitting on their hands, and that’s why companies like East Penn Manufacturing, the largest single-site, lead-acid battery plant; Crown Battery Manufacturing, a developer of deep-cycle applications; Leoch International, one of the biggest lead acid battery exporters in China, and Power-Sonic Inc., a specialty battery distributor all committed capital.

“What’s unique about the battery is two things. One is the use of silicon. It’s built as a stack of cells in series rather than a group of cells in parallel. The silicon plates are used as current collectors — they are really very thin pieces of wire that connect one cell to the next,” explains chief executive Chris Beekhuis. “It creates a density of current and uniform temperature across the plate, both of which prevent sulfation.”

As the energy storage world focuses its attention on building better batteries based on lithium-ion technology (the batteries that are in cell phones and electric vehicles), traditional battery manufacturers could potentially be nervous about seeing their market share erode.

 

With its new design for lead acid batteries, Gridtential is making a smaller, more energy dense, lead acid battery that is perfect for use in hybrid vehicles, storing energy from the power grid and creating backup power supplies.

The other benefit of silicon (in addition to being less toxic), is that a massive supply chain already exists for the stuff. Solar panels and chip manufacturers have created a huge amount of manufacturing supply for the raw materials (something that’s becoming a problem for the lithium-ion business), and the material is relatively cheap, Beekhuis said.

It’s also 40% lighter than a traditional lead battery and will be cost competitive with existing battery costs at roughly $300 per kilowatt-hour of storage in automotive applications.

Unlike other battery companies that intend to manufacture and sell their own batteries, Gridtential intends to license its process (like a more traditional software business would). Indeed, the company has brought in a former Dolby executive to run its licensing operations.

That means, Gridtential’s trademarked “silicon joule” technology could become the Intel inside for lead acid battery makers.

“You’re combining the best of lithium-ion and lead acid in a product that is attractive to the market,” says Andrew Chung, the founder of 1955 Capital .

Chung, a longtime investor in sustainability technologies, sees Gridtential as a response to the capitally intensive missteps that investors have made in the past when backing battery companies.

“Can you commercialize it capital efficiently?” Chung asked. That’s the big question companies face and in the case of Gridtential, the reliance on silicon is critical. “You’re able to move away from that huge upfront cost to invent manufacturing,” Chung told me.

While Gridtential is tackling the lead acid battery market, Romeo Power, which raised a $30 million seed round in late August, is looking at novel technologies for lithium ion battery packs. Not focusing on battery chemistry itself, Romeo is wooing investors with its pitch for power management.

As Romeo co-founder Mike Patterson:

“The [battery] cells are a commodity, it’s true. But of the hundreds of cells [available to buy], you have to know which is the best for a particular application. Then you have to get as many cells as you can into the smallest space possible, to create volumetric density. Then,” he says, “to keep the cells from getting too hot, you need to put them in the right container and connect them using the right materials and methods.”

Some projects are even farther afield. Bill Joy, for instance, has doubled down on his investment in an entirely new material science that could radically remake the battery industry.

One of the solutions to Joy’s “grand challenge” breakthroughs, Ionic Materials has created a low-cost new material that completely reimagines what makes a battery. “We had decided in the case of batteries that the thing that would make the difference would be to have them not have liquids in them,” Joy said of the initial challenge.

The solution was found in a material invented in 2011 by a Tufts professor and former Bell Labs researcher named Mike Zimmerman. The new technology is called a solid polymer lithium metal battery.

“Mike invented a specialty polymer that he can tweak and conduct ions at room temperature,” Joy told me. “It’s a new conduction mechanism.”

Ionic’s energy storage tech uses a solid, almost plastic-like, polymer to allow lithium ions to flow from anode to cathode. The company claims that its new electrolytes can work the same as a cathode; are conductive at room temperature, can be more stable, less flammable, and can be produced in high volumes.

Wired called it the Jesus Battery.

Indeed, if the company’s material can allow for greater flexibility, more power, and better safety standards than a traditional lithium-ion battery, it would be a miracle.

It’ll take something of a miracle to advance battery technologies. There haven’t been significant innovations in energy storage for a few decades, with most of the real improvements coming in how batteries are packed together to create more storage capacity. The inherent technology has remained fairly constant.

While Romeo is tackling the packing problem, both Gridtential and Ioinic are proposing material science solutions to some of the battery industry’s problems — and as the financing indicates they’re not the only ones.

Battery Investors 3 190078748_d8e3d76813_oEnergy storage is a potential trillion-dollar business, and with a potential market of that size, it’s no wonder that investors are (albeit cautiously) coming back in to a market that had jolted them in the past.

 

 

MIT Technolgy Review: This battery advance could make electric vehicles far cheaper


Sila Nanotechnologies has pulled off double-digit performance gains for lithium-ion batteries, promising to lower costs or add capabilities for cars and phones.

For the last seven years, a startup based in Alameda, California, has quietly worked on a novel anode material that promises to significantly boost the performance of lithium-ion batteries.

Sila Nanotechnologies emerged from stealth mode last month, partnering with BMW to put the company’s silicon-based anode materials in at least some of the German automaker’s electric vehicles by 2023.

A BMW spokesman told the Wall Street Journal the company expects that the deal will lead to a 10 to 15 percent increase in the amount of energy you can pack into a battery cell of a given volume. Sila’s CEO Gene Berdichevsky says the materials could eventually produce as much as a 40 percent improvement (see “35 Innovators Under 35: Gene Berdichevsky”).

For EVs, an increase in so-called energy density either significantly extends the mileage range possible on a single charge or decreases the cost of the batteries needed to reach standard ranges. For consumer gadgets, it could alleviate the frustration of cell phones that can’t make it through the day, or it might enable power-hungry next-generation features like bigger cameras or ultrafast 5G networks.

Researchers have spent decades working to advance the capabilities of lithium-ion batteries, but those gains usually only come a few percentage points at a time. So how did Sila Nanotechnologies make such a big leap?

Berdichevsky, who was employee number seven at Tesla, and CTO Gleb Yushin, a professor of materials science at the Georgia Institute of Technology, recently provided a deeper explanation of the battery technology in an interview with MIT Technology Review.

Sila co-founders (from left to right), Gleb Yushin, Gene Berdichevsky and Alex Jacobs.

An anode is the battery’s negative electrode, which in this case stores lithium ions when a battery is charged. Engineers have long believed that silicon holds great potential as an anode material for a simple reason: it can bond with 25 times more lithium ions than graphite, the main material used in lithium-ion batteries today.

But this comes with a big catch. When silicon accommodates that many lithium ions, its volume expands, stressing the material in a way that tends to make it crumble during charging. That swelling also triggers electrochemical side reactions that reduce battery performance.

In 2010, Yushin coauthored a scientific paper that identified a method for producing rigid silicon-based nanoparticles that are internally porous enough to accommodate significant volume changes. He teamed up with Berdichevsky and another former Tesla battery engineer, Alex Jacobs, to form Sila the following year.

The company has been working to commercialize that basic concept ever since, developing, producing, and testing tens of thousands of different varieties of increasingly sophisticated anode nanoparticles. It figured out ways to alter the internal structure to prevent the battery electrolyte from seeping into the particles, and it achieved dozens of incremental gains in energy density that ultimately added up to an improvement of about 20 percent over the best existing technology.

Ultimately, Sila created a robust, micrometer-size spherical particle with a porous core, which directs much of the swelling within the internal structure. The outside of the particle doesn’t change shape or size during charging, ensuring otherwise normal performance and cycle life.

The resulting composite anode powders work as a drop-in material for existing manufacturers of lithium-ion cells.

With any new battery technology, it takes at least five years to work through the automotive industry’s quality and safety assurance processes—hence the 2023 timeline with BMW. But Sila is on a faster track with consumer electronics, where it expects to see products carrying its battery materials on shelves early next year.

Venkat Viswanathan, a mechanical engineer at Carnegie Mellon, says Sila is “making great progress.” But he cautions that gains in one battery metric often come at the expense of others—like safety, charging time, or cycle life—and that what works in the lab doesn’t always translate perfectly into end products.

Companies including Enovix and Enevate are also developing silicon-dominant anode materials. Meanwhile, other businesses are pursuing entirely different routes to higher-capacity storage, notably including solid-state batteries. These use materials such as glass, ceramics, or polymers to replace liquid electrolytes, which help carry lithium ions between the cathode and anode.

BMW has also partnered with Solid Power, a spinout from the University of Colorado Boulder, which claims that its solid-state technology relying on lithium-metal anodes can store two to three times more energy than traditional lithium-ion batteries. Meanwhile, Ionic Materials, which recently raised $65 million from Dyson and others, has developed a solid polymer electrolyte that it claims will enable safer, cheaper batteries that can operate at room temperature and will also work with lithium metal.

Some battery experts believe that solid-state technology ultimately promises bigger gains in energy density, if researchers can surmount some large remaining technical obstacles.

But Berdichevsky stresses that Sila’s materials are ready for products now and, unlike solid-state lithium-metal batteries, don’t require any expensive equipment upgrades on the part of battery manufacturers.

As the company develops additional ways to limit volume change in the silicon-based particles, Berdichevsky and Yushin believe they’ll be able to extend energy density further, while also improving charging times and total cycle life.

This story was updated to clarify that Samsung didn’t invest in Ionic Material’s most recent funding round.

Read and Watch More:

Tenka Energy, Inc. Building Ultra-Thin Energy Dense SuperCaps and NexGen Nano-Enabled Pouch & Cylindrical Batteries – Energy Storage Made Small and POWERFUL! YouTube Video:

NanoSphere Health Sciences Awarded Breakthrough Patent for Disruptive Nanoparticle Delivery Platform


Landmark patent marks most significant advancement in over 25 years for non-invasive medical delivery systems

NanoSphere Health Sciences beakers

Photo Credit: NanoSphere Health Sciences

DENVER, CO – APRIL 2018 – NanoSphere Health Sciences INC (CSE: NSHS) (OTC: NSHSF) is pleased to announce that its flagship subsidiary, NanoSphere Health Sciences, LLC, has been granted Patent No. 9,925.149—which covers the core technology behind the production of the NanoSphere Delivery System™—by the United States Patent and Trademark Office.

The research-proven NanoSphere Delivery System™, protected by this patent, is one of the most important advancements for the non-invasive delivery of biological agents in over 25 years. The patent broadly encompasses the formation and manufacturing of the NanoSphere Delivery System™ for the delivery of cannabinoids, pharmaceuticals, nutraceuticals, cosmeceuticals and other biological agents.

NanoSphere’s groundbreaking NanoSphere Delivery System™ nanoencapsulates a broad range of bioactive compounds in a protective membrane, transporting them rapidly and effectively to the bloodstream and cells for greater efficacy. This delivery platform is a breakthrough in pharmaceutical, cannabinoid, nutraceutical and cosmeceutical supplement delivery. It makes the nanoencapsulated agents safer and more bioavailable, reducing adverse effects by delivering precise doses of smart nanoparticles to target sites.

“The granting of the patent for the NanoSphere Delivery System™ secures our position as a leader in advanced nanoparticle delivery,” said Robert Sutton, CEO of NanoSphere Health Sciences. “Major industries have the potential to be reshaped and reimagined by our next-generation technology.”

“NanoSphere’s patent claims and protects our core technology for the formation and manufacturing of lipid, structural nanoparticles, which is the NanoSphere Delivery System™,” said Dr. Richard Clark Kaufman, Chief Science Officer and inventor of the NanoSphere Delivery System™. “This patent extends to our 16 forms of lipid nanoparticle structures, which can be applied across healthcare sectors for vastly improved medical delivery.”

With the issuance of this patent, the NanoSphere will now have long-term market exclusivity over this delivery platform, with patent infringement prohibited. The company intends to license the patented NanoSphere Delivery System™ and proprietary manufacturing process to selected companies in its target industries to maximize commercialization. This patent allows NanoSphere to bring to the world the NanoSphere Delivery System™ through multiple product lines and platforms, such as the company’s cannabis brand Evolve Formulas’ transdermal, intranasal and intraoral applications and beyond.

SOURCE NanoSphere Health Sciences INC

 

About NanoSphere

NanoSphere Health Sciences LLC, a subsidiary of NanoSphere Health Sciences INC (CSE: NSHS) (OTC: NSHSF), is the leader in nanoparticle delivery, a biotechnology company advancing the NanoSphere Delivery System™.  NanoSphere’s patented core technology is changing the way biological agents deliver benefits.

NanoSphere’s disruptive platforms use smart nanoparticles to deliver cannabinoids, nutraceuticals, pharmaceuticals and over-the-counter medications in a patented process with greater bioavailability and efficacy for the cannabis, nutraceutical, pharmaceutical, cosmeceutical and animal health industries.

The Canadian Securities Exchange does not accept responsibility for the adequacy or accuracy of this release.

Remote-control shoots laser at nano-gold to turn on cancer-killing immune cells – could one day send immune cells on a rampage against a malignant tumor


Nano Thermo Cancer 55092A heat-sensitive gene switch implanted in a sample of T-cells works in an in vitro check. Gentle pulses from a near-infrared laser directed at gold nanoparticles, which are also in the sample with the T-cells, transform into gentle heat and flip the switch on, activating the T-cells. The resulting signal appears as orange dots on a monitor in the background. CREDIT Georgia Tech / Allison Carter

Abstract:
A remote command could one day send immune cells on a rampage against a malignant tumor. The ability to mobilize, from outside the body, targeted cancer immunotherapy inside the body has taken a step closer to becoming reality.

Remote-control shoots laser at nano-gold to turn on cancer-killing immune cells

Bioengineers at the Georgia Institute of Technology have installed a heat-sensitive switch into T-cells that can activate the T-cells when heat turns the switch on. The method, tested in mice and published in a new study, is locally targeted and could someday help turn immunotherapy into a precision instrument in the fight against cancer.

Immunotherapy has made headlines with startling high-profile successes like saving former U.S. President Jimmy Carter from brain cancer. But the treatment, which activates the body’s own immune system against cancer and other diseases, has also, unfortunately, proved to be hit-or-miss.

“In patients where radiation and traditional chemotherapies have failed, this is where T-cell therapies have shined, but the therapy is still new,” said principal investigator Gabe Kwong. “This study is a step toward making it even more effective.”

Laser, gold, and T-cells

In the study, Kwong’s team successfully put their remote-control method through initial tests in mice with implanted tumors (so-called tumor phantoms, specially designed for certain experiments). The remote works via three basic components.

First, the researchers modified T-cells, a type of white blood cell, to include a genetic switch that, when switched on, increased the cells’ expression of specific proteins by more than 200 times. That ability could be used to guide T-cells’ cancer-fighting activities.

The T-cells, with the switch off, were introduced into the tumor phantom which was placed into the mice. The tumor phantom also included gold nanorods, just dozens of atoms in size. The researchers shone pulses of a gentle laser in the near-infrared (NIR) range from outside the mouse’s body onto the spot where the tumor was located.

The nanorods receiving the light waves turned them into useful, localized mild heat, allowing the researchers to precisely warm the tumor. The elevated heat turned on the T-cells’ engineered switch.

Hyper-activated T-cells

This study honed the method and confirmed that its components worked in living animals. It was not the intention of the study to treat cancer yet, although undertaking that is the next step, which is already on its way.

“In upcoming experiments, we are implementing this approach to treat aggressive tumors and establish cancer-fighting effectiveness,” said Kwong, who is an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University.

The researchers published their results in the current edition of the journal ACS Synthetic Biology. The study’s first author was graduate research assistant Ian Miller. The research was funded by the National Institutes of Health, the National Science Foundation, the Burroughs Wellcome Fund, and the Shurl and Kay Curci Foundation.

Better immunotherapy

Bioengineers have been able to do a lot with T-cells already when they’re outside of the body.

“Right now, we’re adept at harvesting a patient’s own T-cells, modifying to target cancer, growing them outside the body until there are hundreds of millions of them,” Kwong said. “But as soon as we inject them back into a patient, we lose control over the T-cells’ activity inside the body.”

Cancer is notoriously wily, and when T-cells crawl into a tumor, the tumor tends to switch off the T-cells’ cancer-killing abilities. Researchers have been working to switch them back on.

Kwong’s remote control has done this in the lab, while also boosting T-cell activity.

T-cell toxicities

Having an off-switch is also important. If T-cells were engineered to be always-on and hyper-activated, as they moved through the body, they could damage healthy tissue.

“There would be off-target toxicities, so you really want to pinpoint their activation,” Kwong said. “Our long-term goal for them is to activate site-specifically, so T-cells can overcome immunosuppression by the tumor and become better killers there.”

When the heat remote is turned off, so are Kwong’s engineered T-cells, because customary body temperatures are not high enough to activate their switch.

Heat-shock switch

The switch is a natural safety mechanism in human cells that has evolved to protect against heat shock and turns on when tissue temperatures rise above the body’s normal operating range, which centers on 37 degrees Celsius (98.6 F). But the researchers re-fitted T-cells with the switch to make it turn on other functions, and it could be used to hyper-activate the cells.

The Georgia Tech bioengineers found that the switch worked in a range of 40 to 42 degrees Celsius (104 – 107.6 F), high enough to not react to the majority of high fevers and low enough to not damage healthy tissue nor the engineered T-cells.

“When the local temperature is raised to 45 degrees (113 F), some cells in our body don’t like it,” Kwong said. “But if heating is precisely controlled in a 40 to 42 degrees window with short pulses of the NIR light, then it turns on the T-cells’ switch, and body cells are still very comfortable.”

Immuno-goals and dreams

The researchers want to combine the switch with some additional cancer-fighting weapons they envision engineering into T-cells.

For example, secreted molecules called cytokines can boost immune cells’ ability to kill cancer, but cytokines, unfortunately, can also be toxic. “Our long-term goal is to engineer T-cells to make and release powerful immune system stimulants like cytokines on command locally and sparingly,” Kwong said.

In other studies, gently heated gold nanorods have been shown to kill tumors or hinder metastasis. But T-cell treatments could be even more thorough and, in addition, hopefully, one day give patients treated with them a long-lasting memory immune response to any recurrence of their cancer.

###

Citation: This experimental method is in laboratory stages in mice and is not yet available as a treatment of any type for human patients. The study was co-authored by Marielena Castro, Joe Maenza and Jason Weis of Coulter BME at Georgia Tech. The research was funded by the National Institutes of Health Director’s New Innovator Award (grant #DP2HD091793), the NIH National Center for Advancing Translational Sciences (grant #UL1TR000454), the NIH GT BioMAT Training Grant (#5T32EB006343), the National Science Foundation (grant # DGE-1451512), the Shurl and Kay Curci Foundation, and the Burroughs Wellcome Fund. Any findings or opinions are those of the authors and not necessarily of the funding agencies.

Nanotechnology Offers Next-Generation Marijuana Delivery


Dope-April-13In only a brief period of time, the cannabis industry has achieved remarkable progress, bringing to market safer, more intelligent product offerings, backed by research and designed to not only enhance but transform the consumer experience. Science and biotechnology companies are increasingly breaking down barriers, fighting the stigma, and making important strides toward gaining widespread acceptance of marijuana as a legitimate medical tool and, more specifically, as a trusted alternative to harmful and addictive prescription opiates.

These significant advancements are a direct reflection of the industry’s unmistakable drive to continuously improve itself. As our knowledge of the plant’s capabilities expands, we have in turn discovered a great deal about healthier approaches and delivery systems. For instance, we have learned that microdosing is a safer, more reliable and desirable approach to enjoying the therapeutic qualities of the plant—allowing consumers to experience maximum benefits from a minimal amount of product. We have also learned a great deal about intraoral, intranasal and transdermal administration methods, which allow for more efficient delivery of cannabinoids and a higher bioavailability.

Today’s cannabis consumers—representing a broader demographic range than ever before—are demanding these alternatives. Yogis, professional athletes, parents and grandparents alike are seeking cannabis as a natural treatment for everything from anxiety and depression to inflammation and chronic pain. They want control over dosing precision, without worrying about taking too much or waiting too long—and they want discreet products for on-the-go use, anytime and anywhere.

This new age of marijuana delivery is marked by smoke-free administration mechanisms that utilize groundbreaking technology to provide necessary alternatives to inconsistent edibles and potentially lung-damaging inhalation delivery methods. As the cannabis wellness revolution gains momentum, NanoSphere Health Sciences is at the forefront of modern research and development activity, helping the industry evolve by creating products that are clean and trustworthy; products that really work—without negative side effects like paranoia or lethargy. Our secret? Nanotechnology.

The next generation of the industry is here. Cannabis 2.0 has arrived, and it’s being ushered in by products like Evolve Formulas’ Transdermal NanoSerum™—a novel, first-of-its-kind product paving the way for smarter cannabinoid delivery by using the unique, patented NanoSphere Delivery System™.

Utilizing the world’s first (and only) scientifically-supported nanoparticle delivery system for cannabis, Evolve’s NanoSerum™ is the sole cannabis transdermal on the market that can break the blood-brain barrier, penetrating five layers of skin to deliver beneficial cannabinoids into the bloodstream and systemic circulation within three minutes.

With record-breaking efficacy and bioavailability, the Evolve NanoSerum™ took home a well-deserved honor as the 2017 DOPE Cup Winner for Best Transdermal. It is, after all, the only transdermal to transport THC swiftly into the bloodstream and to the CB1 and CB2 receptors of the endocannabinoid system. Forget messy topicals and pesky patches—Evolve’s carefully crafted formulation is absorbed instantly, bringing targeted relief to specific problem sites, offering faster uptake without the hassle.

Despite undeniable growth and success, this product represents only the beginning of the industry’s cutting-edge breakthroughs still to come. We have merely scratched the surface of the many possibilities that lie ahead. If we as an industry are to further our progress, we must challenge ourselves to continue pushing the boundaries of innovation, harnessing science and technology to give rise to safe, effective and smoke-free options in medical and recreational cannabis consumption, helping make the benefits of the cannabis plant accessible on a broader scale.