Why Do Most Science Startups Fail? Here’s Why …


Science Start ups fail why getty_629009118_355815

“We need to get a lot better at bridging that gap between discovery and commercialization”

G. Satell – Inc. Magazine

It seems like every day we see or hear about a breakthrough new discovery that will change everything. Some, like perovskites in solar cells and CRISPR are improvements on existing technologies. Others, like quantum computing and graphene promise to open up new horizons encompassing many applications. Still others promise breakthroughs in Exciting Battery Technology Breakthrough News — Is Any Of It Real? or Beyond lithium — the search for a better battery

Nevertheless, we are still waiting for a true market impact. Quantum computing and graphene have been around for decades and still haven’t hit on their “killer app.” Perovskite solar cells and CRISPR are newer, but haven’t really impacted their industries yet. And those are just the most prominent examples.

bright_idea_1_400x400The problem isn’t necessarily with the discoveries themselves, many of which are truly path-breaking, but that there’s a fundamental difference between discovering an important new phenomenon in the lab and creating value in the marketplace.

“We need to get a lot better at bridging that gap. To do so, we need to create a new innovation ecosystem for commercializing science.”

The Valley Of Death And The Human Problem

The gap between discovery and commercialization is so notorious and fraught with danger that it’s been unaffectionately called the “Valley of Death.” Part of the problem is that you can’t really commercialize a discovery, you can only commercialize a product and those are two very different things.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. After something like graphene is discovered in the lab, it needs to be engineered into a useful product and then it has to gain adoption by winning customers in the marketplace. Those three things almost never happen in the same place.

So to bring an important discovery to market, you first need to identify a real world problem it can solve and connect to engineers who can transform it into a viable product or service. Then you need to find customers who are willing to drop whatever else they’ve been doing and adopt it on a large scale. That takes time, usually about 30 years.

The reason it takes so long is that there is a long list of problems to solve. To create a successful business based on a scientific discovery, you need to get scientists to collaborate effectively with engineers and a host of specialists in other areas, such as manufacturing, distribution and marketing. Those aren’t just technology problems, those are human problems. Being able to collaborate effectively is often the most important competitive advantage.

Wrong Industry, Wrong Application

One of the most effective programs for helping to bring discoveries out of the lab is I-Corps. First established by the National Science Foundation (NSF) to help recipients of SBIR grants identify business models for scientific discoveries, it has been such an extraordinary success that the US Congress has mandated its expansion across the federal government.

Based on Steve Blank’s lean startup methodology, the program aims to transform scientists into entrepreneurs. It begins with a presentation session, in which each team explains the nature of their discovery and its commercial potential. It’s exciting stuff, pathbreaking science with real potential to truly change the world.

The thing is, they invariably get it wrong. Despite their years of work to discover something of significance and their further efforts to apply and receive commercialization grants from the federal government, they fail to come up with a viable application in an industry that wants what they have to offer. professor-with-a-bright-idea-vector-937691

Ironically, much of the success of the I-Corps program is due to these early sessions. Once they realize that they are on the wrong track, they embark on a crash course of customer discovery, interviewing dozens — and sometimes hundreds — of customers in search of a business model that actually has a chance of succeeding.

What’s startling about the program is that, without it, scientists with important discoveries often wasted years trying to make a business work that never really had a chance in the first place.

The Silicon Valley Myth

Much of the success of Silicon Valley has been based on venture-funded entrepreneurship. Startups with an idea to change the world create an early stage version of the product they want to launch, show it to investors and get funding to bring it to market. Just about every significant tech company was started this way.

Yet most of the success of Silicon Valley has been based on companies that sell either software or consumer gadgets, which are relatively cheap and easy to rapidly prototype. Many scientific startups, however, do not fit into this category. Often, they need millions of dollars to build a prototype and then have to sell to industrial companies with long lead times.

start up imagesThe myth of Silicon Valley is that venture-funded entrepreneurship is a generalizable model that can be applied to every type of business. It is not. In fact, it is a specific model that was conceived in a specific place at a specific time to fund mature technologies for specific markets. It’s not a solution that fits every problem.

The truth is that venture funds are very adept with assessing market risk, but not so good at taking on technology risk, especially in hard sciences. That simply isn’t what they were set up to do.

We Need A New Innovation Ecosystem For Science Entrepreneurship

In 1945, Vannevar Bush delivered a report, Science, The Endless Frontier, to President Truman, in which he made the persuasive argument that expanding the nation’s scientific capacity will expand its economic capacity and well being. His call led, ultimately, to building America’s scientific infrastructure, including programs like the NSF and the National Institutes of Health (NIH).

It was Bush’s vision that made America a technological superpower. Grants from federal agencies to scientists enabled them to discover new knowledge. Then established businesses and, later, venture backed entrepreneurs would then take those discoveries to bring new products and services to market.

Look at any industry today and its most important technologies were largely shaped by investment from the federal government. Today, however, the challenges are evolving. We’re entering a new era of innovation in which technologies like genomics, nanotechnology and robotics are going to reshape traditional industries like energy, healthcare and manufacturing.

That’s exciting, but also poses new challenges, because these technologies are ill-suited to the Silicon Valley model of venture-funded entrepreneurship and need help to them get past the Valley of Death. So we need to build a new innovation ecosystem on top of the scientific architecture Bush created for the post-war world.

There have been encouraging signs. New programs like I-Corps, the Manufacturing InstitutesCyclotron Road and Chain Reaction are beginning to help fill the gap.

Still much more needs to be done, especially at the state and local level to help build regional hubs for specific industries, if we are going to be nearly as successful in the 21st century as were were in the 20th.

Cape-Starman

Advertisements

A real “boost” in the design and development of graphene-based light detection technology – the photoexcited graphene puzzle solved


thephotoexci
Schematic representation of the ultrafast optical pump – terahertz probe experiment, where the optical pump induces electron heating and the terahertz pulse is sensitive to the conductivity of graphene directly after this heating process, …more

 

Light detection and control lies at the heart of many modern device applications, such as the cameras in phones. Using graphene as a light-sensitive material for light detectors offers significant improvements with respect to materials being used nowadays. For example, graphene can detect light of almost any colour, and it gives an extremely fast electronic response within one millionth of a millionth of a second. Thus, in order to properly design graphene-based light detectors, it is crucial to understand the processes that take place inside the graphene after it absorbs light.

A team of European scientists has now succeeded in understanding these processes. Published recently in Science Advances, their work gives a thorough explanation of why, in some cases,  conductivity increases after  absorption, and in other cases, it decreases. The researchers show that this behaviour correlates with the way in which energy from absorbed light flows to the graphene electrons: After light is absorbed by the graphene, the processes through which graphene electrons  up happen extremely fast and with a very high efficiency.

For highly doped graphene (where many free electrons are present), ultrafast electron heating leads to carriers with elevated energy—hot carriers—which, in turn, leads to a decrease in conductivity. Interestingly enough, for weakly doped graphene (where not so many free electrons are present), electron heating leads to the creation of additional , and therefore an increase in conductivity. These additional carriers are the direct result of the gapless nature of graphene—in gapped , electron heating does not lead to additional free carriers.

This simple scenario of light-induced electron heating in graphene can explain many observed effects. Aside from describing the conductive properties of the material after light absorption, it can explain carrier multiplication, where—under specific conditions—one absorbed light particle (photon) can indirectly generate more than one additional free electron, and thus create an efficient photoresponse within a device.

The results of the paper, in particular, understanding electron heating processes accurately, will definitely mean a great boost in the design and development of graphene-based light detection technology.

 Explore further: Atomically thin building blocks could make optoelectrical devices more efficient

More information: “The ultrafast dynamics and conductivity of photoexcited graphene at different Fermi energies” Science Advances (2018). advances.sciencemag.org/content/4/5/eaar5313

Read more at: https://phys.org/news/2018-05-photoexcited-graphene-puzzle.html#jCp

Graphene smart contact lenses could give you thermal infrared and UV vision


A breakthrough in graphene imaging technology means you might soon have a smart contact lens, or other ultra-thin device, with a built-in camera that also gives you infrared “heat vision.” By sandwiching two layers of graphene together, engineers at the University of Michigan have created an ultra-broadband graphene imaging sensor that is ultra-broadband (it can capture everything from visible light all the way up to mid-infrared) — but more importantly, unlike other devices that can see far into the infrared spectrum, it operates well at room temperature.

As you probably know by now, graphene has some rather miraculous properties — including, as luck would have it, a very strong effect when it’s struck by photons (light energy). Basically, when graphene is struck by a photon, an electron absorbs that energy and becomes a hot carrier — an effect that can be measured, processed, and turned into an image. The problem, however, is that graphene is incredibly thin (just one atom thick) and transparent — and so it only absorbs around 2.3% of the light that hits it. With so little light striking it, there just aren’t enough hot carrier electrons to be reliably detected. (Yes, this is one of those rare cases where being transparent and super-thin is actually a bad thing.)

Zhaohui Zhong and friends at the University of Michigan, however, have devised a solution to this problem. They still use a single layer of graphene as the primary photodetector — but then they put an insulating dielectric beneath it, and then another layer of graphene beneath that. When light strikes the top layer, the hot carrier tunnels through the dielectric to the other side, creating a charge build-up and strong change in conductance. In effect, they have created a phototransistor that amplifies the small number of absorbed photons absorbed by the top layer (gate) into a large change in the bottom layer’s conductance (channel).

In numerical terms, raw graphene generally produces a few milliamps of power per watt of light energy (mA/W) —  the Michigan phototransistor, however, is around 1 A/W, or around 100 times more sensitive. This is around the same sensitivity as CMOS silicon imaging sensors in commercial digital cameras.

The prototype device created by Zhong and co. is already “smaller than a pinky nail” and can be easily scaled down. By far the most exciting aspect here is the ultra-broadband sensitivity — while the silicon sensor in your smartphone can only register visible light, graphene is sensitive to a much wider range of wavelengths, from ultraviolet at the bottom, all the way to far-infrared at the top.

In this case, the Michigan phototransistor is sensitive to visible light and up to mid-infrared — but it’s entirely possible that a future device would cover UV and far-IR as well.

There are imaging technologies that can see in the UV and IR ranges, but they generally require bulky cryogenic cooling equipment; the graphene phototransistor, on the other hand, is so sensitive that it works at room temperature. [Research paper: doi:10.1038/nnano.2014.31 – “Graphene photodetectors with ultra-broadband and high responsivity at room temperature”]

Now, I think we can all agree that a smartphone that can capture UV and IR would be pretty damn awesome — but because this is ultra-thin-and-light-and-efficient graphene we’re talking about, the potential, futuristic applications are far more exciting. For me, the most exciting possibility is building graphene imaging technology into smart contact lenses. At first, you might just use this data to take awesome photos of the environment, or to give you you night/thermal vision through a display built into the contact lens. In the future, though, as bionic eyes and retinal implants improve, we might use this graphene imaging tech to wire UV and IR vision directly into our brains.

Imagine if you could look up at the sky, and instead of seeing the normal handful of stars, you saw this:

The Milky Way, as seen by NASA’s infrared Spitzer telescope

That’d be pretty sweet.

Forbes on Energy: We Don’t Need Solar And Wind To Save The Climate — And It’s A Good Thing, Too


France and Sweden show solar and wind are not needed to [+] Special Contributor, M. Shellenberger

For 30 years, experts have claimed that humankind needs to switch to solar and wind energy to address climate change. But do we really?

Consider the fact that, while no nation has created a near-zero carbon electricity supply out of solar and wind, the only successful efforts to create near-zero carbon electricity supplies didn’t require solar or wind whatsoever.

As such solar and wind aren’t just insufficient, they are also unnecessary for solving climate change.

That turns out to be a good thing.

Sunlight and wind are inherently unreliable and energy-dilute. As such, adding solar panels and wind turbines to the grid in large quantities increases the cost of generating electricity, locks in fossil fuels, and increases the environmental footprint of energy production.

There is a better way. But to understand what it is, we first must understand the modern history of renewable energies.

Renewables Revolution: Always Just Around the Corner

Most people think of solar and wind as new energy sources. In fact, they are two of our oldest.

The predecessor to Stanford University Professor Mark Jacobson, who advocates “100 percent renewables,” is A man named John Etzler.

In 1833, Etzler proposed to build massive solar power plants that used mirrors to concentrate sunlight on boilers, mile-long wind farms, and new dams to store power.

Even electricity-generating solar panels and wind turbines are old. Both date back to the late 1800s.

Throughout the 20th Century, scientists claimed — and the media credulously reported — that solar, wind, and batteries were close to a breakthrough that would allow them to power all of civilization.

Consider these headlines from The New York Times and other major newspapers:

• 1891: “Solar Energy: What the Sun’s Rays Can Do and May Yet Be Able to Do“ — The author notes that while solar energy was not yet economical “…the day is not unlikely to arrive before long…”

• 1923: “World Awaits Big Invention to Meet Needs of Masses “…solar energy may be developed… or tidal energy… or solar energy through the production of fuel.”

• 1931: “Use of Solar Energy Near a Solution.” “Improved Device Held to Rival Hydroelectric Production”

• 1934: “After Coal, The Sun” “…surfaces of copper oxide already available”

• 1935: “New Solar Engine Gives Cheap Power”

• 1939. “M.I.T. Will ‘Store’ Heat of the Sun”

• 1948: “Changing Solar Energy into Fuel “Blocked Out” in GM Laboratory”  “…the most difficult part of the problem is over…”

• 1949: “U.S. Seeks to Harness Sun, May Ask Big Fund, Krug Says”

Reporters were as enthusiastic about renewables in 1930s as they are today.

“It is just possible the world is standing at a turning point,” a New York Times reporter gushed in 1931, “in the evolution of civilization similar to that which followed the invention by James Watt of the steam engine.”

Decade after decade, scientists and journalists re-discovered how much solar energy fell upon the earth.

“Even on such an area as small as Manhattan Island the noontime heat is enough, could it be utilized, to drive all the steam engines in the world,” The Washington Star reported in 1891.

Progress in chemistry and materials sciences was hyped. “Silver Selenide is Key Substance,” The New York Times assured readers.

In 1948, Interior Secretary Krug called for a clean energy moonshot consisting of “hundreds of millions” for solar energy, pointing to its “tremendous potential.”

R&D subsidies for solar began shortly after and solar and wind production subsidies began in earnest in the 1970s.

Solar and wind subsidies increased substantially, and were increased in 2005 and again in 2009 on the basis of a breakthrough being just around the corner.

By 2016, renewables were receiving 94 times more in U.S. subsidies than nuclear and 46 times more than fossil fuels per unit of energy generated.

According to Bloomberg New Energy Finance (BNEF), public and private actors spent $1.1 trillion on solar and over $900 billion on wind between 2007 and 2016.

Global investment in solar and wind hovered at around $300 billion per year between 2010 and 2016.

Did the solar and wind energy revolution arrive?

Judge for yourself: in 2016, solar and wind constituted 1.3 and 3.9 percent of the planet’s electricity, respectively.

Real World Renewables

Are there places in the world where wind and solar have become a significant share of electricity supplies?

The best real-world evidence for wind’s role in decarbonization comes from the nation of Denmark. By 2017, wind and solar had grown to become 48 and 3 percent of Denmark’s electricity.

Does that make Denmark a model?

Not exactly. Denmark has fewer people than Wisconsin, a land area smaller than West Virginia, and an economy smaller than the state of Washington.

Moreover, the reason Denmark was able to deploy so much wind was because it could easily export excess wind electricity to neighboring countries — albeit at a high cost: Denmark today has the most expensive electricity in Europe.

And as one of the world’s largest manufacturers of turbines, Denmark could justify expensive electricity as part of its export strategy.

As for solar, those U.S. states that have deployed the most of it have seen sharp rises in their electricity costs and prices compared to the national average.

As recently as two years ago, some renewable energy advocates held up Germany as a model for the world.

No more. While Germany has deployed some of the most solar and wind in the world, its emissions have been flat for a decade while its electricity has become the second most expensive in Europe.

More recently, Germany has permitted the demolition of old forests, churches, and villages in order to mine and burn coal.

Meanwhile, the two nations whose electricity sectors produce some of the least amount of carbon emissions per capita of any developed nation did so with very little solar and wind: France and Sweden.

Sweden last year generated a whopping 95 percent of its total electricity from zero-carbon sources, with 42 and 41 coming from nuclear and hydroelectric power.

France generated 88 percent of its total electricity from zero-carbon sources, with 72 and 10 coming from nuclear and hydroelectric power.

Other nations like Norway, Brazil, and Costa Rica have almost entirely decarbonized their electricity supplies with the use of hydroelectricity alone.

That being said, hydroelectricity is far less reliable and scalable than nuclear.

Brazil is A case in point. Hydro has fallen from over 90 percent of its electricity 20 years ago to about two-thirds in 2016. Because Brazil failed to grow its nuclear program in the 1990s, it made up for new electricity growth with fossil fuels.

And both Brazil and hydro-heavy California stand as warnings against relying on hydro-electricity in a period of climate change. Both had to use fossil fuels to make up for hydro during recent drought years.

That leaves us with nuclear power as the only truly scalable, reliable, low-carbon energy source proven capable of eliminating carbon emissions from the power sector.

Why This is Good News

The fact that we don’t need renewables to solve climate change is good news for humans and the natural environment.

The dilute nature of water, sunlight, and wind means that up to 5,000 times more land and 10 – 15 times more concrete, cement, steel, and glass, are required than for nuclear plants.

All of that material throughput results in renewables creating large quantities of waste, much of it toxic.

For example, solar panels create 200 – 300 times more hazardous waste than nuclear, with none of it required to be recycled or safely contained outside of the European Union.

Meanwhile, the huge amounts of land required for solar and wind production has had a devastating impact on rare and threatened desert tortoises, bats, and eagles — even when solar and wind are at just a small percentage of electricity supplies.

Does this mean renewables are never desirable?

Not necessarily. Hydroelectric dams remain the way many poor countries gain access to reliable electricity, and both solar and wind might be worthwhile in some circumstances.

But there is nothing in either their history or their physical attributes that suggests solar and wind in particular could or should be the centerpiece of efforts to deal with climate change.

In fact, France demonstrates the costs and consequences of adding solar and wind to an electricity system where decarbonization is nearly complete.

France is already seeing its electricity prices rise as a result of deploying more solar and wind.

Because France lacks Sweden’s hydroelectric potential, it would need to burn far more natural gas (and/or petroleum) in order to integrate significantly more solar and wind.

If France were to reduce the share of its electricity from nuclear from 75 percent to 50 percent — as had been planned — carbon emissions and the cost of electricity would rise.

It is partly for this reason that France’s president recently declared he would not reduce the amount of electricity from nuclear.

Some experts recently pointed out that nuclear plants, like hydroelectric dams, can ramp up and down. France currently does so to balance demand.

But ramping nuclear plants to accommodate intermittent electricity from solar and wind simply adds to the cost of making electricity without delivering fewer emissions or much in the way of cost-savings. That’s because only very small amounts of nuclear fuel and no labor is saved when nuclear plants are ramped down.

Do We Need Solar and Wind to Save Nuclear?

While solar and wind are largely unnecessary at best and counterproductive at worst when it comes to combating climate change, might we need to them in support of a political compromise to prevent nuclear plants from closing?

At least in some circumstances, the answer is yes. Recently in New Jersey, for example, nuclear energy advocates had to accept a subsidy rate 18 to 28 times higher for solar than for nuclear.

The extremely disproportionate subsidy for solar was a compromise in exchange for saving the state’s nuclear plants.

While nuclear enjoys the support of just half of the American people, for example, solar and wind are supported by 70 to 80 percent of them. Thus, in some cases, it might make sense to package nuclear and renewables together.

But we should be honest that such subsidies for solar and wind are policy sweeteners needed to win over powerful financial interests and not good climate policy.

What matters most is that we accept that there are real world physical obstacles to scaling solar and wind.

Consider that the problem of the unreliability of solar has been discussed for as long as there have existed solar panels. During all of that time, solar advocates have waved their hands about potential future solutions.

“Serious problems will, of course, be raised by the fact that sun-power will not be continuous,” wrote a New York Times reporter in 1931. “Whether these will be solved by some sort of storage arrangement or by the operating of photogenerators in conjuction with some other generator cannot be said at present.”

We now know that, in the real world, electricity grid managers cope with the unreliability of solar by firing up petroleum and natural gas generators.

As such —  while there might be good reasons to continue to subsidize the production of solar and wind — their role in locking in fossil fuel generators means that climate change should not be one of them.

Watch a YouTube Video on Our Latest Project

How Lockheed Martin’s and Elcora Advanced Materials (Graphene) Partnership may Revolutionize Military “driverless vehicles” and Lithium-Ion Batteries


Elcora 2 BG-3-elcora

Maintaining a global supply chain is one of the most secretive and understated keys to the success of a military campaign. As described by the U.S. Army, the quick and efficient transport of goods like water, food, fuel, and ammunition has been essential in winning wars for thousands of years. Supply chain and logistics management has evolved to include, “storage of goods, services, and related information between the point of origin and the point of consumption”. In essence, that means the movement of vehicles bringing precious cargo from the home base to the soldiers fighting on the front lines.

Security and strategic operations are critical elements in the fulfillment of this potentially hazardous supply chain. Enemy forces hiding in the bushes can open fire to try to slow down the troops’ movement. With mines littered all over the war zone, all it would take is one wrong step, and the truck and the people in them, would be blown to smithereens.

One ingenious solution is the deployment of an automated military convoy run by a military commander, which can reduce risks and their accompanying vulnerabilities. In line with this, advanced defense contractor Lockheed Martin Canada (NYSE:LMT) has successfully tested “driverless trucks” on two active U.S. military bases.

Call it the soldier’s equivalent of a smart fleet of cars that would take the currently popular concept of self-driving vehicles to a whole new, safer level. Human operators would still be needed to guide the vehicles towards their destinations. However, because this could be accomplished remotely, very little time would be lost to the exchange of hostilities, as these smart military vehicles would be impervious to the enemy’s usual attempts at distraction. And in case firepower does break out, the loss of life, as well as injury to the troops, would be minimal.

The memorandum of agreement signed between Elcora and Lockheed Martin, is not the usual corporate alliance but bears important long-term repercussions for sectors such as transport, security, and the military-industrial complex. Lockheed Martin is a leviathan in the aerospace, defense, weaponry, and other technologies that have been instrumental in keeping many of the nations of the world safe. elcora-advanced-materials 3

The Lithium-ion (or Li-ion) batteries that it uses to store energy in many of its technologies and processes are critical to upholding the operations being conducted in many of its devices, plants, and facilities. The more energy that these batteries can store, the longer the systems and machines can function, without interruption, and in compliance with the highest standards of safety.

This is where Elcora comes in. The future of military supply chain and logistics management is accelerating thanks to Lockheed’s recently signed partnership with end-to-end graphene producer Elcora Advanced Materials (TXSV:ERAOTC:ECORF).

Elcora graphene-uses 1One element that can ensure the consistent and reliable powering up for the Li-ion batteries is graphene, an element derived from graphite minerals. Elcora is one of the few companies that produce and distribute graphene in one dynamic end-to-end operation, from the time that the first rocks are mined in Sri Lanka, to the time that they are refined, developed, and purified in the company’s facilities in Canada. The quality of the graphene that comes out of Elcora’s pipeline is higher than those usually found in the market. This pristine quality can help the Li-ion batteries increase their storage of power without adding further cost.

Li-ion batteries are already being sought after for prolonging the lifespan of power charged in a wide range of devices, from the ubiquitous smartphones, to the electric cars that innovators like Elon Musk are pushing to become more mainstream in our roads and highways. Lockheed Martin will also be using them in the military vehicles that will be guided by their Autonomous Mobility Applique Systems (AMAS), or the ‘driverless military convoy’, as described above. The tests have shown that these near-smart vehicles have already clocked in 55,000 miles. Lockheed is looking forward to completing the tests and fast-forwarding to deploying them for actual use in military campaigns.

Rice Chart for LiIo Batts 2-riceuscienti

The importance of long-lasting Li-ion batteries in the kind of combat arena that Lockheed Martin is expert in cannot be overestimated. With electric storage given a lengthier lifespan by the graphene anode in the batteries, the military commanders guiding the smart convoys do not have to fear any anticipated technical breakdown. They can also count on the batteries to sustain the vehicles’ power and carry them through to the completion of their mission if something unexpected happens. The juice in those Li-ion batteries will last longer, which is critical in crises such as the sudden appearance of combatants.

Sometimes, the winner in war turns out to be the force that is the more resilient and sustaining power. As the ancient Chinese master of war Sun Tzu had warned eons ago, sometimes “the line between order and disorder”—or victory or defeat—“lies in logistics.” Through its graphene-constituted Li-ion batteries, The Lockheed Martin-Elcora alliance can certainly enhance any military force’s capacity in that area.

* Article from Technology.org

Also Read About:

GNT US Tenka EnergySuper Capacitor Assisted Silicon Nanowire and Graphene Batteries for EV and Small Form Factor Markets. A New Class of Battery /Energy Storage Materials is being developed to support the Energy Dense – High Capacity – High Performance High Cycle Battery and Super Capacitor Markets.

Genesis Nanotechnology: “Great Things from Small Things”

What Happens when Graphene is “twisted” into spirals—researchers synthesize helical nanographen – demonstrates outstanding charge and heat transport properties


Heli grapheneThis visualisation shows layers of graphene used for membranes. Credit: University of Manchester

It’s probably the smallest spring you’ve ever seen. Researchers from Kyoto University and Osaka University report for the first time in the Journal of the American Chemical Society the successful synthesis of hexa-peri-hexabenzo[7]helicene, or helical nanographene. These graphene constructs previously existed only in theory, so successful synthesis offers promising applications including nanoscale induction coils and molecular springs for use in nanomechanics.

Graphene, a hexagonal lattice of single-layer carbon atoms exhibiting outstanding charge and heat transport properties, has garnered extensive research and development interest. Helically twisted graphenes have a spiral shape. Successful synthesis of this type of  could have major applications, but its model compounds have never been reported. And while past research has gotten close, resulting compounds have never exhibited the expected properties.

“We processed some basic chemical  through step-by-step reactions, such as McMurry coupling, followed by stepwise photocyclodehydrogenation and aromatization,” explains first author Yusuke Nakakuki. “We then found that we had synthesized the foundational backbone of helical graphene.”

The team confirmed the helicoid nature of the structure through X-ray crystallography, also finding both clockwise and counter-clockwise nanographenes. Further tests showed that the electronic structure and photoabsorption properties of this compound are much different from previous ones. “This helical nanographene is the first of its kind,” concludes lead author Kenji Matsuda. “We will try to expand their surface area and make the helices longer. I expect to find many new physical properties as well.”

The paper, titled “Hexa-peri-hexabenzo[7]helicene: Homogeneously π-Extended Helicene as a Primary Substructure of Helically Twisted Chiral Graphenes,” appeared 19 March 2018 in the Journal of the American Chemical Society.

(From Phys.org)

 Explore further: Synthesis of a water-soluble warped nanographene and its application for photo-induced cell death

More information: Yusuke Nakakuki et al, Hexa-peri-hexabenzo[7]helicene: Homogeneously π-Extended Helicene as a Primary Substructure of Helically Twisted Chiral Graphenes, Journal of the American Chemical Society (2018). DOI: 10.1021/jacs.7b13412

BREAKTHROUGH DISCOVERY – NEW GRAPHENE BIOMATERIAL REGENERATES HEART AND NERVE TISSUE


One of the biggest challenges to the recovery of someone who has experienced a major physical trauma such as a heart attack is the growth of scar tissue.

As scar tissue builds up in the heart, it can limit the organ’s functions, which is obviously a problem for recovery.

However, researchers from the Science Foundation Ireland-funded Advanced Materials and BioEngineering Research (AMBER) Centre have revealed a new biomaterial that actually ‘grows’ healthy tissue – not only for the heart, but also for people with extensive nerve damage.

In a paper published to Advanced Materials, the team said its biomaterial regenerating tissue responds to electrical stimuli and also eliminates infection.

The new material developed by the multidisciplinary research team is composed of the protein collagen, abundant in the human body, and the atom-thick ‘wonder material’ graphene.

The resulting merger creates an electroconductive ‘biohybrid’, combining the beneficial properties of both materials and creating a material that is mechanically stronger, with increased electrical conductivity.

This biohybrid material has been shown to enhance cell growth and, when electrical stimulation is applied, directs cardiac cells to respond and align in the direction of the electrical impulse.

Could repair spinal cord

It is able to prevent infection in the affected area because the surface roughness of the material – thanks to graphene – results in bacterial walls being burst, simultaneously allowing the heart cells to multiply and grow.

For those with extensive nerve damage, current repairs are limited to a region only 2cm across, but this new biomaterial could be used across an entire affected area as it may be possible to transmit electrical signals across damaged tissue.

Speaking of the breakthrough, Prof Fergal O’Brien, deputy director and lead investigator on the project, said: “We are very excited by the potential of this material for cardiac applications, but the capacity of the material to deliver physiological electrical stimuli while limiting infection suggests it might have potential in a number of other indications, such as repairing damaged peripheral nerves or perhaps even spinal cord.

“The technology also has potential applications where external devices such as biosensors and devices might interface with the body.”

The study was led by AMBER researchers at the Royal College of Surgeons in Ireland in partnership with Trinity College Dublin and Eberhard Karls University in Germany.

What if you discovered an Amazing new Material? (The story of) Graphene: The Superstrong, Superthin, and Superversatile Material That Will Revolutionize the World


National Graphene Book 615x461-nmjnp8bjqppb6igqnmkoybjweis6v8prwxwiytx5e0-nmjnqdog3f9xdcsny4ga0146jh4c9x9yqmlw307ts8

 

From National Graphene Association: News

With permission from the authors, Les Johnson and Joseph E. Meany, the preface of their new book, Graphene: The Superstrong, Superthin, and Superversatile Material That Will Revolutionize the World, follows. The book was published by Prometheus Books this month, and is available via Amazon, Barnes & Noble, or at an independent bookstore near you.

“What if you discovered an infinitesimally thin material capable of con­ducting electricity, able to suspend millions of times its own weight, and yet porous enough to filter the murkiest water? And what if this substance was created from the same element as that filling the common pencil? This extraordinary material, graphene, is not a work of science fiction. A growing cadre of scientists aims to make graphene a mainstay technological material by the second half of the twenty-first century. Not satisfied with that timeline, some entrepreneurial types would like to see widespread adoption of graphene within the next decade. How could this be possible?

Graphene is elegant. It is created from a single element, carbon, formed by just one type of bond. Despite graphene’s apparent simplicity, isolating the material was an elusive “Holy Grail” for chemists and physicists alike. Even as the periodic table extended beyond the hundred-odd elements naturally found on Earth, galaxies were charted, and the human genome solved, this material, with the simple chemical formula of C, remained a distant goal at the frontiers of science. Why was this? Graphene excels at hiding in plain sight, and the techniques and instrumentation perfected in the last two decades have played a pivotal role in its discovery.

Carbon, the sole constituent of graphene, is all around us. The element is the fourth most common in the entire universe. Most people think of materials in terms of atoms and molecules, where molecules are made from defined types and numbers of atoms. With graphene, counting carbon atoms is inconsequential. Merely the way in which the constituent carbons are bound to one another is crucial, with this feature separating graphene from other wholly carbon materials like diamonds and graphite. At the atomic level, the exclusively carbon graphene resembles a hexagonal “chicken wire” fence, with each carbon atom making up the point of a hexagon. The hexagonal distribution makes graphene’s earth-shattering properties possible, as the distribution allows the individual carbon atoms of graphene to lay flat.

This property of graphene cannot be overlooked. Graphene is a perfect anomaly in the world of chemistry—a flat, two-dimensional molecule, with a single sheet of graphene measuring only one atom thick. You might imme­diately question the structural integrity of graphene due to its delightfully simplistic construction, but the weaving of the carbon hexagons throughout the structure makes the atomically thin material unexpectedly strong.

Proper application of graphene holds the key to revolutionizing mate­rials technology in the latter half of the twenty-first century, but at what cost? Thankfully, not a substantial environmental one. There is a critical difference between graphene and another linchpin of modern technology, rare-earth metals. These hard-won rare-earth metals, metals including tan­talum, neodymium, and lanthanum, are found everywhere, from the inside of our smartphones to pharmaceuticals. Unlike with rare-earth metals, we do not need armies of manual laborers assisted by heavy equipment and an endless parade of fifty-five gallon drums of polluting solvents to find and retrieve graphene, due to one simple fact: graphene’s elemental con­stituent, carbon, is all around us. The most common precursor of graphene today is the mined mineral graphite. Rare-earth metals are scarce, but the integration of graphene into our lives would not be driven by the acqui­sition of raw materials and disputes between superpowers, but would be guided by the possession of knowledge, with patents and technology sepa­rating the victors and the vanquished.

3D Graphene

You have experienced synthesizing graphene, maybe even earlier today, on a very small scale. The pressure exerted by your hand and finger­tips likely created a few layers of graphene the last time you ran a pencil across a notepad, turning humble graphite into graphene as you wrote this week’s grocery list. But if graphene can be made by such simple means, and its sole constituent, carbon, leads oxygen, nitrogen, and hydrogen in the hierarchy of elements that construct our living world, why is graphene just now, in the twenty-first century, coming to the forefront of human understanding?

The answer to this question is where the story resides. The story of graphene is a story of accidental discovery. A story of corporations and gov­ernments racing to spend billions of dollars in hopes of funding research and development projects to discover a material still years away from store shelves. A story of new materials that will disrupt the way we create things, and, in doing so, what we can create. The previous technological revolu­tions taught us many things. Each new discovery allowed us to break into new experimental territories and further our understanding of what is pos­sible to accomplish. Chemical batteries allowed energy to be stored for future use (like light at night). Steam power allowed us to generate tremen­dous amounts of energy to accomplish tasks no living thing could. This new revolution may allow us to throw off the shackles of metallic wires.

Graphene020216 NewsImage_34318Since at least the 1950s, people have been trying to take graphite out of the ground and turn it into a pile of black gold. This effort was met with fifty years of resistance from the graphite, which has not so easily been coaxed to divulge its secrets. When graphene was finally isolated and examined, physicists and chemists were astounded at what they found. The history beneath this discovery is not so straightforward, though, and it traces its roots all the way back to 1859 in Great Britain. How appropriate, then, that the country already well-known for its history involving carbon should be the country where single-layer graphite was finally witnessed.

After two researchers in Great Britain, Konstantin Novoselov and Andre Geim, were awarded the Nobel Prize in Physics in 2010, technology magazines everywhere heralded a new era of “wonder materials” based around this atomically thin tessellation of carbon atoms. With its incred­ibly high strength and almost impossibly low electrical resistance, graphene pulled back a hidden curtain, allowing scientists to catch a glimpse of the marvels that lay beyond. With the shrouds lifted, the groundwork was laid to revolutionize how we will go about designing and making everything from cars to vaccines and from food packaging to spaceships.Fisker-EV-graphene-battery-img_assist-400x225

The economic potential of this material cannot be understated. Being atomically thin, graphene can be incorporated almost seamlessly into any modern product, with appreciable effect. Early investors were burned, however, by entrepreneurs who over-promised and under-delivered on performance aspects for products (especially composites like plastics) that had graphene in them but that did not use graphene in a way that made its incorporation worth the added expense. It was, in some cases, just an added bit of snake oil. As the overall volume from new production methods and the quality of the resulting graphene have both increased with time, we are starting to finally see graphene’s true benefits. Governmental support is higher than ever in many countries, as whomever discovers a high-throughput production method for pristine graphene will reap signifi­cant financial rewards on the world stage.”

Les-at-TEDx-close-up-300x223

Author, Les Johnson

JoeMeanyHeadshots-4small-200x300

Author, Joe Meany

PROTON TRANSPORT IN GRAPHENE SHOWS PROMISE FOR RENEWABLE ENERGY


RESEARCHERS AT THE UNIVERSITY OF MANCHESTER HAVE DISCOVERED ANOTHER NEW AND UNEXPECTED PHYSICAL EFFECT IN GRAPHENE – MEMBRANES THAT COULD BE USED IN DEVICES TO ARTIFICIALLY MIMIC PHOTOSYNTHESIS.

National Graphene Association

The new findings demonstrated an increase in the rate at which the material conducts protons when it is simply illuminated with sunlight. The ‘photo-proton’ effect, as it has been dubbed, could be exploited to design devices able to directly harvest solar energy to produce hydrogen gas, a promising green fuel. It might also be of interest for other applications, such as light-induced water splitting, photo-catalysis and for making new types of highly efficient photodetectors.

Graphene is a sheet of carbon atoms just one atom thick and has numerous unique physical and mechanical properties. It is an excellent conductor of electrons and can absorb light of all wavelengths.

Researchers recently found that it is also permeable to thermal protons (the nuclei of hydrogen atoms), which means that it might be employed as a proton-conducting membrane in various technology applications.

To find out how light affects the behaviour of protons permeating through the carbon sheet, a team led by Dr Marcelo Lozada-Hidalgo and Professor Sir Andre Geim fabricated pristine graphene membranes and decorated them on one side with platinum nanoparticles. The Manchester scientists were surprised to find that the proton conductivity of these membranes was enhanced 10 times when they were illuminated with sunlight.

Dr Lozada-Hidalgo said: “By far the most interesting application is producing hydrogen in an artificial photosynthetic system based on these membranes.”

Prof Geim is also optimistic: “This is essentially a new experimental system in which protons, electrons and photons are all packed together in an atomically thin volume. I am sure that there is a lot of new physics to be unearthed, and new applications will follow.”

img_0455Scientists around the world are busy looking into how to directly use solar energy to produce renewable fuels (such as hydrogen) by mimicking photosynthesis in plants. These man-made ‘leaves’ will require membranes with very sophisticated properties – including mixed proton-electron conductivity, permeability to gases, mechanical robustness and optical transparency.

Currently, researchers use a mixture of proton and electron-conducting polymers to make such structures, but these require some important trade-offs that could be avoided by using graphene.

Using electrical measurements and mass spectrometry, the researchers say that they measured a photoresponsivity of around 104 A/W, which translates into around 5000 hydrogen molecules being formed in response to every solar photon (light particle) incident on the membrane. This is a huge number if compared with the existing photovoltaic devices where many thousands of photons are needed to produce just a single hydrogen molecule.

“We knew that graphene absorbs light of all frequencies and that it is also permeable to protons, but there was no reason for us to expect that the photons absorbed by the material could enhance the permeation rate of protons through it.” says Lozada-Hidalgo.

“The result is even more surprising when we realised that the membrane was many orders of magnitude more sensitive to light than devices that are specifically designed to be light-sensitive. Examples of such devices include commercial photodiodes or those made from novel 2D materials.”

Photodetectors typically harvest light to produce just electricity but graphene membranes produce both electricity and, as a by-product, hydrogen. The speed at which they respond to light in the microsecond range is faster than most commercial photodiodes.

The authors acknowledge support from the Lloyd’s Register Foundation, EPSRC (EP/ N010345/1), the European Research Council ARTIMATTER project (ERC-2012-ADG) and from Graphene Flagship. M.L.-H. acknowledges a Leverhulme Early Career Fellowship.

Source: The University of Manchester

Designing a Graphene Filter to make Seawater Drinkable and … Cheaper


Seawater drinking water imagesAs drinking water grows scarce, desalination might be one way to bridge the gap.

 

A new study released earlier this week in the journal Nature Nanotechnology may be a major step towards making desalinated water—water in which salt is removed to make it safe for drinking—a viable option for more of the world. Researchers from the University of Manchester modified graphene oxide membranes, a type of selectively permeable membrane that allows some molecules to pass while keeping others behind, to let water through while trapping salt ions. It’s essentially a molecular sieve.

Finding new sources of fresh water is important, because roughly 20 percent of the world’s population—1.2 billion people—lack access to clean drinking water, according to the United Nations. It’s a number that’s expected to grow as populations increase and existing water supplies dwindle, in part due to climate change. This reality has led some to suggest that the world’s next “gold rush” will be for water. Others have a less sanguine approach, worrying that the wars of the future will be fought over water. And this concern is not without merit: the war currently raging in Yemen is linked, at least in part, to water conflicts. All the Water we have Energy-recovery-desalination-1

 

But while fresh water is scarce (a scant three percent of the world’s water is fresh) water itself is not. The Earth is more than 70 percent water, but 97 percent is undrinkable because it’s either salt or brackish (a mix of salt and fresh water). The occasional gulp of seawater while swimming aside, drinking saltwater is dangerous for humans—it leads to dehydration and eventually death. Hence the famous lined from the Rhyme of the Ancient Mariner: “water, water everywhere, nor any drop to drink.”

Desalination could be a solution. After all, the technique is already employed in parts of the Middle East and the Cayman Islands. However, the two techniques currently employed—multi-stage flash distillation, which flash heats a portion of the water into steam through a series of heat exchanges, and reverse osmosis, which uses a high-pressure pump to push sea water through reverse osmosis membranes to remove ions and particles from drinking water—have several key drawbacks.

“Current desalination methods are energy intensive and produce adverse environmental impact,” wrote Ram Devanathan a researcher at the Energy and Environment Directorate at Pacific Northwest National Laboratory, in an op-ed that accompanied the study. “Furthermore, energy production consumes large quantities of water and creates wastewater that needs to be treated with further energy input.”

Graphene oxide membranes show promise as a relatively inexpensive alternative, because they can be cheaply produced in a lab—and though water easily passes through them, salts do not. However, when immersed in water on a large-scale, graphene oxide membranes tend to quickly swell. Once swollen, the membranes not only allow water to pass through, but also sodium and magnesium ions, i.e. salt, defeating the purpose of the filtration.

Study author Rahul Nair and his colleagues discovered that by placing walls made of epoxy resin on either side of the graphene oxide, they could stop the expansion. And by restricting the membranes with resin, they were able to fine tune their capillary size to prevent any errant salts from hitching a ride on water molecules.

The next step will be testing it on an industrial scale to see if the method holds up. If it works, many people might just be drinking (a glass of water) to it.