Why Do Most Science Startups Fail? Here’s Why …


Science Start ups fail why getty_629009118_355815

“We need to get a lot better at bridging that gap between discovery and commercialization”

G. Satell – Inc. Magazine

It seems like every day we see or hear about a breakthrough new discovery that will change everything. Some, like perovskites in solar cells and CRISPR are improvements on existing technologies. Others, like quantum computing and graphene promise to open up new horizons encompassing many applications. Still others promise breakthroughs in Exciting Battery Technology Breakthrough News — Is Any Of It Real? or Beyond lithium — the search for a better battery

Nevertheless, we are still waiting for a true market impact. Quantum computing and graphene have been around for decades and still haven’t hit on their “killer app.” Perovskite solar cells and CRISPR are newer, but haven’t really impacted their industries yet. And those are just the most prominent examples.

bright_idea_1_400x400The problem isn’t necessarily with the discoveries themselves, many of which are truly path-breaking, but that there’s a fundamental difference between discovering an important new phenomenon in the lab and creating value in the marketplace.

“We need to get a lot better at bridging that gap. To do so, we need to create a new innovation ecosystem for commercializing science.”

The Valley Of Death And The Human Problem

The gap between discovery and commercialization is so notorious and fraught with danger that it’s been unaffectionately called the “Valley of Death.” Part of the problem is that you can’t really commercialize a discovery, you can only commercialize a product and those are two very different things.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. After something like graphene is discovered in the lab, it needs to be engineered into a useful product and then it has to gain adoption by winning customers in the marketplace. Those three things almost never happen in the same place.

So to bring an important discovery to market, you first need to identify a real world problem it can solve and connect to engineers who can transform it into a viable product or service. Then you need to find customers who are willing to drop whatever else they’ve been doing and adopt it on a large scale. That takes time, usually about 30 years.

The reason it takes so long is that there is a long list of problems to solve. To create a successful business based on a scientific discovery, you need to get scientists to collaborate effectively with engineers and a host of specialists in other areas, such as manufacturing, distribution and marketing. Those aren’t just technology problems, those are human problems. Being able to collaborate effectively is often the most important competitive advantage.

Wrong Industry, Wrong Application

One of the most effective programs for helping to bring discoveries out of the lab is I-Corps. First established by the National Science Foundation (NSF) to help recipients of SBIR grants identify business models for scientific discoveries, it has been such an extraordinary success that the US Congress has mandated its expansion across the federal government.

Based on Steve Blank’s lean startup methodology, the program aims to transform scientists into entrepreneurs. It begins with a presentation session, in which each team explains the nature of their discovery and its commercial potential. It’s exciting stuff, pathbreaking science with real potential to truly change the world.

The thing is, they invariably get it wrong. Despite their years of work to discover something of significance and their further efforts to apply and receive commercialization grants from the federal government, they fail to come up with a viable application in an industry that wants what they have to offer. professor-with-a-bright-idea-vector-937691

Ironically, much of the success of the I-Corps program is due to these early sessions. Once they realize that they are on the wrong track, they embark on a crash course of customer discovery, interviewing dozens — and sometimes hundreds — of customers in search of a business model that actually has a chance of succeeding.

What’s startling about the program is that, without it, scientists with important discoveries often wasted years trying to make a business work that never really had a chance in the first place.

The Silicon Valley Myth

Much of the success of Silicon Valley has been based on venture-funded entrepreneurship. Startups with an idea to change the world create an early stage version of the product they want to launch, show it to investors and get funding to bring it to market. Just about every significant tech company was started this way.

Yet most of the success of Silicon Valley has been based on companies that sell either software or consumer gadgets, which are relatively cheap and easy to rapidly prototype. Many scientific startups, however, do not fit into this category. Often, they need millions of dollars to build a prototype and then have to sell to industrial companies with long lead times.

start up imagesThe myth of Silicon Valley is that venture-funded entrepreneurship is a generalizable model that can be applied to every type of business. It is not. In fact, it is a specific model that was conceived in a specific place at a specific time to fund mature technologies for specific markets. It’s not a solution that fits every problem.

The truth is that venture funds are very adept with assessing market risk, but not so good at taking on technology risk, especially in hard sciences. That simply isn’t what they were set up to do.

We Need A New Innovation Ecosystem For Science Entrepreneurship

In 1945, Vannevar Bush delivered a report, Science, The Endless Frontier, to President Truman, in which he made the persuasive argument that expanding the nation’s scientific capacity will expand its economic capacity and well being. His call led, ultimately, to building America’s scientific infrastructure, including programs like the NSF and the National Institutes of Health (NIH).

It was Bush’s vision that made America a technological superpower. Grants from federal agencies to scientists enabled them to discover new knowledge. Then established businesses and, later, venture backed entrepreneurs would then take those discoveries to bring new products and services to market.

Look at any industry today and its most important technologies were largely shaped by investment from the federal government. Today, however, the challenges are evolving. We’re entering a new era of innovation in which technologies like genomics, nanotechnology and robotics are going to reshape traditional industries like energy, healthcare and manufacturing.

That’s exciting, but also poses new challenges, because these technologies are ill-suited to the Silicon Valley model of venture-funded entrepreneurship and need help to them get past the Valley of Death. So we need to build a new innovation ecosystem on top of the scientific architecture Bush created for the post-war world.

There have been encouraging signs. New programs like I-Corps, the Manufacturing InstitutesCyclotron Road and Chain Reaction are beginning to help fill the gap.

Still much more needs to be done, especially at the state and local level to help build regional hubs for specific industries, if we are going to be nearly as successful in the 21st century as were were in the 20th.

Cape-Starman

Advertisements

Genesis Nanotech Headlines Are Out!


Organ on a chip organx250Genesis Nanotech Headlines Are Out! Read All About It!

https://paper.li/GenesisNanoTech/1354215819#!headlines

Visit Our Website: www.genesisnanotech.com

Visit/ Post on Our Blog: https://genesisnanotech.wordpress.com

 

SUBCOMMITTE EXAMINES BREAKTHROUGH NANOTECHNOLOGY OPPORTUNITIES FOR AMERICA

Chairman Terry: “Nanotech is a true science race between the nations, and we should be encouraging the transition from research breakthroughs to commercial development.”

WASHINGTON, DCThe Subcommittee on Commerce, Manufacturing, and Trade, chaired by Rep. Lee Terry (R-NE), today held a hearing on:

“Nanotechnology: Understanding How Small Solutions Drive Big Innovation.”

 

 

electron-tomography

“Great Things from Small Things!” … We Couldn’t Agree More!

 

Hydrogen Fueling Could be Easier to Integrate Than You Think


New report could inspire Hydrogen integration into more California gas stations.

July 25, 2014
Sunmmary

Honda's Next Generation Solar Hydrogen Station PrototypeWASHINGTON – According to a report on the Greener Ideal website, a recent research study conducted by Sandia National Laboratories (SNL) may help speed up process of installing hydrogen fuel cell stations throughout the state of California.

The SNL study, which examined 70 gas stations in California, found that 14 of them could integrate hydrogen fuel right away, while another 17 only need some property expansions before they could be ready to cater to fuel cell vehicles. Integrating hydrogen storage into gas stations is a far cheaper option than building new hydrogen fueling stations from the ground up, considering that the construction of an entirely new station can cost up to $1.5 million.

In their article, Greener Ideal explains that the SNL report found that the stations ready to immediately integrate hydrogen fueling meet the requirements of the National Fire Protection Association (NFPA) hydrogen technologies code from 2011, which includes guidelines for the storage, use, piping and generation of hydrogen and is an essential tool for making sure that hydrogen fueling stations are operated in a safe manner, due to their serious flammability issues.

 

“Greener Ideal Article”

Hydrogen Fuel Can Be Easily Integrated into More Gas Stations in California

July 24, 2014

The lack of fueling stations – along with high production costs, is clearly one of the biggest hurdles for hydrogen cars, so until these issues are resolved, vehicles powered by hydrogen won’t become commonplace. As far as costs are concerned, car makers are trying to develop more affordable fuel cells, which would definitely bring the price of hydrogen cars down, but when it comes to fueling stations, a much broader effort from both the auto industry and government is needed to put the proper infrastructure in place.

Of all states, California has done the most in terms of promotion of hydrogen powered cars and encouraging a wider adoption of these alternative fuel vehicles. California has been investing heavily in the construction and installation of fueling stations across the state, and offering various incentives to those who decide to purchase one of these vehicles. Currently, there are over 20 stations in California, and the state has announced plans to install a total of 100 stations within the next 10 years. However, the pace of installing fueling stations could be much faster and a recent research study conducted by Sandia National Laboratories may help speed things up.

 

Honda's Next Generation Solar Hydrogen Station Prototype

Sandia National Laboratories completed a study that found many existing gas stations in California could accept hydrogen and cater to fuel cell vehicles. Integrating hydrogen storage into gas stations is a far cheaper option than building new hydrogen fueling stations from the ground up, considering that the construction of an entirely new station can cost up to $1.5 million. Researchers at Sandia examined 70 gas stations in California, and found that 14 of them could integrate hydrogen fuel right away, while another 17 only need some property expansions before they could be ready for it.

According to the report released by Sandia National Laboratories, the 14 stations that could readily accept hydrogen meet the requirements of the National Fire Protection Association (NFPA) hydrogen technologies code from 2011 – which includes guidelines for the storage, generation, use, piping, and generation of hydrogen. The NFPA hydrogen technologies code is an important tool for making sure that hydrogen fueling stations are operated in a safe manner, since there are serious flammability hazards involved in handling hydrogen.

Researchers were particularly focused on the distance between the different elements of the fueling infrastructure and public streets as one of the key factors to ensuring the safe operation of fueling facilities. “Whether you are filling your car with gasoline, compressed natural gas or hydrogen fuel, the fueling facility first of all must be designed and operated with safety in mind,” said Daniel Dedrick, hydrogen program manager at Sandia.

Chris San Marchi, manager of Sandia’s hydrogen and metallurgy science group, explained that scientists need to examine the potential safety hazards if there is a hydrogen leak at an existing gas station:

“If you have a hydrogen leak at a fueling station, for example, and in the event that the hydrogen ignites, we need to understand how that flame is going to behave in order to maintain and control it within a typical fueling station.”

At the moment, there are about 120,000 gas stations in the U.S., and the study Sandia National Laboratories conducted shows that many of them could cater to hydrogen fuel cell vehicles, which would definitely help expand the hydrogen fueling station network, without having to invest hundreds of millions of dollars in an entirely new infrastructure.

For the latest information on consumer perceptions about hydrogen vehicles, read this week’s NACS Daily article, “Hydrogen Cars Are Here — What Now?”

NANOTECHNOLOGY – On the Horizon and in the Far Future: Video


 

 

 

What is Nanotechnology?

 
A basic definition: Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced.

 

 
In its original sense, ‘nanotechnology’ refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

Nanotechnology (sometimes shortened to “nanotech”) is the manipulation of matter on an atomic and molecular scale. The earliest, widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology.

A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter that occur below the given size threshold.

It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research.

Through its National Nanotechnology Initiative, the USA has invested 3.7 billion dollars. The European Union has invested 1.2 billion and Japan 750 million dollars

New Nanoparticle Production Method at Sandia National Laboratories: Better Lights, Lenses, Solar Cells


 

Sandia Labs id36072Sandia National Laboratories has come up with an inexpensive way to synthesize titanium-dioxide nanoparticles and is seeking partners who can demonstrate the process at industrial scale for everything from solar cells to light-emitting diodes (LEDs).

 
Titanium-dioxide (TiO2) nanoparticles show great promise as fillers to tune the refractive index of anti-reflective coatings on signs and optical encapsulants for LEDs, solar cells and other optical devices. Optical encapsulants are coverings or coatings, usually made of silicone, that protect a device.

 
Industry has largely shunned TiO2 nanoparticles because they’ve been difficult and expensive to make, and current methods produce particles that are too large. Sandia became interested in TiO2 for optical encapsulants because of its work on LED materials for solid-state lighting.

 
Current production methods for TiO2 often require high-temperature processing or costly surfactants — molecules that bind to something to make it soluble in another material, like dish soap does with fat.
Those methods produce less-than-ideal nanoparticles that are very expensive, can vary widely in size and show significant particle clumping, called agglomeration.

 

 
Sandia’s technique, on the other hand, uses readily available, low-cost materials and results in nanoparticles that are small, roughly uniform in size and don’t clump.

 

 
“We wanted something that was low cost and scalable, and that made particles that were very small,” said researcher Todd Monson, who along with principal investigator Dale Huber patented the process in mid-2011 as “High-yield synthesis of brookite TiO2 nanoparticles”.

Sandia Labs id36072 (Low-cost technique produces uniform nanoparticles that don’t clump.)

Their method produces nanoparticles roughly 5 nanometers in diameter, approximately 100 times smaller than the wavelength of visible light, so there’s little light scattering, Monson said. “That’s the advantage of nanoparticles — not just nanoparticles, but small nanoparticles,” he said.

Scattering decreases the amount of light transmission. Less scattering also can help extract more light, in the case of an LED, or capture more light, in the case of a solar cell.

TiO2 can increase the refractive index of materials, such as silicone in lenses or optical encapsulants. Refractive index is the ability of material to bend light. Eyeglass lenses, for example, have a high refractive index.

Practical nanoparticles must be able to handle different surfactants so they’re soluble in a wide range of solvents. Different applications require different solvents for processing.

Technique can be used with different solvents.

“If someone wants to use TiO2 nanoparticles in a range of different polymers and applications, it’s convenient to have your particles be suspension-stable in a wide range of solvents as well,” Monson said. “Some biological applications may require stability in aqueous-based solvents, so it could be very useful to have surfactants available that can make the particles stable in water.”

The researchers came up with their synthesis technique by pooling their backgrounds — Huber’s expertise in nanoparticle synthesis and polymer chemistry and Monson’s knowledge of materials physics. The work was done under a Laboratory Directed Research and Development project Huber began in 2005.

“The original project goals were to investigate the basic science of nanoparticle dispersions, but when this synthesis was developed near the end of the project, the commercial applications were obvious,” Huber said. The researchers subsequently refined the process to make particles easier to manufacture.

Existing synthesis methods for TiO2 particles were too costly and difficult to scale up production. In addition, chemical suppliers ship titanium-dioxide nanoparticles dried and without surfactants, so particles clump together and are impossible to break up. “Then you no longer have the properties you want,” Monson said.

The researchers tried various types of alcohol as an inexpensive solvent to see if they could get a common titanium source, titanium isopropoxide, to react with water and alcohol.

The biggest challenge, Monson said, was figuring out how to control the reaction, since adding water to titanium isopropoxide most often results in a fast reaction that produces large chunks of TiO2, rather than nanoparticles. “So the trick was to control the reaction by controlling the addition of water to that reaction,” he said.

 

 
Textbooks said making nanoparticles couldn’t be done, Sandia persisted
Some textbooks dismissed the titanium isopropoxide-water-alcohol method as a way of making TiO2 nanoparticles. Huber and Monson, however, persisted until they discovered how to add water very slowly by putting it into a dilute solution of alcohol. “As we tweaked the synthesis conditions, we were able to synthesize nanoparticles,” Monson said.

 

 
The next step is to demonstrate synthesis at an industrial scale, which will require a commercial partner. Monson, who presented the work at Sandia’s fall Science and Technology Showcase, said Sandia has received inquiries from companies interested in commercializing the technology.

 

 
“Here at Sandia we’re not set up to produce the particles on a commercial scale,” he said. “We want them to pick it up and run with it and start producing these on a wide enough scale to sell to the end user.”

 

 
Sandia would synthesize a small number of particles, then work with a partner company to form composites and evaluate them to see if they can be used as better encapsulants for LEDs, flexible high-index refraction composites for lenses or solar concentrators. “I think it can meet quite a few needs,” Monson said.

 

 

Source: Sandia National Laboratories