WHICH is the world’s most innovative country? Answering this question is the aim of the annual Global Innovation Index and a related report, which were published this morning by Cornell University, INSEAD, a business school, and the World Intellectual Property Organisation.
The ranking of 140 countries and economies around the world, which are scored using 79 indicators, is not surprising: Switzerland, Britain, Sweden, the Netherlands and America lead the pack.
But the authors also look at their data from other angles, for instance how countries do relative to their economic development and the quality of innovation (measured by indicators such as university rankings). In both cases the results are more remarkable. The chart above shows that in innovation many countries in Africa punch above their economic weight. And the chart below indicates that, even though China is now churning out a lot of patents, it is still way behind America and other rich countries when it comes to innovation quality.
Since the First Industrial Revolution, oil and gas have played a pivotal role in economic transformation and mobility. But now, with the prospects that major economies like the United States, China and European nations will try to shift away from oil, producers are coming to realize that their oil reserves under the ground – sometimes referred to as “black gold” – could become less valuable in the future than they are today.
Of the four scenarios for the future of the industry outlined in a new set of whitepapers from the Global Agenda on the Future of Oil and Gas, three of them envisage this type of world. Factors such as technological advancements, the falling price of batteries that power electric vehicles, and a post-COP21 push for cleaner energy could even drive oil use below 80 million barrels a day by 2040 – 15% lower than today.
We’re already feeling the effect
So what would a future of falling demand mean for the oil and gas industry?
Uncertainty about whether oil demand will continue to grow is already impacting the strategies of oil and gas firms. Through the 2000s and up until last year, the Organization of Petroleum Exporting Countries (OPEC), whose policies influence global oil supply and prices, took a revenues-oriented strategy, believing that scarce oil would be more valuable under the ground than out in the market, as global demand rose exponentially over time. Oil companies, too, responded to this world view by pursuing a business model that maximized adding as many reserves as possible to balance sheets and warehousing expensive assets.
Now, with new trends discussed in a new whitepaper, producers are coming to realize that oil under the ground might soon be less valuable than oil produced and sold in the coming years. This dramatic shift in expectations is changing the operating environment for the future of oil and gas.
A post-oil world: not all doom and gloom
Countries with large, low-cost reserves, such as Saudi Arabia, are rethinking strategies and will have to think twice about delaying production or development of reserves, in case they are unable to monetize those reserves over the long run. Saudi Arabia, for example, has recently announced that it is creating a $2 trillion mega-sovereign wealth fund, funded by sales of current petroleum industry assets, to prepare itself for an age when oil no longer dominates the global economy.
Declining revenues that could be reaped from exploitation of remaining oil reserves would adversely affect national revenues in many countries that have relied on oil as a major economic mainstay. Those countries will face pressing requirements for economic reform, with the risk of sovereign financial defaults rising.
But for the majority of the world’s population, structural transformations related to the future outlook for oil and gas offers an opportunity. If the global economy becomes less oil intensive, vulnerability to supply dislocations and price shocks that have plagued financial markets for decades will fade, with possible positive geopolitical implications. Moreover, many countries have reeled under the pressures of fuel subsidies to growing populations. According to the IMF, fuel subsidies cost $5.3 trillion in 2015 – around 6.5% of global GDP. Lower oil prices and larger range of alternative fuel choices would reverse this burden and lay the groundwork for shallower swings in prices for any one commodity.
Staying competitive in an industry under change
Eventually, players who remain competitive in the oil and gas industry will have to consider whether it can be more profitable to shareholders to develop profitable low-carbon sources of energy as supplement and ultimately replacements for oil and gas revenue sources, especially to maintain market share in the electricity sector.
This will require a change in the oil and gas industry investors’ mindset. To develop this flexible, supplemental leg to traditional oil and gas activities, the oil and gas industry may find new opportunities by addressing the technological challenges associated with the different parts of the renewable energy space, as well as how one can develop efficient combinations of large-scale energy storage and transportation solutions in a world with a lot of variable renewable electricity.
Industry players can benefit from partnerships for flex-fuel technologies to ease infrastructure transitions and improve their resiliency to carbon pricing by achieving carbon efficiency for end-use energy through collaborations with vehicle manufacturers and mobility firms. Such responses will enhance the industry’s attractiveness with customers and investors, and most importantly, will promote a smoother long-term energy transition.
Before the dawn of the new millennium, the then President of the USA Bill Clinton was invited by Science magazine to write an editorial. In the one-page piece, Science in the 21st century, he wrote: “Imagine a new century, full of promise, molded by science, shaped by technology, powered by knowledge. We are now embarking on our most daring explorations, unraveling the mysteries of our inner world and charting new routes to the conquest of disease” . In 2000, the US government firmly kicked off its significant and influential National Nanotechnology Initiative (NNI) program after integrating all resources from Federal agencies, including National Science Foundation, Department of Defense, Department of Energy, Department of Health and Human Services (NIH), National Institute of Standard Technology (NIST), National Aeronautics and Space Administration (NASA), Environmental Protection Agency (EPA), Homeland Security, United States Department of Agriculture (USDA), and Department of Justice.
The NNI established four goals:
(1) to advance a world-class nanotechnology research and development program;
(2) to foster the transfer of new technologies into products for commercial and public benefit;
(3) to develop and sustain educational resources, a skilled workforce, and supporting infrastructure and tools to advance nanotechnology; and
(4) to support responsible development of nanotechnology. The NNI significantly pushes nanotechnology research forward. In 2006, the prominence of nanotechnology research began to exceed medical research in terms of publication rate. That trend appears to be continuing as a result of the growth of products in commerce using nanotechnology and, for example, five-fold growth in number of countries with nanomaterials research centers.
The nanoscience and nanotechnology subject category of the Journal Citation Report (JCR) published by Thomson Reuters has increased rapidly. Correspondingly, both impact factors (published by Thomson Reuters) and SCImago Journal Rank values (SJR is published by Elsevier’s Scopus and powered by Google’s PageRank algorithms) of journals in the nanotechnology subject category have increased rapidly . The aggregate impact factor of nanoscience and nanotechnology has been rising at a breathtaking rate, compared with other subject categories, reaching the top 10 after 2011. The hype and hope of nanotechnology challenging many previously unimaginable goals are especially high now, and many believe in forthcoming breakthroughs in the areas of nanomaterial-based diagnostic imaging, complementation of diagnostic tools combined with therapeutic modalities (i.e., theranostics), or nanoencapsulation and nano-carriers of biotechnology products.
Today, it is estimated that total NNI funding, including the fiscal year 2014, is about $170 billion. Currently, there are more than 60 countries that have launched national nanotechnology programs . Governments and industry have invested millions of dollars in research funding in this rapidly growing field. By 2015, approximately one quarter trillion dollars will have been invested in nanotechnology by the American government and private sectors collectively. The continuous strategic investment in nanotechnology has made the United States a global leader in the field.
Ten years ago, when AAAS celebrated the 125th anniversary of the journal Science, it invited the President of Chinese Academy of Science (CAS) Chunli Bai to write an essay for the special section Global Voice of Science. The CAS President Chunli Bai’s essay, Ascent of Nanoscience in China described the then development of nanotechnology and nanoscience in the country and openly announced the government’s ambition to compete with other countries in the field. In 2006, the Chinese government announced its Medium and Long-term Plan for the Development of Science and Technology (2006–2020), which identified nanotechnology as “a very promising area that could give China a chance of great-leap-forward development”. The plan introduced the new Chinese Science & Technology policy guidelines, which were later implemented by the Ministry of Science and Technology (MOST) that operates Nanoscience Research as a part of the State Key Science Research Plans. So far the Nanoscience Research program has invested about 1.0 billion RMB to support 28 nanotechnology projects. All of these endeavors led to the recent significantly rapid rise of nanotechnology in China as evidenced by its publications, industrial R&D and applications in the field.
The rapid development of nanotechnology-based science and technology in China attracted worldwide attention including from Demos, one of the UK’s most influential think tanks. Led by Wilsdon and Keeley, Demos completed an 18-month study, interviewing many leading scientists and policy makers of 71 Asian organizations, including two well-known Chinese nanotechnology academics Dr. Chen Wang (the then Director of National Center for Nanoscience and Technology) and Academician Zihe Rao (Director of CAS Institute of Biophysics).
After completion of the project, Wilsdon and Keeley published their findings in the book, China: The next science superpower?” . The authors wrote, “China in 2007 is the world’s largest technocracy: a country ruled by scientists and engineers who believe in the power of technology to deliver social and economic progress. Right now, the country is at an early stage in the most ambitious program of research investment since John F Kennedy embarked on the race to the moon. But statistics fail to capture the raw power of the changes that are under way, and the potential for Chinese science and innovation to head in new and surprising directions. Is China on track to become the world’s next science superpower?” Indeed, in recent years, China has emerged not only as a mass manufacturer, but also as one of the world’s leading nanotechnology nations. Many nanomaterial-based semiconductor products come from China and the country dominates in the nanotechnology area of most-cited academic articles: the top eighteen out of the twenty scholars are of Chinese origin .
Changes in nanotechnology-related geopolitical landscape
With strong governmental and private sector supports, nanotechnology and nanoscience R&D has developed rapidly in both the USA and China. As shown in Fig. 1A, from 2003 to 2013, the USA led in the area of global nanotechnology publications in terms of the numbers of papers and their quality determined by the number of citations and H-index. China followed USA in the field. For instance, the total nanotechnology publications from USA were 160,870 with total citations of 4056,278, whereas, China published 154,946 papers with total citations of 2049,072. The quality of an article is usually judged by the number of citations it receives, although other measures such as the number of downloads are becoming more accepted and used .
Based on the total number of publications and related citations, we have used weighted statistics to calculate the top countries actively involved in nanotechnology research (see original publication for full details). The statistics show that USA ranks number one, followed by China, Germany, Japan, Korea, France, UK, India, Italy, Spain, Taiwan (China), and others (Fig. 1A). EU countries are not too far behind in the field. Further analysis indicates that the number of nanotechnology-related publications increased from 23,957 in 2003 to 107,371 in 2013 world-wide (an increase of 4.48-folds). Among them, 3592 and 30.479 papers were contributed by China in 2003 and in 2013, respectively, that is an increase of 8.49 folds, which is about 2-fold higher than the global publication increase rate.
Bibliometric data of twenty leading nanotechnology journals shows that the USA is leading in nanotechnology research by far (see original publication for full details). The USA contributed 22,067 papers to the twenty journals from 2003 to 2013, whereas, China only published 3421 papers in these journals. If the analysis is limited to papers published in journals with an impact factor >20, the USA originated 1068 papers, followed by EU countries Germany (221), UK (193), France (149), and finally Japan (121). China only produced 76 papers with an impact factor >20, demonstrating that China has some significant hurdles to overcome to join the world’s top countries in nanotechnology development.
Interestingly, China is not lagging behind world leaders in all areas, for example, the gap between the USA and China is narrower in the field of nanomaterial research. Publications from China in Advanced Materials, Advanced Functional Materials, and Angew. Chem. Int. Edit. are not much less than those from the USA. In fact, China is leading in nanocomposites, chemical synthesis, and photocatalysis research (Fig. 1B and C). Chinese scientists published 1712 and 1580 papers in chemical synthesis and photocatalysis (from 2003 to 2013), respectively. The numbers exceed those from India, South Korea, Japan, USA, France, Germany, UK and Italy combined, suggesting that Chinese researchers have evolved their own research focuses and strengths over the years. On the other hand, this fact may also indicate an over-investment of resources in this area.
A list of the top ten universities and institutes world-wide (see original paper for full details), Top 10 Universities for Nanotechnology and Materials Science (U.S. has 5 in the Top 10) as well as those located within USA or China who contribute the most nanotechnology publications, reveals that the authorship of China’s nanotechnology publications is mostly concentrated in a small group of prestigious institutes and universities, reflecting the more centralized governance of China science, while authorship in the USA is more widely distributed. Indeed, the CAS possesses more resources than other competitors in China.
The geopolitical differences between the USA and China are also reflected in nanotechnology-related patent applications and industrialization. The numbers of nanotechnology-related patent applications to the US Patent and Trademark Office (USPTO), or the State Intellectual Property Office of China (SIPO) have increased from 405 in 2000 to 3729 in 2008 in USA, or from 105 in 2000 to 5030 in 2008 in China .
According to the China Patent Abstract Database managed by the SIPO, there were 30,863 nanotechnology patent applications from 1985 to 2009, and most of them were published after 2003. The central government has already built several state-level nanotechnology R&D incubators or bases, including the National Center for Nanoscience and Technology of China in Beijing, The State Engineering Research Center for Nanotechnology and Applications in Shanghai, National Institute of Nanotechnology and Engineering in Tianjin, Zhejiang–California International NanoSystems Institute, International Innovation Incubator of Nanotechnology, in Suzhou. In general, Beijing and Shanghai remain the two dominant nanotechnology centers, followed by Jiangsu and Zhejiang, reflecting the regional divergence of Chinese nanotechnology development .
The China–USA relationship is as compelling as it is complex. Approximately, one out of ten professionals in Silicon Valley’s high-tech workforce is from mainland China . In today’s global economy, the two great countries compete with each other in nanotechnology in a parallel and compatible manner. Historically, the United States has led the global high-tech and nanotechnology fields. However, the gap between USA and China in nanotechnology has narrowed significantly in recent years and American nanotechnology leadership faces challenges from all over the world.
With improved investment in research infrastructure and funding, China is sustaining the fastest economic growth in the world. Citizens’ participation in nanoscience and nanotechnology-related consensus conferences or stakeholder dialogues has become normal. This has not only had a significant impact on nanotechnology development in China, but also is democratically legitimate. Interest-based civil society interventions play an important role in the polycentric governance of nanoscience and nanotechnology to ensure that the related policies and regulations are made prudently after open argument and discussions . It would be interesting to watch, debate and decide which type of governmental system, the centralized one-party or the almost equally-divided two-party system, can more efficiently and effectively utilize public resources to produce nanotechnology products that better serve their own taxpayers, and the worldwide community as well.
*** Departing from GNT™‘s ‘normal’ Nanotechnology beat, we turn to in this series of 3 articles the subject “matter” (pardon the punny) of Nuclear Fission, Fusion and Hybrid Fission-Fusion. The interesting cross-pollination of course is the future of “clean, abundant, cheap energy” for our planet of 7 Billion+ people now and going toward 9 Billion by 2042. Please share with us and our readership, your thoughts and any comments.
“Great Things from Small Things!” ~ Team GNT™
“Discover ~ Develop ~ Position ~ Commercialize ~ Exit”
August 3, 2015
China is going to build its first hybrid fusion-fission reactor by 2030, according to local media reports. The reactor is expected to recycle nuclear waste making energy production more environmentally friendly.
The ambitious plan is in the works at the top secret Chinese Academy of Engineering Physics in Sichuan, where China develops its nuclear weapons, China Daily Mail reports. The plans were announced in a study published in the Science and Technology Daily, an official newspaper of the Ministry of Science and Technology.
The experimental research platform will be built by 2020 while the whole system could be launched by 2030, said Huang Hongwen, the deputy project manager, China Daily Mail reported Saturday.
Researchers believe that hybrid reactors will generate twice as much electricity as modern reactors. These reactors are also believed to be safer as they can be immediately stopped by cutting the external power supply.
Today reactors use only fission technology which means dividing atoms in half while future fusion-fission technology will merge two atoms in one. The core of the new hybrid reactor will be a fusion reactor which will be powered by a 60 trillion amperes fission reactor.
The basic principle of the hybrid reactor is recycling uranium-238, which is the main component of nuclear waste, into new fuel. Such a reactor will become a breakthrough in environmentally friendly technologies and in particular a solution of nuclear wastes problem for China, who lacks recycling facilities and has to store the waste inside nuclear energy plants.
Hybrid fusion-fission reactors can also solve another vital problem for China – uranium shortages. According to the study China can meet its uranium demands for only a century, while using fusion-fission technologies will provide it with uranium for several thousand years.
Some scientists have doubts over whether Chinese plans are realistic. “A viable fusion reactor is nowhere in sight, not to mention a hybrid,” an unnamed physicist from Tsinghua University told the SCMP.
“It’s like talking about hybrid cars before the internal combustion engine was even invented. We will be lucky to have the first fusion reactor in 50 years. I don’t think a hybrid can be built way before that”, he added.
China is not the only country which has tried to create a hybrid fusion-fission reactor. Similar projects are being developed in Russia, Japan, the EU and the USA. China, however, is the first country to have planned exact dates.
Russia develops hybrid fusion-fission reactor, offers China role
October 15, 2014
Russia is developing a hybrid nuclear reactor that uses both nuclear fusion and fission, said head of leading nuclear research facility. The project is open for international collaboration, particularly from Chinese scientists.
A hybrid nuclear reactor is a sort of stepping stone to building a true nuclear fusion reactor. It uses a fusion reaction as a source of neutrons to initiate a fission reaction in a ‘blanket’ of traditional nuclear fuel.
The approach has a number of potential benefits in terms of safety, non-proliferation and cost of generated energy, and Russia is developing such a hybrid reactor, according to Mikhail Kovalchuk, director of the Kurchatov Research Center.
“Today we have started the realization of a distinctively new project. We are trying to combine a schematically operational nuclear plant reactor with a ‘tokamak’ to create a hybrid reactor,” he told RIA Novosti, referring to a type of fusion reactor design.
“This project is open for our colleagues, the Chinese in the first place. It’s being discussed,” he added.
Being a leading producer in civilian nuclear energy industry, Russia would benefit from improving its plant designs. A hybrid fusion-fission reactor may be several times more efficient than a traditional fission reactor. And building one is “a goal for tomorrow” rather than the distant future, as is the case for a fusion reactor like the famous France-based International Thermonuclear Experimental Reactor (ITER) that Russia collaborates on, Kovalchuk said.
Harnessing nuclear fusion for energy generation has been elusive for years. So far no industrial-scale design managed to produce more energy than it consumes to start the reaction, though the California-based National Ignition Facility (NIF) was reported to have achieved this goal on lab-scale by bombarding a fuel pellet with 192 powerful lasers.
But nuclear fusion produces neutrons, and those can initiate fission in traditional nuclear fuel like uranium or plutonium. In a hybrid reactor the core fusion zone consumes energy to heat up outer fissile blanket, which on its part generates energy.
A hybrid reactor plant would likely be even more costly that regular nuclear power plants are, considering the complexities of the design. But it is inherently safer, since the reaction in the fissile blanket would be sub-critical, that is, it won’t sustain itself. In an emergency it could be simply stopped in a matter of seconds by turning off the fusion core, as opposed to using dampening rods in a traditional reactor.
Another benefit of a hybrid design is that it ‘burns down’ fissile materials leaving little by-products. So it won’t produce radioactive waste and can even treat spent nuclear fuel from regular reactors.
Rather than taking NIF’s pellet-and-lasers design for the fusion reactor, Russia wants to use a tokamak, a reactor that suspends superheated plasma with powerful magnetic fields, as the core of a hybrid reactor. ITER uses the design too.
A similar tokamak-based project of a hybrid fusion-fission nuclear reactor is being developed at the University of Texas at Austin, although researchers there eye nuclear waste disposal rather than electricity generation as the goal.
February 13, 2014
Nuclear fusion breakthrough: US scientists make crucial step to limitless power
A metallic case called a hohlraum holds the fuel capsule for NIF experiments (Image from llnl.gov)
A team of scientists in California announced Wednesday they are one step closer to developing the almost mythical pollution-free, controlled fusion-energy reaction, though the goal of full “ignition” is still far off.
Researchers at the federally-funded Lawrence Livermore National Laboratory revealed in a study released Wednesday in the peer-reviewed journal Nature that, for the first time, one of their experiments has yielded more energy out of fusion than was used in the fuel that created the reaction.
In a 10-story building the size of three football fields, the Livermore scientists “used 192 lasers to compress a pellet of fuel and generate a reaction in which more energy came out of the fuel core than went into it,” wrote the Washington Post. “Ignition” would mean more energy was produced than was used in the entire process.
“We’re closer than anyone’s gotten before,” said Omar Hurricane, a physicist at Livermore and lead author of the study. “It does show there’s promise.”
The process ultimately mimics the processes in the core of a star inside the laboratory’s hardware. Nuclear fusion, which is how the sun is heated, creates energy when atomic nuclei fuse and form a larger atom.
“This isn’t like building a bridge,” Hurricane told USA Today in an interview. “This is an exceedingly hard problem. You’re basically trying to produce a star, on a small scale, here on Earth.”
A fusion reactor would operate on a common form of hydrogen found in sea water and create minimal nuclear waste while not being nearly as volatile as a traditional nuclear-fission reactor. Fission, used in nuclear power plants, works by splitting atoms.
Hurricane said he does not know how long it will take to reach that point, where fusion is a viable energy source.
“Picture yourself halfway up a mountain, but the mountain is covered in clouds,” he told reporters on a conference call Wednesday. “And then someone calls you on your satellite phone and asks you, ‘How long is it going to take you to climb to the top of the mountain?’ You just don’t know.”
The beams of the 192 lasers Livermore used can pinpoint extreme amounts of energy in billionth-of-a-second pulses on any target. Hurricane said the energy produced by the process was about twice the amount that was in the fuel of the plastic-capsule target. Though the amount of energy yielded equaled only around 1 percent of energy delivered by the lasers to the capsule to ignite the process.
“When briefly compressed by the laser pulses, the isotopes fused, generating new particles and heating up the fuel further and generating still more nuclear reactions, particles and heat,” wrote the Washington Post, adding that the feedback mechanism is known as “alpha heating.”
Debbie Callahan, co-author of the study, said the capsule had to be compressed 35 times to start the reaction, “akin to compressing a basketball to the size of a pea,” according to USA Today.
While applauding the Livermore team’s findings, fusion experts added researchers have “a factor of about 100 to go.”
“These results are still a long way from ignition, but they represent a significant step forward in fusion research,” said Mark Herrmann of the Sandia National Laboratories’ Pulsed Power Sciences Center. “Achieving pressures this large, even for vanishingly short times, is no easy task.”
Livermore is the site of the multi-billion-dollar National Ignition Facility, funded by the National Nuclear Security Administration. Fusion experiments aren’t the only function of the lab; for example, it also studies the processes of nuclear weapon explosions.
Long-pursued by scientists dating back to Albert Einstein, fusion energy does not emit greenhouse gases or leave behind radioactive waste. Since the 1940s, researchers have employed magnetic fields to contain high-temperature hydrogen fuel. Laser use began in the 1970s.
“We have waited 60 years to get close to controlled fusion,” said, Steve Cowley, of the United Kingdom’s Culham Center for Fusion Energy. He added scientists are “now close” with both magnets and lasers. “We must keep at it.”
Stewart Prager – director of the Princeton Plasma Physics Laboratory, which studies fusion using magnets – told the Post he was optimistic about fusion energy’s future.
“In 30 years, we’ll have electricity on the grid produced by fusion energy – absolutely,” Prager said. “I think the open questions now are how complicated a system will it be, how expensive it will be, how economically attractive it will be.”
A new discovery will make it possible to create pixels just a few hundred nanometres across that could pave the way for extremely high-resolution and low-energy thin, flexible displays for applications such as ‘smart’ glasses, synthetic retinas, and foldable screens.
A team led by Oxford University scientists explored the link between the electrical and optical properties of phase change materials (materials that can change from an amorphous to a crystalline state). They found that by sandwiching a seven nanometre thick layer of a phase change material (GST) between two layers of a transparent electrode they could use a tiny current to ‘draw’ images within the sandwich ‘stack’.
Initially still images were created using an atomic force microscope but the team went on to demonstrate that such tiny ‘stacks’ can be turned into prototype pixel-like devices. These ‘nano-pixels’ — just 300 by 300 nanometres in size — can be electrically switched ‘on and off’ at will, creating the coloured dots that would form the building blocks of an extremely high-resolution display technology.
A report of the research is published in this week’s Nature.
‘We didn’t set out to invent a new kind of display,’ said Professor Harish Bhaskaran of Oxford University’s Department of Materials, who led the research. ‘We were exploring the relationship between the electrical and optical properties of phase change materials and then had the idea of creating this GST ‘sandwich’ made up of layers just a few nanometres thick. We found that not only were we able to create images in the stack but, to our surprise, thinner layers of GST actually gave us better contrast. We also discovered that altering the size of the bottom electrode layer enabled us to change the colour of the image.’
Oxford University technology can draw images 70 micrometers across, each image is smaller than the width of a human hair. The researchers have shown that using this technology they can create ‘nano-pixels’ just 100 nanometers in size that could pave the way for extremely high-resolution and low-energy thin, flexible displays for applications such as ‘smart’ glasses, synthetic retinas, and foldable screens.
Whilst the work is still in its early stages, realising its potential, the Oxford team has filed a patent on the discovery with the help of Isis Innovation, Oxford University’s technology commercialisation company. Isis is now discussing the displays with companies who are interested in assessing the technology, and with investors.
The layers of the GST sandwich are created using a sputtering technique where a target is bombarded with high energy particles so that atoms from the target are deposited onto another material as a thin film.
‘Because the layers that make up our devices can be deposited as thin films they can be incorporated into very thin flexible materials — we have already demonstrated that the technique works on flexible Mylar sheets around 200 nanometres thick,’ said Professor Bhaskaran. ‘This makes them potentially useful for ‘smart’ glasses, foldable screens, windshield displays, and even synthetic retinas that mimic the abilities of photoreceptor cells in the human eye.’
Peiman Hosseini of Oxford University’s Department of Materials, first author of the paper, said: ‘Our models are so good at predicting the experiment that we can tune our prototype ‘pixels’ to create any colour we want — including the primary colours needed for a display. One of the advantages of our design is that, unlike most conventional LCD screens, there would be no need to constantly refresh all pixels, you would only have to refresh those pixels that actually change (static pixels remain as they were). This means that any display based on this technology would have extremely low energy consumption.’
The research suggests that flexible paper-thin displays based on the technology could have the capacity to switch between a power-saving ‘colour e-reader mode’, and a backlit display capable of showing video. Such displays could be created using cheap materials and, because they would be solid-state, promise to be reliable and easy to manufacture. The tiny ‘nano-pixels’ make it ideal for applications, such as smart glasses, where an image would be projected at a larger size as, even enlarged, they would offer very high-resolution.
Professor David Wright of the Department of Engineering at the University of Exeter, co-author of the paper, said: ‘Along with many other researchers around the world we have been looking into the use of these GST materials for memory applications for many years, but no one before thought of combining their electrical and optical functionality to provide entirely new kinds of non-volatile, high-resolution, electronic colour displays — so our work is a real breakthrough.’
The phase change material used was the alloy Ge2Sb2Te5 (Germanium-Antimony-Tellurium or GST) sandwiched between electrode layers made of indium tin oxide (ITO).
June 27 — The Food and Drug Administration June 24 announced new guidance to provide greater regulatory clarity for industry on the use of nanotechnology in FDA-regulated products, including drugs, devices, cosmetics and food.
The agency issued a final guidance, “Considering Whether an FDA-Regulated Product Involves the Application of Nanotechnology,” which makes final a 2011 draft. In a press release, the agency said the final document “outlines overarching considerations for all FDA-regulated products, identifying points to consider when determining whether a product involves the use of nanotechnology. It is intended to help industry and others identify when they should consider potential implications for regulatory status, safety, effectiveness or public health impact that may arise with the application of nanotechnology in FDA-regulated products.”
‘Different Modes of Action.’
In this final guidance, the agency said that nanotechnology “can be used in a broad array of FDA-regulated products, including medical products (e.g., to increase bioavailability of a drug), foods (e.g., to improve food packaging) and cosmetics (e.g., to affect the look and feel of cosmetics).”
A footnote said that the application of nanotechnology may also affect the classification of a product: “For example, nanomaterials used in medical products may function through different modes of action than larger-scale materials with the same chemical composition, which may affect the classification of the product, for example as a drug or device.” The agency also published on June 27 a Federal Register notice about the guidance (79 Fed. Reg. 36,534).
The agency June 24 also issued the final versions of two other draft guidances issued in April 2012 addressing the use of nanotechnology in foods and cosmetics. In addition, the FDA issued a draft guidance for comment, on the use of nanomaterials in food for animals; comments on the draft are due Sept. 10.
According to the FDA’s website, nanotechnology allows scientists to create, explore and manipulate materials measured in nanometers (billionths of a meter). “Such materials can have chemical, physical, and biological properties that differ from those of their larger counterparts,” the agency said.
Also on the agency website, the FDA said it “does not make a categorical judgment that nanotechnology is inherently safe or harmful. We intend our regulatory approach to be adaptive and flexible and to take into consideration the specific characteristics and the effects of nanomaterials in the particular biological context of each product and its intended use.”
According to a new market research report titled “Flexible Electronics Market by Components (Display, Battery, Sensor, Photovoltaic, Memory), Circuit Structure (Single-Sided, Double-Sided, Rigid), Application (Consumer Electronics, Healthcare, Automotive, Energy and Power), & by Geography – Analysis & Forecast to 2014 – 2020“, published by MarketsandMarkets, the Flexible Electronics Market is expected to reach $13.23 Billion by 2020.
The development of ﬂexible electronics has spanned the past few years, ranging from the development of ﬂexible solar cell arrays to ﬂexible OLED electronics on plastic substrates. The rapid development of this ﬁeld has been spurred by consistent technological development in large-area electronics, thereby developing the areas like ﬂat-panel electronics, medical image sensors, and electronic paper. Many factors contribute to the rise of ﬂexible electronics they are more ruggedness, lightweight, portable, and less cost, with respect to production as compared to rigid substrate electronics. Basic electronic structure is composed of a substrate, backplane electronics, a front plane, and encapsulation. To make the structure ﬂexible, all the components must bend up to some degree without losing their function. Two basic approaches have been adopted to make ﬂexible electronics, that is, transfer and bonding of completed circuits to a ﬂexible substrate and fabrication of the circuits directly on the ﬂexible substrate.
The report segments the Flexible Electronics Market on the basis of the different types of components, circuit structures, applications and geographies. Further, it contains revenue forecast and analyzes the trends in the market. The geographical analysis contains the in-depth classification of Americas, Europe, and APAC, which contains the major countries covering the market. Further, the Middle-East and Africa have been classified under the RoW region. Each of these geographies has been further split by the major countries existing in this market. The sections and the sub-segments in the report would contain the drivers, restraints, opportunities, and current market trends; and the technologies expected to revolutionize the flexible electronics domain.
The Global Flexible Electronics Market is expected to reach $13.23 Billion by 2020, at an estimated CAGR of 21.73%. The emerging consumer electronics market is expected to grow at a CAGR of 44.30%. North America is the biggest flexible electronics market, followed by Europe and APAC.
Dielectric Material Market by Technology (OLED, LED, TFT-LCD, LED-LCD, Plasma, LCOS, DLP), Application (Conventional, 3D, Transparent, Flexible), Material (Metal Oxide, a-Silicon, LTPS, PET, PEN, Photonic Crystals) & by Geography – Global Forecast to 2013 – 2020
MarketsandMarkets is a global market research and consulting company based in the U.S. We publish strategically analyzed market research reports and serve as a business intelligence partner to Fortune 500 companies across the world.
MarketsandMarkets also provides multi-client reports, company profiles, databases, and custom research services. M&M covers thirteen industry verticals, including advanced materials, automotives and transportation, banking and financial services, biotechnology, chemicals, consumer goods, energy and power, food and beverages, industrial automation, medical devices, pharmaceuticals, semiconductor and electronics, and telecommunications and IT.
New technology yields potential for faster-charging, longer-lasting batteries to power future electronic devices. By Nicole Basaraba on July 3, 2014
(Edmonton) A research team from the University of Alberta has used carbon nanomaterials to develop next-generation batteries capable of charging faster and lasting longer than today’s standard lithium-ion batteries. “What we’ve done is develop a new electrochemistry technology that can provide high energy density and high power density for the next generation,” said lead researcher Xinwei Cui, who completed his PhD in materials engineering at the U of A in 2010 and is now chief technology officer at AdvEn Solutions, a technology development company that is working on the battery so it can be commercially manufactured for use in electronic devices.
The research team developed the new technology for energy storage using a process called induced fluorination. “We tried lots of different materials. Normally carbon is used as the anode in lithium-ion batteries, but we used carbon as the cathode, and this is used to build a battery with induced fluorination,” Cui explained. The advantages of using carbon are that it is cost-effective and safe to use, and the energy output is five to eight times higher than lithium-ion batteries currently on the market.
Xinwei Cui holds one of the nano-engineered carbon components of the new battery technology. (Photo: David Dodge)
The new battery also performs better than two other future technologies: lithium-sulfur batteries, currently in the prototype stage, and lithium-air batteries, now under development. For example, the induced-fluorination technology could be used to produce cellphone batteries that would charge faster and last longer. “Nobody knew that carbon could be used as a cathode with such a high performance. That is what’s unique with our technology and what is detailed in our paper,” Cui said.
The team published their findings in the journal Nature Scientific Reports. The paper was written by Cui; Jian Chen, a researcher in the National Institute for Nanotechnology; Tianfei Wang, a PhD candidate in materials engineering; and Weixing Chen, professor of chemical and materials engineering at the U of A.
“It wasn’t a quick process. Once we found carbon is different, we persisted for three years until we got results,” Cui said. AdvEn Solutions hopes to have a prototype by the end of 2014 and aims to develop three versions of the battery to serve different goals.
One battery would have a high power output and a long life cycle, the second would have high energy for quick charging, and the third a super-high energy storage. “We have a long way to go, but we’re on the right track. It’s exciting work and we want everyone to know about it and that it’s very young but promising,” said Cui. AdvEn is a growing company housed within the Department of Chemical and Materials Engineering at the U of A. It aims to expand by taking on new researchers and gaining more funding. The company recently secured a partnership with the U.S.-based aerospace company Lockheed Martin to develop an advanced anode for AdvEn’s high-performance carbon cathode.
Swedish pharmaceutical company Oasmia Pharmaceutical AB (publ) (Oasmia) announced that a research agreement has been signed between Oasmia and a global pharmaceutical company. The agreement relates to the use of Oasmia’s patented nanoparticle formulation technology XR-17. The full terms of the agreement cannot be disclosed at this point in time.
Under the terms of the agreement, Oasmia will initially perform tests to investigate the possibility of making a solid formulation with the partner’s specified compound in combination with XR-17. XR-17 has been proven in several pre-clinical and clinical studies to make single or multiple APIs (Active Pharmaceutical Ingredients) water-soluble. If the results of these tests are considered promising according to the terms of the agreement, further development work will continue in collaboration with the partner at their research facilities.
The XR-17 technology is the basis for Oasmia’s own project portfolio within human and veterinary oncology which today consists of one conditionally approved product, (Paccal Vet®-CA1), one in registration (Paclical), and a further four products in various phases of clinical studies and pre-clinical development.
Solubility is one of the most important parameters when it comes to achieving the desired concentration of a drug in systemic circulation for anticipated pharmacological response. Low aqueous solubility is the major problem encountered with regards to formulation development of new chemical entities, as well as for generic development. There are numerous approaches available to enhance solubility of poorly water-soluble drugs. Micellar solubilisation is a well-established method to make a drug water-soluble, but the compounds used so far to create micelles have induced undesirable side effects, e.g. hypersensitivity reactions that require prophylactic treatment.
“The agreement we have signed today is indeed a validation of the XR-17 technology, which has always been, and continues to be, the heart and soul of Oasmia’s research and development. Since the inception of this company 15 years ago, we have steadily built a portfolio of potential oncology treatments using the XR-17 technology, with the aim to produce better, safer drugs. With this agreement, we have the opportunity not only to develop additional products within our portfolio but also to add a third leg to our business model apart from our existing human and veterinarian oncology products”, commented Julian Aleksov, CEO of Oasmia.
“We understand that other pharmaceutical companies are looking for excipients that can potentially be used to make water-insoluble compounds, soluble. We are very proud to be recognized as a provider of a technology that has the potential to play an important part in future drug development”, he continued.
About Oasmia Pharmaceutical AB
Oasmia Pharmaceutical AB develops new generations of drugs in the field of human and veterinary oncology. The company’s product development aims to create and manufacture novel nanoparticle formulations and drug-delivery systems based on well-established cytostatics which, in comparison with current alternatives, show improved properties, reduced side-effects, and expanded applications. The company’s product development is based on its proprietary in-house research and company patents.
XR-17 is a patented, nanoparticle formulation technology, which makes single or multiple APIs water soluble. XR-17, which consist of Vitamin A derivatives forms structures called micelles with the encapsulated active substance. A micelle containing a water insoluble substance consists of the active ingredient surrounded by XR-17 with the hydrophobic, non-polar chain pointing towards the active ingredient and the hydrophilic, polar head pointing outwards. If the active compound is water soluble, the hydrophilic, polar head points inwards. The micelles are extremely small, 20 to 60 nm depending on the API, and quality as nanoparticles. The toxicity of, XR-17 is low. Animal data indicate that a dose eight times the expected dose used in formulation in clinical trials is non-toxic.
Pick up a pencil. Make a mark on a piece of paper. Congratulations: you are doing cutting-edge condensed matter physics. You might even be making the first mark on the road to quantum computers, according to new Perimeter research
Graphene had an unlikely start: it began with researchers messing around with pencil marks on paper. Pencil “lead” is actually made of graphite, which is a soft crystal lattice made of nothing but carbon atoms. When pencils deposit that graphite on paper, the lattice is laid down in thin sheets. By pulling that lattice apart into thinner sheets – originally using Scotch tape – researchers discovered that they could make flakes of crystal just one atom thick.
The name for this atom-scale chicken wire is graphene. Those folks with the Scotch tape, Andre Geim and Konstantin Novoselov, won the 2010 Nobel Prize for discovering it. “As a material, it is completely new – not only the thinnest ever but also the strongest,” wrote the Nobel committee. “As a conductor of electricity, it performs as well as copper. As a conductor of heat, it outperforms all other known materials. It is almost completely transparent, yet so dense that not even helium, the smallest gas atom, can pass through it.”
Developing a theoretical model of graphene
Graphene is not just a practical wonder – it’s also a wonderland for theorists. Confined to the two-dimensional surface of the graphene, the electrons behave strangely. All kinds of new phenomena can be seen, and new ideas can be tested. Testing new ideas in graphene is exactly what Perimeter researchers Zlatko Papić and Dmitry (Dima) Abanin set out to do.
scientific illustration of graphene. Credit: Zlatko Papić
“Dima and I started working on graphene a very long time ago,” says Papić. “We first met in 2009 at a conference in Sweden. I was a grad student and Dima was in the first year of his postdoc, I think.”
The two young scientists got to talking about what new physics they might be able to observe in the strange new material when it is exposed to a strong magnetic field.
“We decided we wanted to model the material,” says Papić. They’ve been working on their theoretical model of graphene, on and off, ever since. The two are now both at Perimeter Institute, where Papić is a postdoctoral researcher and Abanin is a faculty member. They are both cross-appointed with the Institute for Quantum Computing (IQC) at the University of Waterloo.
In January 2014, they published a paper in Physical Review Letters (PRL) presenting new ideas about how to induce a strange but interesting state in graphene – one where it appears as if particles inside it have a fraction of an electron’s charge.
It’s called the fractional quantum Hall effect (FQHE), and it’s head turning. Like the speed of light or Planck’s constant, the charge of the electron is a fixed point in the disorienting quantum universe.
Every system in the universe carries whole multiples of a single electron’s charge. When the FQHE was first discovered in the 1980s, condensed matter physicists quickly worked out that the fractionally charged “particles” inside their semiconductors were actually quasiparticles – that is, emergent collective behaviours of the system that imitate particles.
Graphene is an ideal material in which to study the FQHE. “Because it’s just one atom thick, you have direct access to the surface,” says Papić. “In semiconductors, where FQHE was first observed, the gas of electrons that create this effect are buried deep inside the material. They’re hard to access and manipulate. But with graphene you can imagine manipulating these states much more easily.”
In the January paper, Abanin and Papić reported novel types of FQHE states that could arise in bilayer graphene – that is, in two sheets of graphene laid one on top of another – when it is placed in a strong perpendicular magnetic field. In an earlier work from 2012, they argued that applying an electric field across the surface of bilayer graphene could offer a unique experimental knob to induce transitions between FQHE states. Combining the two effects, they argued, would be an ideal way to look at special FQHE states and the transitions between them.
Two experimental groups – one in Geneva, involving Abanin, and one at Columbia, involving both Abanin and Papić – have since put the electric field + magnetic field method to good use. The paper by the Columbia group appears in the July 4 issue of Science . A third group, led by Amir Yacoby of Harvard, is doing closely related work.
“We often work hand in hand with experimentalists,” says Papić. “One of the reasons I like condensed matter is that often even the most sophisticated, cutting-edge theory stands a good chance of being quickly checked with experiment.”
Inside both the magnetic and electric field, the electrical resistance of the graphene demonstrates the strange behaviour characteristic of the FQHE. Instead of resistance that varies in a smooth curve with voltage, resistance jumps suddenly from one level to another, and then plateaus – a kind of staircase of resistance. Each stair step is a different state of matter, defined by the complex quantum tangle of charges, spins, and other properties inside the graphene.
“The number of states is quite rich,” says Papić. “We’re very interested in bilayer graphene because of the number of states we are detecting and because we have these mechanisms – like tuning the electric field – to study how these states are interrelated, and what happens when the material changes from one state to another.”
For the moment, researchers are particularly interested in the stair steps whose “height” is described by a fraction with an even denominator. That’s because the quasiparticles in that state are expected to have an unusual property.
There are two kinds of particles in our three-dimensional world: fermions (such as electrons), where two identical particles can’t occupy one state, and bosons (such as photons), where two identical particles actually want to occupy one state. In three dimensions, fermions are fermions and bosons are bosons, and never the twain shall meet.
But a sheet of graphene doesn’t have three dimensions – it has two. It’s effectively a tiny two-dimensional universe, and in that universe, new phenomena can occur. For one thing, fermions and bosons can meet halfway – becoming anyons, which can be anywhere in between fermions and bosons. The quasiparticles in these special stair-step states are expected to be anyons.
In particular, the researchers are hoping these quasiparticles will be non-Abelian anyons, as their theory indicates they should be. That would be exciting because non-Abelian anyons can be used in the making of qubits.
Qubits are to quantum computers what bits are to ordinary computers: both a basic unit of information and the basic piece of equipment that stores that information. Because of their quantum complexity, qubits are more powerful than ordinary bits and their power grows exponentially as more of them are added. A quantum computer of only a hundred qubits can tackle certain problems beyond the reach of even the best non-quantum supercomputers. Or, it could, if someone could find a way to build stable qubits.
The drive to make qubits is part of the reason why graphene is a hot research area in general, and why even-denominator FQHE states – with their special anyons– are sought after in particular. “A state with some number of these anyons can be used to represent a qubit,” says Papić. “Our theory says they should be there and the experiments seem to bear that out – certainly the even-denominator FQHE states seem to be there, at least according to the Geneva experiments.”
That’s still a step away from experimental proof that those even-denominator stair-step states actually contain non-Abelian anyons. More work remains, but Papić is optimistic: “It might be easier to prove in graphene than it would be in semiconductors. Everything is happening right at the surface.”
It’s still early, but it looks as if bilayer graphene may be the magic material that allows this kind of qubit to be built. That would be a major mark on the unlikely line between pencil lead and quantum computers.