MIT: Fish-eye lens may entangle pairs of atoms – may be a promising vehicle for necessary building blocks in designing quantum computers


MIT-Fish-Eye_0

James Maxwell was the first to realize that light is able to travel in perfect circles within the fish-eye lens because the density of the lens changes, with material being thickest at the middle and gradually thinning out toward the edges.

Nearly 150 years ago, the physicist James Maxwell proposed that a circular lens that is thickest at its center, and that gradually thins out at its edges, should exhibit some fascinating optical behavior. Namely, when light is shone through such a lens, it should travel around in perfect circles, creating highly unusual, curved paths of light.

He also noted that such a lens, at least broadly speaking, resembles the eye of a fish. The lens configuration he devised has since been known in physics as Maxwell’s fish-eye lens — a theoretical construct that is only slightly similar to commercially available fish-eye lenses for cameras and telescopes.

Now scientists at MIT and Harvard University have for the first time studied this unique, theoretical lens from a quantum mechanical perspective, to see how individual atoms and photons may behave within the lens. In a study published Wednesday in Physical Review A, they report that the unique configuration of the fish-eye lens enables it to guide single photons through the lens, in such a way as to entangle pairs of atoms, even over relatively long distances.

Entanglement is a quantum phenomenon in which the properties of one particle are linked, or correlated, with those of another particle, even over vast distances. The team’s findings suggest that fish-eye lenses may be a promising vehicle for entangling atoms and other quantum bits, which are the necessary building blocks for designing quantum computers.

“We found that the fish-eye lens has something that no other two-dimensional device has, which is maintaining this entangling ability over large distances, not just for two atoms, but for multiple pairs of distant atoms,” says first author Janos Perczel, a graduate student in MIT’s Department of Physics. “Entanglement and connecting these various quantum bits can be really the name of the game in making a push forward and trying to find applications of quantum mechanics.”

The team also found that the fish-eye lens, contrary to recent claims, does not produce a perfect image. Scientists have thought that Maxwell’s fish-eye may be a candidate for a “perfect lens” — a lens that can go beyond the diffraction limit, meaning that it can focus light to a point that is smaller than the light’s own wavelength. This perfect imaging, scientist predict, should produce an image with essentially unlimited resolution and extreme clarity.

However, by modeling the behavior of photons through a simulated fish-eye lens, at the quantum level, Perczel and his colleagues concluded that it cannot produce a perfect image, as originally predicted.

“This tells you that there are these limits in physics that are really difficult to break,” Perczel says. “Even in this system, which seemed to be a perfect candidate, this limit seems to be obeyed. Perhaps perfect imaging may still be possible with the fish eye in some other, more complicated way, but not as originally proposed.”

Perczel’s co-authors on the paper are Peter Komar and Mikhail Lukin from Harvard University.

A circular path

Maxwell was the first to realize that light is able to travel in perfect circles within the fish-eye lens because the density of the lens changes, with material being thickest at the middle and gradually thinning out toward the edges. The denser a material, the slower light moves through it. This explains the optical effect when a straw is placed in a glass half full of water. Because the water is so much denser than the air above it, light suddenly moves more slowly, bending as it travels through water and creating an image that looks as if the straw is disjointed.

In the theoretical fish-eye lens, the differences in density are much more gradual and are distributed in a circular pattern, in such a way that it curves rather bends light, guiding light in perfect circles within the lens.

In 2009, Ulf Leonhardt, a physicist at the Weizmann Institute of Science in Israel was studying the optical properties of Maxwell’s fish-eye lens and observed that, when photons are released through the lens from a single point source, the light travels in perfect circles through the lens and collects at a single point at the opposite end, with very little loss of light.

“None of the light rays wander off in unwanted directions,” Perczel says. “Everything follows a perfect trajectory, and all the light will meet at the same time at the same spot.”

Leonhardt, in reporting his results, made a brief mention as to whether the fish-eye lens’ single-point focus might be useful in precisely entangling pairs of atoms at opposite ends of the lens.

“Mikhail [Lukin] asked him whether he had worked out the answer, and he said he hadn’t,” Perczel says. “That’s how we started this project and started digging deeper into how well this entangling operation works within the fish-eye lens.”

Playing photon ping-pong

To investigate the quantum potential of the fish-eye lens, the researchers modeled the lens as the simplest possible system, consisting of two atoms, one at either end of a two-dimensional fish-eye lens, and a single photon, aimed at the first atom. Using established equations of quantum mechanics, the team tracked the photon at any given point in time as it traveled through the lens, and calculated the state of both atoms and their energy levels through time.

They found that when a single photon is shone through the lens, it is temporarily absorbed by an atom at one end of the lens. It then circles through the lens, to the second atom at the precise opposite end of the lens. This second atom momentarily absorbs the photon before sending it back through the lens, where the light collects precisely back on the first atom.

“The photon is bounced back and forth, and the atoms are basically playing ping pong,” Perczel says. “Initially only one of the atoms has the photon, and then the other one. But between these two extremes, there’s a point where both of them kind of have it. It’s this mind-blowing quantum mechanics idea of entanglement, where the photon is completely shared equally between the two atoms.”

Perczel says that the photon is able to entangle the atoms because of the unique geometry of the fish-eye lens. The lens’ density is distributed in such a way that it guides light in a perfectly circular pattern and can cause even a single photon to bounce back and forth between two precise points along a circular path.

“If the photon just flew away in all directions, there wouldn’t be any entanglement,” Perczel says. “But the fish-eye gives this total control over the light rays, so you have an entangled system over long distances, which is a precious quantum system that you can use.”

As they increased the size of the fish-eye lens in their model, the atoms remained entangled, even over relatively large distances of tens of microns. They also observed that, even if some light escaped the lens, the atoms were able to share enough of a photon’s energy to remain entangled. Finally, as they placed more pairs of atoms in the lens, opposite to one another, along with corresponding photons, these atoms also became simultaneously entangled.

“You can use the fish eye to entangle multiple pairs of atoms at a time, which is what makes it useful and promising,” Perczel says.

Fishy secrets

In modeling the behavior of photons and atoms in the fish-eye lens, the researchers also found that, as light collected on the opposite end of the lens, it did so within an area that was larger than the wavelength of the photon’s light, meaning that the lens likely cannot produce a perfect image.

“We can precisely ask the question during this photon exchange, what’s the size of the spot to which the photon gets recollected? And we found that it’s comparable to the wavelength of the photon, and not smaller,” Perczel says. “Perfect imaging would imply it would focus on an infinitely sharp spot. However, that is not what our quantum mechanical calculations showed us.”

Going forward, the team hopes to work with experimentalists to test the quantum behaviors they observed in their modeling. In fact, in their paper, the team also briefly proposes a way to design a fish-eye lens for quantum entanglement experiments.

“The fish-eye lens still has its secrets, and remarkable physics buried in it,” Perczel says. “But now it’s making an appearance in quantum technologies where it turns out this lens could be really useful for entangling distant quantum bits, which is the basic building block for building any useful quantum computer or quantum information processing device.”

New battery technology that could run for more than a decade could revolutionize renewable energy – Harvard University


Harvard Battery Research aziz_650

The race is on to build the next-generation battery that could help the world switch over to clean energy. But as Bill Gates explained in his blog last year: “storing energy turns out to be surprisingly hard and expensive”.

 

Now Harvard researchers have developed a cheap, non-toxic battery that lasts more than 10 years, which they say could be a game changer for renewable energy storage.

Solar installers from Baker Electric place solar panels on the roof of a residential home in Scripps Ranch, San Diego, California, U.S. October 14, 2016.  Picture taken October 14, 2016.      REUTERS/Mike Blake - RTX2QGWW

Image: REUTERS/Mike Blake

The researchers from the John A. Paulson School of Engineering and Applied Sciences published a paper in the journal ACS Energy Letters saying that they have developed a breakthrough technology.

 

Their new type of battery stores energy in organic molecules dissolved in neutral pH water. This makes the battery non-toxic and cheaper. It’s suitable for home storage and lasts for more than a decade. “This is a long-lasting battery you could put in your basement,” Roy Gordon, a lead researcher and the Thomas Dudley Cabot Professor of Chemistry and Professor of Materials Science, said in a Harvard news article.

“If it spilled on the floor, it wouldn’t eat the concrete and since the medium is non-corrosive, you can use cheaper materials to build the components of the batteries, like the tanks and pumps.”

 

The energy storage problem

There’s a big problem with renewable energy sources: Intermittency. In other words, how to keep the lights on when the sun isn’t shining or the wind isn’t blowing.

 Image 2

 Image: International Energy Agency

In recent years, universities and the tech sector have been working on better batteries that they hope could help solve the energy storage problem. Despite significant improvements though, batteries are riddled with issues such as high cost, toxicity and short lifespan.

 

Solar power customers usually have two options to store power: lithium-ion batteries such as the ones found in electronics, which are still very expensive; or lead-acid batteries. These cost half as much, but need a lot of maintenance and contain toxic materials.

 Image 3

Image: Bloomberg New Energy Finance

In one emerging and promising technology is the “v-flow” battery, which uses vanadium in large external tanks of corrosive acids. 

The bigger the tanks, the more energy they store. But there’s a catch: vanadium is an expensive metal and like all other battery technologies, v-flow batteries lose capacity after a few years.

The quest for the next-generation battery

The US Department of Energy has set a goal of building a battery that can store energy for less than $100 per kilowatt-hour, which would make stored wind and solar energy competitive with energy produced from traditional power plants.

 

The Harvard researchers say their breakthrough puts them within sight of this goal.

“If you can get anywhere near this cost target then you change the world,” said Michael Aziz, lead researcher and professor of Materials and Energy Technologies at Harvard. “It becomes cost effective to put batteries in so many places. This research puts us one step closer to reaching that target.”

 

energy_storage_2013-042216-_11-13-1

Video: Next Generation Silicon-Nanowire Batteries

 

A new company has been formed to exploit and commercialize the Next Generation Super-Capacitors and Batteries. The opportunity is based on Technology & Exclusive IP Licensing Rights from Rice University, discovered/ curated by Dr. James M. Tour, named “One of the Fifty (50) most influential scientists in the World today”

The Porous Silicon Nanowires & Lithium Cobalt Oxide technology has been advanced to provide a New Generation Battery that is:

 Energy Dense
 High Specific Power
 Affordable Cost
 Low Manufacturing Cost
 Rapid Charge/ Re-Charge
 Flexible Form Factor
 Long Warranty Life
 Non-Toxic
 Highly Scalable

Key Markets & Commercial Applications

 Motor Cycle/ EV Batteries
 Marine and Drone Batteries
 Medical Devices and
 Power Banks
 Estimated $112 Billion Market for Rechargeable Batteries by 2025

 

 

‘Neural Lace’ or BCI (brain computer interface) ~ Elon Musk & new start-up Neuralink hope to ‘inject’ the Possibilities: Video


brain-quantum-1-download (1)Elon Musk is funding research towards “neural lace,” a brain computer interface technology that could allow our brains to compete with AI.

“The Journal reported that the new startup will focus on “neural lace” technology which involves implanting tiny brain electrodes capable of uploading and downloading thoughts. The report said Musk has taken an active role setting up the company and may play a ‘significant leadership role’ even as he runs two other large companies. Musk has previously spoken about the idea of neural lacing, claiming it can magnify people’s brain power by linking them directly to computing capabilities.”

“Billionaire futurist space explorer Elon Musk has a new project: a ”medical research company’ called Neuralink that will make brain-computer interfaces. Musk’s projects are frequently inspired by science fiction, and this one is a direct reference to a device called a ‘neural lace,’ invented by the late British novelist Iain M. Banks for his Culture series. In those books, characters grow a semi-organic mesh on their cerebral cortexes, which allows them to interface wirelessly with AIs and create backups of their minds.”

“Smarter artificial intelligence is certainly being developed, but how far along are we on producing a neural lace? At the conference, Musk said he didn’t know of any company that was working on one. But last year, a team of researchers led by Charles Lieber, the Mark Hyman Professor of Chemistry at Harvard University, described in Nature Nanotechnology a lace-like electronic mesh that ‘you could literally inject’ into three-dimensional synthetic and biological structures like the brain.”

Read The Harvard/ Nature Nanotechnology Article: Will This “Neural Lace” Brain Implant Help Us Compete with AI? Charles Leiber Harvard Neural Lace 10262_0247f3dd84906223785fddb18353bafe

Dr. Charles Lieber, Harvard University

Watch the YouTube Video:

 So what do YOU think? Science of Science Fiction? Please leave us your Comments. ~ Team GNT   GNT New Thumbnail LARGE 2016        

Scientists at Harvard & Maryland University create a new form of matter: “Time Crystals”


Time crystals may sound like something from science fiction, having more to do with time travel or Dr. Who. These strange materials — in which atoms and molecules are arranged across space and time — are in fact quite real, and are opening up entirely new ways to think about the nature of matter. They also eventually may help protect information in futuristic devices known as quantum computers.

 

Two groups of researchers based at Harvard University and the University of Maryland are reporting in the journal Nature March 9 that they have successfully created time crystals using theories developed at Princeton University. The Harvard-based team included scientists from Princeton who played fundamental roles in working out the theoretical understanding that led to the creation of these exotic crystals.

 

“Our work discovered the essential physics of how time crystals function,” said Shivaji Sondhi, a Princeton professor of physics. “What is more, this discovery builds on a set of developments at Princeton that gets at the issue of how we understand complex systems in and out of equilibrium, which is centrally important to how physicists explain the nature of the everyday world.”

 

Princeton University researchers provided the theoretical understanding that led to the creation of time crystals being reported in the journal Nature March 9 by two groups of researchers based at Harvard University and the University of Maryland. Time crystals feature atoms and molecules arranged across space and time and are opening up entirely new ways to think about the nature of matter. 


The illustration above explains the difference between ordinary crystals (left) such as diamonds, quartz or ice and time crystals (right). (Image by Emily Edwards, University of Maryland)

 

In 2015, Sondhi and colleagues including then-graduate student Vedika Khemani, who earned her Ph.D. at Princeton in 2016 and is now a junior fellow at Harvard, as well as collaborators Achilleas Lazarides and Roderich Moessner at the Max Planck Institute for the Physics of Complex Systems in Germany, published the theoretical basis for how time crystals — at first considered impossible — could actually exist. 
Published in the journal Physics Review Letters in June 2016, the paper spurred conversations about how to build such crystals.

 

Ordinary crystals such as diamonds, quartz or ice are made up of molecules that spontaneously arrange into orderly three-dimensional patterns. The sodium and chlorine atoms in a crystal of salt, for example, are spaced at regular intervals, forming a hexagonal lattice.

 

In time crystals, however, atoms are arranged in patterns not only in space, but also in time. In addition to containing a pattern that repeats in space, time crystals contain a pattern that repeats over time. One way this could happen is that the atoms in the crystal move at a certain rate. 

Were a time crystal of ice to exist, all of the water molecules would vibrate at an identical frequency. What is more, the molecules would do this without any input from the outside world.

 

The concept of time crystals originated with physicist Frank Wilczek at the Massachusetts Institute of Technology. In 2012, the Nobel laureate and former Princeton faculty member was thinking about the similarities between space and time. In physics parlance, crystals are said to “break translational symmetry in space” because the atoms assemble into rigid patterns rather than being evenly spread out, as they are in a liquid or gas. 

Shouldn’t there also be crystals that break translational symmetry in time?

 

“The atoms move in time, but instead of moving in a fluid or continuous way, they move in a periodic way,” Sondhi said. “It was an interesting idea.” It also was an idea that led to hot debates in the physics journals about whether such crystals could exist. The initial conclusion appeared to be that they could not, at least not in the settings Wilczek visualized.

 

Sondhi and Khemani were thinking about a completely different problem in 2015 when they worked out the theory of how time crystals could exist. They were exploring questions about how atoms and molecules settle down, or come to equilibrium, to form phases of matter such as solids, liquids and gases.


While it was common wisdom among physicists that all systems eventually settle down, work during the last decade or so had challenged that notion, specifically among atoms at very low temperatures where the rules of quantum physics apply. It was realized that there are systems that never go to equilibrium because of a phenomenon called “many-body localization,” which occurs due to the behavior of many atoms in a disordered quantum system that are influencing each other.

 

Work in this area is a long Princeton tradition. The first and seminal concept of how quantum systems can be localized when they are disordered, called Anderson localization, stemmed from work by Philip Anderson, a Princeton professor and Nobel laureate, in 1958. This work was extended in 2006 to systems of many atoms by then Princeton professor Boris Altshuler, postdoctoral fellow Denis Basko, and Igor Aleiner of Columbia University.

While on sabbatical at the Max Planck Institute for the Physics of Complex Systems in Germany, Sondhi and Khemani realized that these ideas about how to prevent systems from reaching equilibrium would enable the creation of time crystals. A system in equilibrium cannot be a time crystal, but non-equilibrium systems can be created by periodically poking, or “driving,” a crystal by shining a laser on its atoms. 

To the researchers’ surprise, their calculations revealed that periodically prodding atoms that were in non-equilibrium many-body localized phases would cause the atoms to move at a rate that was twice as slow — or twice the period — as the initial rate at which they were prodded.

 

To explain, Sondhi compared the driving of the quantum system to squeezing periodically on a sponge. “When you release the sponge, you expect it to resume its shape. Imagine now that it only resumes its shape after every second squeeze even though you are applying the same force each time. That is what our system does,” he said.

 

Princeton postdoctoral researcher Curt von Keyserlingk, who contributed additional theoretical work with Khemani and Sondhi, said, “We explained how the time crystal systems lock into the persistent oscillations that signify a spontaneous breaking of time translation symmetry.” Additional work by researchers at Microsoft’s Station Q and the University of California-Berkeley led to further understanding of time crystals. 

As a result of these theoretical studies, two groups of experimenters began attempting to build time crystals in the laboratory. The Harvard-based team, which included Khemani at Harvard and von Keyserlingk at Princeton, used an experimental setup that involved creating an artificial lattice in a synthetic diamond. A different approach at the University of Maryland used a chain of charged particles called ytterbium ions. Both teams have now published the work this week in Nature.

 

Both systems show the emergence of time crystalline behavior, said Christopher Monroe, a physicist who led the effort at the University of Maryland. “Although any applications for this work are far in the future, these experiments help us learn something about the inner workings of this very complex quantum state,” he said.

 

The research may eventually lead to ideas about how to protect information in quantum computers, which can be disrupted by interference by the outside world. Many-body localization can protect quantum information, according to research published in 2013 by the Princeton team of David Huse, the Cyrus Fogg Brackett Professor of Physics, as well as Sondhi and colleagues Rahul Nandkishore, Vadim Oganesyan and Arijeet Pal. 
The research also sheds light on ways to protect topological phases of matter, research for which Princeton’s F. Duncan Haldane, the Eugene Higgins Professor of Physics, shared the 2016 Nobel Prize in Physics.

 

Sondhi said that the work addresses some of the most fundamental questions about the nature of matter. “It was thought that if a system doesn’t settle down and come to equilibrium, you couldn’t really say that it is in a phase. It is a big deal when you can give a definition of a phase of matter when the matter is not in equilibrium,” he said.

 

This out-of-equilibrium setting has enabled the realization of new and exciting phases of matter, according to Khemani. “The creation of time crystals has allowed us to add an entry into the catalog of possible orders in space-time, previously thought impossible,” Khemani said.

 

The research at Princeton and the Max Planck Institute was supported by the National Science Foundation (grant no. 1311781), the John Templeton Foundation, the Alexander von Humboldt Foundation and the German Science Foundation’s Gottfried Wilhelm Leibniz Prize Programme.

 

The papers “Observation of discrete time-crystalline order in a disordered dipolar many-body system” and “Observation of a discrete time crystal” will be published March 9 by Nature.

Princeton University

Harvard University: “Holy Grail” Metallic Hydrogen Is Going to Change Everything … from Space Travel to Energy


rendering-of-metallic-hydrogen-pressurized-between-diamonds.jpeg

The substance has the potential to revolutionize everything from space travel to the energy grid.

Two Harvard scientists have succeeded in creating an entirely new substance long believed to be the “holy grail” of physics — metallic hydrogen, a material of unparalleled power that could one day propel humans into deep space. The research was published Thursday in the journal Science.

Scientists created the metallic hydrogen by pressurizing a hydrogen sample to more pounds per square inch than exists at the center of the Earth. This broke the molecule down from its solid state and allowed the particles to dissociate into atomic hydro

The best rocket fuel we currently have is liquid hydrogen and liquid oxygen, burned for propellant. The efficacy of such substances is characterized by “specific impulse,” the measure of impulse fuel can give a rocket to propel it forward.

“People at NASA or the Air Force have told me that if they could get an increase from 450 seconds [of specific impulse] to 500 seconds, that would have a huge impact on rocketry,” Isaac Silvera, the Thomas D. Cabot Professor of the Natural Sciences at Harvard University, told Inverse by phone. “If you can trigger metallic hydrogen to recover to the molecular phase, [the energy release] calculated for that is 1700 seconds.”

Metallic hydrogen could potentially enable rockets to get into orbit in a single stage, even allowing humans to explore the outer planets. Metallic hydrogen is predicted to be “metastable” — meaning if you make it at a very high pressure then release it, it’ll stay at that pressure. A diamond, for example, is a metastable form of graphite. If you take graphite, pressurize it, then heat it, it becomes a diamond; if you take the pressure off, it’s still a diamond. But if you heat it again, it will revert back to graphite.

Scientists first theorized atomic metallic hydrogen a century ago. Silvera, who created the substance along with post-doctoral fellow Ranga Dias, has been chasing it since 1982 and working as a professor of physics at the University of Amsterdam.

Metallic hydrogen has also been predicted to be a high- or possibly room-temperature superconductor. There are no other known room-temperature superconductors in existence, meaning the applications are immense — particularly for the electric grid, which suffers for energy lost through heat dissipation. It could also facilitate magnetic levitation for futuristic high-speed trains; substantially improve performance of electric cars; and revolutionize the way energy is produced and stored.

But that’s all still likely a couple of decades off. The next step in terms of practical application is to determine if metallic hydrogen is indeed metastable. Right now Silvera has a very small quantity. If the substance does turn out to be metastable, it might be used to create room-temperature crystal and — by spraying atomic hydrogen onto the surface —use it like a seed to grow more, the way synthetic diamonds are made.

Photos via Nature

Harvard University and MIT Team Up: New blue-light emitting molecules for amazing displays resolution


Harvard Blue OLED 081116 23c4a3_f5b47ea7e4734fc7b95cb27331a1576e-mv2

Harvard University researchers have designed more than 1,000 new blue-light emitting molecules for organic light-emitting diodes (OLEDs) that could dramatically improve displays for televisions, phones, tablets and more.

OLED screens use organic molecules that emit light when an electric current is applied. Unlike ubiquitous liquid crystal displays (LCDs), OLED screens don’t require a backlight, meaning the display can be as thin and flexible as a sheet of plastic. Individual pixels can be switched on or entirely off, dramatically improving the screen’s color contrast and energy consumption.

OLEDs are already replacing LCDs in high-end consumer devices but a lack of stable and efficient blue materials has made them less competitive in large displays such as televisions.

The interdisciplinary team of Harvard researchers, in collaboration with MIT and Samsung, developed a large-scale, computer-driven screening process, called the Molecular Space Shuttle, that incorporates theoretical and experimental chemistry, machine learning and cheminformatics to quickly identify new OLED molecules that perform as well as, or better than, industry standards.

“People once believed that this family of organic light-emitting molecules was restricted to a small region of molecular space,” said Alán Aspuru-Guzik, Professor of Chemistry and Chemical Biology, who led the research. “But by developing a sophisticated molecular builder, using state-of-the art machine learning, and drawing on the expertise of experimentalists, we discovered a large set of high-performing blue OLED materials.”

The research is described in the current issue of Nature Materials. The biggest challenge in manufacturing affordable OLEDs is emission of the color blue. Like LCDs, OLEDs rely on green, red and blue subpixels to produce every color on screen.  

But it has been difficult to find organic molecules that efficiently emit blue light. To improve efficiency, OLED producers have created organometallic molecules with expensive transition metals like iridium to enhance the molecule through phosphorescence. This solution is expensive and it has yet to achieve a stable blue color.

 Aspuru-Guzik and his team sought to replace these organometallic systems with entirely organic molecules.

 The team began by building libraries of more than 1.6 million candidate molecules. Then, to narrow the field, a team of researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), led by Ryan Adams, Assistant Professor of Computer Science, developed new machine learning algorithms to predict which molecules were likely to have good outcomes, and prioritize those to be virtually tested. This effectively reduced the computational cost of the search by at least a factor of ten.

“This was a natural collaboration between chemistry and machine learning,” said David Duvenaud, a postdoctoral fellow in the Adams lab and coauthor of the paper. “Since the early stages of our chemical design process starts with millions of possible candidates, there’s no way for a human to evaluate and prioritize all of them. So, we used neural networks to quickly prioritize the candidates based on all the molecules already evaluated.”

“Machine learning tools are really coming of age and starting to see applications in a lot of scientific domains,” said Adams.  “This collaboration was a wonderful opportunity to push the state of the art in computer science, while also developing completely new materials with many practical applications. It was incredibly rewarding to see these designs go from machine learning predictions to devices that you can hold in your hand.”

“We were able to model these molecules in a way that was really predictive,” said Rafael Gómez-Bombarelli, a postdoctoral fellow in the Aspuru-Guzik lab and first author of the paper.  “We could predict the color and the brightness of the molecules from a simple quantum chemical calculation and about 12 hours of computing per molecule. We were charting chemical space and finding the frontier of what a molecule can do by running virtual experiments.”

“Molecules are like athletes,” Aspuru-Guzik said. “It’s easy to find a runner, it’s easy to find a swimmer, it’s easy to find a cyclist but it’s hard to find all three. Our molecules have to be triathletes. They have to be blue, stable and bright.”

But finding these super molecules takes more than computing power — it takes human intuition, said Tim Hirzel, a senior software engineer in the Department of Chemistry and Chemical Biology and coauthor of the paper.

To help bridge the gap between theoretical modeling and experimental practice, Hirzel and the team built a web application for collaborators to explore the results of more than half a million quantum chemistry simulations.

Every month, Gómez-Bombarelli and coauthor Jorge Aguilera-Iparraguirre, also a postdoctoral fellow in the Aspuru-Guzik lab, selected the most promising molecules and used their software to create “baseball cards,” profiles containing important information about each molecule. This process identified 2500 molecules worth a closer look.  The team’s experimental collaborators at Samsung and MIT then voted on which molecules were most promising for application. The team nicknamed the voting tool “molecular Tinder” after the popular online dating app.

 “We facilitated the social aspect of the science in a very deliberate way,” said Hirzel. “The computer models do a lot but the spark of genius is still coming from people,” said Gómez-Bombarelli. “The success of this effort stems from its multidisciplinary nature,” said Aspuru-Guzik. “Our collaborators at MIT and Samsung provided critical feedback regarding the requirements for the molecular structures.”

“The high throughput screening technique pioneered by the Harvard team significantly reduced the need for synthesis, experimental characterization, and optimization,” said Marc Baldo, Professor of Electrical Engineering and Computer Science at MIT and coauthor of the paper. “It shows the industry how to advance OLED technology faster and more efficiently.”

After this accelerated design cycle, the team was left with hundreds of molecules that perform as well as, if not better than, state-of-the-art metal-free OLEDs. Applications of this type of molecular screening also extend far beyond OLEDs.

“This research is an intermediate stop in a trajectory towards more and more advanced organic molecules that could be used in flow batteries, solar cells, organic lasers, and more,” said Aspuru-Guzik. “The future of accelerated molecular design is really, really exciting.”

In addition to the authors mentioned, the manuscript was coauthored by Dougal Maclaurin, Martin A. Blood-Forsythe, Hyun Sik Chae, Markus Einzinger, Dong-Gwang Ha, Tony Wu, Georgios Markopoulos, Soonok Jeon, Hosuk Kang, Hiroshi Miyazaki, Masaki Numata, Sunghan Kim, Wenliang Huang and Seong Ik Hong.

Reference:

Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach

Nature Material, Published Online: 8 August 2016, DOI: 10.1038/NMAT4717

Update: MIT; UC San Diego; Harvard Universities: Energy-carrying particles called ‘topological plexcitons’ could make possible the design of ‘next generation’ solar cells and miniaturized optical circuitry


Plexcitons 060916 11-scientistsde

Plexcitons travel for 20,000 nanometers, a length which is on the order of the width of human hair. Credit: Joel Yuen-Zhou

Scientists at UC San Diego, MIT and Harvard University have engineered “topological plexcitons,” energy-carrying particles that could help make possible the design of new kinds of solar cells and miniaturized optical circuitry.

The researchers report their advance in an article published in the current issue of Nature Communications.

Within the Lilliputian world of physics, light and matter interact in strange ways, exchanging energy back and forth between them.

plexciton-plasmonexciton-coupling-1 II 060916 -638

“When light and matter interact, they exchange energy,” explained Joel Yuen-Zhou, an assistant professor of chemistry and biochemistry at UC San Diego and the first author of the paper. “Energy can flow back and forth between light in a metal (so called plasmon) and light in a molecule (so called exciton). When this exchange is much faster than their respective decay rates, their individual identities are lost, and it is more accurate to think about them as hybrid particles; excitons and plasmons marry to form plexcitons.” mit_logo

Materials scientists have been looking for ways to enhance a process known as exciton energy transfer, or EET, to create better as well as miniaturized photonic circuits which are dozens of times smaller than their silicon counterparts.

“Understanding the fundamental mechanisms of EET enhancement would alter the way we think about designing solar cells or the ways in which energy can be transported in nanoscale materials,” said Yuen-Zhou.

The drawback with EET, however, is that this form of energy transfer is extremely short-ranged, on the scale of only 10 nanometers, and quickly dissipates as the excitons interact with different molecules.

plexciton-plasmonexciton-coupling-13-638One solution to avoid those shortcomings is to hybridize excitons in a molecular crystal with the collective excitations within metals to produce plexcitons, which travel for 20,000 nanometers, a length which is on the order of the width of human hair.

Plexcitons are expected to become an integral part of the next generation of nanophotonic circuitry, light-harvesting solar energy architectures and chemical catalysis devices. But the main problem with plexcitons, said Yuen-Zhou, is that their movement along all directions, which makes it hard to properly harness in a material or device.

He and a team of physicists and engineers at MIT and Harvard found a solution to that problem by engineering particles called “topological plexcitons,” based on the concepts in which solid state physicists have been able to develop materials called “topological insulators.”

“Topological insulators are materials that are perfect electrical insulators in the bulk but at their edges behave as perfect one-dimensional metallic cables,” Yuen-Zhou said. “The exciting feature of is that even when the material is imperfect and has impurities, there is a large threshold of operation where electrons that start travelling along one direction cannot bounce back, making electron transport robust. In other words, one may think about the electrons being blind to impurities.”

Plexcitons, as opposed to electrons, do not have an electrical charge. Yet, as Yuen-Zhou and his colleagues discovered, they still inherit these robust directional properties. Adding this “topological” feature to plexcitons gives rise to directionality of EET, a feature researchers had not previously conceived. This should eventually enable engineers to create plexcitonic switches to distribute selectively across different components of a new kind of solar cell or light-harvesting device.

Explore further: Topological insulators could exist in six new types not seen before, theorists predict

More information: Nature Communications, DOI: 10.1038/NCOMMS11783

 

GNT Thumbnail Alt 3 2015-page-001

Genesis Nanotechnology, Inc. ~ “Great Things from Small Things”

Facebook 042616.jpgFollow Us on Facebook

Twitter Icon 042616.jpgUp to the Minute Nanotech News on Our Twitter Feed

LinkedIn IconA 042316.jpg‘Link-Up” with Us on LinkedIn

 Website Icon 042616Connect with Our Website

YouTube small 050516Watch Our YouTube Video 

 

Harvard University and Wyss Institute: Bionic (Artificial) Leaf: Harvesting Solar Energy and Storing it as a Liquid Fuel


Harvad Bio Leaf_02_15_140-bionic-leafHarvesting sunlight is a trick plants mastered more than a billion years ago, using solar energy to feed themselves from the air and water around them in the process we know as photosynthesis.

Scientists have also figured out how to harness solar energy, using electricity from photovoltaic cells to yield hydrogen that can be later used in fuel cells. But hydrogen has failed to catch on as a practical fuel for cars or for power generation in a world designed around liquid fuels.

Now scientists from a team spanning Harvard University’s Faculty of Arts and Sciences, Harvard Medical School and the Wyss Institute for Biologically Inspired Engineering have created a system that uses bacteria to convert solar energy into a liquid fuel. Their work integrates an “artificial leaf,” which uses a catalyst to make sunlight split water into hydrogen and oxygen, with a bacterium engineered to convert carbon dioxide plus hydrogen into the liquid fuel isopropanol.

The findings are published Feb. 9 in PNAS. The co-first authors are Joseph Torella, a recent PhD graduate from the HMS Department of Systems Biology, and Christopher Gagliardi, a postdoctoral fellow in the Harvard Department of Chemistry and Chemical Biology.

Pamela Silver, the Elliott T. and Onie H. Adams Professor of Biochemistry and Systems Biology at HMS and an author of the paper, calls the system a bionic leaf, a nod to the artificial leaf invented by the paper’s senior author, Daniel Nocera, the Patterson Rockwood Professor of Energy at Harvard University.

Harvad Bio Leaf_02_15_140-bionic-leaf

“This is a proof of concept that you can have a way of harvesting solar energy and storing it in the form of a liquid fuel,” said Silver, who is a founding core faculty member of the Wyss. “Dan’s formidable discovery of the catalyst really set this off, and we had a mission of wanting to interface some kinds of organisms with the harvesting of solar energy. It was a perfect match.”

Get more HMS news here.

Silver and Nocera began collaborating two years ago, shortly after Nocera came to Harvard from MIT. They shared an interest in “personalized energy,” or the concept of making energy locally, as opposed to the current system, which in the example of oil means production is centralized and then sent to gas stations. Local energy would be attractive in the developing world.

“It’s not like we’re trying to make some super-convoluted system,” Silver said. “Instead, we are looking for simplicity and ease of use.”

In a similar vein, Nocera’s artificial leaf depends on catalysts made from materials that are inexpensive and readily accessible.

“The catalysts I made are extremely well adapted and compatible with the growth conditions you need for living organisms like a bacterium,” Nocera said.

In their new system, once the artificial leaf produces oxygen and hydrogen, the hydrogen is fed to a bacterium called Ralstonia eutropha. An enzyme takes the hydrogen back to protons and electrons, then combines them with carbon dioxide to replicate—making more cells.

Next, based on discoveries made earlier by Anthony Sinskey, professor of microbiology and of health sciences and technology at MIT, new pathways in the bacterium are metabolically engineered to make isopropanol.

“The advantage of interfacing the inorganic catalyst with biology is you have an unprecedented platform for chemical synthesis that you don’t have with inorganic catalysts alone,” said Brendan Colón, a graduate student in systems biology in the Silver lab and a co-author of the paper. “Solar-to-chemical production is the heart of this paper, and so far we’ve been using plants for that, but we are using the unprecedented ability of biology to make lots of compounds.”

The same principles could be employed to produce drugs such as vitamins in small amounts, Silver said.

The team’s immediate challenge is to increase the bionic leaf’s ability to translate solar energy to biomass by optimizing the catalyst and the bacteria. Their goal is 5 percent efficiency, compared to nature’s rate of 1 percent efficiency for photosynthesis to turn sunlight into biomass.

“We’re almost at a 1 percent efficiency rate of converting sunlight into isopropanol,” Nocera said. “There have been 2.6 billion years of evolution, and Pam and I working together a year and a half have already achieved the efficiency of photosynthesis.”

Source: Harvard Medical School

Genesis Nanotech Headlines Are Out!


Organ on a chip organx250Genesis Nanotech Headlines Are Out! Read All About It!

https://paper.li/GenesisNanoTech/1354215819#!headlines

Visit Our Website: www.genesisnanotech.com

Visit/ Post on Our Blog: https://genesisnanotech.wordpress.com

 

SUBCOMMITTE EXAMINES BREAKTHROUGH NANOTECHNOLOGY OPPORTUNITIES FOR AMERICA

Chairman Terry: “Nanotech is a true science race between the nations, and we should be encouraging the transition from research breakthroughs to commercial development.”

WASHINGTON, DCThe Subcommittee on Commerce, Manufacturing, and Trade, chaired by Rep. Lee Terry (R-NE), today held a hearing on:

“Nanotechnology: Understanding How Small Solutions Drive Big Innovation.”

 

 

electron-tomography

“Great Things from Small Things!” … We Couldn’t Agree More!

 

Subcommittee Examines Breakthrough Nanotechnology Opportunities for America


Applications-of-Nanomaterials-Chart-Picture1SUBCOMMITTE EXAMINES BREAKTHROUGH NANOTECHNOLOGY OPPORTUNITIES FOR AMERICA
July 29, 2014

WASHINGTON, DCThe Subcommittee on Commerce, Manufacturing, and Trade, chaired by Rep. Lee Terry (R-NE), today held a hearing on “Nanotechnology: Understanding How Small Solutions Drive Big Innovation.” Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is approximately 1 to 100 nanometers (one nanometer is a billionth of a meter). This technology brings great opportunities to advance a broad range of industries, bolster our U.S. economy, and create new manufacturing jobs. Members heard from several nanotech industry leaders about the current state of nanotechnology and the direction that it is headed.UNIVERSITY OF WATERLOO - New $5 million lab

“Just as electricity, telecommunications, and the combustion engine fundamentally altered American economics in the ‘second industrial revolution,’ nanotechnology is poised to drive the next surge of economic growth across all sectors,” said Chairman Terry.

 

 

Applications of Nanomaterials Chart Picture1

Dr. Christian Binek, Associate Professor at the University of Nebraska-Lincoln, explained the potential of nanotechnology to transform a range of industries, stating, “Virtually all of the national and global challenges can at least in part be addressed by advances in nanotechnology. Although the boundary between science and fiction is blurry, it appears reasonable to predict that the transformative power of nanotechnology can rival the industrial revolution. Nanotechnology is expected to make major contributions in fields such as; information technology, medical applications, energy, water supply with strong correlation to the energy problem, smart materials, and manufacturing. It is perhaps one of the major transformative powers of nanotechnology that many of these traditionally separated fields will merge.”

Dr. James M. Tour at the Smalley Institute for Nanoscale Science and Technology at Rice University encouraged steps to help the U.S better compete with markets abroad. “The situation has become untenable. Not only are our best and brightest international students returning to their home countries upon graduation, taking our advanced technology expertise with them, but our top professors also are moving abroad in order to keep their programs funded,” said Tour. “This is an issue for Congress to explore further, working with industry, tax experts, and universities to design an effective incentive structure that will increase industry support for research and development – especially as it relates to nanotechnology. This is a win-win for all parties.”

** Follow Our Nano-Blog: “Great Things from Small Things” – Just Click on the ‘Follow Blog”

** Follow us on Twitter at: @Genesisnanotech1x2 logo sm

** ‘Like Us’ on Facebook

Professor Milan Mrksich of Northwestern University discussed the economic opportunities of nanotechnology, and obstacles to realizing these benefits. He explained, “Nanotechnology is a broad-based field that, unlike traditional disciplines, engages the entire scientific and engineering enterprise and that promises new technologies across these fields. … Current challenges to realizing the broader economic promise of the nanotechnology industry include the development of strategies to ensure the continued investment in fundamental research, to increase the fraction of these discoveries that are translated to technology companies, to have effective regulations on nanomaterials, to efficiently process and protect intellectual property to ensure that within the global landscape, the United States remains the leader in realizing the economic benefits of the nanotechnology industry.”

James Phillips, Chairman & CEO at NanoMech, Inc., added, “It’s time for America to lead. … We must capitalize immediately on our great University system, our National Labs, and tremendous agencies like the National Science Foundation, to be sure this unique and best in class innovation ecosystem, is organized in a way that promotes nanotechnology, tech transfer and commercialization in dramatic and laser focused ways so that we capture the best ideas into patents quickly, that are easily transferred into our capitalistic economy so that our nation’s best ideas and inventions are never left stranded, but instead accelerated to market at the speed of innovation so that we build good jobs and improve the quality of life and security for our citizens faster and better than any other country on our planet.”

Chairman Terry concluded, “Nanotech is a true science race between the nations, and we should be encouraging the transition from research breakthroughs to commercial development. I believe the U.S. should excel in this area.”

– See more at: http://energycommerce.house.gov/press-release/subcommittee-examines-breakthrough-nanotechnology-opportunities-america#sthash.YnSzFU10.dpuf