A protective layer of epoxy resin helps prevent the leakage of pollutants from perovskite solar cells (PSCs), according to scientists from the Okinawa Institute of Science and Technology Graduate University (OIST). Adding a “self-healing” polymer to the top of a PSC can radically reduce how much lead it discharges into the environment. This gives a strong boost to prospects for commercializing the technology.
With atmospheric carbon dioxide levels reaching their highest recorded levels in history, and extreme weather events continuing to rise in number, the world is moving away from legacy energy systems relying on fossil fuels towards renewables such as solar. Perovskite solar technology is promising, but one key challenge to commercialization is that it may release pollutants such as lead into the environment—especially under extreme weather conditions.
“Although PSCs are efficient at converting sunlight into electricity at an affordable cost, the fact that they contain lead raises considerable environmental concern,” explains Professor Yabing Qi, head of the Energy Materials and Surface Sciences Unit, who led the study, published in Nature Energy.
“While so-called ‘lead-free’ technology is worth exploring, it has not yet achieved efficiency and stability comparable to lead-based approaches. Finding ways of using lead in PSCs while keeping it from leaking into the environment, therefore, is a crucial step for commercialization.”
Testing to destruction
Qi’s team, supported by the OIST Technology Development and Innovation Center’s Proof-of-Concept Program, first explored encapsulation methods for adding protective layers to PSCs to understand which materials might best prevent the leakage of lead. They exposed cells encapsulated with different materials to many conditions designed to simulate the sorts of weather to which the cells would be exposed in reality.
They wanted to test the solar cells in a worst-case weather scenario, to understand the maximum lead leakage that could occur. First, they smashed the solar cells using a large ball, mimicking extreme hail that could break down their structure and allow lead to be leaked. Next, they doused the cells with acidic water, to simulate the rainwater that would transport leaked lead into the environment.
Using mass spectroscopy, the team analyzed the acidic rain to determine how much lead leaked from the cells. They found that an epoxy resin layer allowed only minimal lead leakage—orders of magnitude lower than the other materials.
Enabling commercial viability
Epoxy resin also performed best under a number of weather conditions in which sunlight, rainwater and temperature were altered to simulate the environments in which PSCs must operate. In all scenarios, including extreme rain, epoxy resin outperformed rival encapsulation materials.
Epoxy resin works so well due to its “self-healing” properties. After its structure is damaged by hail, for example, the polymer partially reforms its original shape when heated by sunlight. This limits the amount of lead that leaks from inside the cell. This self-healing property could make epoxy resin the encapsulation layer of choice for future photovoltaic products.
“Epoxy resin is certainly a strong candidate, yet other self-healing polymers may be even better,” explains Qi. “At this stage, we are pleased to be promoting photovoltaic industry standards, and bringing the safety of this technology into the discussion. Next, we can build on these data to confirm which is truly the best polymer.”
Beyond lead leakage, another challenge will be to scale up perovskite solar cells into perovskite solar panels. While cells are just a few centimeters long, panels can span a few meters, and will be more relevant to potential consumers. The team will also direct their attention to the long-standing challenge of renewable energy storage.
The World Health Organization (WHO) estimates that by 2025, about half of the world’s population will live in areas where there is a shortage of clean drinking water. Is it possible that the solution to the global water crisis is, literally, right under our noses? Technion researchers have developed a model for a system that separates the moisture naturally present in the air around us and converts it into drinking water. The patented system, and how it can help prevent the water crisis that awaits the world, was recently presented by Associate Professor David Broday of Technion’s Faculty of Civil and Environmental Engineering at a seminar on water, energy, treatment, and recycling.
“Water is available and free to everyone”
Associate Professor Broday, who developed the system together with his colleague Associate Professor Eran Friedler, explains that the idea is to take advantage of a resource that is constantly and abundantly present around us.
“The atmosphere is everywhere, and there is humidity everywhere,” says Broday. “No atmosphere is completely dry; there is humidity even in the air of the arid Sahara Desert. In fact, the amount of moisture in the atmosphere is equal to the amount of fresh liquid water in the world (i.e. not accounting for glaciers). This is a huge amount of water freely available to everyone with no restrictions.”
Harvesting moisture from the air is not new, and there are several companies around the world that have already developed technologies around this concept. This existing technology, says Broday, is similar to a domestic air conditioner that cools air that comes from the outdoors, uses the cold air, and discards the water condensed during the cooling process. In the case of moisture harvesting technology, it is the air that is discarded and the water that is used. “The existing technology actually takes air and cools it to extract the moisture from it,” explains Broday. “It is brought to a state where the moisture condenses on a cold surface and drips from it, then it is collected and used for drinking.”
But, says Prof. Broday, there is a problem with this technology. “Air is composed not only of moisture but also of other gases like oxygen and nitrogen, and this technology invests in cooling them along with the humidity,” he explains. “Air volume contains only 4 to 5 percent humidity, at best, which is a very small part. A lot of energy is invested in cooling something of which more than 90 percent doesn’t get used at all. This is an ineffective use of energy. This process is expensive to begin with, and in effect most of the energy goes towards cooling material that we are not at all interested in.”
With their system, the researchers propose to optimize this process by separating moisture from the air before cooling it. Doing so will make it possible to invest energy in cooling only the moisture itself and converting it into available water.
The Technion system is also a radical departure from attempts by others who are trying to develop membranes that will separate the moisture out of the air (like the desalination process in which membranes separate salt from seawater).
“The alternative we are proposing is based on the use of an absorbent substance called a desiccant, which is a highly concentrated saline solution that naturally absorbs the moisture from the air when it comes into contact with it,” Broday explains. “The idea is to use this material to absorb a large amount of moisture from the air, and to cool the moisture only after this has been accomplished.”
“Our system is composed of several stages,” he explains. “In the first stage, it will circulate air to transfer moisture from the air to the dessicant, which is in a liquid state. This cycle is repeated over and over again, as the dessicant collects more and more moisture from the air. In the second stage, we transfer a small portion of the dessicant to another part of the system, where we produce conditions that cause the desiccant to release the moisture. This moisture is then condensed and turned into water. For this to happen, we need to cool it down – this is the third stage, which is actually similar to what happens in existing systems; but unlike them, at this stage we cool 100 percent humidity rather than air, only a fraction of which is relevant to our needs.”
Drinkable water in the middle of the desert
According to the researchers, their proposed system isn’t just more energy-efficient. It also provides cleaner water. After cooling, the water collected in the system should be suitable for immediate drinking, as opposed to existing technologies in which air is cooled in its entirety. “If the air spinning in a system – in addition to moisture – contains disease-causing bacteria, when it is cooled and the water condenses, the bacteria in the air may also find their way into the water,” Broday explains. “This means this water may require purification to make it fit for consumption.
“In our system, the air does not meet the cooling coils at all – only the moisture that is separated from it. As a result, even if the air contains substances we do not want to reach the water, they are absorbed into the dessicant but not released in the next stage,” he says. “Even if bacteria, dust, and the like have accumulated, because it is a very concentrated salt solution, it simply dries up. So the resulting moisture would be clean, and the water, pure. Of course they would be tested, but the need for water treatment processes would probably be much smaller, which is expected to lower the price of using it.”
Using the system would not be without cost, and the researchers emphasize that their method of producing water is more expensive than desalination.
“Where water can be desalinated – that is, in proximity to a source of water such as a sea or brackish lakes – desalination is the preferred option,” explains Broday. “Economically speaking, it makes sense to desalinate and produce a system for transporting water to places that are up to about 62 miles away. Any further than this, and the cost of transportation becomes more expensive than the cost of desalination. There are also towns located close to rivers where water is suitable for use. But when we take all of these out of the equation, there are quite a few places in the world where desalination and direct use are not economically viable.”
The Technion researchers’ system has not yet been built, but they have already performed simulations with a model to see how the system would function in different climatic and humidity conditions. “We wanted to see whether the system can be used in areas where the air is arid,” says Broday, “for example in the Sahara Desert, and in Yemen, which is currently experiencing a severe hunger crisis and lack of drinking water.” He says the system is both relatively small and allows for distributed production of water that does not depend on one source from which the water must be piped to all the other localities.
“We strongly believe in the idea and the preliminary results,” says Broday. “But we still have to put the theory into practice. That’s the next stage.”
The Technion-Israel Institute of Technology is a major source of the innovation and brainpower that drives the Israeli economy, and a key to Israel’s renown as the world’s “Start-Up Nation.” Its three Nobel Prize winners exemplify academic excellence. Technion people, ideas and inventions make immeasurable contributions to the world including life-saving medicine, sustainable energy, computer science, water conservation and nanotechnology.
American Technion Society (ATS) donors provide critical support for the Technion—nearly $2.5 billion since its inception in 1940. Based in New York City, the ATS and its network of supporters across the U.S. provide funds for scholarships, fellowships, faculty recruitment and chairs, research, buildings, laboratories, classrooms and dormitories, and more.
Rice University’s solar-powered approach for purifying salt water with sunlight and nanoparticles is even more efficient than its creators first believed.
Researchers in Rice’s Laboratory for Nanophotonics (LANP) this week showed they could boost the efficiency of their solar-powered desalination system by more than 50% simply by adding inexpensive plastic lenses to concentrate sunlight into “hot spots.” The results are available online in the Proceedings of the National Academy of Sciences.
“The typical way to boost performance in solar-driven systems is to add solar concentrators and bring in more light,” said Pratiksha Dongare, a graduate student in applied physics at Rice’s Brown School of Engineering and co-lead author of the paper. “The big difference here is that we’re using the same amount of light. We’ve shown it’s possible to inexpensively redistribute that power and dramatically increase the rate of purified water production.”
In conventional membrane distillation, hot, salty water is flowed across one side of a sheetlike membrane while cool, filtered water flows across the other. The temperature difference creates a difference in vapor pressure that drives water vapor from the heated side through the membrane toward the cooler, lower-pressure side. Scaling up the technology is difficult because the temperature difference across the membrane—and the resulting output of clean water—decreases as the size of the membrane increases. Rice’s “nanophotonics-enabled solar membrane distillation” (NESMD) technology addresses this by using light-absorbing nanoparticles to turn the membrane itself into a solar-driven heating element.
Dongare and colleagues, including study co-lead author Alessandro Alabastri, coat the top layer of their membranes with low-cost, commercially available nanoparticles that are designed to convert more than 80% of sunlight energy into heat. The solar-driven nanoparticle heating reduces production costs, and Rice engineers are working to scale up the technology for applications in remote areas that have no access to electricity.
The concept and particles used in NESMD were first demonstrated in 2012 by LANP director Naomi Halas and research scientist Oara Neumann, who are both co-authors on the new study. In this week’s study, Halas, Dongare, Alabastri, Neumann and LANP physicist Peter Nordlander found they could exploit an inherent and previously unrecognized nonlinear relationship between incident light intensity and vapor pressure.
Alabastri, a physicist and Texas Instruments Research Assistant Professor in Rice’s Department of Electrical and Computer Engineering, used a simple mathematical example to describe the difference between a linear and nonlinear relationship. “If you take any two numbers that equal 10—seven and three, five and five, six and four—you will always get 10 if you add them together. But if the process is nonlinear, you might square them or even cube them before adding. So if we have nine and one, that would be nine squared, or 81, plus one squared, which equals 82. That is far better than 10, which is the best you can do with a linear relationship.”
In the case of NESMD, the nonlinear improvement comes from concentrating sunlight into tiny spots, much like a child might with a magnifying glass on a sunny day. Concentrating the light on a tiny spot on the membrane results in a linear increase in heat, but the heating, in turn, produces a nonlinear increase in vapor pressure. And the increased pressure forces more purified steam through the membrane in less time.
“We showed that it’s always better to have more photons in a smaller area than to have a homogeneous distribution of photons across the entire membrane,” Alabastri said.
Halas, a chemist and engineer who’s spent more than 25 years pioneering the use of light-activated nanomaterials, said, “The efficiencies provided by this nonlinear optical process are important because water scarcity is a daily reality for about half of the world’s people, and efficient solar distillation could change that.
“Beyond water purification, this nonlinear optical effect also could improve technologies that use solar heating to drive chemical processes like photocatalysis,” Halas said.
For example, LANP is developing a copper-based nanoparticle for converting ammonia into hydrogen fuel at ambient pressure.
Halas is the Stanley C. Moore Professor of Electrical and Computer Engineering, director of Rice’s Smalley-Curl Institute and a professor of chemistry, bioengineering, physics and astronomy, and materials science and nanoengineering.
NESMD is in development at the Rice-based Center for Nanotechnology Enabled Water Treatment (NEWT) and won research and development funding from the Department of Energy’s Solar Desalination program in 2018.
Rivian’s appearance at Amazon’s tech-focused re:MARS event this week drew some more attention from the celebrity crowd. Former NBA player Shaquille O’Neal paid a visit to the all-electric startup’s display and surprised onlookers by fitting his 7’1″, 300+ lb frame into the R1T pickup truck. “Shaq fits,” CEO RJ Scaringe tweeted after with a photo of the superstar in the driver’s seat.
Shaq himself posted a video of the event on his Instagram account. Rivian’s team is seen in the background,seemingly proud to have yet another celebrity take noticeof the amazing work they’ve done with their vehicles. Singer Rihanna attended the company’s local invite-only preview before its appearance at the New York International Auto Show in April.
Another major figure to visit Rivian at re:MARS, albeit not a surprising one given he’s the CEO of Amazon, was Jeff Bezos. In a video posted to Twitter, he’s seen walking around the event and having a look at all the tech on display.
“Got a chance to scout some of the cool tech at the first #reMARS event,”he posted, tagging Rivian and the other companies featured in the clip. During Amazon’s all-hands meeting in March, Bezos stated that he is fascinated by the emerging trends in the auto industry and referred to Scaringe as “one of the most missionary entrepreneurs [he’s] ever met.” Amazon invested $700 million dollars into Rivian during a funding round earlier this year.
Amazon’s re:MARS 2019 event is an information and networking conference sponsored by the online retail giant focused on artificial intelligence (AI), robotics, and other related Earth and space technologies, including self-driving. The latest research, scientific advancements, and industry innovations are shared during four-days of networking, keynotes, and information sessions. This was the first year for the event and it took place from June 4-7 at the Aria Resort & Casino in Las Vegas, Nevada.
Rivian’s appearance at re:MARS was announced via tweet on its official Twitter account. “What happens when you combine a thirst for adventure with automotive tech and AI? Meet the world’s first Electric Adventure Vehicle at #reMARS to find out,” it said, tagging Rivian andincluding the hashtag #alexaauto. RJ Scaringe has teased “Jurassic Park” style tours with its vehicles that would implement the self-driving technology the company also has in the works. Perhaps Amazon’s Alexa digital assistant could narrate, if speculating about connections between the two companies.
Rivian’s CEO also shared some new insights recently about the R1T pickup truck and the R1S SUV in an interview withThe Drive. In a discussion about the electric adventure company’s battery technology, Scaringe noted thatRivian is preparing solutions that will enable drivers to recharge their vehicles off the grid, such as auxiliary battery packs. He also added that the the cars would be capable of vehicle-to-vehicle charging, allowing two Rivians to charge each other. “We’ve designed the vehicle so you can have auxiliary battery packs. You can also charge Rivian-to-Rivian, which is a neat thing. You connect the two vehicles, and then I could hand you some electrons,” Scaringe said.
Overall, Rivian seems to be very well focused on developing its technology and appealing to a wide audience of future buyers.
Solar panels are fantastic pieces of technology, but we need to work out how to make them evenmore efficient– and scientists just solved a 40-year-old mystery around one of the key obstacles to increased efficiency.
A new study outlines a material defect in silicon used to produce solar cells that has previously gone undetected. It could be responsible for the 2 percent efficiency drop that solar cells can see in the first hours of use: Light Induced Degradation (LID).
Multiplied by the increasing number of panels installed at solar farms around the world, that drop equals a significant cost in gigawatts that non-renewable energy sources have to make up for.
In fact, the estimated loss in efficiency worldwide from LID is estimated to equate to more energy than can be generated by the UK’s 15 nuclear power plants. The new discovery could help scientists make up some of that shortfall.
“Because of the environmental and financial impact solar panel ‘efficiency degradation’ has been the topic of much scientific and engineering interest in the last four decades,” says one of the researchers, Tony Peaker from the University of Manchester in the UK.
“However, despite some of the best minds in the business working on it, the problem has steadfastly resisted resolution until now.”
To find what 270 research papers across four decades had previously been unable to determine, the latest study used an electrical and optical technique calleddeep-level transient spectroscopy (DLTS) to find weaknesses in the silicon.
Here’s what the DLTS analysis found: As the electronic charge in the solar cells gets transformed into sunlight, the flow of electrons gets trapped; in turn, that reduces the level of electrical power that can be produced.
This defect lies dormant until the solar panel gets heated, the team found.
“We’ve proved the defect exists, its now an engineering fix that is needed,”says one of the researchers, Iain Crowe from the University of Manchester.
The researchers also found that higher quality silicon had charge carriers (electrons which carry the photon energy) with a longer ‘lifetime’, which backs up the idea that these traps are linked to the efficiency degradation.
What’s more, heating the material in the dark, a process often used to remove traps from silicon, seems to reverse the degradation.
The work to push solar panel efficiency rates higher continues, with breakthroughs continuing to happenin the lab, and nature offering up plenty ofefficiency tipsas well. Now that the Light Induced Degradation mystery has been solved, solar farms across the globe should benefit.
“An absolute drop of 2 percent in efficiency may not seem like a big deal, but when you consider that these solar panels are now responsible for delivering a large and exponentially growing fraction of the world’s total energy needs, it’s a significant loss of electricity generating capacity,”says Peaker.
Since the Birth of the Space Age the dream of catching a ride to another solar system has been hobbled by the “tyranny ofthe rocket equation,” which sets hard limits on the speed and size of the spacecraft we sling into the cosmos.
Of the advanced propulsion concepts that could theoretically pull that off, few have generated as much excitement—and controversy—as the EmDrive.
First described nearly two decades ago, the EmDrive works by converting electricity into microwaves and channeling this electromagnetic radiation through a conical chamber. In theory, the microwaves can exert force against the walls of the chamber to produce enough thrust to propel a spacecraft once it’s in space.
At this point, however, the EmDrive exists only as a laboratory prototype, and it’s still unclear whether it’s able to produce thrust at all. If it does, the forces it generates aren’t strong enough to be registered by the naked eye, much less propel a spacecraft.
Over the past few years, however, a handful of research teams, including one from NASA, claim to have successfully produced thrust with an EmDrive. If true, it would amount to one of the biggest breakthroughs in the history of space exploration. The problem is that the thrust observed in these experiments is so small that it’s hard to tell if it’s real.
The resolution lies in designing a tool that can measure these minuscule amounts of thrust. So a team of physicists at Germany’s Technische Universität Dresden set out to create a device that would fill this need. Led by physicist Martin Tajmar, theSpaceDrive projectaims to create an instrument so sensitive and immune to interference that it would put an end to the debate once and for all.
In October, Tajmar and his team presented their second set of experimental EmDrivemeasurementsat the International Astronautical Congress, and their results will be published inActa Astronauticathis August. Based on the results of these experiments, Tajmar says a resolution to the EmDrive saga may only be a few months away.
Many scientists and engineers dismiss the EmDrive because it appears to violate the laws of physics. Microwaves pushing on the walls of an EmDrive chamber seem to generate thrust ex nihilo, which runs afoul of the conservation of momentum—it’s all action and no reaction. Proponents of the EmDrive, in turn, have appealed to fringe interpretations of quantum mechanics to explain how the EmDrive might work without violating Newtonian physics.
“From the theory point of view, no one takes this seriously,” Tajmar says. If the EmDrive is able to produce thrust, as some groups have claimed, he says they have “no clue where this thrust is coming from.” When there’s a theoretical rift of this magnitude in science, Tajmar sees only one way to close it: experimentation.
In late 2016, Tajmar and 25 other physicists gathered in Estes Park, Colorado, for thefirst conferencededicated to the EmDrive and related exotic propulsion systems. One of the most exciting presentations was given by Paul March, a physicist at NASA’sEagleworks lab, where he and his colleague Harold White had been testing various EmDrive prototypes. According to March’s presentation and a subsequent paperpublishedin theJournal of Propulsion and Power, he and White observed several dozen micro-newtons of thrust in their EmDrive prototype. (For the sake of comparison, a single SpaceX Merlin engine produces around 845,000 Newtons of thrust at sea level.)
The problem for Harold and White, however, was that their experimental setup allowed for several sources of interference, so they couldn’t say for sure whether what they observed was thrust.
Tajmar and the Dresden group used a close replica of the EmDrive prototype used by Harold and White in their tests at NASA. It consists of a copper frustum—a cone with its top lopped off—that is just under a foot long. This design can be traced back to the engineer Roger Shawyer, who first described the EmDrive in 2001. During tests, the EmDrive cone is placed in a vacuum chamber. Outside the chamber, a device generates a microwave signal that gets relayed, using coaxial cables, to antennas inside the cone.
This isn’t the first time the Dresden team has sought to measure nearly imperceptible amounts of force. They built similar contraptions for their work on ion thrusters, which are used to precisely position satellites in space. These micro-newton thrusters are the kind that were used by the LISA Pathfinder mission, which needs extremely precise positioning ability to detect faint phenomena like gravitational waves. But to study the EmDrive and similar propellantless propulsion systems, Tajmar says, required nano-newton resolution.
Their approach was to use a torsion balance, a pendulum-type balance that measures the amount of torque applied to the axis of the pendulum. A less sensitive version of this balance was also used by the NASA team when they thought their EmDrive produced thrust.
To accurately gauge the small amount of force, the Dresden team used a laser interferometer to measure the physical displacement of the balance scales produced by the EmDrive. According to Tajmar, their torsion scale has a nano-newton resolution and supports thrusters weighing several pounds, making it the most sensitive thrust balance in existence.
But a really sensitive thrust balance isn’t much use unless you can also determine whether the detected force is in fact thrust and not an artifact of outside interference. And there are plenty of alternate explanations for Harold and White’s observations.
To determine whether an EmDrive actually produces thrust, researchers must be able to shield the device from interference caused by the Earth’s magnetic poles, seismic vibrations from the environment, and the thermal expansion of the EmDrive due to heating from the microwaves.
Tweaks to the design of the torsion balance—to better control the EmDrive’s power supply and shield it from magnetic fields—took care of some of the interference issues, Tajmar says. A more difficult problem was how to address “thermal drift.” When power flows to the EmDrive, the copper cone heats up and expands, which shifts its center of gravity just enough to cause the torsion balance to register force that can be mistaken as thrust. Tajmar and his team hoped that changing the orientation of the thruster helped address that issue.
Over the course of 55 experiments, Tajmar and his colleagues registered an average of 3.4 micro-newtons of force from the EmDrive, which was very similar to what the NASA team found. Alas, these forces did not appear to pass the thermal drift test. The forces seen in the data were more indicative of thermal expansion than thrust.
All hope is not lost for the EmDrive, however. Tajmar and his colleagues are also developing two additional types of thrust balances, including a superconducting balance that will, among other things, help to eliminate false positives produced by thermal drift.
If they detect force from an EmDrive on these balances, there’s a high probability that it is actually thrust. But if no force is registered on these balances, it likely means that all the previous EmDrive thrust observations were false positives. Tajmar says he hopes to have a final verdict by the end of the year.
But even a negative result from that work might not kill the EmDrive for good. There are many other propellantless propulsion designs to pursue. And if scientists ever do develop new forms of weak propulsion, the hyper-sensitive thrust balances developed by Tajmar and the Dresden team will almost certainly play a role in sorting science fact from science fiction.
FRANKFURT (Reuters) – BMW and Jaguar Land Rover on Wednesday said they will jointly develop electric motors, transmissions and power electronics, unveiling yet another industry alliance designed to lower the costs of developing electric cars.
Both carmakers are under pressure to roll out zero-emission vehicles to meet stringent anti-pollution rules, but have struggled to maintain profit margins faced with the rising costs of making electric, connected and autonomous cars.
“Together, we have the opportunity to cater more effectively for customer needs by shortening development time and bringing vehicles and state-of-the-art technologies more rapidly to market,” said BMW board member Klaus Froehlich.
BMW and Jaguar Land Rover said they will save costs through shared development, production planning and joint purchasing of electric car components. Both companies will produce electric drivetrains in their own manufacturing facilities, BMW said.
The BMW Jaguar Land Rover pact comes as rivals Fiat Chrysler and Renault explore a $35 billion tie-up of the Italian-American and French car making groups.
Nick Rogers, Jaguar Land Rover’s engineering director said, “We’ve proven we can build world beating electric cars but now we need to scale the technology to support the next generation of Jaguar and Land Rover products.
BMW was in talks with rival Daimler about developing electric car components but was also in discussions with Jaguar Land Rover, a company it once owned, to explore an alliance on engines.
BMW already has a deal to supply an 8 cylinder engine to Jaguar Land Rover.
Carmakers are increasingly open to sharing electric car parts because the technology is expensive and because customers no longer buy a car based on what engine a vehicle has.
“Carmakers are much less precious about sharing electric car technology because it is much harder to create product differentiation with electric car tech. They all accelerate fast, and everybody can do quality and ride and handling,” according to Carl-Peter Forster a former chief executive of Tata Motors and a former BMW executive.
Jaguar Land Rover is still run by former BMW managers, including Ralf Speth the company’s chief executive who spent 20 years at BMW prior to joining JLR, and Wolfgang Ziebart, the engineer who oversaw Jaguar’s iPace electric car program, who is a former head of research and development at BMW.
Jaguar Land Rover said it would redouble efforts to cut costs after it posted a $4 billion loss earlier this year, hit by a downturn in demand for sports utility vehicles in China and a regulatory clampdown on diesel emissions.
BMW bought Britain’s Rover Group, which included the Jaguar and Land Rover brands, for 800 million pounds in 1994 only to sell Jaguar Land Rover to Ford in March 2000 for $2.7 billion. In 2008 India’s Tata Group bought Jaguar and Land Rover from Ford for $2.3 billion.
Researchers at Rice University, Durham (U.K.) University and North Carolina State University reported their success at activating the motors with precise two-photon excitation via near-infrared light. Unlike the ultraviolet light they first used to drive the motors, the new technique does not damage adjacent, healthy cells.
The team’s results appear in the American Chemical Society journal ACS Nano.
The research led by chemists James Tour of Rice, Robert Pal of Durham and Gufeng Wang of North Carolina may be best applied to skin, oral and gastrointestinal cancer cells that can be reached for treatment with a laser.
In a 2017 Nature paper, the same team reported the development of molecular motors enhanced with small proteins that target specific cancer cells.
Once in place and activated with light, the paddlelike motors spin up to 3 million times a second, allowing the molecules to drill through the cells’ protective membranes and killing them in minutes.
Since then, researchers have worked on a way to eliminate the use of damaging ultraviolet light. In two-photon absorption, a phenomenon predicted in 1931 and confirmed 30 years later with the advent of lasers, the motors absorb photons in two frequencies and move to a higher energy state, triggering the paddles.
A video produced in 2017 explains the basic concept of cell death via molecular motors. Video produced by Brandon Martin/Rice University.
“Multiphoton activation is not only more biocompatible but also allows deeper tissue penetration and eliminates any unwanted side effects that may arise with the previously used UV light,” Pal said.
The researchers tested their updated motors on skin, breast, cervical and prostate cancer cells in the lab. Once the motors found their targets, lasers activated them with a precision of about 200 nanometers.
In most cases, the cells were dead within three minutes, they reported. They believe the motors also drill through chromatin and other components of the diseased cells, which could help slow metastasis.
Because the motors target specific cells, Tour said work is underway to adapt them to kill antibiotic-resistant bacteria as well.
“We continue to perfect the molecular motors, aiming toward ones that will work with visible light and provide even higher efficacies of kill toward the cellular targets,” he said.
Rice postdoctoral researcher Dongdong Liu is lead author of the paper. Co-authors are Rice alumni Victor Garcia-López, Lizanne Nilewski and Amir Aliyan, visiting research scientist Richard Gunasekera, and senior research scientist Lawrence Alemany and graduate student Tao Jin of North Carolina State.
Wang is an assistant professor of chemistry at North Carolina State. Pal is an assistant professor of chemistry at Durham. Tour is the T.T. and W.F. Chao Chair in Chemistry as well as a professor of computer science and of materials science and nanoengineering at Rice.
The Royal Society, the United Kingdom’s Engineering and Physical Sciences Research Council, the Discovery Institute, the Pensmore Foundation and North Carolina State supported the research.
About Rice University
Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,962 undergraduates and 3,027 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 2 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.
Follow Rice News and Media Relations via Twitter @RiceUNews.
For all the promise and peril of artificial intelligence, there’s one big obstacle to its seemingly relentless march:
The algorithms for running AI applications have been so big and complex that they’ve required processing on powerful machines in the cloud and data centers, making a wide swath of applications less useful on smartphones and other “edge” devices.
Now, that concern is quickly melting away, thanks to a series of breakthroughs in recent months in software, hardware and energy technologies. That’s likely to drive AI-driven products and services even further away from a dependence on powerful cloud-computing services and enable them to move into every part of our lives — even inside our bodies.
By 2022, 80% of smartphones shipped will have AI capabilities on the device itself, up from 10% in 2017, according to market researcher Gartner Inc. And by 2023, that will add up to some 1.2 billion shipments of devices with on-device AI computing capabilities, up from 79 million in 2017,according to ABI Research.
A lot of startups and their backers smell a big opportunity. According to Jeff Bier, founder of the Embedded Vision Alliance, which held a conference this past week in Silicon Valley, investors have plowed some $1.5 billion into new AI chip startups in the past three years — more than was invested in all chip startups in the previous three years.
Market researcher Yole Développement forecasts that AI application processors will enjoy a 46% compound annual growth rate through 2023, when nearly all smartphones will have them, from fewer than 20% today.
And it’s not just startups. Just today, Intel Corp. previewed its coming Ice Lake chips, which among other things has “Deep Learning Boost” software and other new AI instructions on graphics processing unit.
“Within the next two years, virtually every processor vendor will be offering some kind of competitive platform for AI,” Tom Hackett, principal analyst at IHS Markit, said at the alliance’s Embedded Vision Summit. “We are now seeing a next-generation opportunity.”
Those chips are finding their way into many more devices beyond smartphones. They’re also being used in millions of “internet of things” devices such as robots, drones, cars, cameras and wearables.
Among the 75 or so companies developing machine learning chips, for instance, is Israel’s Hailo, which raised a $21 million funding round in January. In mid-May itreleased a processorthat’s tuned for deep learning, a branch of machine learning responsible for recent breakthroughs in voice and image recognition.
More compact and capable software is paving the way for AI at the edge as well. Google LLC, for instance debuted itsTensorFlow Litemachine learning library for mobile devices in late 2017, enabling the potential for smart cameras to can identify wildlife or imaging devices to can make medical diagnoses even where there’s no internet connection.
Some 2 billion mobiles now have TensorFlow Lite deployed on them, Google staff research engineer Pete Warden said at a keynote presentation at the Embedded Vision Summit.
And in March, Googlerolled out an on-device speech recognizerto power speech input in Gboard, Google’s virtual keyboard app. The automatic speech recognition transcription algorithm is now down to 80 megabytes so it can run on the Arm Ltd. A-series chip inside a typical Pixel phone, and that means it works offline so there’s no network latency or spottiness.
Not least, rapidly rising privacy concerns about data traversing the cloud means there’s also a regulatory reason to avoid moving data off the devices.
“Virtually all the machine learning processing will be done on the device,” said Bier, who’s also co-founder and president of Berkeley Design Technology Inc., which provides analysis and engineering services for embedded digital signal processing technology. And there will be a whole lot of devices: Warden cited an estimate of 250 billion active embedded devices in the world today, and that number is growing 20% a year.
Google’s Pete Warden (Photo: Robert Hof/SiliconANGLE)
But doing AI on such devices is no easy task. It’s more than just the size of the machine learning algorithms but the power it takes to execute them, especially since smartphones and especially IoT devices such as cameras and various sensors can’t depend on power from a wall socket or even batteries. “The devices will not scale if we become bound to changing or recharging batteries,” said Warden.
The radio connections needed to send data to and from the cloud also are energy hogs, so communicating via cellular or other connections is a deal breaker for many small, cheap devices. The result, said Yohann Tschudi, technology and market analyst at Yole Développement: “We need a dedicated architecture for what we want to do.”
There’s also a need to develop devices that realistically must draw less than a milliwatt, and that’s about a thousandth of what a smartphone uses. The good news is that an increasing array of sensors and even microprocessors promises to do just that.
The U.S. Department of Energy, for instance, has helped developlow-cost wireless peel-and-stick sensorsfor building energy management in partnership with Molex Inc. and building automation firm SkyCentrics Inc. And experimental new image sensors can power themselves with ambient light.
And even microprocessors, the workhorses for computing, can be very low-power, such as those from startups such as Ambiq Micro, Eta Compute, Syntiant Corp., Applied Brain Research, Silicon Laboratories Inc. and GreenWaves Technologies.
“There’s no theoretical reason we can’t compute in microwatts,” or a thousand times smaller than milliwatts, Warden said. That’s partly because they can be programmed, for instance, to wake up a radio to talk to the cloud only when something actionable happens, like liquid spilling on a floor.
Embedded Vision Summit (Photo: Robert Hof/SiliconANGLE)
All this suggests a vast new array of applications of machine learning on everything from smartphones to smart cameras and factory monitoring sensors. Indeed, said Warden, “We’re getting so many product requests to run machine learning on embedded devices.”
Among those applications:
Predictive maintenance using accelerometers to determine if a machine is shaking too much or making a funny noise.
Presence detection for street lights so they turn on only when someone’s nearby.
Warden even anticipates that sensors could talk to each other, such as in a smart home where the smoke alarm detects a potential fire and the toaster replies that no, it’s just burned toast. That’s speculative for now, but Google’s already working on“federated learning”to train machine learning models without using centralized training data (below).
None of this means the cloud won’t continue to have a huge role in machine learning. All those examples involve running the models on devices, a process known as inference.
For some time, the difference between a biotechnology company and a pharmaceutical company was straightforward.
A biotechnology focused on developing drugs with a biological basis. Pharmaceutical companies focused on drugs with a chemical basis.
It was sort of an artificial distinction, and is even more so now because pharmaceutical companies haven’t excluded biologics from their portfolios.
At one time there were even distinctions in the definitions related to small molecules versus large molecules, but those are largely in the dustbin of biopharma vocabulary. It’s one reason why “biopharma” itself is a useful word to bridge the two, and really, biotech and pharma are largely interchangeable.
Nanotechnology Versus Biotechnology
But what about nanotechnology? Is that biotechnology?
The answer to that seems to be … yes and no.
Nanotechnology typically refers totechnologythat is less than 100 nanometers in size. Although not horribly useful for differentiating things on the microscopic—or smaller—scale, there are 25,400,000 nanometers in an inch. So … small. Really small.
Wouldn’t that refer to many drugs? Yes, probably.
But nanotechnologytypicallyrefers to tech made of manmade and inorganic materials in that size range. Again, the key word is “typically.”
There is overlap. Liji Thomas, writing forAzo Nano,says, “Nanobiotechnology deals with technology which incorporates nanomolecules into biological systems, or which miniaturizes biotechnology solutions to nanometer size to achieve greater reach and efficacy….
Bionanotechnology, on the other hand, deals with new nanostructures that are created for synthetic applications, the difference being that these are based upon biomolecules.”
Clear? Probably not. Here are some examples of biotechnology companies utilizing nanotechnology, along with whatever tools they need to develop their compounds.
PEEL Therapeutics.PEEL Therapeutics is a small biotech company, largely in stealth mode,foundedbyJoshua Schiffman,an associate professor of Pediatrics at theUniversity of UtahandAvi Schroeder,an assistant professor of chemical engineering at theTechnion-Israel Institute of Technology.
Schiffman was doing work on a tumor suppressor gene, p53, which shows up at very high numbers in elephants. Elephants have significantly lower rates of cancer than humans, who normally have two normal copies of p53. Humans with a disease called Li-Fraumeni Syndrome, have only one, and they have a 100 percent change of getting cancer, or very close to it.
What PEEL is attempting to do is build a synthetic version of p53 and insert them into a novel drug delivery system using nanotechnology. “Peel,” by the way, is the phonetic spelling of the Hebrew word for elephants. eP53 has been successfully encapsulated in nanoparticles, and at least in petri dishes, has demonstrated proof of concept. Elephants are not being experimented upon.
Exicure.Based in Skokie, Illinois,Exicure (formerly known asAuraSense) is a clinical stage biotechnology company that’s working on a new class of immunomodulatory and gene regulating drugs that uses proprietary three-dimensional, spherical nucleic acid architecture.
The SNA technology came out of the laboratory ofChad Mirkinat theNorthwestern University International Institute for Nanotechnology.
The company hasreceivedfinancing from the likes ofMicrosoft’s Bill Gates, AonfounderPat Ryan, David Walt,co-founder ofIllumina,andBoon Hwee Koh,director ofAgilent Technologies.
The technology platform is complex, but it is essentially various single and double-stranded nucleic acids stuck on the outside of a nanosphere.
They are able to easily penetrate cells, which then trigger immune responses.
SpyBiotech.Headquartered in Oxford, UK, SpyBiotechfocuseson the so-called “super glue” that combines two parts of the bacteria that causes strep throat. It was spun out ofOxford University,and was based on research performed by its Department of Biochemistry and theJenner Institute.When the bacteria that cause step throat are separated, they are attracted to each other and attempt to reattach.
The company is working to use this principle to develop vaccines that, instead of using virus-causing bacteria, will bind onto viral infections.
One of the bacteria that can cause strep throat, impetigo and other infections,Streptococcus pyogenes, is often shortened to Spy, hence the name of the company. When Spy is split into a peptide (SpyTag) and its protein partner (SpyCatcher), they are attracted to each other. The researchers isolated the “glue” that creates the attraction, and believe it can be used to bond vaccines together.
The company has backing fromGV,formerlyGoogle Ventures,the venture fund backed byAlphabet/Google.
One of the company’s founders isMark Howarth,professor of Protein Nanotechnology at the University of Oxford. The fact that he’s working on protein nanotechnology undercuts a traditional definition of nanotechnology as not using biological materials. On hiswebsite,Howarth notes that SpyTag and SpyCatcher “is the strongest protein interaction yet measured and is being applied around the world for diverse areas of basic research and biotechnology. We are extending this new class of protein interaction, to create novel possibilities for synthetic biology.”
Ultimately, when researchers are developing drugs, they are using whatever tools are necessary to find effective treatments for diseases. Biotechnology may more accurately be thought of as a set of tools and a philosophical approach to solving biological problems, compared to pharmaceuticals, and nanotechnology is yet another tool.
In the wider world of drug discovery and development, there is also increasing use of artificial intelligence, data science and computational algorithms as well. And who knows what will be used tomorrow.