Israeli scientists ‘print’ world’s first 3D heart with human tissue | The Jerusalem post


A team of Tel Aviv University researchers revealed the heart, which was made using a patient’s own cells and biological materials.
— Read on m.jpost.com/HEALTH-SCIENCE/Israeli-scientists-print-first-3D-heart-586902/amp

South Korea and Sweden are the most innovative countries in the world – Israel Becoming ‘Tech Titan’


” … These are the most innovative countries in the world, South Korea, Sweden and Singapore top the list … “Image: REUTERS/Carlo Allegri

South Korea and Sweden are the most innovative countries in the world, according to a league table covering everything from the concentration of tech companies to the number of science and engineering graduates.

The index on innovative countries highlights South Korea’s position as the economy whose companies filed the most patents in 2017. 

Bloomberg, which compiles the index based on data from sources including the World Bank, IMF and OECD, credits South Korea’s top ranking to Samsung. 

The electronics giant is South Korea’s most valuable company and has received more US patents than any company other than IBM since the start of the millennium. This innovation trickles down the supply chain and throughout South Korea’s economy.

Sweden in second place is fast gaining a reputation as Europe’s tech start-up capital.

The Scandinavian country is home to Europe’s largest tech companies and its capital is second only to Silicon Valley when it comes to the number of “unicorns” – billion-dollar tech companies – that it produces per capita.

Education hinders the US

The US dropped out of the top 10 in the 2018 Bloomberg Innovation Index, for the first time in the six years the gauge has been compiled. 

Bloomberg attributed its fall to 11th place from ninth last year largely to an eight-spot slump in the rating of its tertiary education, which includes an assessment of the share of new science and engineering graduates in the labour force.

The US is now ranked 43 out of 50 nations for “tertiary efficiency”. Singapore and Iran take the top two spots.

The US’ ranking marks another setback for its higher education sector’s global standing in recent months: in September it was revealed neither of the world’s top two universities were considered to be American. Those honours went to the UK’s Oxford and Cambridge universities respectively.

In addition to the US’ education slump in the innovation index, Bloomberg claims the country also lost ground when it came to value-added manufacturing. The country is now ranked in 23rd place, while Ireland and South Korea take the top two spots.

Despite these setbacks, the Bloomberg Innovation Index still ranks the US as number 1 when it comes to its density of tech companies.

The US is also second only to South Korea for patent activity.

These rankings may explain the disparity between Bloomberg’s list of innovative countries and the World Economic Forum’s own list of the 10 most innovative economies.

Image: WEF

Under this ranking, compiled as part of The Global Competitiveness Report 2017-2018, the US is listed as the second most innovative country in the world after Switzerland.

The US’ inclusion in this league table, and South Korea’s exclusion, are the two most notable differences between the different rankings.

Other than these nations, the majority of countries included in the top 10s are the same in both lists.

Tech titan Israel

One nation to feature prominently in both innovation rankings is Israel.

Taking third spot in the Global Competitiveness Report’s innovation league table, Israel is ranked 10th best country in the world for innovation overall by Bloomberg.

However, its index also ranks Israel as number 1 for two categories of innovation: R&D intensity and concentration of researchers.

Israel’s talent for research and development is illustrated by some of the major tech innovations to come out of the country.

These include the USB flash drive, the first Intel PC processor and Google’s Suggest function, to name just three.

Despite being smaller than the US state of New Jersey with fewer people, Israel punches well above its weight on the global tech stage.

It has about 4000 startups, and raises venture capital per capita at two-and-a-half times the rate of the US and 30 times that of Europe.

When it comes to being a world leader at innovation, it may simply be the case that you get out what you put in: according to OECD figures, Israel spends more money on research and development as a proportion of its economy than any other country – 4.3% of GDP against second-placed Korea’s 4.2%. 

Switzerland is in third place spending 3.4% of its GDP on R&D, while Sweden spends 3.3%. The US spends just 2.8%.

Synthetic organisms are about to challenge what ’being’ and ‘alive’ really means


We need to begin a serious debate about whether artificially evolved humans are our future, and if we should put an end to these experiments before it is too late.

In 2016, Craig Venter and his team at Synthetic Genomics announced that they had created a lifeform called JCVI-syn3.0, whose genome consisted of only 473 genes.

This stripped-down organism was a significant breakthrough in the development of artificial life as it enabled us to understand more fully what individual genes do. (In the case of JCVI-syn3.0, most of them were used to create RNA and proteins, preserve genetic fidelity during reproduction and create the cell membrane.

The functions of about a third remain a mystery.)

Venter’s achievement followed an earlier breakthrough in 2014, when Floyd Romesberg at Romesberg Lab in California succeeded in creating xeno nucleic acid (XNA), a synthetic alternative to DNA, using amino acids not found among the naturally occurring four nucleotides: adenine, cytosine, guanine and thymine. 

And, most recently we have seen huge advances in the use of CRISPR, a gene-editing tool that allows substitution or injection of DNA sequences at chosen locations in a genome.

Read More: Why Bill Gates is Betting on this Synthetic Biology Start-Up

Together, these developments mean that in 2019 we will have to take seriously the possibility of our developing multicellular artificial life, and we will need to start thinking about the ethical and philosophical challenges such a possibility brings up.

In the near future we can reasonably anticipate that a large number of unnatural single-cell life forms will be created using artificially edited genomes to correct for genetic defects or to add new features to an organism’s phenotype.

It is already possible to design bacterial forms, for example, that can metabolise pollutants or produce particular substances.

We can also anticipate that new life forms may be created that have never existed in nature through the use of conventional and perhaps artificially arranged codons (nucleotide sequences that manage protein synthesis).

These are likely to make use of the conventional machinery of mitotic cell reproduction and of conventional ribosomes, creating proteins through RNA or XNA interpretation.

And there will be increasing pressures to continue this research. We may need to accelerate the evolution of terrestrial life forms, for example, including homo sapiens, so that they carry traits and capabilities needed for life in space or even on our own changing planet. 

All of this will bring up serious issues as to how we see ourselves – and behave – as a species.

While the creation of multicellular organisms that are capable of sexual reproduction is still a long way off, in 2019 we will need to begin a serious debate about whether artificially evolved humans are our future, and if we should put an end to these experiments before it is too late.

 Vint Cerf of ‘Wired’

How brand new science will manage the fourth industrial revolution – “Managing the Machines”


It’s about artificial intelligence, data, and things like quantum computing and nanotechnology. Australian National University’s 3A Institute is creating a new discipline to manage this revolution and its impact on humanity.

Image: Diagram by Christoph Roser at AllAboutLean.com (CC BY-SA 4.0))

Diagrams explaining the fourth industrial revolution, like this one by Christoph Roser, are OK as far as they go. Apart from the term “cyber physical systems”. Ugh. What they mean is that physical systems are becoming digital. Think of the Internet of Things (IoT) supercharged by artificial intelligence (AI).

But according to Distinguished Professor Genevieve Bell, these diagrams are missing something rather important: Humans and their social structures.

“Now for those of us who’ve come out of the social sciences and humanities, this is an excellent chart because of the work it does in tidying up history,” Bell said in her lecture at the Trinity Long Room Hub at Trinity College Dublin in July.

“It doesn’t help if what you want to think about was what else was going on. Each one of those technological transformations was also about profound shifts in cultural practice, social structure, social organisations, profoundly different ideas about citizenship, governance, regulation, ideas of civil and civic society.”

Another problem with this simplistic view is the way the Industry 4.0 folks attach dates to this chart. Steam power and mechanisation in 1760-1820 or so. Mass production from maybe 1870, but the most famous chapter being Henry Ford’s work in 1913. Then computers and automation started being used to manage manufacturing from 1950.

“That time scheme works really well if you’re in the West. It doesn’t hold if you’re in China or India or Latin America or Africa, where most of those things happened in the 20th century, many of them since 1945,” Bell said.

Bell wants to know what we can learn from those first three revolutions. She heads  the 3A Institute at the Australian National University, which was launched in September 2017 and is working out how we should respond to, and perhaps even direct, the fourth revolution.

Take the steam engines of the first industrial revolution. They were built by blacksmiths and ironmongers, who knew what they needed to build the engines. But they didn’t know how to shape the industries the engines could power, or how to house them, or about the safety systems they’d need. These and other problems generated the new applied science of engineering. The first school of engineering, the École Polytechnique, was established in Paris in 1794.

The large-scale factories and railway systems of the second industrial revolution needed massive amounts of money. Raising and managing that money literally led to capitalism, and concepts like common stock companies and futures trading. And the first business school with funding from industry.

Early in the computer revolution, the US government had a problem. Nearly all of its computers relied on proprietary software from companies like IBM and Honeywell. So it asked Stanford University mathematician George Forsythe to create an abstract language for all computers. Two years later, his team developed a thing called computer science, and issued a standard 10-page curriculum. An updated version is still used globally today.

“So, engineering, business, and computer science: Three completely different applied sciences, emerging from three completely different technical regimes, with different impulses,” Bell said.

“Each starts out incredibly broad in terms of the ideas it draws on, rapidly narrows to a very clear set of theoretical tools and an idea about practice, then is scaled very quickly.”

With this in mind, Bell said that the fourth industrial revolution needs its own applied science, so that’s exactly what the 3A Institute is going to build — as the website puts it, “a new applied science around the management of artificial intelligence, data, and technology and of their impact on humanity”.

And the 3A Institute plans to do it by 2022.

Nine months into this grand project, it’s identified five sets of questions that this new science needs to answer.

First is Autonomy

If autonomous systems are operating without prewritten rules, how do we stop them turning evil, as so many fictional robots do? How do different autonomous systems interact? How do we regulate those interactions? How do you secure those systems and make them safe? How do the rules change when the systems cross national boundaries?

Or, as Bell asked, “What will it mean to live in a world where objects act without reference to us? And how do we know what they’re doing? And do we need to care?”

Second is Agency, which is really about the limits to an object’s autonomy. With an autonomous vehicle, for example, does it have to stop at the border? If so, which border? Determined by whom? Under what circumstances?

“Does your car then have to be updated because of Brexit, and if so how would you do that?” Bell asked.

If autonomous vehicles are following rules, how are those rules litigated? Do the rules sit on the object, or somewhere else? If there’s some network rule that gets vehicles off the road to let emergency vehicles through, who decides that and how? If you have multiple objects with different rule sets, how do they engage each other?

Third is Assurance, and as Bell explained, “sitting under it [is] a whole series of other words. Safety, security, risk, trust, liability, explicability, manageability.”

Fourth is Metrics

“The industrial revolution thus far has proceeded on the notion that the appropriate metric was an increase in productivity or efficiency. So machines did what humans couldn’t, faster, without lunch breaks, relentlessly,” Bell said.

Doing it over again, we might have done things differently, she said. We might have included environmental sustainability as a metric.

“What you measure is what you make, and so imagining that we put our metrics up at the front would be a really interesting way of thinking about this.”

Metrics for fourth revolution systems might include safety, quality of decision-making, and quality of data collection.

Some AI techniques, including deep learning, are energy intensive. Around 10 percent of the world’s energy already goes into running server farms. Maybe an energy efficiency metric would mean that some tasks would be done more efficiently by a human.

Fifth and finally are Interfaces. Our current systems for human-computer interaction (HCI) might not work well with autonomous systems.

“These are objects that you will live in, be moved around by, that may live in you, that may live around you and not care about you at all … the way we choose to engage with those objects feels profoundly different to the way HCI has gotten us up until this moment in time,” Bell said.

“What would it mean to [have] systems that were, I don’t know, nurturing? Caring? The robots that didn’t want to kill us, but wanted to look after us.”

As with computer science before it, the 3A Institute is developing a curriculum for this as-yet-unnamed new science. The first draft will be tested on 10 graduate students in 2019.

Bell’s speech in Dublin, titled “Managing the Machines”, included much more detail than reported here. Versions are being presented around the planet, and videos are starting to appear. This writer highly recommends them.

Four Emerging Technology Areas That Will Help Define Our World In 2019


Welcome to 2019....

2018 was surely a transformative year for technological innovation. We saw early development of ambient computing, quantum teleportation, cloaks of invisibility, genomics advancements and even robocops.

Granted we’re not flying around in our own cars like the Jetsons did yet, but we’re closer. In 2019 we will continue on the transformation path and expand even more into adopting cutting edge immersive technologies.

What’s ahead for the coming year? I envision four emerging technology areas that will significantly impact our lives in 2019.

1.  The Internet of Things and Smart Cities

The Internet of Things (IoT) refers to the general idea of devices and equipment that are readable, recognizable, locatable, addressable, and/or controllable via the internet. 

This includes everything from home appliances, wearable technology and cars. These days, if a device can be turned on, it most likely can be connected to the internet. Because of this, data can be shared quickly across a multitude of objects and devices increasing the rate of communications.

Cisco, who terms the “Internet of Things,” “The Internet of Everything,” predicts that 50 billion devices (including our smartphones, appliances and office equipment) will be wirelessly connected via a network of sensors to the internet by 2020.

The term “Smart City” connotes creating a public/private infrastructure to conduct activities that protect and secure citizens. The concept of Smart Cities integrates communications (5-G), transportation, energy, water resources, waste collections, smart-building technologies, and security technologies and services. They are the cities of the future.

IoT is the cog of Smart Cities that integrates these resources, technologies, services and infrastructure.

The research firm Frost & Sullivan estimates the combined global market potential of Smart City segments (transportation, healthcare, building, infrastructure, energy and governance) to be $1.5 Trillion ($20B by 2050 on sensors alone according to Navigant Technology).

The combined growth of IoT and Smart Cities will be a force to reckon with in 2019!

     2.  Artificial Intelligence (AI)

Emergent artificial intelligence (AI), machine learning, human-computer interface, and augmented reality technologies are no longer science fiction. Head-spinning technological advances allow us to gain greater data-driven insights than ever before.

The ethical debate about AI is fervent over the threatening implications of future technologies that can think like a human (or better) and make their own decisions. The creation of a “Hal” type entity as depicted in Stanley Kubrick’s film, 2001 A Space Odyssey, is not far-fetched.

To truly leverage our ability to use data driven insights we need to make sure our thinking about how to best use this data keeps pace with its availability.

The vast majority of digital data is unstructured: a complex mesh of images, texts, videos and other data formats. Estimates suggest 80-90 percent of the world’s data is unstructured and growing at an increasingly rapid rate each day.

To even begin to make sense of this much data, advanced technologies are required. Artificial intelligence is the means by which this data is processed today, and it’s already a part of your everyday life.

In 2019, companies and governments will continue to develop technology that distributes artificial intelligence and machine learning software to millions of graphics and computer processors around the world. The question is how far away are we from a “Hal” with the ability for human analysis and techno emotions? 

     3.  Quantum Computing

The world of computing has witnessed seismic advancements since the invention of the electronic calculator in the 1960s. The past few years in information processing have been especially transformational.

What were once thought of as science fiction fantasies are now technological realities. Classical computing has become more exponentially faster and more capable and our enabling devices smaller and more adaptable.

We are starting to evolve beyond classical computing into a new data era called quantum computing. It is envisioned that quantum computing will accelerate us into the future by impacting the landscape of artificial intelligence and data analytics.

The quantum computing power and speed will help us solve some of the biggest and most complex challenges we face as humans.

Gartner describes quantum computing as: “[T]he use of atomic quantum states to effect computation. Data is held in qubits (quantum bits), which have the ability to hold all possible states simultaneously. Data held in qubits is affected by data held in other qubits, even when physically separated.

This effect is known as entanglement.” In a simplified description, quantum computers use quantum bits or qubits instead of using binary traditional bits of ones and zeros for digital communications.

Futurist Ray Kurzweil said that mankind will be able to “expand the scope of our intelligence a billion-fold” and that “the power of computing doubles, on average, every two years.” Recent breakthroughs in physics, nanotechnology and materials science have brought us into a computing reality that we could not have imagined a decade ago.

As we get closer to a fully operational quantum computer, a new world of supercomputing beckons that will impact on almost every aspect of our lives. In 2019 we are inching closer.

     4.  Cybersecurity (and Risk Management)

Many corporations, organizations and agencies have continued to be breached throughout 2018 despite cybersecurity investments on information assurance. The cyber threats grow more sophisticated and deadly with each passing year. The firm Gemalto estimated that data breaches compromised 4.5 billion records in first half of 2018. And a University of Maryland study found that hackers now attack computers every 39 seconds.

In 2019 we will be facing a new and more sophisticated array of physical security and cybersecurity challenges (including automated hacker tools) that pose significant risk to people, places and commercial networks.

The nefarious global threat actors are terrorists, criminals, hackers, organized crime, malicious individuals, and in some cases, adversarial nation states.

The physical has merged with the digital in the cybersecurity ecosystem. The more digitally interconnected we become in our work and personal lives, the more vulnerable we will become. Now everyone and anything connected is a target.

Cybersecurity is the digital glue that keeps IoT, Smart Cities, and our world of converged machines, sensors, applications and algorithms operational.

Addressing the 2019 cyber-threat also requires incorporating a better and more calculated risk awareness and management security strategy by both the public and private sectors. A 2019 cybersecurity risk management strategy will need to be comprehensive, adaptive and elevated to the C-Suite. 

I have just touched on a few of the implications of four emerging technology areas that will have significant impact in our lives in 2019.

These areas are just the tip of the iceberg as we really are in the midst of a paradigm shift in applied scientific knowledge.  We have entered a new renaissance of accelerated technological development that is exponentially transforming our civilization.

Yet with these benefits come risks. With such catalyzing innovation, we cannot afford to lose control. The real imperative for this new year is for planning and systematic integration.  

Hopefully that will provide us with a guiding technological framework that will keep us prosperous and safe.

Article by Chuck Brooks Special to Forbes Magazine
Chuck Brooks is an Advisor and Contributor to Cognitive World. In his full time role he is the Principal Market Growth Strategist for General Dynamics Mission Systems…MORE

AI and Nanotechnology Team Up to bring Humans to the brink of IMMORTALITY, top scientist claims


IMMORTAL: Human beings could soon live forever 

HUMAN beings becoming immortal is a step closer following the launch of a new start-up.

Dr Ian Pearson has previously said people will have the ability to “not die” by 2050 – just over 30 years from now.

Two of the methods he said humans might use were “body part renewal” and linking bodies with machines so that people are living their lives through an android.

But after Dr Pearson’s predictions, immortality may now be a step nearer following the launch of a new start-up.

Human is hoping to make the immortality dream a reality with an ambitious plan.

Josh Bocanegra, the CEO of the company, said he is hoping to use Artificial Intelligence technology to create its own human being in the next three decades.

He said: “We’re using artificial intelligence and nanotechnology to store data of conversational styles, behavioural patterns, thought processes and information about how your body functions from the inside-out.

Watch

Live to 2050 and “Live Forever” Really?

“This data will be coded into multiple sensor technologies, which will be built into an artificial body with the brain of a deceased human.

“Using cloning technology, we will restore the brain as it matures.” 

Last year, UK-based stem cell bank StemProject said it could eventually potentially develop treatments that allow humans to live until 200.

Mark Hall, from StemProtect, said at the time: “In just the same way as we might replace a joint such as a hip with a specially made synthetic device, we can now replace cells in the body with new cells which are healthy and younger versions of the ones they’re replacing.

“That means we can replace diseased or ageing cells – and parts of the body – with entirely new ones which are completely natural and healthy.”

Watch Dr. Ian Pearson Talk About the Possibility of Immortality by 2050

University of Delaware: Programming DNA to deliver cancer drugs


DNA has an important job — it tells your cells which proteins to make. Now, a research team at the University of Delaware has developed technology to program strands of DNA into switches that turn proteins on and off. Credit: University of Delaware

DNA has an important job—it tells your cells which proteins to make. Now, a research team at the University of Delaware has developed technology to program strands of DNA into switches that turn proteins on and off.

UD’s Wilfred Chen Group describes their results in a paper published Monday, March 12 in the journal Nature Chemistry. This technology could lead to the development of new cancer therapies and other drugs.

Computing with DNA

This project taps into an emerging field known as DNA computing. Data we commonly send and receive in everyday life, such as text messages and photos, utilize binary code, which has two components—ones and zeroes. DNA is essentially a code with four components, the nucleotides guanine, adenine, cytosine, and thymine. In cells, the arrangement of these four nucleotides determines the output—the proteins made by the DNA. Here, scientists have repurposed the DNA code to design logic-gated DNA circuits.

“Once we had designed the system, we had to first go into the lab and attach these DNA strands to various proteins we wanted to be able to control,” said study author Rebecca P. Chen, a doctoral student in chemical and biomolecular engineering (no relation to Wilfred Chen).

The custom sequence designed DNA strands were ordered from a manufacturer while the proteins were made and purified in the lab. Next, the protein was attached to the DNA to make protein-DNA conjugates.

The group then tested the DNA circuits on E. coli bacteria and human cells. The target proteins organized, assembled, and disassembled in accordance with their design.

“Previous work has shown how powerful DNA nanotechnology might possibly be, and we know how powerful proteins are within cells,” said Rebecca P. Chen. “We managed to link those two together.”

Applications to drug delivery

The team also demonstrated that their DNA-logic devices could activate a non-toxic cancer prodrug, 5-fluorocytosine, into its toxic chemotherapeutic form, 5-fluorouracil. Cancer prodrugs are inactive until they are metabolized into their therapeutic form.

In this case, the scientists designed DNA circuits that controlled the activity of a protein that was responsible for conversion of the prodrug into its active form. The DNA circuit and protein activity was turned “on” by specific RNA/DNA sequence inputs, while in the absence of said inputs the system stayed “off.”

To do this, the scientists based their sequence inputs on microRNA, small RNA molecules that regulate cellular gene expression. MicroRNA in cancer cells contains anomalies that would not be found in healthy cells. For example, certain microRNA are present in cancer cells but absent in healthy cells. The group calculated how nucleotides should be arranged to activate the cancer prodrug in the presence of cancer microRNA, but stay inactive and non-toxic in a non-cancerous environment where the microRNA are missing.

When the cancer microRNAs were present and able to turn the DNA circuit on, cells were unable to grow. When the circuit was turned off, cells grew normally.

Wilfred Chen (left) and Rebecca P. Chen are developing new biomolecular tools to address key global health problems. Credit: University of Delaware/ Evan Krape

This technology could have wide applications not only to other diseases besides cancer, but also beyond the biomedical field. For example, the research team demonstrated that their technology could be applied to the production of biofuels, by utilizing their technology to guide an enzymatic cascade, a series of chemical reactions, to break down a plant fiber.

Using the newly developed technology, researchers could target any DNA sequence of their choosing and attach and control any protein they want. Someday, researchers could “plug and play” programmed DNA into a variety of cells to address a variety of diseases, said study author Wilfred Chen, Gore Professor of Chemical Engineering.

“This is based on a very simple concept, a logical combination, but we are the first to make it work,” he said. “It can address a wide scope of problems, and that makes it very intriguing.”

More information: Rebecca P. Chen et al, Dynamic protein assembly by programmable DNA strand displacement, Nature Chemistry (2018). DOI: 10.1038/s41557-018-0016-9

Provided by: University of Delaware

“And Now for Something Completely Different” – Australian Physicists Have Proved That Time Travel is Possible


Scientists from the University of Queensland have used photons (single particles of light) to simulate quantum particles travelling through time. The research is cutting edge and the results could be dramatic!

Their research, entitled “Experimental simulation of closed timelike curves “, is published in the latest issue of NatureCommunications.

The grandfather paradox states that if a time traveler were to go back in time, he could accidentally prevent his grandparents from meeting, and thus prevent his own birth. However, if he had never been born, he could never have traveled back in time, in the first place. The paradoxes are largely caused by Einstein’s theory of relativity, and the solution to it, the Gödel metric.

How relativity works

Einstein’s theory of relativity is made up of two parts – general relativity and special relativity. Special relativity posits that space and time are aspects of the same thing, known as the space-time continuum, and that time can slow down or speed up, depending on how fast you are moving, relative to something else.

Gravity can also bend time, and Einstein’s theory of general relativity suggests that it would be possible to travel backwards in time by following a space-time path, i.e. a closed timeline curve that returns to the starting point in space, but arrives at an earlier time.

It was predicted in 1991 that quantum mechanics could avoid some of the paradoxes caused by Einstein’s theory of relativity, as quantum particles behave almost outside the realm of physics.

Read More: Parallel Worlds Exist And Interact With Our World, Say Physicists

“The question of time travel features at the interface between two of our most successful yet incompatible physical theories – Einstein’s general relativity and quantum mechanics. Einstein’s theory describes the world at the very large scale of stars and galaxies, while quantum mechanics is an excellent description of the world at the very small scale of atoms and molecules.” said Martin Ringbauer, a PhD student at UQ’s School of Mathematics and Physics and a lead author of the paper.

Simulating time travel

The scientists simulated the behavior of two photons interacting with each other in two different cases. In the first case, one photon passed through a wormhole and then interacted with its older self. In the second case, when a photon travels through normal space-time and interacts with another photon trapped inside a closed timeline curve forever.

“The properties of quantum particles are ‘fuzzy’ or uncertain to start with, so this gives them enough wiggle room to avoid inconsistent time travel situations,” said co-author Professor Timothy Ralph.

“Our study provides insights into where and how nature might behave differently from what our theories predict.”

Although it has been possible to simulate time travel with tiny quantum particles, the same might not be possible for larger particles or atoms, which are groups of particles.

Rice University: Designing Materials with ‘Stiffness and Flexibility’


Rice Flex Materials 38905-53

Materials scientists at Rice University are looking to nature — at the discs in human spines and the skin in ocean-diving fish, for example — for clues about designing materials with seemingly contradictory properties — flexibility and stiffness.

In research graduate student Peter Owuor, research scientist Chandra Sekhar Tiwary and colleagues from the laboratories of Rice Professor Pulickel Ajayan and Jun Lou found they could increase the stiffness, or “elastic modulus,” of a soft silicon-based polymer by infusing it with tiny pockets of liquid gallium.

Such composites could find use in high-energy absorption materials and shock absorbers and in biomimetic structures like artificial intervertebral discs, they said.

Owuor said conventional wisdom in composite design for the past 60 years has been that adding a harder substance increases modulus and adding a softer one decreases modulus. In most instances, that’s correct.

“People had not really looked at it from the other way around,” he said. “Is it possible to add something soft inside something else that is also soft and get something that has a higher modulus? If you look at the natural world, there are plenty of examples where you find exactly that. As materials scientists, we wanted to study this, not from a biological perspective but rather from a mechanical one.”

For example, the discs between the vertebrae in human spines, which act like both shock absorbers and ligaments, are made of a tough outer layer of cartilage and a soft, jelly-like interior. And the outer skin of deep-diving ocean fish and mammals contain myriad tiny oil-filled chambers — some no larger than a virus and others larger than entire cells — that allow the animals to withstand the intense pressures that exist thousands of feet below the ocean’s surface.Rice Flex Materials2 38906-53.jpg

Choosing the basic materials to model these living systems was relatively easy, but finding a way to bring them together to mimic nature proved difficult, said Tiwary, a postdoctoral research associate in Rice’s Department of Materials Science and NanoEngineering.

Polydimethylsiloxane, or PDMS, was chosen as the soft encapsulating layer for a number of reasons: It’s cheap, inert, nontoxic and widely used in everything from caulk and aquarium sealants to cosmetics and food additives. It also dries clear, which made it easy to see the bubbles of liquid the team wanted to encapsulate. For that, the researchers chose gallium, which like mercury is liquid at room temperature, but unlike mercury is nontoxic and relatively easy to work with.

Owuor said it took nearly four months to find a recipe for encapsulating bubbles of gallium inside PDMS. His test samples are about the diameter of a small coin and as much as a quarter-inch thick. By curing the PDMS slowly, Owuor developed a process by which he could add gallium droplets of various sizes. Some samples contained one large inner chamber, and others contained up to a dozen discrete droplets.

Each sample was subjected to dozens of tests. A dynamic mechanical analysis instrument was used to measure how much the material deformed under load, and various measures like stiffness, toughness and elasticity were measured under a variety of conditions. For example, with a relatively small amount of cooling, gallium can be turned into a solid. So the team was able to compare some measurements taken when the gallium spheres were liquid with measures taken when the spheres were solid.

Collaborators Roy Mahapatra and Shashishekarayya Hiremath of the Indian Institute of Science at Bangalore used finite element modeling and hydrodynamic simulations to help the team analyze how the materials behaved under mechanical stress. Based on this, the researchers determined that pockets of liquid gallium gave the composite higher energy absorption and dissipation characteristics than plain PDMS or PDMS with air-filled pockets.

“What we’ve shown is that putting liquid inside a solid is not always going to make it softer, and thanks to our collaborators we are able to explain why this is the case,” Tiwary said. “Next we hope to use this understanding to try to engineer materials to take advantage of these properties.”

Owuor and Tiwary said just using nanoengineering alone may not provide a maximum effect. Instead, nature employs hierarchical structures with features of varying sizes that repeat at larger scales, like those found in the oil-filled chambers in fish skin.

“If you look at (the fish’s) membrane and you section it, there is a layer where you have spheres with big diameters, and as you move, the diameters keep decreasing,” Owuor said. “The chambers are seen across the whole scale, from the nano- all the way out to the microscale.

Tiwary said, “There are important nanoscale features in nature, but it’s not all nano. We may find that engineering at the nanoscale alone isn’t enough. We want to see if we can start designing in a hierarchical way.”

Ajayan is chair of Rice’s Department of Materials Science and NanoEngineering, the Benjamin M. and Mary Greenwood Anderson Professor in Engineering and a professor of chemistry.

The research was supported by the Air Force Office of Scientific Research. Additional Rice co-authors include Lou, Alin Chipara and Robert Vajtai.

MIT: Powering up graphene implants without frying cells ~ For the Next Generation of Implants


mit-graphene-heat_0

This computational illustration shows a graphene network structure below a layer of water.

Image: Zhao Qin

New analysis finds way to safely conduct heat from graphene to biological tissues.

In the future, our health may be monitored and maintained by tiny sensors and drug dispensers, deployed within the body and made from graphene — one of the strongest, lightest materials in the world. Graphene is composed of a single sheet of carbon atoms, linked together like razor-thin chicken wire, and its properties may be tuned in countless ways, making it a versatile material for tiny, next-generation implants.

But graphene is incredibly stiff, whereas biological tissue is soft. Because of this, any power applied to operate a graphene implant could precipitously heat up and fry surrounding cells.

Now, engineers from MIT and Tsinghua University in Beijing have precisely simulated how electrical power may generate heat between a single layer of graphene and a simple cell membrane. While direct contact between the two layers inevitably overheats and kills the cell, the researchers found they could prevent this effect with a very thin, in-between layer of water.

By tuning the thickness of this intermediate water layer, the researchers could carefully control the amount of heat transferred between graphene and biological tissue. They also identified the critical power to apply to the graphene layer, without frying the cell membrane. The results are published today in the journal Nature Communications.

Co-author Zhao Qin, a research scientist in MIT’s Department of Civil and Environmental Engineering (CEE), says the team’s simulations may help guide the development of graphene implants and their optimal power requirements.

“We’ve provided a lot of insight, like what’s the critical power we can accept that will not fry the cell,” Qin says. “But sometimes we might want to intentionally increase the temperature, because for some biomedical applications, we want to kill cells like cancer cells. This work can also be used as guidance [for those efforts.]”

Qin’s co-authors include Markus Buehler, head of CEE and the McAfee Professor of Engineering, along with Yanlei Wang and Zhiping Xu of Tsinghua University.

Sandwich model

Typically, heat travels between two materials via vibrations in each material’s atoms. These atoms are always vibrating, at frequencies that depend on the properties of their materials. As a surface heats up, its atoms vibrate even more, causing collisions with other atoms and transferring heat in the process.

The researchers sought to accurately characterize the way heat travels, at the level of individual atoms, between graphene and biological tissue. To do this, they considered the simplest interface, comprising a small, 500-nanometer-square sheet of graphene and a simple cell membrane, separated by a thin layer of water.

mit-graphene-ii-shutterstock_62457640-610x406“In the body, water is everywhere, and the outer surface of membranes will always like to interact with water, so you cannot totally remove it,” Qin says. “So we came up with a sandwich model for graphene, water, and membrane, that is a crystal clear system for seeing the thermal conductance between these two materials.”

Qin’s colleagues at Tsinghua University had previously developed a model to precisely simulate the interactions between atoms in graphene and water, using density functional theory — a computational modeling technique that considers the structure of an atom’s electrons in determining how that atom will interact with other atoms.

However, to apply this modeling technique to the group’s sandwich model, which comprised about half a million atoms, would have required an incredible amount of computational power. Instead, Qin and his colleagues used classical molecular dynamics — a mathematical technique based on a “force field” potential function, or a simplified version of the interactions between atoms — that enabled them to efficiently calculate interactions within larger atomic systems.

The researchers then built an atom-level sandwich model of graphene, water, and a cell membrane, based on the group’s simplified force field. They carried out molecular dynamics simulations in which they changed the amount of power applied to the graphene, as well as the thickness of the intermediate water layer, and observed the amount of heat that carried over from the graphene to the cell membrane.

Watery crystals

Because the stiffness of graphene and biological tissue is so different, Qin and his colleagues expected that heat would conduct rather poorly between the two materials, building up steeply in the graphene before flooding and overheating the cell membrane. However, the intermediate water layer helped dissipate this heat, easing its conduction and preventing a temperature spike in the cell membrane.

Looking more closely at the interactions within this interface, the researchers made a surprising discovery: Within the sandwich model, the water, pressed against graphene’s chicken-wire pattern, morphed into a similar crystal-like structure.

“Graphene’s lattice acts like a template to guide the water to form network structures,” Qin explains. “The water acts more like a solid material and makes the stiffness transition from graphene and membrane less abrupt. We think this helps heat to conduct from graphene to the membrane side.”

The group varied the thickness of the intermediate water layer in simulations, and found that a 1-nanometer-wide layer of water helped to dissipate heat very effectively. In terms of the power applied to the system, they calculated that about a megawatt of power per meter squared, applied in tiny, microsecond bursts, was the most power that could be applied to the interface without overheating the cell membrane.

Qin says going forward, implant designers can use the group’s model and simulations to determine the critical power requirements for graphene devices of different dimensions. As for how they might practically control the thickness of the intermediate water layer, he says graphene’s surface may be modified to attract a particular number of water molecules. mit_logo

“I think graphene provides a very promising candidate for implantable devices,” Qin says. “Our calculations can provide knowledge for designing these devices in the future, for specific applications, like sensors, monitors, and other biomedical applications.”

This research was supported in part by the MIT International Science and Technology Initiative (MISTI): MIT-China Seed Fund, the National Natural Science Foundation of China, DARPA, the Department of Defense (DoD) Office of Naval Research, the DoD Multidisciplinary Research Initiatives program, the MIT Energy Initiative, and the National Science Foundation.

%d bloggers like this: