New Form of Carbon is Stronger Than Graphene and Diamond

Chemists have calculated that chains of double or triple-bonded carbon atoms, known as carbyne, should be stronger and stiffer than any known material.

The sixth element, carbon, has given us an amazing abundance of extraordinary materials. Once there was simply carbon, graphite and diamond. But in recent years chemists have added buckyballs, nanotubes and any number of exotic shapes created out of graphene, the molecular equivalent of chickenwire.

So it’s hard to believe that carbon has any more surprises up its sleeve. And yet today, Mingjie Liu and pals at Rice University in Houston calculate the properties of another form of carbon that is stronger, stiffer and more exotic than anything chemists have seen before.

The new material is called carbyne. It is a chain of carbon atoms that are linked either by alternate triple and single bonds or by consecutive double bonds.

Carbyne is something of a mystery. Astronomers believe they have detected its signature in interstellar space but chemists have been bickering for decades over whether they had ever created this stuff on Earth. A couple of years ago, however, they synthesised carbyne chains up to 44 atoms long in solution.

The thinking until now has been that carbyne must be extremely unstable. In fact some chemists have calculates that two strands of carbyne coming into contact would react explosively.

Nevertheless, nanotechnologists have been fascinated with potential of this material because it ought to be both strong and stiff and therefore useful. But exactly how strong and how stiff, no one has been quite sure.

This is where Liu and co step in. These guys have calculated from first principles the bulk properties of carbyne and the results make for interesting reading.

For a start, they say that carbyne is about twice as stiff as the stiffest known materials today. Carbon nanotubes and grapheme, for example, have a stiffness of 4.5 x 10^8 N.m/kg but carbyne tops them with a stiffness of around 10^9 N.m/kg.

Just as impressive is the new material’s strength. Liu and co calculate that it takes around 10 nanoNewtons to break a single strand of carbyne. “This force translates into a specific strength of 6.0–7.5×10^7 N∙m/kg, again significantly outperforming every known material including graphene (4.7–5.5×10^7 N∙m/ kg), carbon nanotubes (4.3–5.0×10^7 N∙m/ kg), and diamond (2.5–6.5×10”7 N∙m/kg4),” they say.

Carbyne has other interesting properties too. Its flexibility is somewhere between that of a typical polymer and double-stranded DNA. And when twisted, it can either rotate freely or become torsionally stiff depending on the chemical group attached to its end.

Perhaps most interesting is the Rice team’s calculation of carbyne’s stability. They agree that two chains in contact can react but there is an activation barrier that prevents this happening readily. “This barrier suggests the viability of carbyne in condensed phase at room temperature on the order of days,” they conclude.

All this should whet the appetite of nanotechnologists hoping to design ever more exotic nanomachines, such as nanoelectronic and spintronic devices. Given the advances being made in manufacturing this stuff, we may not have long to wait before somebody begins exploiting the extraordinary mechanical properties of carbyne chains for real.

Nanosensors could aid drug manufacturing

Chemical engineers find that arrays of carbon nanotubes can detect flaws in drugs and help improve production.
CAMBRIDGE, Mass. — MIT chemical engineers have discovered that arrays of billions of nanoscale sensors have unique properties that could help pharmaceutical companies produce drugs — especially those based on antibodies — more safely and efficiently.

Using these sensors, the researchers were able to characterize variations in the binding strength of antibody drugs, which hold promise for treating cancer and other diseases. They also used the sensors to monitor the structure of antibody molecules, including whether they contain a chain of sugars that interferes with proper function.

“This could help pharmaceutical companies figure out why certain drug formulations work better than others, and may help improve their effectiveness,” says Michael Strano, an MIT professor of chemical engineering and senior author of a recent paper describing the sensors in the journal ACS Nano.

The team also demonstrated how nanosensor arrays could be used to determine which cells in a population of genetically engineered, drug-producing cells are the most productive or desirable, Strano says.

Lead author of the paper is Nigel Reuel, a graduate student in Strano’s lab. The labs of MIT faculty members Krystyn Van Vliet, Christopher Love and Dane Wittrup also contributed, along with scientists from Novartis.

Testing drug strength

Strano and other scientists have previously shown that tiny, nanometer-sized sensors, such as carbon nanotubes, offer a powerful way to detect minute quantities of a substance. Carbon nanotubes are 50,000 times thinner than a human hair, and they can bind to proteins that recognize a specific target molecule. When the target is present, it alters the fluorescent signal produced by the nanotube in a way that scientists can detect.

Some researchers are trying to exploit large arrays of nanosensors, such as carbon nanotubes or semiconducting nanowires, each customized for a different target molecule, to detect many different targets at once. In the new study, Strano and his colleagues wanted to explore unique properties that emerge from large arrays of sensors that all detect the same thing.

The first feature they discovered, through mathematical modeling and experimentation, is that uniform arrays can measure the distribution in binding strength of complex proteins such as antibodies. Antibodies are naturally occurring molecules that play a key role in the body’s ability to recognize and defend against foreign invaders. In recent years, scientists have been developing antibodies to treat disease, particularly cancer. When those antibodies bind to proteins found on cancer cells, they stimulate the body’s own immune system to attack the tumor.

For antibody drugs to be effective, they must strongly bind their target. However, the manufacturing process, which relies on nonhuman, engineered cells, does not always generate consistent, uniformly binding batches of antibodies.

Currently, drug companies use time-consuming and expensive analytical processes to test each batch and make sure it meets the regulatory standards for effectiveness. However, the new MIT sensor could make this process much faster, allowing researchers to not only better monitor and control production, but also to fine-tune the manufacturing process to generate a more consistent product.

“You could use the technology to reject batches, but ideally you’d want to use it in your upstream process development to better define culture conditions, so then you wouldn’t produce spurious lots,” Reuel says.

Measuring weak interactions

Another useful trait of such sensors is their ability to measure very weak binding interactions, which could also help with antibody drug manufacturing.

Antibodies are usually coated with long sugar chains through a process called glycosylation. These sugar chains are necessary for the drugs to be effective, but they are extremely hard to detect because they interact so weakly with other molecules. Drug-manufacturing organisms that synthesize antibodies are also programmed to add sugar chains, but the process is difficult to control and is strongly influenced by the cells’ environmental conditions, including temperature and acidity.

Without the appropriate glycosylation, antibodies delivered to a patient may elicit an unwanted immune response or be destroyed by the body’s cells, making them useless.

“This has been a problem for pharmaceutical companies and researchers alike, trying to measure glycosylated proteins by recognizing the carbohydrate chain,” Strano says. “What a nanosensor array can do is greatly expand the number of opportunities to detect rare binding events. You can measure what you would otherwise not be able to quantify with a single, larger sensor with the same sensitivity.”

This tool could help researchers determine the optimal conditions for the correct degree of glycosylation to occur, making it easier to consistently produce effective drugs.

Mapping production

The third property the researchers discovered is the ability to map the production of a molecule of interest. “One of the things you would like to do is find strains of particular organisms that produce the therapeutic that you want,” Strano says. “There are lots of ways of doing this, but none of them are easy.”

The MIT team found that by growing the cells on a surface coated with an array of nanometer-sized sensors, they could detect the location of the most productive cells. In this study, they looked for an antibody produced by engineered human embryonic kidney cells, but the system could also be tailored to other proteins and organisms.

Once the most productive cells are identified, scientists look for genes that distinguish those cells from the less productive ones and engineer a new strain that is highly productive, Strano says.

The researchers have built a briefcase-sized prototype of their sensor that they plan to test with Novartis, which funded the research along with the National Science Foundation.

“Carbon nanotubes coupled to protein-binding entities are interesting for several areas of bio-manufacturing as they offer great potential for online monitoring of product levels and quality. Our collaboration has shown that carbon nanotube-based fluorescent sensors are applicable for such purposes, and I am eager to follow the maturation of this technology,” says Ramon Wahl, an author of the paper and a principal scientist at Novartis.

Growth of disorder of electrons measured in dual temperature system

Researchers at Aalto University and the University of Tokyo have succeeded for the first time in experimentally measuring a probability distribution for entropy production of electrons.

Entropy production means an increase in disorder when electrons are moved individually between two microscopic conductors of differing temperatures.

The researchers also showed that a connection prevails between two definitions of entropy that have been used. The result is significant for the design of future nanoelectronic devices. The study was published recently in the scientific journal Nature Physics.

Similar experiments have been conducted before, but this is the first time that researchers have used conductors at different temperatures to measure the entropy production of electrons.

‘Entropy production is defined either by the time when the shift takes place or by the heat that moves from one conductor to another. In the study we measured electronic entropy production according to both definitions. The change in entropy in an individual measurement is random: the distribution for production is acquired by repeating the process about 100,000 times, for instance. Both distributions follow the so-called fluctuation relation’, says doctoral student Jonne Koski.

Fluctuation relations are relatively new discoveries of thermodynamics and statistical physics. When the probability to produce a certain amount of disorder of electrons, or entropy, is precisely known, the fluctuation relation is an equation, which gives a probability for the decrease in the amount of entropy. Therefore, the degree of disorder of electrons can decline when the nanostructures are examined for short periods of time.

‘Entropy production leads to overheating in the nanostructures, which is why it is important to get more information on their heat transmission properties’, observes Professor Jukka Pekola.

Quantum teleportation: Transfer of flying quantum bits at the touch of a button

Hybrid technology makes possible highly reliable transmission of photonic qubits

By means of the quantum-mechanical entanglement of spatially separated light fields, researchers in Tokyo and Mainz have managed to teleport photonic qubits with extreme reliability. This means that a decisive breakthrough has been achieved some 15 years after the first experiments in the field of optical teleportation. The success of the experiment conducted in Tokyo is attributable to the use of a hybrid technique in which two conceptually different and previously incompatible approaches were combined. “Discrete digital optical quantum information can now be transmitted continuously – at the touch of a button, if you will,” explained Professor Peter van Loock of Johannes Gutenberg University Mainz (JGU). As a theoretical physicist, van Loock advised the experimental physicists in the research team headed by Professor Akira Furusawa of the University of Tokyo on how they could most efficiently perform the teleportation experiment to ultimately verify the success of quantum teleportation. Their findings have now been published in the prestigious specialist journal Nature.
Quantum teleportation involves the transfer of arbitrary quantum states from a sender, dubbed Alice, to a spatially distant receiver, named Bob. This requires that Alice and Bob initially share an entangled quantum state across the space in question, e.g., in the form of entangled photons. Quantum teleportation is of fundamental importance to the processing of quantum information (quantum computing) and quantum communication. Photons are especially valued as ideal information carriers for quantum communication since they can be used to transmit signals at the speed of light. A photon can represent a quantum bit or qubit analogous to a binary digit (bit) in standard classical information processing. Such photons are known as ‘flying quantum bits’.

The first attempts to teleport single photons or light particles were made by the Austrian physicist Anton Zeilinger. Various other related experiments have been performed in the meantime. However, teleportation of photonic quantum bits using conventional methods proved to have its limitations because of experimental deficiencies and difficulties with fundamental principles.

What makes the experiment in Tokyo so different is the use of a hybrid technique. With its help, a completely deterministic and highly reliable quantum teleportation of photonic qubits has been achieved. The accuracy of the transfer was 79 to 82 percent for four different qubits. In addition, the qubits were teleported much more efficiently than in previous experiments, even at a low degree of entanglement.

The concept of entanglement was first formulated by Erwin Schrödinger and involves a situation in which two quantum systems, such as two light particles for example, are in a joint state, so that their behavior is mutually dependent to a greater extent than is normally (classically) possible. In the Tokyo experiment, continuous entanglement was achieved by means of entangling many photons with many other photons. This meant that the complete amplitudes and phases of two light fields were quantum correlated. Previous experiments only had a single photon entangled with another single photon – a less efficient solution. “The entanglement of photons functioned very well in the Tokyo experiment – practically at the press of a button, as soon as the laser was switched on,” said van Loock, Professor for Theory of Quantum Optics and Quantum Information at Mainz University. This continuous entanglement was accomplished with the aid of so-called ‘squeezed light’, which takes the form of an ellipse in the phase space of the light field. Once entanglement has been achieved, a third light field can be attached to the transmitter. From there, in principle, any state and any number of states can be transmitted to the receiver. “In our experiment, there were precisely four sufficiently representative test states that were transferred from Alice to Bob using entanglement. Thanks to continuous entanglement, it was possible to transmit the photonic qubits in a deterministic fashion to Bob, in other words, in each run,” added van Loock.

Earlier attempts to achieve optical teleportation were performed differently and, before now, the concepts used have proved to be incompatible. Although in theory it had already been assumed that the two different strategies, from the discrete and the continuous world, needed to be combined, it represents a technological breakthrough that this has actually now been experimentally demonstrated with the help of the hybrid technique. “The two separate worlds, the discrete and the continuous, are starting to converge,” concluded van Loock.

Graphene nanoscrolls are formed by decoration of magnetic nanoparticles

Researchers at Umeå University, together with researchers at Uppsala University and Stockholm University, show in a new study how nitrogen doped graphene can be rolled into perfect Archimedean nano scrolls by adhering magnetic iron oxide nanoparticles on the surface of the graphene sheets. The new material may have very good properties for application as electrodes in for example Li-ion batteries.

Graphene is one of the most interesting materials for future applications in everything from high performance electronics, optical components to flexible and strong materials. Ordinary graphene consists of carbon sheets that are single or few atomic layers thick.

In the study the researchers have modified the graphene by replacing some of the carbon atoms by nitrogen atoms. By this method they obtain anchoring sites for the iron oxide nanoparticles that are decorated onto the graphene sheets in a solution process. In the decoration process one can control the type of iron oxide nanoparticles that are formed on the graphene surface, so that they either form so called hematite (the reddish form of iron oxide that often is found in nature) or maghemite, a less stable and more magnetic form of iron oxide.

“Interestingly we observed that when the graphene is decorated by maghemite, the graphene sheets spontaneously start to roll into perfect Archimedean nano scrolls, while when decorated by the less magnetic hematite nanoparticles the graphene remain as open sheets, says Thomas Wågberg, Senior lecturer at the Department of Physics at Umeå University.

The nanoscrolls can be visualized as traditional “Swiss rolls” where the sponge-cake represents the graphene, and the creamy filling is the iron oxide nanoparticles. The graphene nanoscrolls are however around one million times thinner.

The results that now have been published in Nature Communications are conceptually interesting for several reasons. It shows that the magnetic interaction between the iron oxide nanoparticles is one of the main effects behind the scroll formation. It also shows that the nitrogen defects in the graphene lattice are necessary for both stabilizing a sufficiently high number of maghemite nanoparticles, and also responsible for “buckling” the graphene sheets and thereby lowering the formation energy of the nanoscrolls.

The process is extraordinary efficient. Almost 100 percent of the graphene sheets are scrolled. After the decoration with maghemite particles the research team could not find any open graphene sheets.
Moreover, they showed that by removing the iron oxide nanoparticles by acid treatment the nanoscrolls again open up and go back to single graphene sheets

“Besides adding valuable fundamental understanding in the physics and chemistry of graphene, nitrogen-doping and nanoparticles we have reasons to believe that the iron oxide decorated nitrogen doped graphene nanoscrolls have very good properties for application as electrodes in for example Li-ion batteries, one of the most important batteries in daily life electronics, “ says Thomas Wågberg.
The study has been conducted within the “The artificial leaf” project which is funded by Knut and Alice Wallenberg foundation to physicist, chemists, and plant science researchers at Umeå University.

Image: Snapshot of a partially re-opened nanoscroll. The atomic layer thick graphene resembles a thin foil with some few wrinkles.

NJIT Researchers Examine Dynamics of Liquid Metal Particles at Nanoscale

Two NJIT researchers have demonstrated that using a continuum-based approach, they can explain the dynamics of liquid metal particles on a substrate of a nanoscale. “Numerical simulation of ejected molten metal nanoparticles liquified by laser irradiation: Interplay of geometry and dewetting,” appeared in Physical Review Letters (July 16, 2013).

The evolution of fluid drops deposited on solid substrates has been a focus of large research effort for decades, said co-author Shahriar Afkhami, an assistant professor in the NJIT Department of Mathematical Sciences. This effort has become particularly extensive on the nanoscale, due to the relevance of nanostructures in a variety of fields, ranging from DNA sequencing to plasmonics and nanomagnetism. And the research also applies to liquid crystal displays and solar panel designs.”
In this work, Afkhami with NJIT Professor Lou Kondic, also in the Department of Mathematical Sciences, studied the liquid metal nanostructures placed on solid substrates. The study is of direct relevance to self- and directed-assembly of metal nanoparticles on surfaces. For example, the size and distribution of metallic particles strongly affects the yield of solar cell devices, Afkhami said.

In this work, however, the researchers demonstrate that using a continuum-based approach is appropriate on the nanoscale, where the basic assumptions of continuum fluid mechanics are pushed to the limits. The pair’s research is the first attempt to utilize state-of-the-art simulations based on continuum fluid mechanics to explain the dynamics of liquid metal particles on a substrate on the nanoscale.

“We demonstrated that continuum simulations provide a good qualitative agreement with atomistic simulations on the length scales in the range of 1-10 nm and with the physical experiments length scales measured in the range of 100 nanometers,” added Kondic.

Kondic is involved in the mathematical modeling and simulating of granular materials, as well as in development of numerical methods for highly nonlinear partial differential equations related to the flows of thin liquid films. In 2005, Kondic received a Fulbright Foundation grant and traveled to Argentina to study the dynamics of non-Newtonian liquid films involving contact lines. He currently leads four federally funded projects totaling more than $800,000.

Afkhami uses computational and mathematical modeling to help researchers better understand a range of real-life engineering phenomena. His work includes examining biomedical systems, polymers and plastics, microfluidics and nano-materials. His research looks for the existence of solutions and issues involving fluid flows from stability to asymptotic behavior.

Afkhami’s current research project is to numerically discover a better way to understand the dynamics of mixtures of fluids. The effort will tie into his new three-year NSF $252,000 grant (2013-16) to develop a state-of-the-art computational framework for polymeric liquids. The fruits of this labor will eventually have a broad effect in complex applications, such as how blood and other bodily fluids flow in microfluidic devices as well as finding better ways to improve the flow of emulsions when blending or processing polymers.

Pass the salt: Common condiment could enable new high-tech industry

Chemists at Oregon State University have identified a compound that could significantly reduce the cost and potentially enable the mass commercial production of silicon nanostructures – materials that have huge potential in everything from electronics to biomedicine and energy storage.

This extraordinary compound is called table salt.

Simple sodium chloride, most frequently found in a salt shaker, has the ability to solve a key problem in the production of silicon nanostructures, researchers just announced in Scientific Reports, a professional journal.

By melting and absorbing heat at a critical moment during a “magnesiothermic reaction,” the salt prevents the collapse of the valuable nanostructures that researchers are trying to create. The molten salt can then be washed away by dissolving it in water, and it can be recycled and used again.

The concept, surprising in its simplicity, should open the door to wider use of these remarkable materials that have stimulated scientific research all over the world.

“This could be what it takes to open up an important new industry,” said David Xiulei Ji, an assistant professor of chemistry in the OSU College of Science. “There are methods now to create silicon nanostructures, but they are very costly and can only produce tiny amounts.

“The use of salt as a heat scavenger in this process should allow the production of high-quality silicon nanostructures in large quantities at low cost,” he said. “If we can get the cost low enough many new applications may emerge.”

Silicon, the second most abundant element in the Earth’s crust, has already created a revolution in electronics. But silicon nanostructures, which are complex structures much smaller than a speck of dust, have potential that goes far beyond the element itself.

Uses are envisioned in photonics, biological imaging, sensors, drug delivery, thermoelectric materials that can convert heat into electricity, and energy storage.

Batteries are one of the most obvious and possibly first applications that may emerge from this field, Ji said. It should be possible with silicon nanostructures to create batteries – for anything from a cell phone to an electric car – that last nearly twice as long before they need recharging.

Existing technologies to make silicon nanostructures are costly, and simpler technologies in the past would not work because they required such high temperatures. Ji developed a methodology that mixed sodium chloride and magnesium with diatomaceous earth, a cheap and abundant form of silicon.

When the temperature reached 801 degrees centigrade, the salt melted and absorbed heat in the process. This basic chemical concept – a solid melting into a liquid absorbs heat – kept the nanostructure from collapsing.

The sodium chloride did not contaminate or otherwise affect the reaction, researchers said. Scaling reactions such as this up to larger commercial levels should be feasible, they said.

The study also created, for the first time with this process, nanoporous composite materials of silicon and germanium. These could have wide applications in semiconductors, thermoelectric materials and electrochemical energy devices.

Funding for the research was provided by OSU. Six other researchers from the Department of Chemistry and the OSU Department of Chemical Engineering also collaborated on the work.