Blogger Themes

Friday 30 March 2012

New understanding of how materials change when rapidly heated

Engineerblogger
March 30, 2012


Behrad led the team that provided the phase change materials. Credit: Southampton University.

Collaboration between the University of Southampton and the University of Cambridge has made ground-breaking advances in our understanding of the changes that materials undergo when rapidly heated.

Using cutting edge equipment and specially designed MEM’s sensors on loan from Mettler-Toledo, scientists from the University of Southampton’s Optoelectronic Research Centre and the University of Cambridge’s Department of Materials Science were able to probe the behaviour of phase change memory materials, the semiconductors that store information in the next generation of electronics, as they were heated at rates up to 10,000 degree C per second.

Insight and a detailed understanding of measurement results was provided by Professor Lindsay Greer, from the University of Cambridge’s Department of Materials Science, whose analysis showed that crystal growth rates differed considerably from other materials such as glass and silicon and the behaviour of these materials on such rapid heating was not as expected.

The results, which are published this week in Nature Materials, show that crystal growth rates are much faster than we previously believed in these materials and that the growth behaviour is independent of the surroundings. While it is not surprising that properties of materials change significantly when they are shrunk to nanoscale dimensions, we now have a method of directly screening materials for improved memory performance; this means faster, smaller and less power hungry smart phones, ipods and computers are one step closer.

Professor Dan Hewak from the University of Southampton, whose team, led by Behrad Gholipour, provided the phase change materials and deposited them as very thin films, comments:

“We have been studying novel glasses and phase change materials for two decades here at the Optoelectronics Research Centre. However, our understanding of what happens when these materials are heated, that is, their crystallization and melting behaviours, has been limited to heating rates of about 10 degrees C per minute using conventional thermal analysis. In reality, in the memory devices we fabricate, heating rates are millions of times faster and it is reasonable to expect that in order to improve these devices, an understanding of their properties at the same heating rates they will be used is needed.”

Writing in the same issue of Nature Materials, Professors Matthias Wuttig and Martin Salinga at RWTH Aachen University in Germany explain why this breakthrough is so important: “Jiri Orava (Cambridge University) and colleagues now provide a completely new insight in our understanding of the fast transformations that occur in the materials that make up today’s memory devices. Reading and writing of data in optical memory such as rewriteable compact discs (CD-RWs and DVDs) and emerging new electronic memory can take place at speeds of tens of nanoseconds but our understanding of what happens when these materials are heated is based on experiments where heating rates are much slower.”

“Unravelling the mysteries of chocolate making, comprehending the formation of amethyst geodes, or producing advanced steels requires an understanding of the relevant crystallization phenomena.”

Source: University of Southampton 

Additional Information:

Two-in-one device uses sewage as fuel to make electricity and clean the sewage

Engineerblogger
March 30, 2012


New “microbial fuel cell” cleans municipal sewage and generates electricity at the same time.

Scientists today described a new and more efficient version of an innovative device the size of a home washing machine that uses bacteria growing in municipal sewage to make electricity and clean up the sewage at the same time. Their report here at the 243rd National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society, suggested that commercial versions of the two-in-one device could be a boon for the developing world and water-short parts of the U.S.

“Our prototype incorporates innovations so that it can process five times more sewage six times more efficiently at half the cost of its predecessors,” said Orianna Bretschger, Ph.D., who presented a report on the improved technology at the ACS meeting.

“We’ve improved its energy recovery capacity from about 2 percent to as much as 13 percent, which is a great step in the right direction. That actually puts us in a realm where we could produce a meaningful amount of electricity if this technology is implemented commercially. Eventually, we could have wastewater treatment for free. That could mean availability for cleaner water in the developing world, or in southern California and other water-short areas of the United States through the use of more wastewater recycling technologies,” she said.

Current wastewater treatment technology involves a number of steps designed to separate the solid and liquid components of sewage and clean the wastewater before it is released into a waterway. This often involves settling tanks, macerators that break down larger objects, membranes to filter particles, biological digestion steps and chemicals that kill harmful microbes. One estimate puts their energy use at 2 percent of overall consumption in the U.S.

Bretschger’s team at the J. Craig Venter Institute is developing one version of a so-called microbial fuel cell (MFC). Traditional fuel cells, like those used on the Space Shuttles and envisioned for cars in the future “hydrogen economy,” convert fuel directly into electricity without igniting the fuel. They react or combine hydrogen and oxygen, for instance, and produce electricity and drinkable water. MFCs are biological fuel cells. They use organic matter, such as the material in sewage, as fuel, and microbes break down the organic matter. In the process of doing so, the bacteria produce electrons, which have a negative charge and are the basic units of electricity. Electricity consists of a flow of electrons or other charges through a circuit.

The new MFC uses ordinary sewage obtained from a conventional sewage treatment plant. Microbes that exist naturally in the sewage produce electrons as they metabolize, or digest, organic material in the sludge. Bretschger found that microbes exist in the MFC community that might even break down potentially harmful pollutants like benzene and toluene that may be in the sludge.

An MFC consists of a sealed chamber in which the microbes grow in a film on an electrode, which receives their electrons. Meanwhile, positively-charged units termed protons pass through a membrane to a second, unsealed container. In that container, microbes growing on another electrode combine oxygen with those protons and the electrons flowing as electricity from the electrode in the sealed chamber, producing water or other products like hydrogen peroxide.

Bretschger said the MFC also is quite effective in treating sewage to remove organic material, and data suggest a decrease in disease-causing microbes.

“We remove about 97 percent of the organic matter,” she said. “That sounds clean, but it is not quite clean enough to drink. In order to get to potable, you need 99.99 percent removal and more complete disinfection of the water.” Still, she suggested their MFC might one day replace some of the existing steps in municipal wastewater treatment.

The group presented their first MFC last year. Since then, they increased the amount of waste their device could handle each week from 20 gallons to 100 gallons, trucked in from a local treatment plant near San Diego. They also replaced the titanium components with a polyvinyl chloride (PVC) frame and graphite electrodes. Because of that, the new fuel cell costs about $150 per gallon, half as expensive as their previous prototype. The group hopes eventually to bring the cost under $20 per gallon or less to be cost competitive with existing water treatment technologies.

Bretschger reported that the new device is also more than six times as efficient as its predecessor, turning 13 percent of the usable energy in the sludge into electricity. While this only generates a small current, Bretschger explained that a large device running at 20-25 percent efficiency could produce enough power to operate a conventional wastewater treatment plant. A typical sewage treatment plant may consume enough electricity to power 10,000 or more homes, according to some estimates.

The scientists acknowledged funding from the California State PIER EISG program and the San Diego Foundation Blasker Science and Technology award.


Source: American Chemical Society

Moving microfluidics from the lab bench to the factory floor

Engineerblogger
March 30, 2012


The Center for Polymer Microfabrication is designing processes for manufacturing microfluidic chips. Pictured here is a chip fabricated by the center's tailor-made production machines. Photo: Melinda Hale

In the not-too-distant future, plastic chips the size of flash cards may quickly and accurately diagnose diseases such as AIDS and cancer, as well as detect toxins and pathogens in the environment. Such lab-on-a-chip technology — known as microfluidics — works by flowing fluid such as blood through microscopic channels etched into a polymer’s surface. Scientists have devised ways to manipulate the flow at micro- and nanoscales to detect certain molecules or markers that signal disease.

Microfluidic devices have the potential to be fast, cheap and portable diagnostic tools. But for the most part, the technology hasn’t yet made it to the marketplace. While scientists have made successful prototypes in the laboratory, microfluidic devices — particularly for clinical use — have yet to be manufactured on a wider scale.

MIT's David Hardt is working to move microfluidics from the lab to the factory. Hardt heads the Center for Polymer Microfabrication — a multidisciplinary research group funded by the Singapore-MIT Alliance — which is designing manufacturing processes for microfluidics from the ground up. The group is analyzing the behavior of polymers under factory conditions, building new tools and machines to make polymer-based chips at production levels, and designing quality-control processes to check a chip’s integrity at submicron scales — all while minimizing the cost of manufacturing.

“These are devices that people want to make by the millions, for a few pennies each,” says Hardt, the Ralph E. and Eloise F. Cross Professor of Mechanical Engineering at MIT. “The material cost is close to zero, there’s not enough plastic here to send a bill for. So you have to get the manufacturing cost down.”

Micromachines

Hardt and his colleagues found that in making microfluidic chips, many research groups and startups have adopted equipment mainly from the semiconductor industry. Hardt says this equipment — such as nano-indenting and bonding machines — is incredibly expensive, and was never designed to work on polymer-based materials. Instead, Hardt's team looked for ways to design cheaper equipment that’s better suited to work with polymers.

The group focused on an imprinting technique called microembossing, in which a polymer is heated, then stamped with a pattern of tiny channels. In experiments with existing machines, the researchers discovered a flaw in the embossing process: When they tried to disengage the stamping tool from the cooled chip, much of the plastic ripped out with it.

To prevent embossing failures in a manufacturing setting, the team studied the interactions between the cooling polymer and the embossing tool, measuring the mechanical forces between the two. The researchers then used the measurements to build embossing machines specifically designed to minimize polymer “stickiness.” In experiments, the group found that the machines fabricated chips quickly and accurately, “at very low cost,” Hardt says. “In many cases it makes sense to build your own equipment for the task at hand,” he adds.

In addition to building microfluidic equipment, Hardt and his team are coming up with innovative quality-control techniques. Unlike automobile parts on an assembly line that can be quickly inspected with the naked eye, microfluidic chips carry tiny features, some of which can only be seen with a high-resolution microscope. Checking every feature on even one chip is a time-intensive exercise.

Hardt and his colleagues came up with a fast and reliable way to gauge the “health” of a chip’s production process. Instead of checking whether every channel on a chip has been embossed, the group added an extra feature — a tiny X — to the chip pattern. They designed the feature to be more difficult to emboss than the rest of the chip. Hardt says how sharply the X is stamped is a good indication of whether the rest of the chip has been rendered accurately.

Jumpstarting an industry

The group’s ultimate goal is to change how manufacturing is done. Typically, an industry builds up its production processes gradually, making adjustments and improvements over time. Hardt says the semiconductor industry is a prime example of manufacturing’s iterative process.

“Now what they do in manufacturing is impossibly difficult, but it’s been a series of small incremental improvements over years,” Hardt says. “We’re trying to jumpstart that and not wait until industry identifies all these problems when they’re trying to make a product.”

The group is now investigating ways to design a “self-correcting factory” in which products are automatically tested. If the product doesn’t work, Hardt envisions the manufacturing process changing in response, adjusting settings on machines to correct the process. For example, the team is looking for ways to evaluate how fluid flows through a manufactured chip. The point at which two fluids mix within a chip should be exactly the same in every chip produced. If that mixing point drifts from chip to chip, Hardt and his colleagues have developed algorithms that adjust equipment to correct the drift.

Holger Becker, co-founder of Microfluidic ChipShop, a lab-on-a-chip production company in Jena, Germany, says the center's research plays an important role in understanding the different processes involved in large-scale production of microfluidics.

"Most of the academic work in microfluidics concentrates on applications, and unfortunately only very few concentrate on the actual manufacturing technologies suited for industrialization," Becker says. "David Hardt's team takes a very holistic approach looking into all different process steps and the complete manufacturing process instead of individual technologies."

“We’re at the stage where we’d like industry to know what we’re doing,” Hardt says. “We’ve been sort of laboring in the vineyard for years, and now we have this base, and it could get to the point where we’re ahead of the group.”

Source: MIT

Technology anticipates, meets our needs for health, efficiency

Engineerblogger
March 30, 2012


Credit: WSU

We have all heard of the smartphone and, any day now, most of us will have one. Not far behind: the smart home.

Writing in the latest issue of the journal Science, Washington State University’s Diane Cook says it won’t be long before our homes act as "intelligent agents” that use sensors and software to anticipate our needs and tend to tasks that improve our health, energy efficiency, even social media.

Many homes are already halfway there, with computer chips helping microwave popcorn, record TV shows and turn on coffee makers and thermostats.

"If you have a programmable thermostat, you have the beginnings of a smart home,” says Cook, a WSU professor of electrical engineering and computer science. "What we’re trying to do is get the home to take over the job of programming it.

"We want your home as a whole to think about what you need and use the components in it to do the right thing,” she says.

Cook has been applying artificial intelligence in test homes since coming to WSU in 2006. Sites around the Northwest, including 18 apartments in Seattle, already show that the technology can help monitor aging-in-place elderly residents and alert caregivers if they are not completing ordinary activities like rising, eating, bathing and taking medications.

Similarly, homes can be designed to automatically regulate energy use, the source of nearly half a consumer’s energy diet. Smart home technologies can run washers at off-peak times, turn off unneeded appliances and put out lights in empty rooms without residents having to make conscious choices. Many communities, including Pullman, are already testing such concepts through the use of smart meters.

While the smartphone lets people take their social media with them, the home could in effect act like a car’s Bluetooth, facilitating hands-free conversation from any room. For that matter, says Cook, cameras would let residents "Skype from anywhere.”

But while the technology is available, technologies like smart meters and in-home cameras raise privacy concerns for many Americans. The technologies, like so many others, face a classic challenge of being accepted and adopted, says Cook.

She has seen that in particular with the elderly participants in her studies.

"Ultimately,” she says, "when people get a better understanding of what these technologies do and see a usefulness that counterbalances their skittishness, adoption will start. I’m guessing some technologies will gain momentum once they’re starting to be used.”

Cook’s work is funded by the National Institutes of Health, the National Science Foundation and Washington State’s Life Sciences Discovery Fund.

A video of Diane Cook discussing the impact a smart environment can have on a growing elderly population can be found below:


Source: Washingon State University 

Related Information:
  • See a related article in award-winning Scientific American magazine here.

Researchers use Electricity to Generate Alternative Fuel

Engineerblogger
March 30, 2012



UCLA researchers have generated isobutanol from CO2 using a genetically engineered microorganism with solar electricity the sole energy input. Credit UCLA.

Imagine being able to use electricity to power your car — even if it's not an electric vehicle. Researchers at the UCLA Henry Samueli School of Engineering and Applied Science have for the first time demonstrated a method for converting carbon dioxide into liquid fuel isobutanol using electricity.

Today, electrical energy generated by various methods is still difficult to store efficiently. Chemical batteries, hydraulic pumping and water splitting suffer from low energy-density storage or incompatibility with current transportation infrastructure.

In a study published March 30 in the journal Science, James Liao, UCLA's Ralph M. Parsons Foundation Chair in Chemical Engineering, and his team report a method for storing electrical energy as chemical energy in higher alcohols, which can be used as liquid transportation fuels.

"The current way to store electricity is with lithium ion batteries, in which the density is low, but when you store it in liquid fuel, the density could actually be very high," Liao said. "In addition, we have the potential to use electricity as transportation fuel without needing to change current infrastructure."

Liao and his team genetically engineered a lithoautotrophic microorganism known as Ralstonia eutropha H16 to produce isobutanol and 3-methyl-1-butanol in an electro-bioreactor using carbon dioxide as the sole carbon source and electricity as the sole energy input.

Photosynthesis is the process of converting light energy to chemical energy and storing it in the bonds of sugar. There are two parts to photosynthesis — a light reaction and a dark reaction. The light reaction converts light energy to chemical energy and must take place in the light. The dark reaction, which converts CO2 to sugar, doesn't directly need light to occur.

"We've been able to separate the light reaction from the dark reaction and instead of using biological photosynthesis, we are using solar panels to convert the sunlight to electrical energy, then to a chemical intermediate, and using that to power carbon dioxide fixation to produce the fuel," Liao said. "This method could be more efficient than the biological system."

Liao explained that with biological systems, the plants used require large areas of agricultural land. However, because Liao's method does not require the light and dark reactions to take place together, solar panels, for example, can be built in the desert or on rooftops.

Theoretically, the hydrogen generated by solar electricity can drive CO2 conversion in lithoautotrophic microorganisms engineered to synthesize high-energy density liquid fuels. But the low solubility, low mass-transfer rate and the safety issues surrounding hydrogen limit the efficiency and scalability of such processes. Instead Liao's team found formic acid to be a favorable substitute and efficient energy carrier.

"Instead of using hydrogen, we use formic acid as the intermediary," Liao said. "We use electricity to generate formic acid and then use the formic acid to power the CO2 fixation in bacteria in the dark to produce isobutanol and higher alcohols."

The electrochemical formate production and the biological CO2 fixation and higher alcohol synthesis now open up the possibility of electricity-driven bioconversion of CO2 to a variety of chemicals. In addition, the transformation of formate into liquid fuel will also play an important role in the biomass refinery process, according to Liao.

"We've demonstrated the principle, and now we think we can scale up," he said. "That's our next step." The study was funded by a grant from the U.S. Department of Energy's Advanced Research Projects Agency–Energy (ARPA–E).

Source:  UCLA

Additional Information: 

Thursday 29 March 2012

Physicists Propose Yet Another Form of Superhard Carbon

Engineerblogger
March 29, 2012




Last month, we looked at an emerging debate amongst materials scientists over the nature of a new form of carbon that was recently discovered by compressing graphite at room temperature to pressures in excess of 10 GigaPascals.

Under these conditions, various properties of graphite change, such as its resistivity, optical transparency, reflectance and so on. That's a good indication that something different is forming.

But the question is what? Various groups have proposed new carbon allotropes that could explain the results. These are variously called monoclinic M-carbon, cubic body center C4 carbon, orthorhombic W-carbon and most recently M10-carbon.

Today, Chaoyu He and buddies at the Institute for Quantum Engineering and Micro-Nano Energy Technology in Xiangtan, China, propose two new superhard structures, which they call S-carbon and H-carbon.

One of the problems with the previous structures is the uncertainty over their stability, which may mean they cannot form in reality. He and co say the new structures are more stable than M or W-carbon and even more stable than graphite at high pressures. What's more, their bulk properties more or less exactly match those measured in the experimental samples

Of course, this work, like the other structural predictions, is entirely theoretical, relying on computer simulations based on first principle calculations. And until somebody actually measures the structure of this new form of carbon, we won't know which proposal is correct.

Nevertheless, the process of predicting new carbon allotropes and calculating their properties is itself providing a clear impetus for new research in this field.

A snapshot of an interesting piece of science in action.


Source: Technology Review



Additional Information:
  • In the Cornell University Library: "New Superhard Carbon Phases Between Graphite and Diamond" Ref: arxiv.org/abs/1203.5509

An aircraft revolution on the horizon

Engineerblogger
March 29, 2012


Artist's rendering of D-Dalus. Image: Grapham Murdoch

You've got planes, helicopters, blimps, and balloons. Throw in the gyrocopter, a powered paraglider, and a few other odd-ball inventions, and you've pretty much covered the ways we can fly. Not since Sikorsky started mass producing helicopters in 1942 has there been a real revolution in aviation.

However, if the makers of the airfoil- and rotor-free D-Dalus manage to tame their prototype, the latest upheaval may be just around the corner. As conceived, it will be able to take off vertically, fly in any direction, hover in the worst weather imaginable, and turn on an airborne dime, making it the ideal drone. "We don't really want to see this used as a weapons platform," says project leader David Wills. "But we're businessmen and we want money."

The aircraft is the product of the mind of Meinhard Schwaiger, "A charismatic little guy—five-foot-four, shock of white hair, and a bow tie," as Wills describes him. Schwaiger holds some 150 patents and made his first killing in plastics extrusion. The concept for D-Dalus first came to him while he was in Moscow watching "a tedious documentary about helicopter crashes," says Wills. In trying to imagine an aircraft capable of hovering with greater ease and safety, he turned his mind to the Voith Schneider propellers that give tugboats the strength and near-instantaneous maneuverability necessary to haul water vessels many times their size.

Designing the D-Dalus

Schwaiger sketched the idea down on a pad and assumed it would wind up in the rubbish bin in the cold light of day. But when, the next morning, he ran a CAD model of the engine, he realized that the only thing stopping such a contraption from getting in the air was time, money, wherewithal, perseverance, frustration, marketing, tinkering, mistakes, recoveries, and a whole lot of thought.

D-Dalus is made of four axles each with a series of carbon fiber disks that spin at 2,200 rpm. Surrounding the disks are blades that redirect the thrust. Working in concert they will give the craft 360 degrees of mobility. To do so they need to be able to handle about 1,000 Gs, a force the needle bearings in Schwaiger's initial prototype could not withstand. "Most would have said 'Oh, that's why no one else has done this before,'" says Wills. "Instead, he sat for a year and a half designing a fiction-free bearing." The result was akin to a beer keg topped by a glass table topped by a beer keg. Stand on the table and you'll feel an easy rock between the barrels.

D-Dalus is made of four axels with a series of carbon fiber disks that spin at 2,200 rpm.

The current prototype has only made its way through a few of those 360 available degrees. It's managed to get off the ground and take a sharp horizontal turn. But just before its public debut at the Paris Airshow, it did a bit more than that. "We were hanging on to the beast with a safety rope, but we didn't hang on strong enough," says Wills. "It lurched and started fighting the rope aggressively. It really was like Frankenstein on the end of the rope, bits were flying at 2,000 rpm, and they're pretty expensive bits too." The breakdown was the result of a number of factors: An engine less powerful than expected; bolts that could not withstand high vibration; engine placement that made the craft act as a pendulum; a racecar driver at the controls who was not used to maneuvering in more than two dimensions.

The damage meant that untold dollars and hours were down the drain. Not enough to ground the project, though. "You ought to see some black and white footage of Sikorsky sitting on his helicopter almost having his head lopped off," notes Wills. "I said 'These things happen.'"

D-Dalus at the Paris Air Show.

The New Look
The latest assemblage has a new engine, its weight is down to less than 400 pounds, and it is merely awaiting the careful hand of renowned stunt helicopter pilot Rudiger Feil to operate its joystick.

If the aircraft proves itself worthy, it will soon be performing feats impossible to its airborne-brethren. "Imagine them looking for Somali pirates, or Iranian missile launchers," says Wills. "They could work together in a swarm. If you need to use the same port you were bombing 10 minutes ago—the same thing that was throwing lifeboats on the back of the ship early on can act as a crane."

Voith Schneider Propellers have been working in water for the better part of a century, so one might wonder why a D-Dalus-like vehicle hasn't already popped up to take care of Somali pirates and the like. "There are lots of people who have gotten close," says Wills, "It's just that my little Einstein has solved all the things at once."

Source: ASME

Concerted efforts needed to secure key resources for low-carbon future

Engineerblogger
March 29, 2012




New Stockholm Environment Institute (SEI) studies on biomass, scarce metals and water, produced as part of a partnership with the business initiative 3C (Combat Climate Change), show supply constraints could slow deployment of green energy technologies by 2035 – but business and policy choices can reduce these risks.

Low-carbon technologies – solar, wind, hydroelectric and geothermal power, biofuels, electric and hybrid vehicles – are likely to play key roles in efforts to avoid dangerous levels of climate change. But to make a significant impact, they need to be deployed on a large scale. And even if economic and political obstacles can be removed, there is growing concern that key resources needed for these technologies are just too limited.

Aiming to gauge the extent of the problem, SEI researchers in the United States and UK estimated demand for these resources under different scenarios. In three new reports and accompanying policy briefs, they outline their findings and identify possible solutions involving technology, public policy and business strategies.

Researchers from the Stockholm Environment Institute at the University of York played a major role in the metals study.

The metals study estimated future supply and demand for five metals – cobalt, lithium, neodymium, indium and tellurium – under three energy scenarios and three global minerals market scenarios. It found a severe risk of medium- and long-term cumulative deficits of indium and tellurium, a moderate risk of medium-term and severe risk of long-term deficits of neodymium, and a limited risk of long-term deficits of cobalt and lithium.

Policy initiatives can help address the challenges identified in our analysis and provide incentives for recycling, technology and materials substitution, and other effective responses, the authors note. But governments may not be paying enough attention to scarcity, focusing only on technological and environmental challenges, and some policies that do address scarcity, such as trade barriers and hoarding, could actually exacerbate problems, the study warns.

The biomass study explored four scenarios differentiated by their relative focus on climate or agriculture and food security: a “Single Bottom Line” in which relatively unconstrained markets determine what technologies are developed and deployed; “Meeting the Climate Challenge”, focused entirely on curbing emissions; “Feeding the Planet”, focused on increasing food production; and a “Sustainability Transition” that uses biomass not only for food and energy, but also, increasingly, as industrial feedstock.

The analysis shows that no path is perfect, and all but “Feeding the Planet” would increase total agricultural land use, but a “Sustainability Transition” would yield the greatest benefits for both climate and agricultural productivity. It could spur innovation, the authors note: “If we encourage the use of bio-materials, then entrepreneurs and businesses can take this on as a challenge, and thrive on it.”

The water study, meanwhile, notes that some low-carbon electricity sources, such as solar thermal and geothermal, use so much water that their large-scale implementation might not be viable in the growing share of the world where water supplies are constrained or uncertain.

SEI researchers developed a case study for water use for electricity generation in California, focusing on the water and emissions implications of the state’s renewable energy portfolio standard (RPS). They found that under the RPS, which would boost the share of renewable electricity from 25 to 34 per cent by 2020, emissions and water withdrawals would be lower than under business as usual, but water consumption (the water that is not reused or returned to the source) would increase.

But adjustments to California’s RPS could significantly reduce water demand. A scenario the authors called RPS+Technology, using more photovoltaics and less solar thermal power, and incrementally switching once-through to wet-recirculating and dry-cooling systems, reduced both water withdrawals and consumption. Yet there are trade-offs: Some of the adjustments would reduce plant efficiency, offsetting some of the emission reductions; adding carbon capture and sequestration (CCS) to some natural-gas plants could more than make up the difference, but that technology, in turn, would require more water.

“It is clear that there are no easy solutions to any of these problems,” says Annika Varnäs, the Stockholm-based SEI research fellow who coordinates SEI’s research programme with the 3C initiative. “Competing demands for resources, political and economic factors, and sustainability concern all and pose daunting challenges. However, these studies also show that both businesses and governments can take significant steps to address these challenges and increase the likelihood that low-carbon technologies can be successfully deployed around the world.”

“Companies are becoming increasingly aware of the many challenges inherent in the transition to a low-carbon economy, and the 3C companies hope that resource scarcity can remain on the agenda,” adds 3C Coordinator Jesse Fahnestock. “These studies show that being proactive and coordinated on resource management is going to be an important part of overhauling the energy system.”



Source: University of York 



Additional Information:

  • Policy briefs summarising the studies’ findings are available on the SEI website, at the links below, or in print at SEI’s booth at the Planet Under Pressure conference in London. The full reports are being finalised; to obtain copies, send a request to one of the contacts listed below.
  1. Biomass: www.sei-international.org/publications?pid=2076
  2. Metals: www.sei-international.org/publications?pid=2077
  3. Water: www.sei-international.org/publications?pid=2075
  •  About 3C (Combat Climate Change) 3C was founded by Lars Josefsson, former president of the energy company Vattenfall, in 2007. The 3C group consists of around 70 large companies, including multinational companies such as Siemens and Unilever as well as major American companies such as Duke Energy, and companies from China, India and Russia.  To download SEI reports produced in partnership with 3C, visit the project page on the SEI website.
About the Stockholm Environment Institute
  • The Stockholm Environment Institute (SEI) is an international nonprofit research organization that has been engaged in environment and development issues at the local, national, regional and global policy levels for more than 20 years. Its goal is to bring about change for sustainable development by bridging science and policy. SEI has seven centres worldwide, in Stockholm; Oxford and York, U.K.; the United States; Bangkok, Thailand; Dares Salaam, Tanzania; and Tallinn, Estonia. This report was produced in Stockholm.

Physicists Mix Two Lasers to Create Light at Many Frequencies

Engineerblogger
March 29, 2012


Artist's rendition of electron-hole recollision. Near infrared (amber rods) and terahertz (yellow cones) radiation interact with a semiconductor quantum well (tiles). The near-ir radiation creates excitons (green tiles) consisting of a negative electron and a positive hole (dark blue tile at center of green tiles) bound in an atom-like state. Intense terahertz fields pull the electrons (white tiles) first away from the hole and then back towards it (electron paths represented by blue ellipses). Electrons periodically recollide with holes, creating periodic flashes of light (white disks between amber rods) that are emitted and detected as sidebands.
Credit: Peter Allen, UCSB

A team of physicists at UC Santa Barbara has seen the light, and it comes in many different colors. By aiming high- and low-frequency laser beams at a semiconductor, the researchers caused electrons to be ripped from their cores, accelerated, and then smashed back into the cores they left behind. This recollision produced multiple frequencies of light simultaneously. Their findings appear in the current issue of the science journal Nature.

"This is a very remarkable phenomenon. I have never seen anything like this before," said Mark Sherwin, whose research group made the groundbreaking discovery. Sherwin is a professor of physics at UCSB and a co-author of the paper. He is also director of the campus's Institute for Terahertz Science and Technology.

When the high-frequency optical laser beam hits the semiconductor material –– in this case, gallium arsenide nanostructures –– it creates an electron-hole pair called an exciton. The electron is negatively charged, and the hole is positively charged, and the two are bound together by their mutual attraction. "The high-frequency laser creates electrons and holes," Sherwin explained. "The very strong, low-frequency free electron laser beam rips the electron away from the hole and accelerates it. As the low-frequency field oscillates, it causes the electron to come careening back to the hole." The electron has excess energy because it has been accelerated, and when it slams back into the hole, the recombined electron-hole pair emits photons at new frequencies.

"It's fairly routine to mix the lasers and get one or two new frequencies, Sherwin continued. "But to see all these different new frequencies, up to 11 in our experiment, is the exciting phenomenon. Each frequency corresponds to a different color."

In terms of real-world applications, the electron-hole recollision phenomenon has the potential to significantly increase the speed of data transfer and communication processes. One possible application involves multiplexing –– the ability to send data down multiple channels –– and another is high-speed modulation.

"Think of your cable Internet," explained Ben Zaks, a UCSB doctoral student in physics and the paper's lead author. "The cable is a bundle of fiber optics, and you're sending a beam with a wavelength that's approximately 1.5 microns down the line. But within that beam there are a lot of frequencies separated by small gaps, like a fine-toothed comb. Information going one way moves on one frequency, and information going another way uses another frequency. You want to have a lot of frequencies available, but not too far from one another."

The electron-hole recollision phenomenon does just that –– it creates light at new frequencies, with optimal separation between them.

The researchers utilize a free electron laser –– a building-size machine in UCSB's Broida Hall –– to produce the electron-hole recollisions, which they note is not practical for real-world applications. Theoretically, however, a transistor could be used in place of the free electron laser to produce the strong terahertz fields. "The transistor would then modulate the near infrared beam," Zaks continued. "Our data indicates that we are modulating the near infrared laser at twice the terahertz frequency. This is where we could really see this working to increase the speed of optical modulation, which is how you get information down a cable line."

The electron-hole recollision phenomenon creates many new avenues for research and exploration, Sherwin noted. "It is an interesting time because there are a lot of people who can participate in doing this kind of research," he said. "We have a unique tool –– a free electron laser –– which gives us a big advantage for exploring the properties of fundamental materials. We just put it in front of our laser beams and measure the colors of light going out. Now that we've seen this phenomenon, we can start doing the hard work of putting the pieces together on a chip."

In discussing the research team's discovery, Sherwin cited Michael Polanyi, the Hungarian scientist and science philosopher. "He talked about growing points in science, and I'm hoping this is going to be one of those, where a lot of people can use it as a foundation for going off in a lot of different directions," he said. "I want to continue working on it, but I'd like to see a lot of other people join in."

Also contributing to the research is the paper's second author, R.B. Liu of The Chinese University in Hong Kong. "This is an excellent example of the value of communicating with scientists from all over the globe," said Sherwin. "If we had never met, this research would not have happened."


Source: University of California - Santa Barbara

Additional Information:

ORNL process converts polyethylene into carbon fiber

Engineerblogger
March 29, 2012

Carbon fibers having unique surface geometries, from circular to hollow gear-shaped, are produced from polyethylene using a versatile fabrication method. The resulting carbon fiber exhibits properties that are dependent on processing conditions, rendering them highly amenable to myriad applications.

Common material such as polyethylene used in plastic bags could be turned into something far more valuable through a process being developed at the Department of Energy's Oak Ridge National Laboratory.

In a paper published in Advanced Materials, a team led by Amit Naskar of the Materials Science and Technology Division outlined a method that allows not only for production of carbon fiber but also the ability to tailor the final product to specific applications.

"Our results represent what we believe will one day provide industry with a flexible technique for producing technologically innovative fibers in myriad configurations such as fiber bundle or non-woven mat assemblies," Naskar said.

Using a combination of multi-component fiber spinning and their sulfonation technique, Naskar and colleagues demonstrated that they can make polyethylene-base fibers with a customized surface contour and manipulate filament diameter down to the submicron scale. The patent-pending process also allows them to tune the porosity, making the material potentially useful for filtration, catalysis and electrochemical energy harvesting.

Naskar noted that the sulfonation process allows for great flexibility as the carbon fibers exhibit properties that are dictated by processing conditions. For this project, the researchers produced carbon fibers with unique cross-sectional geometry, from hollow circular to gear-shaped by using a multi-component melt extrusion-based fiber spinning method.

The possibilities are virtually endless, according to Naskar, who described the process.

"We dip the fiber bundle into an acid containing a chemical bath where it reacts and forms a black fiber that no longer will melt," Naskar said. "It is this sulfonation reaction that transforms the plastic fiber into an infusible form.

"At this stage, the plastic molecules bond, and with further heating cannot melt or flow. At very high temperatures, this fiber retains mostly carbon and all other elements volatize off in different gas or compound forms."

The researchers also noted that their discovery represents a success for DOE, which seeks advances in lightweight materials that can, among other things, help the U.S. auto industry design cars able to achieve more miles per gallon with no compromise in safety or comfort. And the raw material, which could come from grocery store plastic bags, carpet backing scraps and salvage, is abundant and inexpensive.

Source: Oak Ridge National Laboratory

Additional Information:
  • The paper, titled "Patterned functional carbon fibers from polyethylene," are Marcus Hunt, Tomonori Saito and Rebecca Brown of ORNL and Amar Kumbhar of the University of North Carolina's Chapel Hill Analytical and Nanofabrication Laboratory. The paper is published on line here: http://onlinelibrary.wiley.com/doi/10.1002/adma.201104551/pdf

Wood Research Symposium: The discovery of a versatile, multifunctional material

Engineerblogger
March 29, 2012



Source: iStock-Foto

For 75 years Empa has been conducting research on all aspects relating to wood. Beginning with an investigation into the properties of various types of indigenous woods, today this has grown into a research area with many branches – from fundamental research on the structure of wood to chemically or biologically modified woods, novel wood fiber products and surface technologies. Empa scientists examine each of the material’s properties in minute detail and use these results to develop next-generation techniques of utilizing wood such as in the areas of acoustic insulation and structural engineering.
In the beginning the focus was concentrated on research aimed at encouraging the exploitation of indigenous types of wood, according to Klaus Richter, who headed Empa's Wood Laboratory for many years and who now teaches wood science at the Technical University, Munich. In the early days, for example, new processes for the pressure-impregnation of telegraph poles were developed which significantly increased their useful life. Over time the field of research broadened, but the aim has remained the same, explains Tanja Zimmermann, Richter’s successor: developing innovative wood products which, in collaboration with industrial partners, can be economically exploited. This aspect is reflected in the new name of the laboratory, "Applied Wood Research".


Basic research and applied development at the same time
Exactly how this works in detail was explained by Ingo Burgert, professor of wood based materials at the ETH Zürich and simultaneously head of the «Bio-inspired Wood Materials» working group. Burgert is investigating the phenomenon of heartwood, which develops in trees after a certain age and frequently gives the material an enhanced durability. The professor is investigating questions such as whether it is possible to chemically modify wood after it has been harvested to make it harder and longer lived. Can one retrospectively fill wood with functional nanoparticles? Is it possible to construct composite materials of wood and carbon fiber? Francis Schwarze and Mark Schubert are also studying ways of altering the characteristics of wood in specific ways for particular applications, for example harnessing fungi which decompose wood to improve the tonal qualities for use in musical instruments, or utilizing enzymes to lend wooden surfaces completely new characteristics. This could perhaps make a wood surface resistant to fungus and bacteria. Another possibility is "self-adhesive" wooden chips for making fiberboard.

A range of interesting substances and materials can also be isolated from wood, one such being nanofibrillated cellulose (NFC), an extremely versatile material. Zimmermann's team played (and continues to play) a central role in the further development of this material, which can be used in fiber reinforced adhesives and lacquers, and to make intervertebral disc implants. Airtight packaging for the food industry is another application for NFC's – they are compostable and when burnt emit practically no pollutants.


From small to large
A more classical application is the use of wood as a construction material. René Steiger and Robert Widmann, two wood specialists from Empa's Engineering Structures Laboratory, have been investigating load-bearing structures made of wood such as glue-laminated timber beams, which they destroy by applying extreme loads and then repair using carbon fiber mats affixed with adhesive, for example. The building industry profits from these optimized repair techniques. Wood is of course inflammable, but how quickly does it burn and how long does it maintain its load-bearing abilities? What happens when wood burns from the inside? These are questions which Empa investigates in its Fire Laboratory. For instance, just in front of the flame front a large quantity of water vapor is forced through the wood.


As a result of further advances in the fire safety regulations in Switzerland, multistoried wooden buildings have been permitted in the country for several years now. This development raises new questions, though: how good is the acoustic installation in a wooden-shelled building? A team led by Kurt Eggenschwiler, head of Empa's Acoustics/Noise Control Laboratory has been studying this topic since 2011 with the help of a special lightweight testing structure which allows the scientists to investigate and evaluate the acoustic qualities of lightweight structures made of wood and composite materials.

Wood research–Empa's role in Switzerland
The symposium ended with a series of guest lectures, underscoring the important role which Empa plays in the Swiss wood industry. Martin Riediker of the Innovation Promotion Agency (CTI) explained the National Research Program "Resource Wood" project (NFP 66), initiated in 2010. The aim of this initiative is to better exploit ageing Swiss timber resources and to find new applications for wood. Christoph Starck of Lignum, the umbrella organization for the Swiss forestry and lumber industry, described how the results of wood research are applied in practice in the building industry. From Empa researcher to carpenter – everyone involved in dealing with wood is working towards the same goal, namely utilizing a sustainable, regrowable raw material in an optimal manner.

Source: Swiss Federal Laboratories for Materials Science and Technology (EMPA)

Wednesday 28 March 2012

Virtu: The solar panel that heats water while generating electricity

Engineerblogger
March 28, 2012



Virtu.  Credit: Naked Energy

UK start up seeks up to $15m investment to drive accelerated roll out of ultra-efficient solar panel technology

Did you know that when solar photovoltaic cells get too hot their output plummets? Now a UK start up is seeking to resolve what it regards "a fundamental design flaw" of conventional solar panels through the development of a hybrid solar PV and thermal panel that promises to boost efficiency at high temperatures by around 50 per cent.

Guildford-based Naked Energy has developed a new prototype system based on a patented substrate design that encases solar PV cells in a vacuum contained in glass tubes. "Solar PV panels lose half a per cent of their efficiency with every degree above 25 degrees Centigrade, and in a warm climate they can reach temperatures of 70 to 80 degrees," explained Christophe Williams, managing director at the company. "Our design transfers that heat away from the cells, increasing the electricity output from the solar cells and providing heat for hot water."

The company has worked with Imperial University to independently verify the performance of its prototype, concluding that it can deliver 70 per cent solar conversion efficiency when the electrical and heat output is combined. Moreover, tests showed that the system increases electrical output by 65 per cent compared to conventional PV panels when the cells are heated to 65 degrees Centigrade.

With the technology still at the prototype stage the company is reluctant to estimate the cost of the solar energy the new design could produce. But director Nicholas Simmons argued the significant increases in output mean the system should deliver a "vast improvement" in payback periods compared to conventional solar PV and solar thermal systems.

"The log-term goal is to produce systems at the same price as standard panels, while hugely increasing the energy output," he added.

The company raised £500,000 of funding last year, including a £40,000 award from the Shell Springboard competition, and is currently in the process of raising further funds to support the roll out a number of pilot projects.

Williams is cagey about revealing precise details, but confirmed the company is in talks with a number of large corporates about undertaking onsite pilot projects, while two large scale deployments are also being planned in Chile and China. He also revealed the company is looking to identify potential manufacturing partners who can produce the first wave of panels.

Once the pilot projects are completed the company is aiming to raise between $10m and $15m later this year to help fund rapid global expansion, and as such executives are in San Francisco this week meeting with potential investors as part of the UK's Clean and Cool trade mission.

"We need to move fast to make the most of our first mover advantage," said Simmons, acknowledging that while the company has patents in place to protect its technology rapid expansion represents the best way to protect its intellectual property.

The new funding round would be used to support either production facilities or licensing deals with third party manufacturers, depending on the strategy the company opts for.

In addition, the capital injection would allow Naked Energy to deliver a second generation system that further improves efficiencies.

"By controlling the amount of heat extracted from the cells we can control the amount of electricity and hot water you produce based on the level of demand," Williams explained. "That raises the prospect of us making the system really intelligent. For example, you could draw on weather data to work out that you might want to maximise output on a sunny day and store the energy for use over the next few days when the weather is going to bad."

"We are also looking at using the system to track the sun, which should be possible given that the cylindrical design makes it easier to tilt the cells to follow the sun."

Source: Business Green.com


Additional Information:

Green Nanotechnology Investment: Researchers Help Assess Economic Impact of Nanotech on Green & Sustainable Growth

Engineerblogger
March 28, 2012


Image shows nanogenerators developed in the laboratory of Zhong Lin Wang at the Georgia Institute of Technology. Credit: Gary Meek.


In the United States alone, government and private industry together invest more than $3 billion per year in nanotechnology research and development, and globally the total is much higher. What will be the long-run economic returns from these investments, not only in new jobs and product sales, but also from improvements in sustainability?

Georgia Institute of Technology researchers Philip Shapira and Jan Youtie helped answer that question through research presented March 27th at the International Symposium on Assessing the Economic Impact of Nanotechnology held in Washington, D.C. The researchers highlighted the importance of full lifecycle assessments to understand the impacts of nanotechnologies on green economic development in such areas as energy, the environment and safe drinking water.

“Nanotechnology promises to foster green and sustainable growth in many product and process areas,” said Shapira, a professor with Georgia Tech’s School of Public Policy and the Manchester Institute of Innovation Research at the Manchester Business School in the United Kingdom. “Although nanotechnology commercialization is still in its early phases, we need now to get a better sense of what markets will grow and how new nanotechnology products will impact sustainability. This includes balancing gains in efficiency and performance against the net energy, environmental, carbon and other costs associated with the production, use and end-of-life disposal or recycling of nanotechnology products.”

But because nanotechnology underlies many different industries, assessing and forecasting its impact won’t be easy. “Compared to information technology and biotechnology, for example, nanotechnology has more of the characteristics of a general technology such as the development of electric power,” said Youtie, director of policy research services at Georgia Tech’s Enterprise Innovation Institute. “That makes it difficult to analyze the value of products and processes that are enabled by the technology. We hope that our paper will provide background information and help frame the discussion about making those assessments.”

The symposium is sponsored by the Organization for Economic Cooperation and Development and by the U.S. National Nanotechnology Initiative. Support for Georgia Tech research into the societal impacts of nanotechnology has come from the National Science Foundation through the Center for Nanotechnology in Society based at Arizona State University.

For their paper, co-authors Shapira and Youtie examined a subset of green nanotechnologies that aim to enable sustainable energy, improve environmental quality, and provide healthy drinking water for areas of the world that now lack it. They argue that the lifecycle of nanotechnology products must be included in the assessment.

“In examining the economic impact of these green nanotechnologies, we have to consider the lifecycle, which includes such issues as environmental health and safety, as well as the amount of energy required to produce materials such as carbon nanotubes,” said Shapira.

Environmental concerns have been raised about what happens to nanomaterials when they get into water supplies, he noted. In addition, some nanostructures use toxic elements such as cadmium. Energy required for producing nano-enabled products is also an important consideration, though it may be balanced against the energy saved – and pollution reduced – through the use of such products, Shapira said.

Research into these societal issues, which is being conducted in parallel with the research and development of nanotechnology – may allow the resulting nano-enabled products to avoid the kinds of the controversies that have hindered earlier technologies.

“Scientists, policy-makers and other observers have found that some of the promise of prior rounds of technology was limited by not anticipating and considering societal concerns prior to the introduction of new products,” Youtie said. “For nanotechnology, it is vital that these issues are being considered even during the research and development stage, before products hit the market in significant quantities.”

The nanotechnology industry began with large companies that had the resources to invest in research and development. But that is now changing, Youtie said.

“A lot of small companies are involved in novel nanomaterials development,” she said. “Large companies often focus on integrating those nanomaterials into existing products or processes.”

Among the goals of the OECD symposium are development of methodologies and approaches for estimating the impacts of green nanotechnology on jobs and new product sales. Existing forecasts have come largely from proprietary models used by private-sector firms.

“While these private forecasts have high visibility, their information and methods are often proprietary,” Shapira noted. “We also need to develop open and peer-reviewed models in which approaches are transparent and everyone can see the methods and assumptions used.”

In their paper, Youtie and Shapira cite several examples of green nanotechnology, discuss the potential impacts of the technology, and review forecasts that have been made. Examples of green nanotechnology they cite include:
  • Nano-enabled solar cells that use lower-cost organic materials, as opposed to current photovoltaic technologies that require rare materials such as platinum;
  • Nanogenerators that use piezoelectric materials such as zinc oxide nanowires to convert human movement into energy;
  • Energy storage applications in which nanotechnology materials improve existing batteries and nano-enabled fuel cells;
  • Thermal energy applications, such as nano-enabled insulation;
  • Fuel catalysis in which nanoparticles improve the production and refining of fuels and reduce emissions from automobiles;
  • Technologies used to provide safe drinking water through improved water treatment, desalination and reuse.

Source: Georgia Institute of Technology

Transparent, flexible “3D” memory chips may be the next big thing in small memory devices

Engineerblogger 
March 28, 2012

Transparent, flexible memory chips could replace
flash drives, like this one, for personal memory
storage. Credit: iStock


New memory chips that are transparent, flexible enough to be folded like a sheet of paper, shrug off 1,000-degree Fahrenheit temperatures — twice as hot as the max in a kitchen oven — and survive other hostile conditions could usher in the development of next-generation flash-competitive memory for tomorrow’s keychain drives, cell phones and computers, a scientist reported today.

Speaking at the 243rd National Meeting & Exposition of the American Chemical Society, the world’s largest scientific society, he said devices with these chips could retain data despite an accidental trip through the drier — or even a voyage to Mars. And with a unique 3-D internal architecture, the new chips could pack extra gigabytes of data while taking up less space.

“These new chips are really big for the electronics industry because they are now looking for replacements for flash memory,” said James M. Tour, Ph.D., who led the research team. “These new memory chips have numerous advantages over the chips today that are workhorses for data storage in hundreds of millions of flash, or thumb drives, smart phones, computers and other products. Flash has about another six or seven years in which it can be built smaller, but then developers hit fundamental barriers.”

Because of the way that the new memory chips are configured, namely with two terminals per bit of information rather than the standard three terminals per bit, they are much better suited for the next revolution in electronics — 3-D memory — than flash drives.

“In order to put more memory into a smaller area, you have to stack components beyond two dimensions, which is what is currently available,” he said. “You have to go to 3-D.” And the chips have a high on-off ratio, which is a measure of how much electrical current can flow in the chip when it stores information versus when it is empty. The higher the ratio, the more attractive the chips are to manufacturers.

The chips were originally composed of a layer of graphene or other carbon material on top of silicon oxide, which has long been considered an insulator, a passive component in electronic devices. Graphene is a thin layer of carbon atoms that’s touted as a “miracle material” because it is the thinnest and strongest known material. It was even the topic of a recent Nobel Prize. Originally, the researchers at Rice University thought that the amazing memory capability of the chips was due to the graphene. They discovered recently that they were wrong. The silicon oxide surface was actually making the memories, and now they can make them graphene-free. The work was done by Tour’s group in collaboration with Professor Douglas Natelson (Department of Physics) and Lin Zhong (Department of Electrical and Computer Engineering). The main students on the project were Jun Yao and Javen Lin.

The transparency and small size of the new chips enables them to be used in a wide range of potential applications. Manufacturers could embed them in glass for see-through windshield displays for everyday driving, military and space uses so that not only is the display in the windshield, but also the memory. That frees up space elsewhere in the vehicle for other devices and functionalities. In fact, the chips were onboard a recent Russian Progress 44 cargo spacecraft in August 2011 for further experimentation aboard the International Space Station. However, the vehicle never made it into space and crashed. “The spacecraft crashed over Siberia, so our chips are in Siberia!” said Tour. He hopes to send the chips on a future mission in July 2012 to see how the memory holds up in the high-radiation environment of space.

Current touch screens are made of indium tin oxide and glass, both of which are brittle and can break easily. However, plastic containing the memory chips could replace those screens with the added bonuses of being flexible while also storing large amounts of memory, freeing up space elsewhere in a phone for other components that could provide other services and functions. Alternatively, storing memory in small chips in the screen instead of within large components inside the body of a phone could allow manufacturers to make these devices much thinner.

The easy-to-fabricate memory chips are patented, and Tour is talking to manufacturers about embedding the chips into products.

The scientists acknowledged funding from the Texas Instruments Leadership University Fund, the National Science Foundation Award No. 0720825 and the Army Research Office through the SBIR program administrated by PrivaTran, LLC.

Source: American Chemical Society (ACS)

Researchers Discover a New Path for Light Through Metal: Novel Plasmonic Material May Merge Photonic and Electronic Technologies

Engineerblogger
March 28, 2012


a) Excitation by light of a surface plasmon-polariton on a thin film of titanium nitride. b) Atomic force microscope image of the surface of titanium nitride film. The mean roughness of the film is 0.5 nm. c) Scanning electron microscopy image of TiN thin film on sapphire. The texture shows multivariant epitaxial (crystalline) growth. Credit: Alexandra Boltasseva, Purdue University/Optical Materials Express.

Helping bridge the gap between photonics and electronics, researchers from Purdue University have coaxed a thin film of titanium nitride into transporting plasmons, tiny electron excitations coupled to light that can direct and manipulate optical signals on the nanoscale. Titanium nitride’s addition to the short list of surface-plasmon-supporting materials, formerly comprised only of metals, could point the way to a new class of optoelectronic devices with unprecedented speed and efficiency.

“We have found that titanium nitride is a promising candidate for an entirely new class of technologies based on plasmonics and metamaterials,” said Alexandra Boltasseva, a researcher at Purdue and an author on a paper published today in the Optical Society’s (OSA) open-access journal Optical Materials Express. “This is particularly compelling because surface plasmons resolve a basic mismatch between wavelength-scale optical devices and the much smaller components of integrated electronic circuits.”

Value of Plasmons

Metals carry electricity with ease, but normally do nothing to transmit light waves. Surface plasmons, unusual light-coupled oscillations that form on the surface of metallic materials, are the exception to that rule. When excited on the surface of metals by light waves of specific frequencies, plasmons are able to retain that same frequency, but with wavelengths that are orders-of-magnitude smaller, cramming visible and near-infrared light into the realm of the nanoscale.

In the world of electronics and optics, that 100-fold contraction is a boon. Circuits that direct the paths of electrons operate on a much smaller scale than optical light waves, so engineers must either rely on small but relatively sluggish electrons for information processing or bulk up to accommodate the zippy photons. Plasmons represent the best of both worlds and are already at the heart of a number of optoelectronic devices. They have not had widespread use, however, due to the dearth of materials that readily generate them and the fact that metals, in most cases, cannot be integrated with semiconductor devices.

Plasmonic Materials

Until now, the best candidates for plasmonic materials were gold and silver. These noble metals, however, are not compatible with standard silicon manufacturing technologies, limiting their use in commercial products. Silver is the metal with the best optical and surface plasmon properties, but it forms grainy, or semi-continuous, thin films. Silver also easily degrades in air, which causes loss of optical signal, making it a less-attractive material in plasmon technologies.

In an effort to overcome these drawbacks, Boltasseva and her team chose to study titanium nitride- a ceramic material that is commonly used as a barrier metal in microelectronics and to coat metal surfaces such as medical implants or machine tooling parts- because they could manipulate its properties in the manufacturing process. It also could be easily integrated into silicon products, and grown crystal-by-crystal, forming highly uniform, ultrathin films- properties that metals do not share.

To test its plasmonic capabilities, the researchers deposited a very thin, very even film of titanium nitride on a sapphire surface. They were able to confirm that titanium nitride supported the propagation of surface plasmons almost as efficiently as gold. Silver, under perfect conditions, was still more efficient for plasmonic applications, but its acknowledged signal loss limited its practical applications.

To further improve the performance of titanium nitride, the researchers are now looking into a manufacturing method known as molecular beam epitaxy, which would enable them to grow the films and layered structures known as superlattices crystal-by-crystal.

Technologies and Potential Applications

In addition to plasmonics, the researchers also speculate that titanium nitride may have applications in metamaterials, which are engineered materials that can be tailored for almost any application because of their extraordinary response to electromagnetic, acoustic, and thermal waves. Recently proposed applications of metamaterials include invisibility cloaks, optical black holes, nanoscale optics, data storage, and quantum information processing.

The search for alternatives to noble metals with improved optical properties, easier fabrication and integration capabilities could ultimately lead to real-life applications for plasmonics and metamaterials.

“Plasmonics is an important technology for nanoscale optical circuits, sensing, and data storage because it can focus light down to nanoscale,” notes Boltasseva. “Titanium nitride is a promising candidate in the near-infrared and visible wavelength ranges. Unlike gold and silver, titanium nitride is compatible with standard semiconductor manufacturing technology and provides many advantages in its nanofabrication and integration.”

According to the researchers, titanium nitride-based devices could provide nearly the same performance for some plasmonic applications. While noble metals like silver would still be the best choice for specific applications like negative index metamaterials, titanium nitride could outperform noble metals in other metamaterial and transformation optics devices, such as those based on hyperbolic metamaterials.


Source: Optical Society of America (OSA)


Additional Information:

Engineers find elusive plasmons in tiny metal particles, a boost to nanotechnology

Engineerblogger
March 28, 2012


Artist Kate Nichols creates structurally colored artwork using Surface Plasmon Resonances, the same phenomenon described by Scholl and Dionne.

After five decades of debate, Stanford engineers determine how collective electron oscillations, called plasmons, behave in individual metal particles as small as just a few nanometers in diameter. This knowledge may open up new avenues in nanotechnology ranging from solar catalysis to biomedical therapeutics.

Stanford scientists have shown that a phenomenon known as plasmon resonance occurs at very small scales, offering a new understanding of quantum physics that could lead to improved solar catalysis and targeted cancer treatments.

The new discovery by Stanford engineers was reported recently in the cover story of the journal Nature.

When light hits a metal, electrons on the surface collectively oscillate in waves, called plasmons, that travel out like ripples on a pond. The new research shows that plasmons exist in smaller particles than had been shown before. The research reveals the presence and clear quantum-influenced nature of plasmons in individual metal particles as small as one nanometer in diameter, about 100 atoms in total.

"Particles of this size are valuable in engineering. They are more sensitive and more reactive than bulk materials and could prove very useful in nanotechnology," said Jennifer Dionne, an assistant professor of materials science and engineering at Stanford and the study's senior author.

Plasmons are an area of intense research focus and a key driver of engineering at the nanoscale. However, as metals become smaller, obtaining experimental data about the nature of plasmons becomes extremely challenging. For over five decades, scientists have debated the nature of plasmons at these smallest of scales.

"Until now, however, we hadn't been able to take full advantage of the optical and electronic properties of these tiny particles because we didn't have a complete picture of the science," said Jonathan Scholl, a doctoral candidate in Dionne's lab and first author of the paper. "This paper provides the foundation for nanoengineering a new class of metal particles made up of between 100 and 10,000 atoms."

Plasmon resonances in relatively small metal particles are not new. They are visible in the vibrant hues of the great stained-glass windows of the world. More recently, engineers have used them to develop new, light-activated cancer treatments and to enhance light absorption in photovoltaics and photocatalysis.

Stained-glass windows

"The windows of Notre Dame Cathedral and Stanford Chapel derive their color from metal nanoparticles embedded in the glass. When the windows are illuminated, the nanoparticles scatter specific colors of light. The color depends on the size and geometry of the metal particles," said Dionne.

"While scientists have found a number of applications for larger nanoparticles, quantum-sized metal particles have remained largely under-utilized," said Scholl.

Science has a solid understanding of plasmons in larger metal particles based mostly on classical physics. Below a threshold of about 10 nanometers in diameter, however, at what is described as the quantum scale, the classical physics breaks down and quantum mechanics takes over.

At this scale, the particles begin to demonstrate unique physical and chemical properties that larger counterparts of the very same materials do not. Additional and important physical properties can occur when plasmons are constrained in extremely small spaces, at the scale of the nanoparticles Dionne and Scholl studied.

A nanoparticle of silver measuring just a few atoms across, for instance, will respond to photons and electrons in ways profoundly different from a larger particle or slab of silver. By clearly illustrating the details of this classical-to-quantum transition, Scholl and Dionne have pushed the study of plasmons, a field known as plasmonics, into a new realm.

"Our study allows researchers, for the first time, to directly correlate a quantum-sized particle's geometry – its shape and size – with its plasmon resonances," said Dionne.

Interesting applications

Exploring the size-dependent nature of plasmons at the extreme nanoscale could open up some interesting applications.

"We might discover novel electronic or photonic devices based on excitation and detection of plasmons. Or, there could be opportunities in quantum optics, bio-imaging and therapeutics," said Dionne.

Medical science, for instance, has devised a way to use nanoparticles excited by light to burn away cancer cells, a process known as photothermal ablation. Metal nanoparticles are affixed with molecular appendages that attach exclusively to cancerous cells in the body. When irradiated with infrared light, the plasmons in the metal begin to vibrate and the nanoparticles heat up, burning away the cancer while leaving the surrounding healthy tissue unaffected.

The metal particles used in these applications today, however, are relatively large. The use of smaller particles like those described in this research could prove more easily integrated into cells and might therefore improve the accuracy and the effectiveness of these technologies.

In a similar vein, the greater surface-area-to-volume ratios offered by atomic-scale nanoparticles could improve rates and efficiencies in catalytic processes like water-splitting and artificial photosynthesis, yielding clean and renewable sources of energy from artificial fuels.

Elegant and versatile

The researchers concluded by explaining the physics of their discovery through an elegant and versatile analytical model based on well-known quantum mechanical principles.

"Technically speaking, we've created a relatively simple, computationally light model that describes plasmonic systems where classical theories have failed," said Scholl.

The researchers' ability to observe plasmons in particles of such small size was abetted by the powerful, multimillion-dollar environmental scanning transmission electron microscope (E-STEM) installed recently at Stanford's Center for Nanoscale Science and Engineering, one of just a few such microscopes in the world.

E-STEM imaging was used in conjunction with electron energy-loss spectroscopy (EELS) – a research technique that measures the change of an electron's energy as it passes through a material – to determine the shape and behavior of individual nanoparticles. Combined, STEM and EELS allowed the team to address many of the ambiguities of previous investigations.

Ai Leen Koh, a research scientist at the Stanford Nanocharacterization Laboratory, contributed to this study. The work was funded by the National Science Foundation Graduate Research Fellowship Program, the Stanford Terman Fellowship and the Robert N. Noyce Family Faculty Fellowship.

Source: Stanford University

Nanotechnology: Feel the pressure

Engineerblogger
March 28, 2012


Miniaturized pressure sensors are widely used in mechanical and biomedical applications, for example, in gauging fuel pressure in cars or in monitoring blood pressure in patients. Woo-Tae Park and co-workers at the A*STAR Institute of Microelectronics have now developed a nanowire-based sensor that is so sensitive it can detect even very low pressure changes (see paper in Journal of Micromechanics and Microengineering: "Gate-bias-controlled sensitivity and SNR enhancement in a nanowire FET pressure sensor").

Most miniaturized pressure sensors harness the intrinsic properties of piezoresistive materials. A structural change in such a material, induced for example by an external force, results in a complementary change in its electrical resistance. However, piezoresistive materials have two major limitations. Firstly, these materials are not particularly sensitive, which means that low pressures produce weak electronic signals. Secondly, these materials can generate a lot of electrical noise, which can mask the true measurement signal. An ideal transducer should have a high signal-to-noise ratio (SNR). Park and his co-workers have now used nanowires to create a pressure sensor with enhanced SNR properties.

Previous research has shown that nanowires can exhibit high piezoresistive effects because of their small size. To take advantage of this, Park and his co-workers used state-of-the-art material processing techniques to suspend two silicon nanowires between two electrodes on a silicon-on-insulator substrate. Each wire was a few hundred nanometers long and approximately 10 nanometers wide. They were covered in amorphous silicon which both protected them and acted as an electrical connection, referred to as the gate. The researchers attached to this a circular diaphragm: a two-layer membrane of silicon nitride and silicon dioxide. Any stress in the diaphragm was therefore transferred to the nanowire structure.

The team characterized their sensor by passing a controlled stream of air across it. Ammeters measured the current flowing through the device as a known electrical potential was applied across the two electrodes. An additional voltage, the gate bias, was also applied between one of the electrodes and the gate. Park and his co-workers demonstrated that they could achieve a four-fold increase in pressure sensitivity by reversing the direction of this gate bias. This, they believe, is a result of the bias voltage controlling the confinement of the electrons within the nanowire channels — a concept commonly employed in so-called field-effect transistors. An assessment of the device noise characteristics also showed significant improvements with the right choice of operating parameters.

Park and his co-workers believe that the device provides a promising route for applications requiring miniaturized pressure sensors that use little power.

Source: A*STAR

Additional Information: