Tuesday, September 10, 2013

Posted by lelyholida
No comments | 9:45 PM
Well, this is becoming an unfortunate trend. Another train carrying oil has derailed and exploded, this time in Alabama. From the Reuters news report:

A 90-car train carrying crude oil derailed and exploded in western Alabama in the early hours of Friday morning, spilling oil and leaving eleven cars burning in the rural area.

No injuries have been reported, but 20 of the train’s cars derailed and 11 were still on fire, the train owner, Genesee & Wyoming, said in a statement on Friday. Those cars, which threw flames 300 feet into the night sky, are being left to burn down, which could take up to 24 hours, the company said.

A local official said the crude oil had originated in North Dakota, home of the booming Bakken shale patch. If so, it may have been carrying the same type of light crude oil that was on a Canadian train that derailed in the Quebec town of Lac-Megantic this summer, killing 47 people.

I can’t think of a clearer manifestation of how stressed and ill-equipped the U.S. energy infrastructure is to handle the domestic oil boom than exploding trains. Pipeline capacity simply doesn’t exist to move oil from the central parts of the country to refineries and markets (especially on the East Coast). Trains are the quick and easy solution. As I mentioned the last time an oil train derailed (“Another oil derails in Canada”), putting oil on train cars is cheaper than building new pipelines and doesn’t require environmental review.

Clearly, oil-by-train standards will be tightened up. While that is long overdue, that doesn’t address the core issue, which is with having adequate petroleum infrastructure. To remedy this, the U.S. government could streamline permitting rules. The Federal Energy Regulatory Commission (FERC) approves interstate natural gas pipelines in a relatively speedy 18 months. Petroleum pipelines, on the other hand, are permitted at the state level, rather than the federal level, drawing out the permitting process and driving up costs. So instead of wading through state regulatory commissions and agencies, someone looking to move oil will simply hire a train, where there is little safety oversight.

Streamlined permitting benefits both petroleum producers by having an easier way to move their product (as the National Petroleum Council recommended – PDF), while environmentalists could chalk it up as a win by delivering a safer transportation network and avoiding the pollution and loss of life that comes with each derailment and explosion.

Monday, September 9, 2013

Posted by lelyholida
No comments | 9:40 PM
Over the last few years, we’ve heard a lot about how “Big Data”—which as far as I can tell is just data mining in a glossy new wrapper–are going to revolutionize science and help us create a better world.* These claims strike me as all too familiar. They remind me of the hype generated in the 1980s by chaos and in the 1990s by complexity (which was just chaos in a glossy new wrapper). Chaos and complexity enthusiasts promised (and are still promising) that ever-more-powerful computers plus jazzy new software and math were going to crack riddles that resisted more traditional scientific methods.

Advances in data-collection, computation and search programs have led to impressive gains in certain realms, notably speech recognition, language-translation and other traditional problems of artificial intelligence. So some of the enthusiasm for Big Data may turn out to be warranted. But in keeping with my crabby, glass-half-empty persona, in this post I’ll suggest that Big Data might be harming science, by luring smart young people away from the pursuit of scientific truth and toward the pursuit of profits.

My attention was drawn to this issue by a postdoc in neuroscience, whose research involves lots of data crunching. He prefers to remain anonymous, so I’ll call him Fred. After reading my recent remarks on the shakiness of the scientific literature, he wrote me to suggest that I look into a trend that could be exacerbating science’s woes.

“I think the big science journalism story of 2014 will be the brain drain from science to industry ‘data science,’” Fred writes. “Up until a few years ago, at least in my field, the best grad students got jobs as professors, and the less successful grad students took jobs in industry. It is now the reverse. It’s a real trend, and it’s a big deal. One reason is that science tends not to reward the graduate students who are best at developing good software, which is exactly what science needs right now…

“Another reason, especially important for me, is the quality of research in academia and in industry. In academia, the journals tend to want the most interesting results and are not so concerned about whether the results are true. In industry data science, [your] boss just wants the truth. That’s a much more inspiring environment to work in. I like writing code and analyzing data. In industry, I can do that for most of the day. In academia, it seems like faculty have to spend most of their time writing grants and responding to emails.”

Fred sent me a link to a blog post, “The Big Data Brain Drain: Why Science is in Trouble,” that expands on his concerns. The blogger, Jake VanderPlas, a postdoc in astrophysics at the University of Washington, claims that Big Data is, or should be, the future of science. He writes that “in a wide array of academic fields, the ability to effectively process data is superseding other more classical modes of research… From particle physics to genomics to biochemistry to neuroscience to oceanography to atmospheric physics and everywhere in-between, research is increasingly data-driven, and the pace of data collection shows no sign of abating.”

Vanderplas suggests that the growing unreliability of peer-reviewed scientific results, to which I alluded in my last post, may stem in part from the dependence of many research results on poorly written and documented software. The “crisis of irreproducibility” could be ameliorated, VanderPlas contends, by researchers who are adept at data-analysis and can share their methods with others.

The problem, VanderPlas says, is that academia is way behind Big Business in recognizing the value of data-analysis talent. “The skills required to be a successful scientific researcher are increasingly indistinguishable from the skills required to be successful in industry. While academia, with typical inertia, gradually shifts to accommodate this, the rest of the world has already begun to embrace and reward these skills to a much greater degree. The unfortunate result is that some of the most promising upcoming researchers are finding no place for themselves in the academic community, while the for-profit world of industry stands by with deep pockets and open arms.”

VanderPlas and Fred, who are are apparently software whizzes themselves, perhaps overstate the scientific potential of data crunching just a tad. And Fred’s aforementioned claim that industry “just wants the truth” strikes me as almost comically naïve. [**See Fred's clarification below.] For businesses, peddling products trumps truth–which makes the brain drain described by Fred and VanderPlas even more disturbing.

Fred is a case in point. Increasingly despondent about his prospects in brain research, he signed up for training from the Insight Data Science, which trains science Ph.D.s in data-manipulation skills that are desirable to industry (and claims to have a 100 percent job placement record). The investment paid off for Fred, who just got a job at Facebook.

*Should “Big Data” be treated as plural or singular? I polled my students, and they said plural, so I went with plural.

**Re his comment about industry bosses wanting “truth,” “Fred” just emailed me this clarification: “I think there is a distinction, which I perhaps should have made clearer, between ‘marketing’ and ‘analytics.’ When it comes to marketing a product to consumers, I agree it’s pretty obvious that business incentives are not aligned with truth telling. No one disputes that. But when it comes to the business’s internal ‘analytics’ team, the incentives are very aligned with truth telling. Analytics teams do stuff like: determining how users are interacting with the product, measuring trends in user engagement or sales, analyzing failure points in the product. This is the type of work that most data scientists do.”

***A couple of afterthoughts on this topic: First, Lee Vinsel, my Stevens colleague and former friend, points out in a comment below that industry has long lured scientists away from academia with promises of filthy lucre and freedom from the grind of tenure-and-grant-chasing. Yup. Wall Street “quants” are just one manifestation of this age-old phenomenon. So what’s new about the Big Data Brain Drain? Does it differ in degree or kind from previous academia-to-business brain drains? Good questions, Lee. I have no idea, but I bet Big Data can provide the answer! (Unless of course it’s subject to some sort of Godelian limit on self-analysis.)

Second, a fascinating implication of the rise of Big Data is that science may increasingly deliver power—that is, solutions to problems—without understanding. Big Data can, for example, help artificial intelligence researchers build programs that play chess, recognize faces and converse without knowing how human brains accomplish these tasks. The same could be true of problems in biology, physics and other fields. If science doesn’t yield insight, is it really science? (For a smart rebuttal of the notion that Big Data could bring about “the end of theory,” see the smart blog post mentioned below by Sabine Hossenfelder.)

Sunday, September 8, 2013

Posted by lelyholida
No comments | 9:38 PM
A mathematician and a chef have produced objects that mimic the function and beauty of biological organisms

Finding a bug in your drink is an unpleasant surprise, but researchers at Massachusetts Institute of Technology have created a fanciful cocktail accessory based on the mechanics of water bugs—and another less ironically modeled after the workings of a delicate water lily.

In partnership with José Andrés, a renowned chef who lectures on the science of cooking at Harvard University, applied mathematics professor John Bush designed two cocktail accessories—a pipette modeled after a water lily, which serves to pick up drops of cocktails meant to cleanse the palate and drop them on the diner’s tongue, and an edible “boat” that circles around the surface of alcoholic drinks. Both produced on 3D printers, allowing the researchers to modify their prototypes rapidly, the objects were inspired by Bush’s desire to combine mathematics and culinary art.

After attending one of Andrés Harvard lectures, Bush approached him suggesting they collaborate on edible designs that relied on mathematical properties. “Much of my research concerns surface tension,” Bush says, “which is responsible for a number of interesting effects that arise in the kitchen—or the bar.”

The cocktail boat is filled with an alcohol of a higher proof than the drink it floats in, which it then releases steadily through a notch at one end. This creates a difference in surface tension, propelling the boat forward in a phenomenon called the Marangoni effect. The design, described in a paper published in the October issue of Bioinspiration & Biometrics, is inspired by a mechanism found in nature: Many aquatic insects rely on Marangoni propulsion, which they create by releasing chemicals that produce a gradient in surface tension. When dropped onto a watery surface, the bugs use this mechanism to skitter safely back to shore. Bush’s team optimized the design of the boats for speed and fuel efficiency—in other words, the amount of time they could move before running out of alcohol, or about two minutes. You can see them in action here.

The floral pipettes, which Bush says are intended to deliver dainty drops of liquid to the tongue—concoctions, according to the paper, that Andrés will develop specially to refresh diners’ palettes during multi-course meals—look like upside-down flowers. When the flowers are dipped into liquid and pulled back out, their petals fold shut and hold some of the liquid inside. The design, Bush says, is “an inversion of the design of floating flowers that, when exposed to floods, wrap up in order to protect genetic material.” Biomimicry is nothing new—Velcro, Scotch tape and the airplane are all examples of designs borrowed from nature—but creating the cocktail accessories was a unique challenge. “Typically in the lab,” Bush says, “function is everything, but given our ultimate goal, aesthetic appeal was also a consideration.” As the researchers stated in their paper, they strove for “the mimicry not only of nature’s function, but of her elegance.”

Now the designs are in the hands of Andrés’ management company, ThinkFoodGroup. “The chefs are taking it a step further,” Bush says, “The designs are not to be only functional and aesthetically pleasing, but edible.” The hope is that they’ll soon make a debut at Minibar, Andrés’ restaurant in Washington, DC. And in the meantime, Bush says he’s always on the lookout for more natural mechanisms he can replicate. 


A mathematician and a chef have produced objects that mimic the function and beauty of biological organism


Saturday, September 7, 2013

Posted by lelyholida
No comments | 9:37 PM
A couple of weeks ago, I was writing up a description of Einstein’s general theory of relativity, and I thought I’d compare the warping of spacetime to the motion of Earth’s tectonic plates. Nothing on Earth’s surface has fixed coordinates, because the surface is ever-shifting. Same goes for spacetime. But then it struck me: if nothing has fixed coordinates, then how do Google Maps, car nav systems, and all the other mapping services get you where you’re going? Presumably they must keep updating the coordinates of places, but how?

I figured I’d Google the answer quickly and get back to Einstein, yet a search turned up remarkably little on the subject. So, as happens distressingly often in my life, what I thought would take 30 seconds ended up consuming a couple of days. I discovered a sizable infrastructure of geographers, geologists, and geodesists dedicated to ensuring that maps are accurate. But they are always a step behind the restless landscape. Geologic activity can create significant errors in the maps on your screens.

One of the people I talked to is Ken Hudnut of the U.S. Geological Survey, an earthquake researcher (and blogger) who set up one of the first GPS networks to track plate motions. “Say that you’re standing right in the middle of a road intersection with your GPS receiver and you get the coordinates for your position,” he says. “You look at Google Earth, and instead of being located right at the middle of the road intersection, you’re off by some amount.” Several factors produce these errors. Consumer GPS units have a position uncertainty of several meters or more (represented by a circle in Google Maps). Less well known is that maps and satellite images are typically misaligned by a comparable amount. “It’s partly the GPS hardware that limits the accuracy, and part of it may also be the quality of the georeferencing,” Hudnut says.

An interesting, if dated, study from 2008looked at Google Earth images in 31 cities in the developed world and found position errors ranging from 1 to 50 meters. It’s not hard do to your own experiments. The image at left shows my position in Google Maps while I was standing on my back deck—a discrepancy of about 10 meters, much larger than the stated error circle. When I go to Google Earth and compare images taken on different dates, I find that my house jumps around by as much as 20 meters.

In the grand scheme of things, this isn’t much, but does make you wary of high zoom levels. Hudnut says he sees map bloopers in his field work all the time. As technology progresses, so will we all. “We’re fast approaching the day when people will expect accuracies of centimeters in real time out of their handheld devices and then we’ll see a lot of head scratching as things no longer line up,” says Dru Smith of the National Geodetic Survey in Silver Spring, Md., the nation’s civilian chief geodesist—the go-to guy for the precise shape and size of our planet.

For the most part, misalignments don’t represent real geologic changes, but occur because it’s tricky to plop an aerial or orbital image onto the latitude and longitude grid. The image has to be aligned with reference points established on the ground. For this purpose, NGS maintains a network of fixed GPS stations and, over the past two centuries, has sprinkled the land with survey marks—typically, metallic disks mounted on exposed bedrock, concrete piers, and other fixed structures. The photo at left shows one near my house. But the process of ground-truthing a map is never perfect. Moreover, the survey-mark coordinates can be imprecise or downright wrong.

NGS and other agencies recheck survey marks only very infrequently, so what a stroke of luck that a whole new community of hobbyists—geocachers—does so for fun. “One of the many things we no longer have money to do is send out people to make sure those marks are still there,” Smith says. “Geocachers, through this creation of a new recreation of going out and finding these marks, are sending in tons of reports.… It’s been helpful to us to keep the mark recoveries up to date.”

Errors also sneak in because the latitude and longitude grid (or “datum”) is not god-given, but has to be pegged to a model of the planet’s shape. This is where plate tectonics can make itself felt. Confusingly, the U.S. uses two separate datums. Most maps are based on NAD 83, developed by NGS. Google Maps and GPS rely instead on WGS 84, maintained by a parallel military agency, which, thanks to Edward Snowden, we now know has a considerably larger budget. The civilian one is optimized for surveying within North America; the military one sacrifices domestic precision for global coverage.

When NGS introduced NAD 83, replacing an older datum that dated to 1927, it was the geographic version of the shift from the Julian to the Gregorian calendar. If you’d been paying attention, you would have woken up on December 6, 1988, to find that your house wasn’t at the same latitude and longitude anymore. The shift, as large as 100 meters, reflected a more accurate model of Earth’s shape. Vestiges of the old datum linger. You still see maps based on NAD 27. Also, when the U.S. Navy developed the first satellite navigation system in the 1960s, engineers set the location of 0 degrees longitude by extrapolating the old North American datum. Only later did they discover they had drawn the meridian about 100 meters east of the historic Prime Meridian marker at the Royal Observatory in Greenwich. (Graham Dolan tells the whole, convoluted story on his website, the definitive reference on the meridian.)

NGS and its military opposite number worked together to align their respective datums, but the two systems have drifted apart since then, creating a mismatch between maps and GPS coordinates. Plate tectonics is one reason. WGS 84 is a global standard tied to no one plate. In essence, it is fixed to Earth’s deep interior. Geodesists seeking to disentangle latitude and longitude from the movements of any one particular plate assume that tectonic plates are like interlocking gears—when one moves, all do—and that, if you add up all their rotational rates, they should sum to zero. The effect of not tying coordinates to one plate is that surveyed positions, and the maps built upon them, change over time.

In contrast, NAD 83 sits atop the North American plate like a fishnet laid out on the deck of a boat. As the plate moves, so does the datum. Other regions of the world likewise have their own local datums. That way, drivers can find their way and surveyors can draw their property lines in blissful ignorance of large-scale tectonic and polar motion. “Most surveyors and mapmakers would be happy to live in a world where the plates don’t move,” Smith explains. “We can’t fix that, but we can fix the datum so that the effect is not felt by the predominant number of users.… Generally speaking, a point in Kansas with a certain latitude and longitude this year had that exact same latitude and longitude 10 years ago or 10 years from now.… We try to make the planet non-dynamic.”

To deepen the datum discrepancy, NAD 83 has not been revamped to account for improved knowledge of Earth’s shape and size. “We are currently working with a system that is very self-consistent and very internally precise, but we know, for example, that the (0,0,0) coordinate of NAD 83, which should be the center of the Earth, is off by about two meters,” Smith says. NGS plans an update in 2022, which will shift points on the continent by a meter or more (as shown in the figure at top of this post).

The tradeoff for keeping surveyors happy is that the North American latitude and longitude grid is increasingly out of sync with the rest of the world (as shown in the diagram at left, in which you can see how the North American plate is rotating about a point in the Yucatán). The “rest of the world” includes Southern California, which straddles the North American and Pacific plates. The Pacific plate creeps a couple of inches toward the northwest every year relative to the rest of North America. The plate boundary is not sharp, so the actual amount of movement varies in a complicated way. The California Spatial Reference Center in La Jolla has a network of tracking stations and periodically updates the coordinates of reference points in the state. “That’s what the surveyors then use to tie themselves into NAD 83,” says the center’s director, Yehuda Bock. The last update was in 2011 and another is planned for next year.

Like Smith, Bock says that more frequent updating would actually complicate matters: “Surveyors do not like it if coordinates change, so this is kind of a compromise.” For localized line-drawing, it doesn’t much matter, but large-scale projects such as the California high-speed rail system have to keep up with tectonic motion.

Things obviously get more interesting during earthquakes. “What the earthquake would do is the equivalent of what you do with a pair of scissors, if you cut diagonally across a map along a fault line and then slid one side of the map with respect to the other,” Hudnut says. For instance, in Google Earth, go to the following coordinates north of Palm Springs, near the epicenter of the 1992 Landers quake: 34.189838 degrees, –116.433842 degrees. Bring up the historical imagery, compare the July 1989 and May 1994 images, and you’ll see a lateral shift along the fault that runs from the top left to the bottom right of the frame. The alignment of Aberdeen Road, which crosses the fault, shifts noticeably. The quake displaced the land near the fault by several meters.

GPS networks can even see earthquakes in real time. Here’s a dramatic video of the 2011 Tohoko quake, made by Ronni Grapenthin at U.C. Berkeley based on data from the Japanese Geospatial Information Authority. The coastline near the quake site moved horizontally by as much as 4 meters. The video also shows the waves that rippled outward over Japan (and indeed the world).

Adjustments for tectonic activity take time to filter down to maps. I spoke with Kari Craun, who, as director of the USGS National Geospatial Technical Operations Center near St. Louis., is in charge of producing the USGS topographic maps beloved of outdoors enthusiasts. She says the maps are updated every three years (and even that pace has been hard to maintain with budget cuts). In between, mapmakers figure, the error is swamped by the imprecision of mapping and GPS equipment. Future maps may be updated at a rate closer to real-time. “We have the technology now with GPS to be able to make those slight adjustments on a more frequent basis,” Craun says. As someone who relies on Google Maps to get around, I look forward to that. But the romantic in me prefers seeing out-of-date maps. They never let us forget the dynamism of our planet.

Friday, September 6, 2013

Posted by lelyholida
No comments | 9:32 PM
Rob Carlson drove his Tesla Model S down Route 167 outside Seattle. He accidentally ran over a piece of metal, likely a fender or other curved piece of metal from a truck. That metal somehow punctured the quarter-inch thick armored undercarriage of the vehicle and penetrated its battery pack. Within 30 minutes, the car was in flames—the first fully electric vehicle fire on the road in the U.S. and a viral video sensation.

Fortunately, the Model S comes equipped with a warning system. "The car warned the driver to get off the highway as soon as the incident happened. That's awesome," notes chemist Jeff Chamberlain, deputy director of the Joint Center for Energy Storage Research at Argonne National Laboratory, otherwise known as the U.S. government's battery research hub. "Of course, any $80,000 car should be able to do that for you."

A few weeks later, a driver in Merida, Mexico, lost control of his Model S at high speed, crashing through a concrete wall and into a tree. The driver and his passengers were able to walk away, apparently uninjured before the Tesla burst into flames. And now a third Model S has caught fire after an accident initiated by yet more road debris—this time a renegade tow hitch near Smyrna, Tenn., (which is coincidentally where Nissan builds its all-electric LEAF that uses similar lithium-ion batteries)—that again appears to have pierced the car’s battery pack and set it ablaze on November 6.

Battery fires are not an issue confined to the Tesla Model S, which is by some measures the world's safest car. When doused with seawater after Superstorm Sandy, a fleet of 16 Fisker Karma's burned last October. And a Chevy Volt sitting in a garage weeks after safety testing by the National Highway and Traffic Safety Administration (NHTSA) suddenly began to smolder and burn in 2011, prompting a full safety investigation that later cleared the model for sale. That’s 20 or so incidents in the past few years. For comparison, note that there is a fire in the predominant type of vehicle on the road—a car powered by an internal combustion engine vehicle—every four minutes or so.

Nonetheless, battery cars can burst into flames. This is not a problem confined to cars—think Boeing's Dreamliner or any number of Sony products. Rather, it is a problem confined to batteries. Pack a lot of chemical energy into a small space and if something goes wrong, fire or explosions are the inevitable result. So what should be done in the event of such novel fires?

Novel fires
Lithium-ion is the world's most popular battery technology, employed by the hundreds of millions in everything from cell phones to electric cars. Yet such mishaps have proved extremely rare.

Here's how a lithium ion battery works. A plastic film separates a positive and negative electrode, all of which is bathed in electrolyte, in this case a clear chemical solvent. The electrolyte is a carbonate liquid that ionizes the lithium, causing it to pick up an extra electron. Each of these lithium ions then acts as a shuttle of sorts, carrying that extra electron from the anode to the cathode. At the cathode, the lithium ions are absorbed, freeing up those ionizing electrons to act as current. To recharge the cell, simply add electricity, which drives the lithium back out of the cathode and into the anode, and it’s ready to do it all over again.

Now tightly roll sheets of anode and cathode material and cram it into a cylinder. That's one lithium ion battery. All kinds of things can go wrong in this set up, from a build up of gas that bursts the exterior cylinder to an actual metallic lithium link forming between the anode and cathode that then sets off what engineers call "thermal runaway." It's more commonly known as fire, helped along by the fact that other components of the cell, such as the plastic separator and the organic solvent burn nicely, much like gasoline. "It's a chemical fire at its heart," Chamberlain explains.

Put enough cells next to each other and a defect in one can quickly become a defect in all, thermal runaway on the scale of a car-sized battery pack. These breakdowns of the battery generate their own heat, or, in the words of chemists, the reaction is exothermic—enough so that the heat from one cell can set off another. That's why the software to manage the cooling and recharging of electric vehicle batteries is as important as the lithium ion battery pack itself.

And that's where Tesla has distinguished itself. (Tesla declined to provide someone to comment for this story but referred this reporter to the National Fire Protection Association and the company's online safety video.) The new car company confines each Model S's more than 6,500 lithium ion batteries from Panasonic in 16 individual modules—separate but equal and comprising the vehicle’s overall battery pack. By separating the modules in this way a mishap in one module is unlikely to spread to another module. In addition Tesla's battery pack is cooled with a glycol-based chemical cocktail, blue in appearance, that can quickly whisk away any excess heat. There is also a "firewall" between each module, according to Tesla CEO Elon Musk, suggesting that some kind of heat resistant material is segregating the modules. It seems that just one module burst into flames in the October 1 incident in Washington involving the pierced battery pack. The battery management system worked well enough that the car’s navigation system warned the driver to pull over and get away from the vehicle. He walked away from the accident unharmed.

That is often not true of crashes involving gasoline.

And keep in mind that part of the reason an electric vehicle cannot go as far as a gasoline-burning car is that even the best lithium ion battery only holds roughly 200 watt-hours of energy per kilogram. Gasoline holds 1700 watt-hours per kilogram. Less energy stored means a reduced risk of that energy unleashed. "We are already carrying around really energy dense materials in our vehicles," Chamberlain notes. "We should be comfortable."

Or as Tesla's Musk put it: "For consumers concerned about fire risk, there should be absolutely zero doubt that it is safer to power a car with a battery than a large tank of a highly flammable liquid."

Put out the fire
Once a battery fire gets started, however, that fire can be harder to put out than a gasoline fire. In Washington, firefighters did not help matters by cutting holes into the metal frame and thus allowing more oxygen to reach the battery fire in progress. The best thing to do may be nothing.

"In our controlled laboratory setting, we prefer to let a lithium-ion battery fire burn itself out," says Chris Orendorff, principal investigator for the Battery Abuse Testing Laboratory at Sandia National Laboratories, or the guy whose job includes finding out what it takes to blow up any given battery. That is also the guidance that Tesla gives first responders in its emergency response guide: "Battery fires can take up to 24 hours to fully extinguish. Consider allowing the vehicle to burn while protecting exposures."

If a more aggressive course of action is taken, as also happened in the Washington State wreck, beware of putting too little water on a lithium-ion battery fire. If the amount of water is insufficient, the fire will simply appear to go out—before bursting out anew. And if small amounts of metallic lithium have formed as a result of the lithium-ion failure (lithium-ion batteries, despite the name, typically do not contain metallic lithium), the reactive metal can burst into flames because of something as simple as humidity in the air. In attempting to douse a lithium-ion fire, either a lot of water is required or alternative fire suppressants, like CO2 or other chemicals such as Halon. The National Fire Protection Association notes this as an area requiring more research to determine the best approach.

More worryingly, if left to itself after being damaged in an accident, a lithium-ion battery can slowly degrade to the point where it spontaneously bursts into flames weeks later. In essence, if damaged a lithium-ion battery can produce hydrofluoric acid (one of the most powerful acids on Earth) from fluorinated compounds in the battery that then further damages the cell itself and potentially allows the conditions to become right for a fire. Something similar is what happened in the case of the Chevy Volt that burst into flames weeks after safety testing.

Then there's the toxic vapors: the aforementioned sulfuric acid, plus bits of various metals that can be liberated by the fire, including aluminum, cobalt, copper, lithium, and nickel. Anyone anywhere near such fires, particularly in an enclosed space, should wear full protective gear and self-contained breathing apparatus that allows no outside air into the system. Sulfuric acid is no picnic (although it also finds use in the electrolyte of some lead-acid batteries and is part of the reason that more than 2,000 people suffer chemical burns from using lead-acid batteries, such as the ones in conventional cars, each year.)

Already, more than 100,000 electric cars ply U.S. roadways. Such novel fires will become more common as more and more electric vehicles hit the road, whether the luxury Tesla Model S or cheaper alternatives such as the Ford Fusion Energi. That will in turn mean that the NHTSA and other regulators will need to devise new e-car safety tests, a process that an interagency task force, including the U.S. Department of Energy is currently working to complete.

One reason that Tesla burst into flames in Washington is inherent to the design itself. The battery pack comprises the very bottom of the Model S. This "skateboard" design is what makes the Model S so stable to drive.

But the skateboard design also exposes the battery pack to the hazards of the road. A piece of metal run over by a gasoline-powered car would have torn up the muffler and exhaust system, or possibly punctured a fuel line. In the case of the Tesla Model S, it punctured the battery and set off a fire. The NHTSA will investigate. "It is important to evaluate and understand a cell or battery response to all types of abusive scenarios," Sandia's Orendorff argues, "so that proper design, chemistry or engineering improvements can be made to mitigate these risks."

Or, as Carlson put it in an email to Tesla customer service: "I was thinking this was bound to happen, just not to me. But now it is out there and probably gets a sigh of relief as a test and risk issue—this 'doomsday' event has now been tested, and the design and engineering works." In other words, battery fires are no reason to kill the electric car.

Thursday, September 5, 2013

Posted by lelyholida
No comments | 9:35 PM
The costs of reducing emissions may be flash points along the path toward a 2015 Paris treaty

At a major United Nations climate summit in Warsaw this week, a plan is being hammered out for negotiations on a new climate treaty to be finalized in Paris in two years’ time. Delegates from 195 nations are also seeking to obtain commitments from countries to limit their greenhouse-gas emissions between now and 2020. But the path forward is rife with disputes between rich and poor countries over funding, and how to allocate and enforce emissions reductions.

The conference aims to outline the schedule and to set parameters for negotiations ahead of the next major climate summit in Paris in 2015, when countries hope to forge a treaty to follow the 2009 agreement settled on in Copenhagen.

At that meeting, negotiations over a formal treaty broke down, but eventually resulted in a set of non-binding pledges — the Copenhagen Accord — for emissions reductions until 2020. The accord also blurred the distinction between developed countries, which were bound by the 1997 Kyoto Protocol to reduce emissions, and developing countries, which had no such obligations. Since then, negotiators have worked on how to structure a new framework that would involve climate commitments from all countries — including China, now the world’s largest emitter, and the United States, which never ratified the Kyoto Protocol (E. Diringer Nature 501, 307–309; 2013).

The Warsaw talks are split into two main tracks. One focuses on the architecture of a new global climate treaty that would take effect after 2020, when the current Copenhagen commitments expire. The second examines what can be done to strengthen commitments between now and 2020 to increase the chance of limiting global warming to a target of 2 °C above pre-industrial temperatures (see ‘Emissions up in the air?’).

The European Union (EU), for example, has proposed a multi-stage process, whereby commitments for climate action post-2020 would be registered next year and then subjected to an international assessment to determine how well the commitments measure up against each other and against scientific assessments. The final commitments would then be registered in Paris in 2015. By getting countries to volunteer their climate commitments and comparing them in this way, the hope is that nations with unambitious targets might be shamed into strengthening them. The EU has also called for a review of pre-2020 commitments.

Tasneem Essop, who is tracking negotiations for the environmental group WWF in Cape Town, South Africa, says that these short-term commitments are crucial for pointing the world in the right direction. “The biggest challenge will be to ensure that emissions do peak within this decade,” she says.

The cost of reducing emissions could be the first flashpoint in Warsaw. In Copenhagen, developed countries agreed to provide US$30 billion in climate aid from 2010 to 2012, and to increase climate support to developing countries to $100 billion annually by 2020. Although the short-term commitments were largely met, there is no clear plan for attaining the goal of $100 billion a year. From emerging giants such as Brazil and China to poor countries in Africa, developing nations are demanding that wealthy countries ramp up funding and create a viable path to this goal.

With public coffers strapped, many developed nations are looking for other funding sources. One possibility is to place some type of levy on international aviation, which is being considered by the International Civil Aviation Organization in Quebec, Canada. The body has committed to craft an agreement by 2016 that could take effect by 2020.

Negotiators in Warsaw will haggle over how to finance and ultimately deploy climate aid through organizations such as the newly launched Green Climate Fund, based in Incheon, South Korea. Another flashpoint is the developing countries’ demand for a ‘loss and damage’ mechanism to compensate poor countries irreparably harmed by climate change.

But the biggest questions will center on the framework for the treaty in 2015. Before Copenhagen, the emphasis was on a treaty similar to the Kyoto Protocol that would lock in legally binding emissions reductions. In Copenhagen, the United States and other developed countries pushed for an alternative that would allow individual countries to register commitments, which would then be reviewed at an international level. Delia Villagrasa, a senior adviser for the European Climate Foundation in Brussels, says that the talks are moving towards this bottom-up approach, which would be combined with a formal review to assess commitments and identify ways to scale them up. The world could get its first hint of what such a system might look like as the talks wrap up next week.


“Warsaw will bring some clarification on the structure of the new agreement,” Villagrasa says. “That’s not sexy for the media, but it’s important.”

Wednesday, September 4, 2013

Posted by lelyholida
No comments | 9:28 PM
The proposed project's accelerator ring would be 100 kilometers around and run at seven times the energy of the LHC

When Europe’s Large Hadron Collider (LHC) started up in 2008, particle physicists would not have dreamt of asking for something bigger until they got their US$5-billion machine to work. But with the 2012 discovery of the Higgs boson, the LHC has fulfilled its original promise — and physicists are beginning to get excited about designing a machine that might one day succeed it: the Very Large Hadron Collider (VLHC).

“It’s only prudent to try to sketch a vision decades into the future,” says Michael Peskin, a theoretical physicist at SLAC National Accelerator Laboratory in Menlo Park, California, who presented the VLHC concept to a US government advisory panel on 2 November.

The giant machine would dwarf all of its predecessors (see ‘Lord of the rings’). It would collide protons at energies around 100 teraelectronvolts (TeV), compared with the planned 14 TeV of the LHC at CERN, Europe’s particle-physics lab near Geneva in Switzerland. And it would require a tunnel 80–100 kilometers around, compared with the LHC’s 27-km circumference. For the past decade or so, there has been little research money available worldwide to develop the concept. But this summer, at the Snowmass meeting in Minneapolis, Minnesota — where hundreds of particle physicists assembled to dream up machines for their field’s long-term future — the VLHC concept stood out as a favorite.

Some physicists caution that the VLHC would be only a small part of the global particle-physics agenda. Other priorities include: upgrading the LHC, which shut down in February for two years to boost its energies from 7 TeV to 14 TeV; plans to build an International Linear Collider in Japan, to collide beams of electrons and positrons as a complement to the LHC’s proton findings; and a major US project to exploit high-intensity neutrino beams generated at the Fermi National Accelerator Laboratory in Batavia, Illinois. Jonathan Rosner, a particle physicist at the University of Chicago, Illinois, who convened Snowmass, says that these forthcoming projects should be the focus. “It’s premature to highlight the VLHC,” he says.

In some ways, the interest in the VLHC is a sign that particle physicists are returning to their roots, pushing to ever higher energies to find the fundamental building blocks of nature.

They will have to justify it, however. The discovery of the Higgs particle lends support to the idea that some particles have mass because they interact with a pervasive, treacle-like Higgs field. Yet many aspects of the discovery are still not understood, including why the mass of the Higgs particle is so large. One way of explaining its heaviness is through supersymmetry theory, in which known particles are coupled with heavier ones that might be observed in bigger particle colliders. Although the LHC has not detected any signs of supersymmetry, Peskin hopes that a hint may come before the end of the decade, which would help to inform the design of a larger machine.

One advocate of a bigger machine is Nima Arkani-Hamed, a theoretical physicist at the Institute for Advanced Study in Princeton, New Jersey. In December, he will help to launch an institute in Beijing called the Center for Future High Energy Physics. Part of its explicit mission, he says, is to explore the physics that a future proton collider might investigate. William Barletta, an accelerator physicist at the Massachusetts Institute of Technology in Cambridge, says that this work is crucial to identify a machine size that will maximize the science per dollar. “We won’t just give hand-waving arguments,” he says.

To build a 100-TeV machine, Barletta adds, physicists will need to develop superconducting magnets that can operate at higher fields than the current generation, perhaps 20 tesla instead of 14 tesla. One leading candidate material for such magnets is niobium tin, which can withstand higher fields but is expensive and must be cooled below 18 kelvin.


CERN is developing its own plans for a collider that is similar to the VLHC. CERN accelerator physicist Michael Benedikt is leading a study of a ‘very high energy large hadron collider’ that would pass under Lake Geneva. It would have the same key parameters as the suggested VLHC: a circumference of 80–100 km and a collision energy of 100 TeV. Benedikt suggests that construction might begin in the 2020s so that the machine could be completed soon after the LHC shuts down for good around 2035. “One would not want to end up with a huge gap for high-energy physics,” he says. He adds that it is too early to offer a price tag. But other physicists speculate that a next-generation collider would have to cost less than $10 billion for the project to be politically plausible.

Tuesday, September 3, 2013

Posted by lelyholida
No comments | 9:27 PM
Dead bodies clog the basement of the Tacloban City Convention Centre. The dazed evacuees in its sports hall are mostly women and children.

Dead bodies clog the basement of the Tacloban City Convention Centre. The dazed evacuees in its sports hall are mostly women and children. The men are missing.
That so few men made it to this refuge shows how dimly aware they were of the threat posed by Typhoon Haiyan, which crashed into the central Philippines on Friday with some of the strongest winds ever recorded.
Many men stayed at their homes to guard against looters. Poorly enforced evacuations compounded the problem. And the bodies illustrate another, more troubling truth: the evacuation center itself became a death trap, as many of those huddling in the basement perished in a tsunami-like swirl of water.
Those with the foresight to evacuate flimsy homes along the coast gathered in concrete structures not strong enough to withstand the six-meter (20-ft) storm surges that swept through Tacloban, capital of the worst-hit Leyte province.
The aid, when it came, was slow. Foreign aid agencies said relief resources were stretched thin after a big earthquake in central Bohol province last month and displacement caused by fighting with rebels in the country's south, complicating efforts to get supplies in place before the storm struck.
The Philippines, no stranger to natural disasters, was unprepared for Haiyan's fury.
"We're all waiting for our husbands," said Melody Mendoza, 27, camped out with her two young sons at the convention center, which towers over the devastated coastal landscape.
Local officials say 10,000 people were killed in Tacloban alone. President Benigno Aquino told CNN the death toll from the typhoon was 2,000 to 2,500, saying "emotional drama" was behind the higher estimate.
Aquino defended the government's preparations, saying the toll might have been higher had it not been for the evacuation of people and the readying of relief supplies.
"But, of course, nobody imagined the magnitude that this super typhoon brought on us," he said.
WARNINGS UNHEEDED
Two days before the storm hit, the International Federation of Red Cross and Red Crescent Societies predicted a "dangerous" typhoon with winds of 240 kph (150 mph) heading straight for Leyte and Samar - the two most devastated provinces.
Warnings were broadcast regularly on television and over social media. More than 750,000 people across the central Philippines were evacuated.
"As bad as the loss of life was, it could have in fact been much, much worse," said Clare Nullis, spokeswoman for the U.N.'s World Meteorological Organization, praising the government's work in issuing warnings.
"Certainly on Thursday and Friday, PAGASA, which is the Philippines' meteorological service, they were sending out regular warnings of a seven-meter (22 ft) storm surge. That was going out on an hourly basis."
But as the storm approached Tacloban and authorities crisscrossed the city, their warnings often fell on deaf ears.
"Some people didn't believe us because it was so sunny," said Jerry Yaokasin, vice mayor of Tacloban. "Some people were even laughing."
Getting relief supplies to survivors has also been chaotic.
Foreign aid workers said they had struggled to get equipment and personnel on to Philippine military cargo planes, with the government prioritizing the deployment of soldiers due to widespread looting at the weekend.
Mark Fernando, 33, a volunteer for the Philippine National Red Cross, arrived in Tacloban on Tuesday afternoon after a two-day wait at nearby Cebu city for a military plane.
"They said, 'Our priority is to bring in soldiers and policemen,'" said Fernando, whose 10-strong team plans to clear debris and set up a water filtration system.
One survivor at the Tacloban convention center said he would have evacuated if he had been told a tsunami-like wall of water might hit.
"On Thursday night we could see the stars in the sky," said Moises Rosillo, 41, a pedicab driver sheltering beneath the centre's distinctive domed roof with his family. "We thought it would just be wind and rain."
Rosillo evacuated his wife and son, but stayed behind with his father and thousands of other men in a neighborhood near the airport. The authorities warned of a storm surge - a term Rosilla said he didn't understand - but didn't try to forcibly evacuate them.
Winds of 314 kph (195 mph) were followed by a surge of water, which rose to the height of a coconut tree within five minutes, he said. Rosillo was swept into a bay, which he likened to a giant whirlpool, and clung for hours to a piece of wood before struggling ashore. His father died in the water.
Medical workers are treating evacuees at the convention center for lacerations and other wounds.
But many, like Mendoza, complained of a lack of food and poor hygiene. "People won't come here because they are scared their children will get sick."
"THE PREPARATIONS WERE NOT ENOUGH"
With so little help arriving, people are still streaming towards Tacloban's airport, where hundreds of people are waiting for a chance to board a flight to Cebu or Manila.
"It appears local government units failed to mobilize officials for forced evacuations to higher and safer ground, out of the way of strong winds, storm surges and widespread flooding," said Doracie Zoleta-Nantes, an expert on disasters at the Australian National University in Canberra.
Typhoons are a frequent phenomenon in the Philippines and the flimsy nature of rural housing means fatalities are hard to avoid. Haiyan was the second category 5 typhoon to hit this year after Typhoon Usagi in September. An average of 20 typhoons strike every year, and Haiyan was the 24th this year.
Last year, Typhoon Bopha flattened three towns in southern Mindanao, killing 1,100 people and causing damage of more than $1 billion.
Zoleta Nantes, a Philippines native, said despite those disasters and efforts to strengthen disaster management since 2010, "the Philippine government continues a reactive approach to disasters".
Survivors complained of shortages of food and water, piling pressure on Aquino whose once soaring popularity has been eroded in recent weeks by a corruption scandal roiling his political allies.
Some officials said they could have done more.
"Now, looking back, the preparations were not enough, especially in Tacloban. What we did not prepare for was the breakdown in local functions," said Lucille Sering, secretary of the government's Climate Change Commission.
More than 30 countries have pledged aid, but distribution of relief goods has been hampered by impassable roads and rudderless towns that have lost leaders and emergency workers.
Hardest-hit Leyte province has only one working airstrip, which is overrun with relief supplies and crowds jostling to evacuate. It can handle only lighter aircraft.
Philippine Army Major Ruben Guinolbay said help from the United States, other countries and aid agencies was slowed by the lack of clear information. Tacloban's government was wiped out by the storm. Many officials are dead, missing or too overcome with grief to work.
"The usual contact people could not be reached because communications were cut and there was no way of getting information," he told Reuters. A U.S. Marine commander came to Tacloban to personally assess the situation, he added. After his trip, help started to flow.
Budget Secretary Florencio Abad said the government's response this time was faster than previous disasters.
"We saw something that is really unprecedented," Abad said. "I don't think we could have prepared for this."

Monday, September 2, 2013

Posted by lelyholida
No comments | 9:26 PM
Building nuclear power plants is the cheapest way for Poland to curb carbon emissions in the coming decades, a government report said on Wednesday.

Building nuclear power plants is the cheapest way for Poland to curb carbon emissions in the coming decades, a government report said on Wednesday.

Poland, which generates nearly all its electricity from ageing, coal-fired power plants, is formulating a new national strategy aimed at modernizing its energy sector to make it more efficient and to cut carbon emissions.

The report said Eastern Europe's biggest economy would need to increase spending on power infrastructure to 26 billion-37 billion zlotys ($8.3-11.8 billion) annually from a current 18 billion in order to boost the efficiency of the sector.

The size of the investment will depend on the structure of the energy mix and the future costs of carbon emissions, according to the study issued by the government's analysts.

The report, which presents various scenarios for Poland's energy mix through to 2060, said having atomic energy in the mix represented the most cost-efficient future path.

"The cheapest way to reduce emission growth is the construction of nuclear power plants," it said.

Additional savings could be made by linking Poland's power network with neighboring countries, so that it could buy or sell electricity on a larger scale in future, it said.

A part of the government's energy strategy is to build a 3 gigawatt nuclear power plant initially expected to cost 50 billion zlotys and be completed by 2023 - but the project has already run into delays due in part to financing problems.

The plant would help Poland meet EU requirements to reduce carbon missions by diversifying to less polluting forms of electricity generation.


Sunday, September 1, 2013

Posted by lelyholida
No comments | 9:24 PM
Governments want to launch a platform at United Nations climate talks to help set common standards and accounting rules and tie together national and regional emissions trading schemes, but developing countries and green groups warned that talk of a global carbon market is premature.

Governments want to launch a platform at United Nations climate talks to help set common standards and accounting rules and tie together national and regional emissions trading schemes, but developing countries and green groups warned that talk of a global carbon market is premature.

Almost 200 nations are in Poland for a November 11-22 meeting to plan a 2015 U.N. deal in Paris that would start to tackle climate change in 2021.

Most developed countries see carbon markets as crucial under any new agreement because they seek out the cheapest emissions reductions, making climate change targets more achievable.

More than 40 mainly developed countries, including New Zealand and members of the European Union, have, or are in the process of developing, markets to help cut their output of climate-warming emissions by putting a price on carbon dioxide.

As those schemes are disconnected from each other, governments have proposed launching a framework to unite them under a single voluntary platform to share ideas, with a view to eventually launching a global market to battle climate change.

"Markets are vital ... and it's not premature (to be discussing them). It's more than timely to be thinking now, in advance of 2015, about how to manage their intersection," said New Zealand's climate change ambassador Jo Tyndall, adding there was a clear link between carbon markets and channeling climate finance to poor countries.

Tyndall said the platform, a concept floated by Poland earlier this year, would be a "toolbox" that provides a variety of tools to help develop technical standards and best practice approaches to build trading rules that could underpin a global market.

It would also codify transparency and accounting standards that are "fundamental to ensure all mitigation tools and options, including carbon markets, have environmental integrity and will avoid double-counting (emissions cuts)," she added.

"We have a whole series of different mechanisms all over the world ... (so) should we wait until Paris to start thinking what would be useful to have in 2021, or should we look at what we are doing already?", said Poland's Tomasz Chruszczow, chair of one U.N.'s negotiating streams.

MORE PRESSING ISSUES

But poor nations argue that more pressing issues need to be ironed out, for example the overarching dispute between rich and poor countries over how to share efforts to cut emissions, before more market-based mechanisms are developed or the groundwork for a global trading scheme is laid.

"A market is important but it's premature to deliver it now ... with so many other issues that have to be resolved before 2015," said Khalid Abuleif, an advisor to Saudi Arabia's ministry of petroleum and mineral resources and the country's lead negotiator at the U.N. talks.

"Let's not bring markets in to influence that process."

Emerging economies like Saudi Arabia and China, the world's top emitter, want rich countries to commit to doing more to cut greenhouse gas output while allowing poorer nations to burn more fossil fuels to build their economies and end poverty.

"We need to know the nature of those commitments before we can design a credible market. We cannot afford to have failure," Abuleif said, citing existing U.N.-backed carbon trading mechanisms such as the Clean Development Mechanism (CDM).

The CDM has since 2005 helped channel more than $315 billion to poor nations to help them cut their CO2 emissions or adapt to the effects of climate change.

But the failure of nations to craft a new global pact has caused demand for the CO2 offsets generated under the U.N.'s carbon markets to dry up, sending prices crashing and nearly bankrupting many of the companies that invested in the schemes.

"(The toolbox) is not about establishing yet another mechanism to produce units that no one will buy, it's about understanding what we are doing collectively, which is very helpful in understanding how a new global agreement might work," Poland's Chruszczow said.

Green groups including the Third World Network have called the idea a "recipe for disaster", saying governments are pushing forward with building new markets before studying and learning from the failings of existing ones.

Blogroll

About