Thursday, September 27, 2007

DNA Barcoding

DNA barcoding was invented by Paul Hebert of the University of Guelph, in Ontario, Canada, in 2003. His idea was to generate a unique identification tag for each species based on a short stretch of DNA. Separating species would then be a simple task of sequencing this tiny bit of DNA. Dr Hebert proposed part of a gene called cytochrome c oxidase I (COI) as suitable to the task. All animals have it. It seems to vary enough, but not too much, to act as a reliable marker. And it is easily extracted, because it is one of a handful of genes found outside the cell nucleus, in structures called mitochondria.

The idea worked, and it has dramatically reduced the time (to less than an hour) and expense (to less than $2) of using DNA to identify species.
That is pretty cool that they were able to find a section of DNA that uniquely identifies a species and that can be determined so cheaply.

More information on barcoding can be found a the Consortium for the Barcode of Life (and this .pdf is particularly informative).

Turns out that the barcode isn't exactly the same for a species, but it differs by much less than when compared to other species.
Barcodes affirm the unity of the species Homo sapiens. Comparison of COI barcode sequences shows we typically differ from one another by only one or two base pairs out of 648, while we differ from chimpanzees at about 60 locations and gorillas at about 70 locations. Large intraspecific differences may signal the presence of hidden species, as for example in the recent recognition of two species of orangutan.

A comparison of mitochondrial sequences from 2238 species in 11 animal phyla showed 98% of closely related species pairs had more than 2% sequence difference, which is enough for successful identification of most species.
I was curious how many species would have the same (or very similar) barcodes. If it is only 2% then this methodology ought to work pretty well.

Where does this barcoding project go from here?
Dr Hebert hopes to have half a million barcodes available online within the next five years. Both his laboratory at Guelph and the Smithsonian's Laboratories of Analytical Biology can sequence the COI gene rapidly, and have thus been dubbed “barcode factories”, so this looks feasible.
This dovetails nicely with the the Encyclopedia Of Life project. I hope they are working together.
In the long term, barcoding enthusiasts envisage something called a “barcoder”. It will be a hand-held device that reads barcodes on the spot.
Sounds all Star Trek, but I definitely want me one of these hand-held devices.

And my favorite use of barcoding?
America's Federal Aviation Administration and its air force are working on bird barcoding. They want to scrape bits of tissue from planes, discover which birds are most often being struck, and thus work out which bird-migration routes to avoid.

via The Economist


Link Alert

Google added some new links to the top of the Google Reader, and as I clicked on one I was surprised to see it launch in a new window. It then struck me that it is really stupid that the browser gives you absolutely no way to know whether a link will launch in a new window.

I was all ready to rant about how stupid this is and how someone should come up with a fix to this, when low and behold I found out that someone already had.

The Firefox add-on Link Alert changes the cursor to tell you when the link will launch in a new window. But it is actually much cooler than that, as it will also tell you when you are clicking on all sorts of different links, such as (South Africa, the Iraq, everywhere like such as) email links, .pdf links, and RSS feeds. I have been using it for a while now and I really like it.

Worth the install.


Interesting Articles of the Week

Salmon sperm makes for better LEDs.

A new device detects heartbeats and brain activity at a distance, doing away with uncomfortable electrodes.

If you're going to die, do it differently!

Soldier of the future gets his gear on.

In flood-prone Bangladesh, a future that floats.


Wednesday, September 26, 2007

Plan Uses Taxes to Fight Climate Change

Dingell will offer a "discussion draft" outlining his tax proposals on Thursday, the same day that President Bush holds a two-day conference to discuss voluntary efforts to combat climate change.

-A 50-cent-a-gallon tax on gasoline and jet fuel, phased in over five years, on top of existing taxes.

-A tax on carbon, at $50 a ton, released from burning coal, petroleum or natural gas.

-Phaseout of the interest tax deduction on home mortgages for homes over 3,000 square feet. Owners would keep most of the deduction for homes at the lower end of the scale, but it would be eliminated entirely for homes of 4,200 feet or more.

He estimates that would affect 10 percent of homeowners. He says "it's only fair" to tax those who buy large suburban houses and create urban sprawl. Historic and farm houses would be exempted.

Some of the revenue would be used to reduce payroll taxes, but most would go elsewhere including for highway construction, mass transit, paying for Social Security and health programs and to help the poor pay energy bills.
Sounds good to me. Something tells me it has no shot though.

via The Washington Post


Diploid Genome Shows That People Might Differ By 63% of Their Genes

Craig Venter has just sequenced his diploid genome, the results of which were published at PLoS Biology (or you can take a look at the actual map of his diploid genome sequence) and its significance explained in his new book A Life Decoded.

And although they were referred to as complete, they were in fact half-genomes -- or "haploid" -- containing a mom-and-pop mosaic of the 3 billion DNA letters found on just one set of the 23 chromosomes paired in every cell.

Not emphasized in 2001 was the fact that people have in their cells two versions of each of those 23 chromosomes, one from each parent -- a "diploid" genome.

Dr. Venter has spent the last five years and an extra $10 million of his institute’s money in improving the draft genome he prepared at Celera. That genome was based mostly on his own DNA, and the new diploid version is entirely so. It was decoded with an old method, known as Sanger sequencing, that is expensive but analyzes stretches of DNA up to 800 units in length. The cheaper new technologies at present analyze pieces of DNA only 200 units or so long, and the shorter lengths are much harder to assemble into a complete genome.

And unlike the Human Genome Project, whose focus on individual letters made it blind to many larger mutations or variations involving hundreds or thousands of letters, the newer methods that Venter used capture all sizes.
And what did they find? From the report:
Comparison of this genome and the National Center for Biotechnology Information human reference assembly revealed more than 4.1 million DNA variants, encompassing 12.3 Mb. These variants (of which 1,288,319 were novel) included 3,213,401 single nucleotide polymorphisms (SNPs), 53,823 block substitutions (2–206 bp), 292,102 heterozygous insertion/deletion events (indels)(1–571 bp), 559,473 homozygous indels (1–82,711 bp), 90 inversions, as well as numerous segmental duplications and copy number variation regions. Non-SNP DNA variation accounts for 22% of all events identified in the donor, however they involve 74% of all variant bases. This suggests an important role for non-SNP genetic alterations in defining the diploid genome structure. Moreover, 44% of genes were heterozygous for one or more variants.
Or in slightly easier to read prose:
All told, 44 percent of the genes Venter received from one parent were at least a little different from those he inherited from his other parent, and a third of those variations had never been seen in studies of those genes in other people.

One type is called indels, where a single DNA unit has either been inserted or deleted from the genome. Another is copy number variation, in which the same gene can exist in multiple copies. There are also inversions, in which a stretch of DNA has been knocked out of its chromosome and reinserted the wrong way around. Dr. Venter’s genome has four million variations compared with the consortium’s, including three million snips, nearly a million indels and 90 inversions.

Specifically, older analyses suggested that humans' genetic codes are, on average, 99.9 percent identical (or 0.1 percent different), while the new estimate comes in at 99.5 percent (or 0.5 percent different). The true number may be as low as 99 percent, Venter said.
That is amazing to me that 44% of the genes he inherited from his parents were different from each other.

If I understand that correctly, it means that 22% (44%/2) of the genes in each haploid genome has a variation in it. So, for two people to have an identically coded gene, each would have had to have gotten 2 copies (one from each parent) of the gene with no variations. The odds of having a gene with no variations is .78 (1-.22), and the odds of having four of them are just .78^4= 37%. 63% (1-.37) of the time, the coding for any given gene will differ between two people.

If you were to compare your diploid genome with another random individual, of your approximately 25,000 genes, you would share only (25,000*.37=) 9,250 identically coded genes (this assumes that variations are randomly distributed throughout the genome which might be a very bad assumption, so take this number with a grain of salt until real professionals run the numbers).

While at the level of base pairs we are 99.5% similar (14 million base pairs might differ in a haploid genome of 2.8 billion), at the level of genes the differences are much greater as only 37% of them likely to be the same between two individuals. Whether you choose to say we are 99.5% or 37% the same genetically depends on how you look at it. I think the later makes more sense personally.

And what do they still not know?
Although Venter's method produces a 6 billion-letter diploid genome, it does not produce complete paternal and maternal genomes of 3 billion letters each.

There are 4,500 gaps where the sequence of DNA units is uncertain, and no technology yet exists for decoding the large amounts of DNA at the center and tips of the chromosomes.
While they have made great strides, this diploid genome is not really "complete" yet, just much closer. There are likely to be more variations found.

And where do we go from here? Venter sees the following:
I think next year we'll probably see 30 to 50 individual genomes done, and hopefully a major escalation from there. Our goal is to maybe, over the next five years, get as many as 10,000 different complete human genomes.
And how much will it cost?
Cost trends are encouraging. The first 3 billion-letter genome sequences took more than a decade to complete and cost billions of dollars. During Venter's latest project, costs dropped precipitously, and today, several scientists said, an entire diploid genome could probably be done for about $100,000. Some predict that a $1,000 genome will be available within five years.
$1,000 in 5 years! I hope that will happen, but I think it will take more like 10 or 15 years to hit that mark.

Another interesting question is how much space it will take to store your genome. If you were to record every base pair of your diploid genome, there would be around 5.6 billion base pairs which could be stored in 2.8 GB. Put another way you could store it on your 4 GB iPod Nano and still have 1.2 GB left for music.

But, instead of storing all the base pairs, you really only need to store the deltas from the standard genome. If you just record what makes you different, you can reconstruct your own genome with the standard one. Each genome differs by maybe 3.2 million SNPs and another 640,000 non-SNP variations. Multiply that by two for your diploid genome, add in the space to record where these variations occur in your genome (lets say that increases the size 5 times) and that gets you to 20 MB, or about the space a 20 minute MP3 files takes up. Amazing that it takes up more space to save a Dave Matthews Band extended jam song that it does to record what makes you unique genetically.

I can't wait for the day when I can get my genome sequenced.

via NY Times and Washington Post and News Hour


Tuesday, September 25, 2007

New York Times Now Free

The New York Times is now free:

Effective Sept. 19, we are ending TimesSelect. All of our online readers will now be able to read Times columnists, access our archives back to 1987 and enjoy many other TimesSelect features that have been added over the last two years – free.
About time.

As a blogger, I am so glad not to have to worry about converting NY Times links to permalinks or ever having to use this site again.

And I never understood why the content in the archive had no images or graphs. I thought at first it was because they got rid of them after a while to save space, but that was obiviously not the case as they were still accessible via the permalink. I have no idea why they did that or why they would charge for that inferior version.

I never thought the idea of charging for columnists and 2 week old content made any sense. I blame Lexis Nexis for giving the NY Times the crazy idea that you should be charging for your fishwrap.

While I am glad that the content is free, I am not convinced that a completely advertising driven model is the way to go for online newspapers. Currently, one one print reader is worth ten online and advertising rates online are much less than those in print. This might change in time, but I find that I am more likely to glance at an ad in a paper than I am to look at one online. And the big moneymaker for print newspapers are the classifieds, which, thanks to Craigslist, have absolutely no revenue generating value online for newspapers.

To support their current reporting staff, the New York Times might still need some sort of subscription system. If you look at news on television, the network and local news are both completely advertising supported, but the cable news channels get about 1/2 of their revenue from cable subscriptions.

One idea that I have floated before is to sell a subscription for access to all online newspapers and then split the revenues between them based on page views. I am willing to pay a flat fee to access any article in the US that I want to view (and help to pay for the cost of additional reporters and quality reporting), but I have a hard time justifying one for the NY Times, one for the Wall Street Journal (which looks to be going free as well sometime soon) and one for The Economist (which has also released all its articles for free recently) and one for the Financial Times.

Another option for them would be to charge for early access to content. Charge to allow people to view articles at 9 PM rather than 6 AM the next day or Wednesday rather than Sunday for the NY Times Magazine content. This scheme could also be extended to charge for current day's content. If you want to view it for free you have to wait until it is one day old. Those that need access to the news as it breaks will have to pay, while those who can wait will get it for free. This is a similar scheme to books where they are first sold in an expensive hardcover edition and than later released in a cheaper paperback edition.

But for now I am a happy NY Times reader and blogger.


This is Socialism?

When I think of school vouchers, I always associate it with Milton Friedman and free markets. And yet where has it taken hold? In "socialist" Europe:

These proposals may seem radical, yet parents in the Netherlands have had the right to demand new schools since 1917, and those in Sweden have been free since 1992 to take their government money to any school that satisfies basic government rules. Such freedoms are wildly popular: in the Netherlands 70% of children are educated in private schools at the taxpayers' expense; in Sweden 10% already are. In both countries state spending on education is lower per head than in Britain, and results are better.
via The Economist


Wal-Mart Asks: How Much Energy Does it Take to Make Toothpaste?

Wal-Mart Stores Inc. (WMT) on Monday said it formed a partnership with the Carbon Disclosure Project to measure the amount of energy used to make products throughout its supply chain.

Wal-Mart said it will use this measurement tool to initiate a pilot plan with a group of suppliers to look for new and innovative ways to make the entire process more energy efficient.

The pilot will focus on seven commonly used product categories - DVD's, toothpaste, soap, milk, beer, vacuum cleaners and soda - and seek to determine the overall environmental impact of products and look for innovative ways to drive energy efficiency.
Interesting. I like it. Can't wait until the day that the labels at the store include the environmental impact of each item.

The project plan of calculating one item in many categories makes a lot of sense to me, but I don't get why they are doing both beer and soda. They are quite similar from an environmental standpoint. Would have liked to see something wood based, maybe books or furniture. And while a vacuum cleaner hits the electronic goods category, I would rather see a TV or a computer in this category.

via CNN Money via Earth2Tech


Monday, September 24, 2007

Killing Two Birds With One Stone

According to the All-China Lawyers Association, the country has only 122,000 lawyers. That is 70,000 fewer than California where the population is only 37m (against China's 1.3 billion). Many business people might argue that California is overlawyered, but there are parts of China without any lawyers at all.
This seems like a problem that could easily be solved with 500 one way Los Angeles to Shanghai flights.

via The Economist


New Low Cost Solar Panels Ready for Mass Production

Colorado State University's method for manufacturing low-cost, high-efficiency solar panels is nearing mass production. AVA Solar Inc. will start production by the end of next year on the technology developed by mechanical engineering Professor W.S. Sampath at Colorado State.

Sampath has developed a continuous, automated manufacturing process for solar panels using glass coating with a cadmium telluride thin film instead of the standard high-cost crystalline silicon. Because the process produces high efficiency devices (ranging from 11% to 13%) at a very high rate and yield, it can be done much more cheaply than with existing technologies. The cost to the consumer could be as low as $2 per watt, about half the current cost of solar panels.

The process is a low waste process with less than 2% of the materials used in production needing to be recycled. Cadmium telluride solar panels require 100 times less semiconductor material than high-cost crystalline silicon panels.
via Industry Week via Engadget


Saturday, September 22, 2007

Telepresence and a Telecommuting Robot

Business travel sucks for a variety of reasons. On a personal level there is jet lag, wasted time, and the hassles such as having to take your shoes off at security checkstands. From an environmental standpoint, travel requires lots of fuel leading to higher greenhouse gas emissions. From a business standpoint it is expensive. For all these reasons, I am hopeful that telepresence can greatly reduce business travel.

The result is something called “telepresence”, which HP and other technology firms are just beginning to sell. It is basically a spruced-up version of videoconferencing, but its creators insist that the technology is so improved as to be unrecognisable. Users still communicate via live audio and video feeds, but the speed and quality of transmission have increased, and the screens have grown and multiplied, in order to create the illusion that the two parties to a conversation are not continents apart but at opposite ends of the same table (as in the picture above).

Designers want people in telepresence meetings to appear life-sized, and the tables and rooms at the two ends to blend together seamlessly. (Rooms, furniture and even wallpaper are often identical, to aid the illusion.) People must also feel that they are making eye contact, which involves multiple cameras and enormous computing power. The delays in sight and sound must be negligible (ie, below 250 milliseconds, the threshold at which the human brain starts to notice), so that people can interrupt each other naturally. Sound must be perceived to come from the direction of the person speaking. And getting things started must be simple—ideally involving a single button or none at all.

HP charges $350,000 for every room it kits out for telepresence and, in America, a further $18,000 a month for service. Cisco charges up to $299,000 per room. Dominic Dodd, of Frost & Sullivan, a research firm, says that buyers of such systems find that despite their high cost they quickly pay for themselves by keeping travel bills down. Cisco claims that it has cut its own spending on travel by a fifth this year, and that the 100-odd telepresence rooms at its own offices around the world are almost constantly in use.

Frost & Sullivan forecasts that the global market for telepresence, although still tiny, will grow by 56% a year to reach $1.24 billion by 2013.
Sounds pretty cool. Hopefully I will get a chance to try it out someday.

And even if you don't have to travel for work, commuting has its own issues of wasted time commuting, traffic (both the headache of being in it and the fact you are adding to it), gasoline costs and CO2 emissions. A recent study estimates that 2.9 billion gallons of fuel wasted in traffic congestion in the US every year. To get around it you can telecommute (which saves 840 million gallons of gas a year), but doing so can make you feel 'out of the loop' at work. Here is one creative solution to that problem:
Programmer Ivan Bowman works from home, but still maintains his presence in the office through the use of a bot he calls IvanAnywhere -- a clever play on his name and the name of his employer, iAnywhere. Basically a webcam-on-wheels, IvanAnywhere motors around the office, takes meetings, and even gives presentations, all while the real Ivan remains safely pantless in his home office.
While that robot will get the job done, I have a feeling that somewhere in Japan scientists are working on making a replacement robot that is a little more lifelike.

via The Economist and Engadget


Name Your Own Species

Searching for new ways to raise money for environmental causes, scientists and conservationists are increasingly opting to sell naming rights to the highest bidder.
I think this is a good idea. With possibly 10 million undiscovered species out there, why not help fund their discovery and preservation by selling their names?

And with just 93 shopping days left until Christmas, what better gift can you give someone than the gift of immortality?
The elegant, invitation-only "Blue Auction," hosted by the Monaco-Asia Society and Conservation International under the patronage of Monaco's Prince Albert II, is the boldest sign yet of a novel twist in the centuries-old system for naming new species.
Check out the species that were up for auction. The auction was a success netting $2 million. The shark that walks on its fins went for $500,000 which is good, but not quite monkey good ($650,000).

If you missed the auction, don't worry you still have other options.
A German nonprofit called Biopat has tried a more systematic approach. For the past eight years, Biopat has maintained a database of plants and animals that individuals can name for a price that depends on the species. The group divides the proceeds between the institution of the scientist who found it and support for field projects in the area where it resides. It has raised nearly $514,000 so far for naming rights to about 120 species.
They have a nice selection of species names for sale, including frogs, plants, and one cool looking water mite, for 2,600€ to 3,000€. I personally like the Chilean soft coral for 4,000€ and the Brazilian Nudibranch at 5,000€. Really though, I am holding out for one of these deep sea species to become available.

Aside: I love how this Biopat website is kickin' it vintage 90's style with frames, a web counter and (my all time favorite unnecessary website feature:) a guestbook.

If you already have gifts for all your friends, what about one of your enemies? What better way to have them live in infamy than to name a malaria causing mosquito, a parasitic tapeworm or a slime mold eating beetle (well actually someone thought that last one was an honor) after them?

For those of you with more time then money, hopefully the Encyclopedia of Life will make it easy for amateurs to go discover and name their own species. A handheld species barcoder would certainly make the job easier and might be available some day soon.

And if you were wondering what the latin for Fat Knowledge is, that would be pinguis scientia. A fine name for any species if you ask me. :)

via The Washington Post


Can't Tase This

via Digg


Thursday, September 20, 2007

ELF Magnetic Fields Boost Ethanol Production

Researchers in Brazil have successfully used extremely low frequency (ELF) magnetic fields to significantly boost the amount of ethanol produced through the fermentation of sugar. Their study is scheduled for the 5 October issue of Biotechnology Progress, a bi-monthly journal published by the American Chemical Society.

Victor Perez of the State University of Campinas, Brazil, and his colleagues showed that yeast-based fermentation of sugar cane in the presence of ELF magnetic fields boosted ethanol production by 17%. The scientists also showed that ethanol production was faster, taking two hours less than standard fermentation methods.
Hmm, maybe living under the electric lines isn't so bad. Well, at least not if you are a yeast cell.

via Green Car Congress


Wednesday, September 19, 2007

New Greenspan Book Ghostwritten by Krugman

In the 500-page book, “The Age of Turbulence: Adventures in a New World,” Mr. Greenspan describes the Bush administration as so captive to its own political operation that it paid little attention to fiscal discipline, and he described Mr. Bush’s first two Treasury secretaries, Paul H. O’Neill and John W. Snow, as essentially powerless.

Of the presidents he worked with, Mr. Greenspan reserves his highest praise for Bill Clinton, whom he described in his book as a sponge for economic data who maintained “a consistent, disciplined focus on long-term economic growth.”

By contrast, Mr. Greenspan paints a picture of Mr. Bush as a man driven more by ideology and the desire to fulfill campaign promises made in 2000, incurious about the effects of his economic policy, and an administration incapable of executing policy.

Though Mr. Greenspan does not admit he made a mistake, he shows remorse about how Republicans jumped on his endorsement of the 2001 tax cuts to push through unconditional cuts without any safeguards against surprises. Today, Mr. Greenspan is indignant and chagrined about his role in the Bush tax cuts.
Wow, not that I disagree with anything he said, but I am surprised that he wrote in such stark terms that the the Bush administration puts politics over policy, that Clinton was his favorite president, and that the tax cuts were a bad idea. If I didn't know any better I would think the book was written by Paul Krugman.

And then to top it all off, Greenspan goes into Michael Moore territory:
“I am saddened that it is politically inconvenient to acknowledge what everyone knows: the Iraq war is largely about oil,” he says.
As much as the book sounds interesting, the prospect of reading 500 pages, written by a man whose speech is so indecipherable that he had to propose to his wife (NBC News reporter Andrea Mitchell) three times before she understood what he was doing, is a bit too daunting for me.

via NY Times and Times Online


The Truth About Santa

via Spelling Mistakes Cost Lives via Digg


Interesting Articles of the Week

Taste, nutrients decline as size of crops grows.

Underwater Lost Cities: 7 Submerged Urban Wonders of the World

Taddy Blecher, a self proclaimed 'white Jewish guy from Johannesburg, South Africa' quits his job as a management consultant to teach transcendental meditation for 4 years and then starts an almost-free business university for students who cannot afford mainstream higher education.

Bottle makes dirty water drinkable.

International team of scientists to test South Atlantic carbon sink in 2009.


Tuesday, September 18, 2007

'Bringing the Ocean to the World,' in High-Def

Under a $331 million program long dreamed of by oceanographers and being financed by the National Science Foundation, Professor Delaney and a team of scientists from several other institutions are leading the new Ocean Observatories Initiative, a multifaceted effort to study the ocean — in the ocean — through a combination of Internet-linked cables, buoys atop submerged data collection devices, robots and high-definition cameras. The first equipment is expected to be in place by 2009.

Researchers will be able, for example, to assemble a year’s worth of daily data on deep ocean temperatures in the Atlantic or track changes in currents as a hurricane nears the Gulf of Mexico. And schoolchildren accustomed to dated graphics and grainy shark videos will only have to boot up to dive deep in high definition. “It’ll all go on the Internet and in as real time as possible,” said Alexandra Isern, the program director for ocean technology at the National Science Foundation. “This is really going to transform not only the way scientists do science but also the way the public can be involved.”

In the Northwest, about $130 million of the initiative’s cost is being dedicated to build a regional observatory, a series of underwater cables that will crisscross the tectonic plate named for the explorer Juan de Fuca. Rather than provide an information superhighway that bypasses the ocean, this new network is being put in place to take its pulse. Professor Delaney, whose specialty is underwater volcanoes that form at the seams between tectonic plates and the surprising life those seams support, is among those who have been pursuing the cable network for more than a decade, overcoming hurdles of money, technology and skepticism.
We know less about what goes on in the deep ocean then we do about Mars. As long as this is the case, I think we should fund underwater research at the same level we do NASA. This is a great step in that direction.

I like the idea of replacing humans in the field with robots and fiber optic cables. I am also excited about the prospect of an HD hydrothermal vent TV channel (as that may be where life on Earth originated and there might be more biomass there than on the rest of the Earth combined). If CSPAN gets two channels, this deserves at least one.

via NY Times


Hudson River Environmental Monitoring Goes High-Tech

The Beacon Institute for Rivers and Estuaries, a non-profit scientific research organization based in Beacon, NY, has teamed up with IBM and several other research groups to develop a high-tech environmental-monitoring system for the state's Hudson River that would transform its 315 miles into an interconnected network of sensors. These would collect data on the river's biology and chemistry and transmit them to a central location for further analysis by IBM's new system — which will take the information and create a virtual model of the river to simulate its ecosystem in real time.

Some of the network's sensors will be mounted on a solar-powered robotic underwater vehicle (seen above) built by RPI and the Woods Hole Oceanographic Institute (WHOI) while others will be fixed in place along the river bed or suspended from buoys. The IBM system will use the gathered data to monitor the river's temperature, pressure, salinity, dissolved oxygen content and pH levels — which will help indicate whether pollutants have entered the Hudson River — and its sea life.
Love those autonomous robots out there collecting data for us.

via TreeHugger


Friday, September 14, 2007

China and Green GDP

President Hu Jintao’s most ambitious attempt to change the culture of fast-growth collapsed this year. The project, known as “Green G.D.P.,” was an effort to create an environmental yardstick for evaluating the performance of every official in China. It recalculated gross domestic product, or G.D.P., to reflect the cost of pollution.

But the early results were so sobering — in some provinces the pollution-adjusted growth rates were reduced almost to zero — that the project was banished to China’s ivory tower this spring and stripped of official influence.

The Green G.D.P. team sought to calculate the yearly damage to the environment and human health in each province. Their first report, released last year, estimated that pollution in 2004 cost just over 3 percent of the gross domestic product, meaning that the pollution-adjusted growth rate that year would drop to about 7 percent from 10 percent. Officials said at the time that their formula used low estimates of environmental damage to health and did not assess the impact on China’s ecology. They would produce a more decisive formula, they said, the next year.

That did not happen. Mr. Hu’s plan died amid intense squabbling, people involved in the effort said. The Green G.D.P. group’s second report, originally scheduled for release in March, never materialized.

The official explanation was that the science behind the green index was immature. Wang Jinnan, the leading academic researcher on the Green G.D.P. team, said provincial leaders killed the project. “Officials do not like to be lined up and told how they are not meeting the leadership’s goals,” he said. “They found it difficult to accept this.”
I like this idea of a Green G.D.P. Disappointing that it has been shelved for the moment in China, I think it would be in their best interest to bring it back.

I am also curious how exactly they calculated it, but this being China, hard to say if that information would ever become public.

via NY Times


Saturday, September 08, 2007

Interesting Articles of the Week

Scientists induce out of body sensation.

First all-electric police cruiser.

Green Valley In Wal-Mart's back yard.

Poll finds one in four didn't crack a book last year.

Japanese government offers half-price Big Macs in exchange for pledging to fight global warming. (No word on whether eating less beef was something you could pledge.)


New Viruses to Treat Bacterial Diseases

“By using a virus that only attacks bacteria, called a phage – and some phages only attack specific types of bacteria – we can treat infections by targeting the exact strain of bacteria causing the disease”, says Ana Toribio from the Wellcome Trust Sanger Institute in Hinxton, Cambridgeshire, UK. “This is much more targeted than conventional antibiotic therapy”.

The scientists used a close relative of Escherichia coli, the bacterium that commonly causes food poisoning and gastrointestinal infections in humans, called Citrobacter rodentium, which has exactly the same gastrointestinal effects in mice. They were able to treat the infected mice with a cocktail of phages obtained from the River Cam that target C. rodentium. At present they are optimizing the selection of the viruses by DNA analysis to utilise phage with different profiles.

“Using phages rather than traditional broad-spectrum antibiotics, which essentially try to kill all bacteria they come across, is much better because they do not upset the normal microbial balance in the body”, says Dr Derek Pickard from the Wellcome Trust Sanger Institute. “We all need good bacteria to help us fight off infections, to digest our food and provide us with essential nutrients, and conventional antibiotics can kill these too, while they are fighting the disease-causing bacteria”

“The more we can develop the treatment and understand the obstacles encountered in using this method to treat gut infections, the more likely we are to maximise its chance of success in the long term”, says Ana Toribio. “We have found that using a variety of phages to treat one disease has many benefits over just using one phage type to attack a dangerous strain of bacteria, overcoming any potential resistance to the phage from bacterial mutations”.
This is an interesting idea to use viruses rather than antibiotics to kill bacteria. I like it because it doesn't harm our better halves. Alsothe viruses ought to be able to evolve right along with the bacteria so that there won't be any more "super bugs".