Tuesday, September 30, 2008

Interesting Articles of the Week

Historian predicts the end of 'science superpowers'.

The perils of petrocracy.

Self-driving Ford Hybrid SUV for sale, only $89k.

Forest fire sensor system runs on electricity generated by trees.

Patent system 'stifling science'.

Read More...

Monday, September 29, 2008

8 Year Olds Learn Little From Negative Feedback

Eight-year-old children have a radically different learning strategy from twelve-year-olds and adults. Eight-year-olds learn primarily from positive feedback ('Well done!'), whereas negative feedback ('Got it wrong this time') scarcely causes any alarm bells to ring. Twelve-year-olds are better able to process negative feedback, and use it to learn from their mistakes. Adults do the same, but more efficiently.

The switch in learning strategy has been demonstrated in behavioural research, which shows that eight-year-olds respond disproportionately inaccurately to negative feedback. But the switch can also be seen in the brain, as developmental psychologist Dr Eveline Crone and her colleagues from the Leiden Brain and Cognition Lab discovered using fMRI research. The difference can be observed particularly in the areas of the brain responsible for cognitive control. These areas are located in the cerebral cortex.

In children of eight and nine, these areas of the brain react strongly to positive feedback and scarcely respond at all to negative feedback. But in children of 12 and 13, and also in adults, the opposite is the case. Their 'control centres' in the brain are more strongly activated by negative feedback and much less by positive feedback.
via FuturePundit

Read More...

Sunday, September 28, 2008

Why This Mass Extinction is Different

Scientists believe we are in the sixth major mass extinction of the planet. As Wired reports:

Earth may be in the midst of the greatest extinction ever, according to a new mass extinction scoring system.

Their system, published today in the Proceedings of the National Academy of Sciences, attempts to quantify those periods when more than half of all species disappeared. In addition to the current mass extinction, this has happened at least five times: the End Ordovician, Late Devonian, End Permian, End Triassic and End Cretaceous. The latter -- marking the end of the Age of Dinosaurs -- receives the most attention, but scientists have been unable to decide which extinction was most significant.

By multiplying the number of organismal groups that went extinct with the time it took, they arrived at a metric called "greatness." According to this, the dinosaur-ending End Cretaceous event, possibly caused when asteroid strikes or volcanic explosions sheathed the Earth in ash, was twice as great as any previous extinction.

The Permian extinction event, caused 250 million years ago by the formation of the Pangea supercontinent and volcano-induced oceanic poisoning, placed third on the researchers' rankings -- and it still encompassed the loss of 96 percent of Earthly life.

The International Union for the Conservation of Nature estimates that 800 plant and animal species have gone extinct in the last 500 years, with more than 16,000 currently threatened with extinction -- and those lost or threatened organisms come the from mere 41,000 species so far assessed by science. More than a million have been described but remain unstudied.

The most troubling figures, however, come not from the total species lost but the rate at which they're vanishing: 1,000 times faster than usual. But even that alarming rate may be too conservative.
There is a major difference between the previous mass extinctions and this one.

The previous ones were were caused by traumatic events such as asteroids colliding with the Earth or major volcanic eruptions. This greatly reduced the amount of life that the Earth could support. Not only did most species go extinct, but also the population size of all species fell dramatically.

In the current mass extinction, the total amount of life that the Earth can support is not declining. One way to measure this is net primary productivity (NPP) which tells how much biomass is being created by plants on land and phytoplankton in the ocean. More NPP means more biomass at the bottom of the food chain. This in turn allows the Earth to support larger populations of animals all the way up the food chain. NASA satellites have recorded a 6% increase in net primary productivity (NPP) from 1982-1999. One study suggests that NPP is likely to increase by 40% over the next 100 years due to global warming.

In business terms, this current mass extinction could be thought of as a consolidation. There may be fewer species, but there will be larger populations of the ones that remain.

For example, most primate species populations have been dwindling fast and 48% of primate species are threatened with extinction. But, one specific primate species has done so well (better than any large mammal species has ever done, with a population of over 6 billion) that the total population of all primates has never been higher.

Another example is Hawaii, which is a verdant tropical paradise. In terms of the amount of life it supports, it is doing better than any other state in the US. And yet it is also home to nearly 1/2 of the 114 species extinctions in the US in the last 20 years and currently has 344 endangered species, more than any other state in the nation. As one scientist writes:
''Not a single plant, none of the lowland birds in Hawaii are native.'' We are turning the world into ''a McDonald's ecosystem,'' with the same species living roughly the same way everywhere.
I think the phrase "McDonald's ecosystem" is a good way to look at this current mass extinction. While loss of biodiversity is not a good thing, it is a much smaller problem than the Earth losing its ability to support life. While the previous mass extinctions had both species extinctions and smaller populations of life, the current one just has species extinctions.

Read More...

Saturday, September 27, 2008

Efficient, Cheap Solar Cells

A cheap new way to attach mirrors to silicon yields very efficient solar cells that don't cost much to manufacture. The technique could lead to solar panels that produce electricity for the average price of electricity in the United States.

Suniva, a startup based in Atlanta, has made solar cells that convert about 20 percent of the energy in the sunlight that falls on them into electricity. That's up from 17 percent for its previous solar cells and close to the efficiency of the best solar cells on the market. But unlike other high-efficiency silicon solar cells, says Ajeet Rohatgi, the company's founder and chief technology officer, Suniva's are made using low-cost methods. One such method is screen printing, a relatively cheap process much like the silk-screen process used to print T-shirts.

So far, the high cost of solar cells has limited them to a marginal role in power production, accounting for less than 1 percent of electricity worldwide. Rohatgi calculates that the company's low-cost manufacturing techniques will make solar power competitive with conventional sources, producing electricity for about 8 to 10 cents per kilowatt-hour--the average cost of electricity in the United States and far less than prices in many markets.

To be sure, significant work remains before the goal of 8 to 10 cents per kilowatt can be achieved. Suniva has demonstrated the crucial first step, which is to show that it can make solar cells that are more than 20 percent efficient using screen printing. The results have been confirmed by the National Renewable Energy Laboratory, in Golden, CO. But for those tests, Suniva used cells with 200-micrometer-thick silicon wafers, and reaching 8 cents a kilowatt will require 100-micrometer wafers. That this is technically possible has been established.
via Technology Review via FuturePundit

Read More...

Green Ghostbusters

Wired reports:

San Francisco-based startup Sustainable Spaces estimates that homeowners could reduce their energy costs by up to 50 percent by taking simple steps like fixing bad duct work and installing the right insulation. And since 21 percent of U.S. energy consumption goes into American homes, greater efficiency could help reduce the country's overall energy footprint, too.

The trick is finding the hidden leaks. The company's founders call themselves "green ghostbusters" because they're tracking down ghost power drains using a panoply of high-tech gadgets that would make the Sta-Puft Marshmallow Man jiggle with fear.
Neat company. Lots of cool gizmos they have for diagnosing problems. I like how they make money by reducing their customers energy costs.

While their founders call themselves "green ghostbusters", I think they could really play up the Ghostbusters side even more and give the company more personality, ala Geek Squad. Imagine if they had uniforms like the old Ghostbusters. Instead of a hearse, what about a PT Cruiser, possibly retrofitted as an electric vehicle? They could give all of their tools crazy sounding names. The report given to customers on the energy efficiency of the house could use Dan Akroydesque names for all the problems like a "class 5 full roaming vapor".

And then they could run ads Vern Fonk style mimicking the classic Ghostbuster one:
Are you troubled by strange rustling noises in the middle of the night?

Do you experience feelings of dread when you must go to your basement or attic?

Have you or anyone in your family been impacted by duct leakage, flue blockage, or mold spores?

If the answer is yes then don't wait another minute, pick up your phone and call the professionals.

Green Ghostbusters!

Our courteous and efficient staff is available at regular business hours to serve all your ghost power elimination needs.

We are ready to green you!
Now that be some real Green Ghostbusters.

Read More...

Friday, September 26, 2008

Macaques Monkeys

via National Geographic

Read More...

Interesting Articles of the Week

Nathan Myhrvold’s patent extortion fund is reaping hundreds of millions of dollars.

A new bank to save our infrastructure.

Could the termite’s stomach help us solve global warming?

4 ways the Fed could oversee the bailout.

Green idealists most likely to take long-haul flights.

Read More...

Research on Batteries for Transportation

The main impediment to electric vehicle adoption is battery technology. Here is a look at some research currently going on.

GM:

GM, said Dr. Abbas, is looking for about a three-fold increase in anode and cathode capacity over that provided by the conventional combination of carbonaceous anodes and layered oxide cathodes.

Two promising approaches GM Research is exploring on the anode side to reach this target are the use of silicon-coated carbon nanofiber; and the use of metal hydrides. On the cathode side, Abbas said that GM had developed a material with capacity close to the 3X target, but that he could not discuss that yet.
Toyota:
Watanabe was referring to what’s known as a metal-air battery, according to Toyota Executive Vice-President Masatami Takimoto. In this type of battery, electricity is generated by a reaction between oxygen in the air and a metal like zinc at the negative electrode. The battery does not require the use of a combustible liquid electrolyte, so there is no danger of ignition as is the case with lithium-ion batteries. Moreover, an air battery has over fives times the energy-storage capacity of a similarly-sized lithium-ion battery...It may take some time before air batteries reach the practical stage, but Toyota believes that they will ultimately become the next-generation battery technology of choice.
I don't know much about Metal-Air batteries, but this source says that they are cheap and have high energy storage, but are difficult to recharge. The idea might then be that you don't recharge your battery at home, but rather swap it for a new one at your local gasoline station.

China:
Dr. Jiqiang Wang of the Tianjin Institute of Power Sources (TIPS) provided an overview of the government-supported R&D projects for lithium-ion batteries for transportation, which are now focusing on two primary cathode materials: manganese spinel (LiMn2O4) and iron phosphate (LiFePO4).

Looking ahead for the next two or three years, said Dr. Wang, the government-supported 863 project will continue to support R&D on LiMn2O4, but will also start to support R&D on LiFePO4.
I have no idea which technique will end up being the best, but the more research being done by corporations and countries on batteries the quicker the day comes when electric vehicles overtake internal combustion engine vehicles.

Read More...

Carbon Footprint Size Varies

A recent University of Washington study found that when the same values were used with 10 different online calculators, the results varied greatly. In one category, the bottom line for a typical American homeowner varied by more than 32,800 pounds of carbon produced per year.

One reason for the wide range is that emissions from air travel, the couple's largest carbon producer, are often calculated very differently, said Clark Williams-Derry, research director at Sightline Institute, a nonprofit research center that studies carbon calculators.

Another part of the variation stems from the fact that different calculators include different behaviors in their calculations. For example, some ask for the amount of garbage a household produces each year, while others use the national average (1,606 pounds), and others don't include garbage at all.

Another reason for the differences is that most calculators use different internal numbers as conversion factors and standards to come up with their results, Steinemann said.

"There are so many different ways to calculate, using different variables, different standards and different assumptions," she said. "There's no one absolute right best number, so each calculator seems to use something different."

Web sites offering the calculators also often don't let the user see what those numbers are.

"The other problem is that individual calculators don't tell you the assumptions behind their calculations," Steinemann said. "Even if there isn't one standard calculator, they should at least be able to be transparent, so people know what's being included, what isn't being included, and what's not being calculated."
Given how complex it is to calculate a footprint and all the assumptions involved, I am not surprised that there is so much difference. I think the calculators do need to be more transparent and let you know exactly how they calculate their value and what assumptions were made.

Links to 10 carbon calculators in the article.

via Seattle PI

Read More...

Thursday, September 25, 2008

Markets and Fishing

There is a notion held by some on the right, that if government just would get out of the way then markets would function properly. That well function markets are the natural order and all government can do is mess them up with regulations. I disagree with this notion.

I agree with Tom Friedman who writes in his new book:

No, markets are like gardens. You have to intelligently design and fertilize them—with the right taxes, regulations, incentives, and disincentives—so they yield the good, healthy crops necessary for you to thrive.
I define well function market to mean one where the invisible hand is in effect and an individual pursuing his own self-interest will also promote the good of his community as a whole. While government over regulation can distort markets, under regulation can also cause markets to misfunction. The debate should not be about over regulation or under regulation, be whether if the regulations in place lead to a well functioning market.

Case in point is how to regulate the fishing market, so fishermen will maximize their own profits when they are also fishing safely and sustainably.

One such way to regulate fishing is for government to have a hands off policy and allow fishermen to take as much as they can. This leads to a tragedy of the commons scenario where fishermen invest in larger boats that can get as many fish as possible without regards for the future. The result is overfishing and collapse, as seen with the cod industry off of the Grand Banks.

A second way is for the government to set up catch limits and then close the fishing season as soon as those limits are hit. While this is better for sustainability, it makes fishing more dangerous as fishermen try and catch as many as they can in as few of days. There is also a big incentive for fishermen to try and catch more than they are allowed to.

A third way is for the government to set catch limits and privatize these rights (known as Individual Transferable Quotas (ITQs) or catch shares). Fishermen then own the right to a certain amount of the catch. A TED grant to the Environmental Defense Fund's Oceans Program allowed them to explore the impact of catch shares on the fishing market. Their results of their study was recently published in Science. The Economist looked at this study in two articles.

What was the impact of ITQs on sustainability and overfishing?
The overall finding was that fisheries that were managed with ITQs were half as likely to collapse as those that were not.

By giving fishermen a long-term interest in the health of the fishery, ITQs have transformed fishermen from rapacious predators into stewards and policemen of the resource. The tragedy of the commons is resolved when individuals own a defined (and guaranteed) share of a resource, a share that they can trade. This means that they can increase the amount of fish they catch not by using brute strength and fishing effort, but by buying additional shares or improving the fishery’s health and hence increasing its overall size.

In a report on this fishery, Dan Flavey, a fisherman himself, says some of his colleagues have even pushed for the quota to be reduced by 40%. “Most fishermen will now support cuts in quota because they feel guaranteed that in the future, when the stocks recover, they would be the ones to benefit,” he says.

Where mariners’ only thought was once to catch fish before the next man, they now want to catch fewer fish than they are allowed to—because conservation increases the value of the fishery and their share in it. The combined value of their quota has increased by 67%, to $492m.
By owning the right to the catch share, the fishermen have a financial incentive to see fish stocks increase, which aligns their interest with that of society as a whole. There is a notion held by some on the left that privatization of natural resources always leads to over exploitation. This study shows that privatization can actually lead to better sustainability.

What is the impact of ITQs on the fishing season, safety, the number of boats in the water and the profitability of the boats?
After a decade of using ITQs in the halibut fishery, the average fishing season now lasts for eight months. The number of search-and-rescue missions that are launched is down by more than 70% and deaths by 15%. And fish can be sold at the most lucrative time of year—and fresh, so that they fetch a better price.

With less gear in the water and less competition at specific times, individual boat yields rose by 75%.
ITQs allow for a longer season with less boats, more full time work and better profitability.

How are ITQs originally allocated?
In theory, for instance, you should allocate shares through auctions. But if fishermen do not agree to a new system, it will not work. So fishermen are typically just given their shares—which can lead to bitter, politicised arguments. In Australia, a pioneer in ITQs, a breakthrough came when independent allocation panels were set up to advise the fishing agencies, chaired by retired judges advised by fishing experts.
If fishermen are just given their shares, this is a government hand out to the fishermen. But, I would rather a few fishermen got unfairly rich than have a fishery collapse due to overfishing. Likewise, some might see it unfair that those that own the right can get rich as the value of the ITQ increases due to better management of the fishery (or that they can profit without even fishing, as a Alaskan crab ITQ owner can earn $243,600), but better that than the alternative.

As this example in fishing shows, it is the details of the regulations that determine how well the market functions. In this case, the fishing market functions best when the government takes the role of creating a catch share property right, allocating that right, determining the catch limit, and enforcing that limit (although other fishermen will now have an incentive to try and catch cheaters). If the government would "just get out of the way", the result would not be a better functioning market but rather a worse one. While over regulation can kill what makes markets valuable, so too can under regulation and wrong regulation.

Read More...

The Economist in Kindle Format

As loyal Fat Knowledge readers are aware of from my posts, I love me my The Economist. A few weeks back, for reasons unbeknownst to me, the post office decided not to deliver my weekly edition. Then, also unbeknownst to me, that copy was returned to The Economist and they decided to put my account on hold. After missing a 3rd week, I contacted The Economist and all the unbeknownsts became knownst to me. But, I was told that it would take another 2 weeks before I would get my next one in the mail.

Unable to live without my The Economist for multiple weeks, I created a way to download the contents of the weekly issue off of their website and then put it into a file that can be read on a Kindle. First, I created this .php file which downloads the entire content of the weekly edition from The Economist's website to a directory of files on my machine. Next, I used Mobipocket Creator to convert that directory into a Kindle friendly file. One advantage of this digital version is that it is available on Thursday, a day or two sooner than the physical version arrives in the mail. I also prefer the way I setup the table of contents to what I have seen on other Kindle newspapers and magazines.

Thinking that some others might also like to read The Economist on their Kindles as well, I share the results of this experiment with you: today's new 8/27/08 edition of The Economist in Kindle format. Just download the file and either email to your Kindle, or copy it over using the USB cable.

Truth be told though, as big of a Kindle and e-book fan as I am, I still prefer the dead tree edition of the magazine. Magazines have color and are easy to skim neither of which can be said for e-books. Magazines are great for reading on the go, easy to dog ear pages to read, and worry free if you spill your coffee on them. Unlike books which are heavy and don't stay open on their own, magazines are light weight and stay put. Unlike newspapers which are unwieldy to read on a bus, require a ton of paper and energy to print, and leak ink on your hands, magazines come in a convenient size that doesn't waste nearly as much paper and leave your hands ink free. While I am a full convert of e-books over paper books and newspapers, the magazine is still my one dead tree holdout.

Update: Commenter Jon gives instructions on how to set this up on a Windows machine to get the latest Economist each week. If you are fairly computer literate and are willing to spend a little time, I think you will be able to get it to work (see also Update 3 for another solution).

1. Download PHP. Visit http://us3.php.net/get/php-5.2.8-win32-installer.msi/from/a/mirror and click the yellow highlighted link near the top and run the file whose download should then start.

2. When asked what Web Server you would like to set up choose "Do not setup a web server" (the last option). Aside from this, just do the usual hitting next and install without changing anything. You'll get what you need.

3. Click on this link and save the file.

4. Navigate to where you saved the file, right click on it, and hit "Open with > Choose Program" then in that hit browse and navigate to C:\Program Files\PHP\php.exe and select this. Then hit the checkbox next to "Always use the selected program..." and hit OK.

5. Double click the file. A command prompt window should come up. Wait until it closes. Do not do anything to help it along, it will finish, but it might take 5 minutes or longer to download all the pages from the web.

6. Download Mobipocket Creator and install. Don't change anything during install. After installation open mobipocket creator. Then hit open at the top and navigate to the folder C:\Economist_[date] open this folder and open the file with a .opf extension. Then hit build at the top. Whew, that's it, there should now be a .prc file in that folder that you can transfer to and read off your kindle. Ignore build errors in making the .prc, they shouldn't matter.

7. Email the Economist_[date].prc file to your Kindle, or transfer it via the USB cable.
Customization: This requires touching the code which could be a bit scary for the less computer literate. Open the economist.php file that you created in step 3 with Wordpad. At the top there is a section called "Variables". 3 customizations are possible:

A) To save files to a different directtory change the $base_dir to another location like "C:\\Kindle Files\\".
B) Enter your username and password to access the site at $GLOBALS['loginEmail'] and $GLOBALS['loginPass'].
C) To create a version without images (much smaller file) set $with_images = false;
D) Get a back issue by setting $get_back_issue = false; and then setting the $back_date and $back_date2 variables to the date of the issue you want.

Update 2: The Economist website changed their formatting of URLs on 2/19/09. I have updated the .php file to fix this. Just redownload and overwrite the old version (step 3 above).

Update 3: This MobileRead thread alerts me to the existence of Calibre. This open source software will also allow you to download the latest Economist (as well as many other magazines and newspapers) in Kindle format. It has a graphical user interface, so it is much easier to use. Just make sure to set the output type to MOBI. Personally I lke my format and Table of Contents better, but then again I should as I designed it for an audience of one (though this guy gives my formatting props as well). :)

Update 4: The Economist has stopped the free access to the weekly addition. You now need a username/password to access it. Fortunately Nic has updated his script to allow you to enter that information. I have updated my script as well with this new code. Thanks Nic!

Read More...

A Neural Link Between Intelligence and Self-Control

If you had a choice between receiving $1,000 right now or $4,000 ten years from now, which would you pick? Psychologists use the term “delay discounting” to describe our inability to resist the temptation of a smaller immediate reward in lieu of receiving a larger reward at a later date. Discounting future rewards too much is a form of impulsivity, and an important way in which we can neglect to exert self-control.

"It has been known for some time that intelligence and self-control are related, but we didn't know why. Our study implicates the function of a specific brain structure, the anterior prefrontal cortex, which is one of the last brain structures to fully mature,” said Dr. Shamosh.

In this study, 103 healthy adults were presented with a delay discounting task to assess self-control: a series of hypothetical choices where they had to choose between two financial rewards, a smaller one which they would receive immediately or another, larger reward which would be received at a later time. The participants then underwent a variety of tests of intelligence and short term memory. On another day, subjects’ brain activity was measured using fMRI, while they performed additional short-term memory tasks.

The results show that participants with the greatest activation in the brain region known as the anterior prefrontal cortex also scored the highest on intelligence tests and exhibited the best self-control during the financial reward test. This was the only brain region to show this relation. The results appear in the September issue of Psychological Science, a journal of the Association for Psychological Science.

Previous studies have shown that the anterior prefrontal cortex plays a role in integrating a variety of information. The authors suggest that greater activity in the anterior prefrontal cortex helps people not only to manage complex problems, resulting in higher intelligence, but also aids in dealing with simultaneous goals, leading to better self-control.
via Psychological Science

Read More...

Wednesday, September 24, 2008

Project 10^100

Google launches Project 10^100.

Never in history have so many people had so much information, so many tools at their disposal, so many ways of making good ideas come to life. Yet at the same time, so many people, of all walks of life, could use so much help, in both little ways and big.

In the midst of this, new studies are reinforcing the simple wisdom that beyond a certain very basic level of material wealth, the only thing that increases individual happiness over time is helping other people.

In other words, helping helps everybody, helper and helped alike.

The question is: what would help? And help most?

At Google, we don't believe we have the answers, but we do believe the answers are out there.

We're committing $10 million to implement these projects, and our goal is to help as many people as possible. So remember, money may provide a jumpstart, but the idea is the thing.
Deliciously Googlesque. I am curious to see how it turns out. Heck, I might have to dig into the Fat Knowledge treasure trove of good ideas and submit one myself.

Read More...

Open-Science

The Boston Globe writes about the open-science movement.

Openness has always been an integral part of science, with scientists presenting findings in journals or at conferences. But the open-science movement, with many of its leaders in the Boston area, encourages scientists to share techniques and even their work long before they are ready to present results, when they are devising research questions, running experiments, and analyzing data. In such open forums, the wisdom of the crowd could offer the ultimate form of peer review. And scientific information, they say, should be available without the hefty subscription fees charged by most journals.

It is an attempt to bring the kind of revolutionary and disruptive change to the laboratory that the Internet has already wrought on the music and print media industries. The idea is that opening up science could speed discoveries, increase collaboration, and transform the field in unforeseen ways.
It is kind of sad that scientific research isn't very well accessible on the internet right now, and that the scientific community hasn't embraced internet tools and techniques to make science go faster. But, there are a few examples of where this is occurring now.
For example, OpenWetWare.org started out in 2005 as Endipedia, a website that scientists in Drew Endy and Tom Knight's labs at MIT used to share information. But today the website is backed by a National Science Foundation grant, and more than 4,000 biologists and bioengineers from across the world have signed up to share techniques, get practical tips, and even detail their day-to-day work if they choose.
The Economist mentions Research Blogging and Nature Network as two sites where scientists are blogging and discussing peer-reviewed science.

Besides just allowing for new ways of distributing scientific information, open-science allows for new ways for scientific research and analysis to be conducted. Bruce G. Charlton writes about amateur internet researchers who participate by publishing on blogs and posting their data online:
At the other extreme ‘quant bloggers’ are publishing real science with their personal identity shielded by pseudonyms and writing from internet addresses that give no indication of their location or professional affiliation. Yet the paradox is that while named high status scientists from famous institutions are operating with suspect integrity (e.g. covertly acting as figureheads) and minimal accountability (i.e. failing to respond to substantive criticism); pseudonymous bloggers – of mostly unknown identity, unknown education or training, and unknown address – are publishing interesting work and interacting with their critics on the internet. And at the same time as ‘official’ and professional science is increasingly timid careerist and dull; the self-organized, amateur realm of science blogs displays curiosity, scientific motivation, accountability, responsibility – and often considerable flair and skill. Quant bloggers and other internet scientists are, however, usually dependent on professional scientists to generate databases.
Beyond having articles available on the internet, open-science is also about making data available over the internet so other researchers can easily access it. Having this data easily available makes it possible for these amateur researchers to exist.

Journals currently hold the roll of determining what the best scientific research is, but this could be replaced by a scientific Digg site, where researchers could digg the research articles they find most interesting. Those with the most diggs would show up on the "front page" which would be equivalent to being published in the best journals. Instead of a selected few on the journal selection committee determining which are the best papers, the scientific community as a whole would get to decide. A scientific Digg site could also incorporate Digg's comment section, allowing scientists to give feedback on articles immediately and making the back and forth of researchers open to all.

Given the advantages to open-science, what is the holdup?
More broadly, the entire system of credit in science is based on being the first to publish a finding in a reputable journal; there's no incentive to post on blogs or community websites. Scientists try to get their findings published in the top journals in their fields, and major scientific prizes are awarded to those who make breakthroughs.
The way scientists are rewarded is going to need to change for open-science to take hold. But there is no reason for that not to happen. Rewards, such as tenure, should be based on who is making the biggest contribution to science, and there is no reason not to add blog posts and other online contributions into the calculation. You should get credit not just for publishing in journals but also for making information and data publicly available fast.

Another way to make the change is if those giving the grant money require researchers to share.
The Gates Foundation, for example, is now offering millions for malaria research, and it's contingent on the researchers making it available to share. Sharing maximizes the return on investment in early-stage research.
The open-science movement will make science more accessible, more integrated with the internet, allow new way for science to be conducted and speed up the rate of discovery. Hopefully the scientific community will embrace it quickly.

Read More...

Ninja Cat Comes Closer While Not Moving

via YouTube via Digg

Read More...

Green Windows

Two new technologies for making windows more energy efficient.

First, electrochromic glass:

Electrochromic glass, with changeable opacity, is one new avenue of exploration. Some of the most promising is produced by a firm called Sage Electrochromics, based in Minnesota. Its product, which consists of sheets of glass with a metal-oxide coating, was first used in skylights in 2003. When a voltage is applied across the coating, the window darkens, allowing less light to enter but still permitting a view. America’s Department of Energy (which developed low-e coatings in the 1970s) has used this glass, along with an insulated sash, to develop a “zero-energy” window that saves more energy in reduced heating and lighting than it needs to operate.

John Van Dine, Sage’s founder, says his company is about to invest in a new production facility that should reduce costs enough to make electrochromic windows competitive for homes by 2010. The company is also working on continuously variable darkening (current models have only two settings) and windows that are powered entirely by self-contained solar cells.
It would be cool to be able to get rid of the drapes and replace them with glass that can block out the sun.

Second, vacuum-insulated panes:
An even bigger leap may come from refining an older idea. Sheets of glass separated by a vacuum could bring windows’ insulating properties up to par with insulated walls, yet allow them to be nearly as thin as single panes of glass. The idea of vacuum-insulated panes has been around for nearly a century, and NSG/Pilkington, a Japanese firm that is one of the biggest glassmakers in the world, has sold such panes since 1996. But they remain a technical challenge: the difference in temperature causes the inner and outer sheets of glass to expand by different amounts, so that NSG/Pilkington’s windows can be used only where the temperature difference is less than 35ºC, which rules out many homes in need of insulation.

David Stark thinks he has a solution. His company, EverSealed Windows, based in Colorado, has patented a metal baffle bonded to both sheets of glass that allows them to expand and contract separately, while maintaining a vacuum that he says will last for decades. Mr Stark says vacuum-insulated windows at competitive prices could be on the market in three years. This could precipitate a big shift in the window industry, and its focus on spacers, adhesives and sealants: “All of that goes out the window,” he says.
According to their website, 12% of all energy in the US is lost out of windows. If they could create a window that insulates as well as a wall at a competitive price, that would be a huge energy savings.

via The Economist

Read More...

Carsharing 2.0

What would a commuter carpooling service that actually tapped into the real-time transparency and flexibility of the Internet look like? Well, a lot like high-tech hitch-hiking, and possibly a lot more popular and effective at getting single occupancy vehicles out of morning traffic. At least that’s the idea behind Avego (Update: formerly called Sharelift according to the DEMO info), a service officially being launched at the DEMO convention in San Diego on Monday.

The service, developed by Cork, Ireland-based company Mapflow, uses a cell phone or connected-gadget placed in commuter vehicles to pull satellite navigation info and car info via a wireless connection to develop a next-generation public transportation system.

Update: Mapflow executive chairman Sean O’Sullivan tells us that the company will be focusing on its iPhone app for the DEMO launch. Mapflow calls it “shared transport,” but to us it looks a lot like carpooling brought into the always-on Internet age. The service will largely target commuters, but also could be used for taxi and other public transportation systems.

The passengers will pay on a per mile basis, a fraction of the IRS rate per mile (IRS rate is 58.5 cents per mile, the service would cost about 30 cents per mile, or 5 to 6 times cheaper than a taxi). Mapflow directs the vast bulk of the revenue directly to the driver (85 percent of the fare)… the remainder goes to cover OTA data communication charges, SMS charges, IT infrastructure, finance charges (Visa/MC/Paypal), and our operating, marketing & R&D expenses.
Video description of the service here.

Cool concept to use an iPhone to turn your car into an on-demand taxi or to be able to get a ride from another person for much less than the cost of a taxi ride.

Unfortunately, I think they have a chicken and egg problem here. As a rider, I would only use it if I had a very good chance of being able to get a ride within 15 minutes of when I wanted to leave and would be dropped off right where I wanted to go. Likewise, as a driver, I wouldn't want to go out of my way that much to pick someone up or drop them off. Unless a huge percentage of the population is using this service all the time, I just don't see how those two requirements would match very often.

I also wonder if this service would work better as a free service, where drivers give rides to others solely because it makes them feel good to help others out, and that they would expect someone would give them a free ride when they needed one. I am doubtful that the payments will be large enough to convince people to go out of their way to pick others up if they value their time at more than $5/hr.

I am not sure if this particular service will succeed, but the concept of combining GPS cellphones with car sharing and other forms of transportation seems like a winner.

via Earth2Tech

Update: Looks like this is actually a crowded field, as I just found two other carsharing applications out there.

First, from TreeHugger:
Carticipate is the first location-based mobile social network application for ride sharing, ride combining, and car pooling on a mobile platform -- specifically, the iPhone. The download is free and allows users to indicate where they are going and when, broadcast this information, and allow others in the same area (with iPhones) going in the same direction to find each other.
Second, from Earth2Tech:
Ecorio also provides alternatives to driving, using Google Transit to recommend public transportation and even integrates with Zimride, the large Facebook carpooling program, to locate carpools or allow a user to offer up a new carpool.

Read More...

Tuesday, September 23, 2008

Don’t Buy That Textbook, Download It Free

For my dream of one e-book for student to come true, cheap (or free) digital text books must become widely available.

The NY Times profiles one author who has bought into distributing a digital version of his textbook for free.

In protest of what he says are textbooks’ intolerably high prices — and the dumbing down of their content to appeal to the widest possible market — Professor McAfee has put his introductory economics textbook online free. He says he most likely could have earned a $100,000 advance on the book had he gone the traditional publishing route, and it would have had a list price approaching $200.

While still on the periphery of the academic world, his volume, “Introduction to Economic Analysis,” is being used at some colleges, including Harvard and Claremont-McKenna, a private liberal arts college in Claremont, Calif..

For the textbook makers, however, it is a different story. Professor McAfee allows anyone to download a Word file or PDF of his book, while also taking advantage of the growing marketplace for print on demand.

In true economist fashion, he has allowed two companies, Lulu and Flat World Knowledge, to sell print versions of his textbook, with Lulu charging $11 and Flat World anywhere from $19.95 to $59.95. As he said on his Web site, he is keeping the multiple options to “further constrain their ability to engage in monopoly pricing.”
They also mention a couple of other sites for finding free textbooks:
A broader effort to publish free textbooks is called Connexions, which was the brainchild of Richard G. Baraniuk, an engineering professor at Rice University, which has received $6 million from the William and Flora Hewlett Foundation. In addition to being a repository for textbooks covering a wide range of subjects and educational levels, its ethic is taken from the digital music world, he said — rip, burn and mash.

Unlike other projects that share course materials, notably OpenCourseWare at M.I.T., Connexions uses broader Creative Commons license allowing students and teachers to rewrite and edit material as long as the originator is credited. Teachers put up material, called “modules,” and then mix and match their work with others’ to create a collection of material for students. “We are changing textbook publishing from a pipeline to an ecosystem,” he said.
The textbook publishers are also making some of their books available digitally, but they are making them quite expensive.
While these open-source projects slowly grow, the textbook publishers have entered the online publishing field with CourseSmart, a service owned by five publishers. In service for only a year, CourseSmart allows students to subscribe to a textbook and read it online, with the option of highlighting and printing out portions of it at a time.

The price is generally half of what a print book costs, a sum that can still appear staggering — an introductory economics textbook costs around $90 online. (This semester, a student has the option of downloading a book as well — but it is an either-or choice: read online or download to a computer.)
Wired finds a business that gives the book away for free but sells "accessories":
Flat World's business plan aims to exploit the inefficiencies: Its books are online and free. Instead of charging for content it aims to make money by wrapping content up in "convenient" downloadable and print wrappers and selling those, along with study aides and related items.

Enhancing the value of the online versions is the open source component. Students can annotate and comment in the digital margins of Flat World's texts to share their insights, analysis and conclusions with other students.
And if companies don't supply digital textbooks, the NY Times finds that students (shockingly!) just pirate them.
After scanning his textbooks and making them available to anyone to download free, a contributor at the file-sharing site PirateBay.org composed a colorful message for “all publishers” of college textbooks, warning them that “myself and all other students are tired of getting” ripped off.

Consider the cost of a legitimate copy of one of the textbooks listed at the Pirate Bay, John E. McMurry’s “Organic Chemistry.” A new copy has a list price of $209.95; discounted, it’s about $150; used copies run $110 and up. To many students, those prices are outrageous, set by profit-engorged corporations (and assisted by callous professors, who choose which texts are required). Helping themselves to gratis pirated copies may seem natural, especially when hard drives are loaded with lots of other products picked up free.
Update: Is Pirate Bay co-founder plotting e-book plunder?

Read More...

Home, Green Home

A green-home boom is getting under way, thanks to rising energy prices, new standards (the European Union’s Energy Performance of Buildings Directive, Britain’s Code for Sustainable Homes and California’s Green Building Standards Code, to name three recent examples), and improved technologies. Many of these technologies have been around for a while, but they are now ready for the mainstream. In 2007 McGraw Hill Construction, a research firm, reported that 40% of all renovation in America included some green features, mostly windows and heating/cooling systems. The company predicts green homes will account for 10% of all building starts in America by 2010, with new green homes worth $20 billion in that year. When housing arises from its torpor, it could find itself transformed.

Greenery can be hard to define, so the emergence of credible certifiers is clarifying things. The most stringent form of certification is the German Passivhaus standard, which applies to buildings that reduce their energy requirements so dramatically—by 90% compared with standard construction—that they can forgo heating and cooling systems. In 2007 the United States Green Building Council released a version of its LEED green-building standard that applies to homes. (Mr Rogers’s home is certified as LEED Platinum, the highest standard available.) And for more than a decade, America’s Environmental Protection Agency has offered Energy Star for Homes, a label indicating specific features meant to reduce energy use. Sam Rashkin, the programme’s director, expects nearly 1m American homes to carry the label by the end of this year.

Market forces are at play, too. Mr Rogers says his 2,100-square-foot house—just under the average for new homes in America—cost $75,000, or 23%, more to build impeccably green than it would have otherwise. He received $35,000 in rebates and incentives from state and federal governments, and he expects to recoup the rest in avoided energy costs within five years. Given the recent spike in energy prices, it may not take that long.
If the additional cost of going green is repaid in 5 years, I don't know why anyone wouldn't do it. If you get a 30 year mortgage, any green features that have a payback period of under 30 years should make economic sense (you save more in energy costs each month than the additional amount you pay on your mortgage).

I wonder if the government mandated that home sellers had to specify the expected energy costs to heat and cool the home, just like car sellers must specify the MPG of their cars, if this would make people aware of the cost difference between homes and see the financial benefit of going green.

Lots of interesting new green home technologies discussed in the article.

via The Economist

Read More...

Work Walking

Terri Krivosha, a partner at a Minneapolis law firm, logs three miles each workday on a treadmill without leaving her desk. She finds it easier to exercise while she types than to attend aerobics classes at the crack of dawn.

Without breaking a sweat, the so-called work-walker can burn an estimated 100 to 130 calories an hour at speeds slower than two miles an hour, Mayo research shows.
I liked this idea when I first heard about it 2 years ago more as a way to keep my mind focused than as a way to lose weight. Glad to hear that it is starting to catch on.

If you want to learn more about work walking, there is lots of information in the blogosphere.
Mr. Stirt’s site, www.bookofjoe.com/2007/10/treadmill-works.html, is one of some dozen work-walking blogs, including www.treadmill-desk.com and treadmill-workstation.com.

There is even a burgeoning social network (officewalkers.ning.com), with around 30 members, that Mr. Rhoads started in March.
via NY Times via Life Hacker

Read More...

Science Commons

Enter John Wilbanks, executive director of the Science Commons initiative, and the six-year-old innovation of its parent organization, Creative Commons—an intelligent, understandable copyright that's revolutionizing how everything from photos to publications are shared. Wilbanks and his team (which includes Nobel Prize winners Joshua Lederberg and John Sulston) are focused on three areas where roadblocks to scientific discovery are most common: in accessing literature, obtaining materials, and sharing data.

In June, Science Commons introduced a set of tools to allow authors greater control over papers published in scientific journals. And this week, Science Commons is expanding its Neurocommons project with the launch of an open-source research platform for brain studies. By using text-mining tools and analysis software to annotate millions of neurology papers, researchers worldwide can find relevant information in a matter of minutes.
I am a fan of the Creative Commons initiative, and I am glad they are now entering the scientific space.

How does John Wilbanks want to change journals to make scholarly articles more accessible?
Scholarly literature would be available for free because the peer-review charges would be paid as part of the cost of research instead of through subscription models, and the annotations or comments that had been made on any given paper would be readily available. The idea is that we would do things in science that we already do everyday in other fields with ease. Science ought to be like this, but it isn't.

Currently, journal articles, data, research, materials and so on are stopped by contracts and copyrights at such a rate that it's become nearly impossible to pull them together. The estimated utility half-life of a scientific paper is 15 years, but the copyright lasts until 70 years after the author's death. It's hard to get data sets shared, and the basic elements of the commercial Web (like eBay, Amazon and Google) function poorly, if at all, inside the sciences. The knowledge simply isn't moving as easily as it should, and transactions are slow on a good day, non-existent on a bad one.

Admittedly, right now the traditional for-profit publishing companies don't have a strong incentive to change. These publishers are making as much as a 35 percent profit, and in the absence of prodding from the scientific and research communities they're not going to change. But over the long term, people will get frustrated that we can easily find everything we need recreationally online but we can't do it for science. Google doesn't work as well for finding science as it does for finding pizza, and that's a shame. Open access isn't just about getting a scientist access to a file. It's the best thing for science because it allows all the smart people in the world to start hacking on the scientific literature and applying tools like text mining, collaborative filtering and more. Right now, all that content is basically dark to most of the smart people on Earth.
It is crazy to me that some journals have a 35% profit. The open source software movement was created to mimic the way scientific research was conducted, where information is freely available to all. How ironic is it then that academic journals have become some of the most profitable businesses in the world and open source concepts need to be brought back to the scientific community?

I always find it strange that when I am researching a topic, Google ends up taking me to a lot of blogs and very few academic studies. Not that there is anything wrong with blogs :), just that I am sure there are academic studies that have the exact answer to what I am looking for, and Google rarely takes me to them. Instead of getting well thought out, heavily researched and peer-reviewed information, I end up with some blogger's back of the envelope calculation.

And in the few instances when Google (or Google Scholar) does find an academic study, I am directed to a website that requires payment for access. And not something reasonable like $1-5 for a 3 page article. No, they want something crazy like $25-100 for a single article. If I was sure the article had the information I was looking for, it might be worth that, but I am typically too scared that it won't be there and I pass. Instead, I spend 10-15 minutes Googling the authors' names to see if they happened to put a free version of the report up on their personal websites. Sometimes I get lucky, sometimes not. But, it is always a huge waste of time.

Making academic studies freely available and searchable via Google would be immensely valuable and so I hope Science Commons is successful in their endeavor.

via Popular Science

Read More...

Monday, September 22, 2008

Where the Book Readers Live

Libraries are especially thriving in the conservative rural heartlands. The average Wyoming resident checked out nine books in 2005-06, compared with an average of five in California and two in Washington, DC.
Who knew?

via The Economist

Read More...

iRex Digital Reader

iRex released a new 10.2" e-book today. Engadget has the recap:

The new iRex Digital Reader is pretty much what we expected: a 10.2-inch, Lithium Ion battery-powered, black and white e-ink device that still leaves us hungry for that snazzy reader that Plastic Logic has coming down the pipe. Geared towards business users, prices start at a hefty $649 for the Digital Reader 1000, and if you want a stylus thrown in -- you know, something else to lose -- be prepared to spend $749 on the 1000S. Still, the big daddy 1000SW -- with WiFi, Bluetooth and that 3G data connectivity -- adds some new functionality that will be welcome, though it's hard to say who's breaking off $849 for those aforementioned features.
This device is the first publicly available e-book that I know of that is large enough to display a page of a .pdf file in its entirety. This Mobile Read thread has more details on the device including that it has a 1024 x 1280 pixel, 160dpi screen and 1 GB of storage. They also say it has a 24 hour battery life, but it is not clear what exactly that means. If you use it 1 hour a day, will it last for 24 days? I thought that e-ink devices were limited by the number of page turns rather than an amount of time.

While the price is high, it is following my adapation curve for a large e-book by marketing it at professionals who can justify the expenditure with a small increase in productivity. From iRex's press release:
“Tax specialists, accountants and lawyers that previously had thick piles of documents can carry them in their digital reader; students and academics can easily save their textbooks in the device.” Says Brons.

“Government and public sector organisations can make minutes and reports available electronically whilst medical specialists can have all their patient information and key texts at their fingertips. Plus, in addition to their professional documents they can also have their e‐books and newspapers available.”
As the price lowers in the future, expect students and book lovers to purchase them as well.

I would like to know how long it takes to refresh the screen on a page turn and whether Kindle books can be hacked to be displayed on it.

More images of the device here and more information in the Mobile Read iRex Digital Reader forum. Available for purchase in the US on October 1st at eReader Outfitters.

Read More...

The Danish Model of Energy Independence

Unlike America, Denmark, which was so badly hammered by the 1973 Arab oil embargo that it banned all Sunday driving for a while, responded to that crisis in such a sustained, focused and systematic way that today it is energy independent.

What was the trick? To be sure, Denmark is much smaller than us and was lucky to discover some oil in the North Sea. But despite that, Danes imposed on themselves a set of gasoline taxes, CO2 taxes and building-and-appliance efficiency standards that allowed them to grow their economy — while barely growing their energy consumption — and gave birth to a Danish clean-power industry that is one of the most competitive in the world today. Denmark today gets nearly 20 percent of its electricity from wind. America? About 1 percent.

And did Danes suffer from their government shaping the market with energy taxes to stimulate innovations in clean power? In one word, said Connie Hedegaard, Denmark’s minister of climate and energy: “No.” It just forced them to innovate more — like the way Danes recycle waste heat from their coal-fired power plants and use it for home heating and hot water, or the way they incinerate their trash in central stations to provide home heating.

There is little whining here about Denmark having $10-a-gallon gasoline because of high energy taxes. The shaping of the market with high energy standards and taxes on fossil fuels by the Danish government has actually had “a positive impact on job creation,” added Hedegaard. “For example, the wind industry — it was nothing in the 1970s. Today, one-third of all terrestrial wind turbines in the world come from Denmark.” In the last 10 years, Denmark’s exports of energy efficiency products have tripled. Energy technology exports rose 8 percent in 2007 to more than $10.5 billion in 2006, compared with a 2 percent rise in 2007 for Danish exports as a whole.

“It is one of our fastest-growing export areas,” said Hedegaard. It is one reason that unemployment in Denmark today is 1.6 percent. In 1973, said Hedegaard, “we got 99 percent of our energy from the Middle East. Today it is zero.”

After appointments here in Copenhagen, I was riding in a car back to my hotel at the 6 p.m. rush hour. And boy, you knew it was rush hour because 50 percent of the traffic in every intersection was bicycles. That is roughly the percentage of Danes who use two-wheelers to go to and from work or school every day here. If I lived in a city that had dedicated bike lanes everywhere, including one to the airport, I’d go to work that way, too. It means less traffic, less pollution and less obesity.
Some wonder if it is possible to have high energy taxes, high government imposed efficiency standards, and produce a substantial amount of energy from renewable sources while at the same time having a strong economy with low unemployment and a high standard of living. Denmark shows it is possible. They are the best model of where America needs to go and how to do it.

via NY Times

Read More...

The Holes in Our Genomes

Over the past two years, scientists have made a surprising discovery about our DNA. Like a book with torn pages, duplicate chapters, or upside-down paragraphs, everyone's genome is riddled with large mistakes. These "copy number variations" can include deletions, duplications, and rearrangements of stretches of DNA ranging in size from one thousand to one million base pairs. New tools to screen for such mistakes, described this month in Nature Genetics, should generate a more complete picture of the genetic root of common diseases.

Over the past few years, advances in gene microarray technologies, which can quickly survey large volumes of DNA, have allowed scientists to screen more human genomes than ever before, resulting in a flood of information linking specific genes to disease. Most of these studies begin by looking for single-letter changes in the DNA code, called single-nucleotide polymorphisms, or SNPs (pronounced "snips").

Scientists have now adapted these microarrays to identify both small SNP changes and copy number variations, which they hope will help them identify a larger fraction of disease-causing genes. In one of the papers in Nature Genetics, David Altshuler, a physician and scientist at the Broad Institute, in Cambridge, MA, and his collaborators described the design of such a chip, in collaboration with genomics instrument maker Affymetrix, which they then used to map this kind of variation.
I just wrote about how the lack of testing for copy number variations was a serious problem for personal genome sequencer 23 and Me. Hopefully they will be able to integrate these new tests in their next release.

via Technology Review

Read More...

Sunday, September 21, 2008

US High School Graduation Rate Peaked 40 Years Ago

While the National Center for Educational Statistics reports that the rate of US high school graduation has been steadily increasing, according to a new report by James J. Heckman and Paul A. LaFontaine, it has actually declined for the last 40 years. It peaked at 80% for those born in 1950 and the most current data shows a graduation rate of just 75% (click on the graph above for a larger version).

The report details all the reason for the differences in the numbers, but the key one is that the NCES counts those who get GEDs as graduating while Heckman and LaFontaine don't. They exclude GEDs because:

Although GED recipients have the same measured academic ability as high school graduates who do not attend college, they have the economic and social outcomes of otherwise similar dropouts without certification.
Education is getting almost no attention in this election, but if the US wants to remain competitive in the 21st century it is the most important topic. If the graduation rate is going down, it deserves much more emphasis.

The report also looks at some other issues as described in the abstract:
Correcting for important biases that plague previous calculations, we establish that (a) the true high school graduation rate is substantially lower than the official rate issued by the National Center for Educational Statistics; (b) it has been declining over the past 40 years; (c) majority/minority graduation rate differentials are substantial and have not converged over the past 35 years; (d) the decline in high school graduation rates occurs among native populations and is not solely a consequence of increasing proportions of immigrants and minorities in American society; (e) the decline in high school graduation explains part of the recent slowdown in college attendance; and (f) the pattern of the decline of high school graduation rates by gender helps to explain the recent increase in male-female college attendance gaps.
The other statistic I found interesting is that the rate of college graduation peaked for males born in 1950 at 25% and has held steady at 20-25% from then on, while the rate for females born in 1950 was 15% and has steadily gained to 35% today.

Read More...

SoCal Edison’s $1.63B Smart Meter Plan Gets OK

California utility Southern California Edison has been slowly laying down the final details for one of the largest smart meter deployments in the U.S.: 5.3 million smart meters to be installed between 2009 and 2012. California’s regulatory body for all things power-related, the California Public Utilities Commission (CPUC), said on Thursday that it had approved $1.63 billion in funding from rate payers for the third deployment phase of the program.

While SCE had sought approval of that funding over a year ago, the CPUC had to determine that paying for the program with rate payers funds — which could mean an initial increase of 1.5 percent on customer energy bills — would also benefit the ratepayers themselves. CPUC says in its release that it has “determined that the project offers between $9 million and $304 million in net benefits to consumers.” And that doesn’t begin to cover what consumer can ultimately save on their bills with real time pricing, what the utility can save from energy conservation, and the overall carbon emissions reductions.

The CPUC’s approval actually adopted the decision of a settlement between the SCE and the CPUC’s rate payer advocacy group Division of Ratepayer Advocates (which acts as an independent body). That settlement said that SCE’s smart meter program could reasonably generate $1.17 billion in operational benefits and $816 million in energy conservation benefits, and determined the $9 million baseline net benefit to consumers.

SCE’s smart meter rollout will enable the utility to offer close to real-time pricing as well as thermostats and appliances that can respond to the needs of the power grid. By implementing demand-response technology, SCE can not only save money from not having to add more power generation, but can make the grid more stable by shifting loads during peak times.

I can't wait until my power company installs smart meter technology. I hope this allows a socket by socket breakdown of electricity usage so you can determine exactly how much power each device in your house uses.

The previous decade was about rolling out broadband internet, the next decade will be about rolling out smart grid and smart meter technology.

via Earth2Tech

Read More...

Saturday, September 20, 2008

Electoral Projections at FiveThirtyEight.com

While there are a ton of polls and election projections on the presidential race, my favorite spot to take a look at who is winning and by how much is FiveThirtyEight.com.

I like the way they take all the polls, massage them and then run 10,000 Monte Carlo simulations to come up with their prediction of who is likely to win (more on the methodology here).

The image to the left is the result for 9/19. If I am interpreting correctly, Obama is 71.5% likely to win this election based on the polls and how well each pollster was able to predict previous elections. Or put another way, previous candidates who had Obama's the same lead in polls at this point in election cycle won 71.5% of the time.

What is interesting to me is that while the popular vote is so close (1.8% difference), the electoral vote is not nearly as close (56.3% to 43.7%), and McCain is only 28.6% likely to win. Part of the reason for Obama's high win percentage is that he is much more likely than McCain to win the electoral college and lose the popular vote.

If you look at the their breakdown of states by regions, there are really only two regions that are competitive: the Rust Belt and the Southwest (I am excluding the South Coast region with Florida and Virginia, because I am skeptical that Obama really has a shot there). I hate the fact that the Rust Belt states are battlegrounds yet again, as they appear to vote solely on which candidate is more likely to keep shitty jobs from going overseas. I am glad that the Southwest also has battleground states now, as these states are forward looking and want to discuss how the US needs to change to be successful in 2050, unlike the Rust Belt states that want to change the US to return to 1950.

The site is worth checking out if you are curious how the election is likely to turn out.

Read More...

Interesting Articles of the Week

10 everyday technologies that can change the world.

City uses DNA to fight dog poop.

Google and GE join up to tackle energy policy & tech.

Diamond and Kashyap on the recent financial upheavals.

At least six Brazilian politicians have officially renamed themselves "Barack Obama".

Read More...

Hypermiling iPermiling

Thanks to Hunter Research and Technology, budding hypermilers can become iPermilers with an iPhone app that costs less than three gallons of gas. GreenMeter uses the iPhone's built-in accelerometer to tell drivers, in real time, how much their lead-footed ways are costing them in carbon emissions and fuel economy. His new greenMeter app builds upon the gMeter app that works like a poor man's dynamometer.

Don't expect to hop in your car and turn on the greenMeter. You've gotta get a little data together beforehand, including vehicle weight, engine efficiency, current weather conditions, rolling resistance and your car's drag coefficient. Once that's all entered, you've got to balance your iPhone (or iPod Touch) on a sturdy, level part of your car's dashboard or console to recalibrate the accelerometer. We probably shouldn't have to say this, but don't do this while driving. In fact, once it's set up, don't look at any of the numbers while driving. As Hunter says on his website, the colors on the screen will be all the information you need to know.
I wrote recently on eco-driving and this iPhone application would be a great way of letting drivers know the fuel economy and carbon emissions of their driving. Being able to see in real time how you are doing is the best way to teach people how to improve their driving technique, and $6 for the information (assuming you own an iPhone) is a bargain.

While the concept is good, I think it could be easily improved to make it much more valuable. First, it could better display just the pertinent information while you are driving, so you can better see the impact of your driving style. I like the idea of using colors, but the screen at left looks cluttered with other information.

Second, instead of having to determine your car's vehicle weight, engine efficiency, current weather conditions, rolling resistance and drag coefficient, there should be a database of that information for all cars and then the end user can just select their car.

Third, if it could follow your progress and tell you how much you are improving your eco-driving skills, that would make it much more enjoyable for users. If it could be turned into a game where the more efficiently you drive the more points you get, that would be even better. Add to it a way for people to upload their scores and compete trying to out eco-drive each other and I think you have a winner.

via Wired

Read More...

Berkeley OKs City-Backed Solar Loans

The Berkeley City Council unanimously approved a solar loan program late Tuesday night that will see the city loan homeowners the upfront cost of solar panels and recoup the cost across a 20-year property tax assessment.

The program solves a number of problems that have been holding residential solar back. First and foremost, it removes the high upfront cost of going solar with a large, lump-sum loan, around $20,000. And it goes even further, by attaching the loan to the property title — which means that if a homeowner moves before paying off the loan, they can easily pass the payments along to the new owner. Residential solar installers, on the other hand, often require customers to pay an extra fee to break their contract should they move.

The next step for the city is to secure a financier to front the money on the loans, and given the state of institutional mortgage houses, that could prove difficult. But city reps are optimistic: “I would argue that this is very, very secure debt,” City Deputy Christin Daniel told the New York Times. The city will initially seek $1.5 million for a pilot program of about 50 homes before scaling up to a program financing hundreds of solar systems.

The Mayor’s Sustainability Advisor Nils Moe says the pilot program should start by the end of October or beginning of November and if things go well the second, much larger phase could start at the new year. While the focus of the pilot program will be on photovoltaics, Moe says the second phase could include solar thermal as well as energy efficiency technologies.
I think this model makes a lot of sense. While many alternative energy projects have a high enough return on investment to make economic sense, customers aren't willing to do them because they expect a very short payback period. Berkeley's system solves that problem by using a 20 year loan to pay for the solar panels. I am curious what interest rate they will charge, but with California's high electricity prices I bet that the solar panels will save more in electricity costs than the yearly loan payment.

I would also like to better understand the pros and cons of the loan being given by a private solar installers vs. the local government. The ability to attach the loan to the property title seems like a big advantage for local governments. I also wonder what happens if the home goes into foreclosure. I guess even if no one is living in the home that the solar panels could be generating electricity and paying the government back for the loan.

I like this better than San Francisco's plan to offer tax rebates of up to $6,000 for installing a solar system. Instead of the tax payer subsidizing the purchases, the Berkeley plan just solves the financing problem with no additional costs to tax payers.

via Earth2Tech

Read More...

Friday, September 19, 2008

Obama Promises To Stop America's Shitty Jobs From Going Overseas


via The Onion via Greg Mankiw

Read More...

US Happiness Inequality Decreasing

Despite the fact that income inequality — the chasm between rich and poor — has grown to levels rarely seen outside the third world, happiness inequality in the United States seems to have declined sharply over the past 35 years. And that is not because everyone is just that much more cheerful.


According to new research by Betsey Stevenson and Justin Wolfers of the Wharton School at the University of Pennsylvania, the happiness gap between blacks and whites has fallen by two-thirds since the early 1970s. The gender gap (women used to be happier than men) has disappeared. Most significant, the disparity in happiness within demographic groups has also shrunk: the unhappiest 25 percent of the population has gotten a lot happier. The happiest quarter is less cheerful.

It seems odd that happiness would become more egalitarian over a period in which the share of the nation’s income sucked in by the richest 1 percent of Americans rose from 7 percent to 17 percent. In fact, the report does find a growing happiness gap between Americans with higher levels of education and those with less, which is roughly in line with the widening pay gap between the skilled and unskilled.
Is decreasing happiness inequality a good thing?

I would say that it is not really important. Happiness inequality would go down if the "very happy" people became "pretty happy" people, but I don't think this would be a good thing at all. Likewise, if all the pretty happy people became very happy, this would increase happiness inequality, but would be a really good thing. Instead, I would focus on average happiness level and trying to increase it and trying to get the number of "not too happy" down as far as it will go.

With income, many believe that the rich get rich at the expense of the poor, and therefore decreasing inequality is a good thing. But, I wonder if they believe the same with happiness? Do they think that happy people are happy because unhappy people are unhappy?

More on the report here, here and here.

via NY Times

Read More...

Nanoflowers Improve Ultracapacitors

Imagine a cell-phone battery that recharges in a few seconds and that you would never have to replace. That's the promise of energy-storage devices known as ultracapacitors, but at present, they can store only about 5 percent as much energy as lithium-ion batteries. An advance by researchers at the Research Institute of Chemical Defense, in China, could boost ultracapacitors' ability to store energy.

The researchers have developed an electrode that can store twice as much charge as the activated-carbon electrodes used in current ultracapacitors. The new electrode contains flower-shaped manganese oxide nanoparticles deposited on vertically grown carbon nanotubes.

The electrodes deliver five times as much power as activated-carbon electrodes, says Hao Zhang, lead author of the Nano Letters paper describing the new work. The electrode's longevity also compares with that of activated-carbon electrodes, Zhang says: discharging and recharging the electrodes 20,000 times reduced the capacitor's energy-storage capacity by only 3 percent.

So far, ultracapacitors have been limited to niche applications that require high power and quick, repetitive recharging. For example, the devices provide quick bursts of power to buses, trucks, and light-rail trains over short stretches, and braking replenishes them. If they could store more energy, however, they could be a powerful, long-lasting replacement for batteries in hybrid-electric vehicles and portable electronics.
How great would it be if we could replace batteries with ultracapacitors that recharged in an instant and never needed to be replaced? With each breakthrough like this one, that possibility becomes a little bit bigger.

via Technology Review

Read More...