|►◄ Reverse Zone|
Complete List of Posts
Fri, 21 Dec 2007
I'm not on a witch hunt, I'm just curious. There was this curious juxtaposition of news stories a week ago:
Then from one day to the next Canadian Environment Minister John Baird, who was in Bali at the time, decides that having Quebec tie itself to the California regulations is a fine idea. Why the turnaround? Well a couple of days ago Washington announced that it was denying California permission to have these emissions standards for cars. Legally, California is allowed to enact auto emissions standards that are stronger than federal standards, but it needs a waiver from Washington. Over the years these waivers have been a formality. 50 waivers were requested, 50 waivers were granted, usually within a few weeks. In 2004 California asked for a waiver to regulate CO2 emissions; it already had the authority to regulate all other greenhouse gases and the courts said that CO2 was also fair game. Washington delayed and said they had to wait for other court cases that challenged the right to regulate CO2 emissions. California won all the court cases, including at the Supreme Court earlier this year. Legally it's pretty clear that Washington has no choice but to allow the California standards to be enacted. But this week it gave its decision: the waiver will not be granted.
Without being a political expert, its seems pretty clear that they had decided this a while ago, but didn't want to announce it until after the Bali conference. Bad PR. Better wait until everyone is off for Christmas. So the question is did Baird, while meeting with US administration officials in Bali, get wind of the fact that California standards were about to be killed and that therefore there was zero risk of Quebec actually being able to follow them? Just curious.
Wed, 12 Dec 2007
The Daily Mail ran a story, repeated by several newspapers, saying that the Pope critized what they called "climate change prophets of doom", and saying that "fears over man-made emissions melting the ice caps and causing a wave of unprecedented disasters were nothing more than scare-mongering." The story says the Pope said it was vital "that the international community based its policies on science rather than the dogma of the environmentalist movement."
A bit surprising, since the Vatican's Pontifical Council for Justice and Peace hosted a conference on climate change in April, to which the Pope sent a message saying he wished to foster the "research and promotion of lifestyles and models of production and consumption that respect creation and the real demands of sustainable progress of peoples." But looking at the actual words of the Pope, it is clear that the press is reading in between the lines some things that are not actually there. Here is what he actually says:
We need to care for the environment: it has been entrusted to men and women to be protected and cultivated with responsible freedom, with the good of all as a constant guiding criterion. Human beings, obviously, are of supreme worth vis-à-vis creation as a whole. Respecting the environment does not mean considering material or animal nature more important than man. Rather, it means not selfishly considering nature to be at the complete disposal of our own interests, for future generations also have the right to reap its benefits and to exhibit towards nature the same responsible freedom that we claim for ourselves. Nor must we overlook the poor, who are excluded in many cases from the goods of creation destined for all. Humanity today is rightly concerned about the ecological balance of tomorrow. It is important for assessments in this regard to be carried out prudently, in dialogue with experts and people of wisdom, uninhibited by ideological pressure to draw hasty conclusions, and above all with the aim of reaching agreement on a model of sustainable development capable of ensuring the well-being of all while respecting environmental balances. If the protection of the environment involves costs, they should be justly distributed, taking due account of the different levels of development of various countries and the need for solidarity with future generations. Prudence does not mean failing to accept responsibilities and postponing decisions; it means being committed to making joint decisions after pondering responsibly the road to be taken, decisions aimed at strengthening that covenant between human beings and the environment, which should mirror the creative love of God, from whom we come and towards whom we are journeying.Translation adapted from The Vatican
Now, this call prudence and for avoiding ideological pressure to draw hasty conclusions, is it aimed at environmentalists or at climate change deniers? Perhaps both, when their environmental conclusions magically align with their political beliefs? And the call to prudence, when both sides of the debate, or of what remains of the debate in the light of mounting evidence, find the positions of the other imprudent, who is it aimed at? "Prudence does not mean failing to accept responsibilities and postponing decisions," says the Pope.
When he refers to unilateral decisions rather than dialogue, is he referring to the majority who decided to act in concert to voluntarily contain their emissions or to the very few who decided not to? When he talks about setting up international agencies to confront the stewardship of earth, is it with a mandate to let each country do whatever it wants, in the absence of an international treaty or protocol? His message is very clear to me. Despite apparently not taking a side in the debate, the Pope states as fact that there is in fact an environmental crisis and that action is urgent. "The problems looming on the horizon are complex and time is short". He states that rich countries have a pressing need to reduce their level of energy consumption, and to invest in alternative energy and in energy efficiency.
Maybe the Pope who wrote that second paragraph is one of the the climate change fear-mongerers that the Pope who wrote the first paragraph is criticizing for being too hasty in declaring an urgency for action and for being dogmatic in the opinion that energy efficiency and alternative energy are required. Or maybe the conservative journalists who make him out to be a climate change denier are grasping at straws, wishing to have him seem support their point of view when he clearly does not. The Holy See has always supported the Kyoto Protocol and is well on its way to its objective of becoming the world's first carbon-neutral sovereign state.
Wed, 28 Nov 2007
It's a story that gets repeated in a lot of major cities. Retailers in a struggling downtown pin their hopes on a high-end condominium complex to bring them fresh new customers. Condo dwellers move in, but the customers never appear.
If the retailers were unable to attract the thousands of people already within walking distance, what makes them think that a few hundred others will be any different? Is it like Goldilocks, the potential customers that are nearby are too old or too young, but the new ones will be just right?
From speaking to retailers in areas that have both highrise condos and other forms of housing in the vicinity, the people in the condos are not their best customers. It's mostly the ones in the ground-oriented housing, the ones with families and/or roots in the area, that frequent local stores. Is the condo lifestyle with an underground garage not conducive to picking up some groceries from the local butcher and fruit store, or is there something about ground-oriented housing that is more likely to make people grounded, loyal to the small retailers within a few blocks of them?
It's not just the fact of plopping a certain number of persons in one place that animates the street, and arranging them vertically does not guarantee that a large proportion of them will walk the street and animate it in search of who knows what destination. A prerequisite is to provide these destinations: parks, schools, community centres, skating rinks, and so forth. Half of these are destinations mostly for households with children, so having family housing as part of a diversity of housing types helps ensure that people are walking down the street and interacting. The area of course has to be made walkable, with a scale and feel that lends itself to that mode of transportation.
Condos do very little of any of that. Downtown condos interact with the street through video cameras, a sign of mutual hostility and suspicion. They can be part of the mix, and once you get a mix of ages giving life to the street, and giving subsistence and permanence to the retailers who serve them, condos can be additional, a way to get a shot at the big-ticket items that the condo crowd may notice while walking from the newsstand to the coffee shop on those days where they don't just go from the underground garage to their apartment without any interaction with the street. But they can't be the mainstay of the retailers who are holding out for a better class of customers.
This is one of the flaws behind "new" mixed use: the thought that having buildings with both retail and residential ingredients on the same lot will magically ignite the flame of commerce. Retail has to stand on its own two feet with the customers that are already there in single-use buildings within walking distance. Having more customers an elevator ride away doesn't contribute to retail success. The key to successful mixed use? Put retail where retail wants to go, near where people already live and can walk. Let them put a few affordable apartments above the stores, walkups only if at all possible.
Wed, 21 Nov 2007
There is an interesting note in the Law Times about a new carbon offset program in Ontario. This program was announced in September, during the election campaign, and was ignored by the media. I'm with Howard Hampton on this one, the media really ignored a lot of important issues during the campaign. It had nothing to do with ignoring his party, all parties were ignored when they spoke about the environment.
The government announcement and the subsequent article in the Law Times are puzzling by the parts that they leave out. Offsets as a mechanism are part and parcel of a cap-and-trade system, except for those that are for entertainment purposes only. You buy a credit to offset the amount by which you exceed the cap that has been imposed on you. You sell an offset or credit when you are well below your cap.
So starting with agriculture and forestry implies that Ontario is imposing GHG gas emissions caps on the agricultural and forestry industries. That could be good. Where are those caps? Are they actual caps or are they just intensity-based? Why are the farmers not out there debating this? This could be a good way for the hard-pressed beef cattle industry to ease out of that production which emits so much GHG.
It could be a way to promote low-till and no-till agriculture, to improve manure management, and to reduce the amount of fertilizer used. Calculated correctly, it will kill off the silly 5% biofuel target unless the farming and transformation is done in a way that at least breaks even in terms of GHG emission.
With cap-and trade, farmers could keep on using old high-till and manure handling methods, or stay in the beef business, but then have to buy offsets from the manufacturing sector or from Ontario Power Generation, who have so far led the way to GHG reduction. Or you could reward those farmers who can prove their entire supply chain is 100% Syncrude-free.
All right, I admit I don't really believe it. The program sounds voluntary. A farmer does something good and gets a credit while his neighbour steps in to adopt the bad practice to fill the void and gets no disincentive. To me, starting with farming and land use is a sure sign that the credits are for show only. Canada has tried for a long time to fool itself into thinking that replanting a forest reduces GHG. Most forests are really relatively carbon neutral. Environment Canada tried that and found out that land use, despite all the work of Mother Nature, doesn't lower our emissions on paper it increases them. Not counting them Canada has the seventh worst record on emission levels, but counting the forests it's the third worst. The only country with cooperative trees is Latvia, whose emissions are negative. Bad luck, reducing our GHGs is not in Mother Nature's mission statement, nature is designed to keep them about constant. For every tree you plant you get a million bugs eager to release whatever carbon the tree fixes.
But planting a tree is such a lovely symbolic gesture, surely that is worth money. And composting is so righteous! Sorry, nature has been capturing and releasing carbon for many millenia without a significant effect on GHG levels. You actually have to extract and burn less fossil fuels and limestone in total, or stop generating so much methane and NOx. This seems like just a feel-good PR program. Please prove me wrong.Thu, 15 Nov 2007
I have been giving Mark Jaccard and other carbon sequestration enthusiasts a hard time, but does that mean that carbon sequestration is a complete waste of time? Not necessarily, but you have to be aware of the costs and of the niches where the technology is a good fit. At the very least it is a good way to hoist the coal industry on its own petard. They say sequestration will make them as green as other fuels? Fine, you can still sell coal as a fuel as long as you reduce emissions to the level of natural gas. We're not putting you out of business, we just believe you're telling the truth about sequestration, wink, wink.
As I have mentioned before, the concentration of CO2 in flue gas is so low, and the cost of separating it out is so high both in terms of money and in terms of GHG emissions that it is not worth tackling that problem and probably will never be. It's much better value for money to just stay away from high-carbon fuels as much as possible. It's like the problem with extracting fuel from the tar sands. Right now extracting uses large quantities of natural gas as feedstock and other energy sources to move the stuff and heat it up. You could then add more energy and use it to sequester some of the carbon. But when you do all the math a much simpler solution is staring you in the face: rather than using natural gas to process the tar sand into a fuel, use the natural gas as a fuel directly and leave the tar in the ground. You get to deliver a cleaner fuel to markets, at lower cost and with much lower GHG emissions. And you avoid destroying the entire Athabaska basin. Everybody wins.
Back to carbon sequestration. There are plenty of processes where CO2 is produced in higher concentration, where the separation cost is much lower. Right now that CO2 is usually just being released into the air. There are also plenty of processes that consume CO2 and where customers are willing to pay good money to get a source of it. So much so that there is a market for the drilling of underground CO2 wells, taking naturally sequestered CO2 out of the ground to satisfy a market demand. The low-hanging fruit is to bring the two together, to make sure that CO2 in the ground stays in the ground, and then to make sure that everyone captures the easily captured CO2 and that any excess that can not be used gets buried.
Some of the easily captured sources of CO2 include ammonia production for fertilizer, fermentation, lime calcination, detergents, and natural gas wells. Oddly enough, when producing "clean energy" like fuel ethanol or natural gas, a lot of CO2 get dumped into the atmosphere. Most gas wells contain a lot of CO2. The industrial processes for preparing the gas for market does the separation of practically pure CO2 at virtually no additional cost. Don't release it, capture it and make gas even greener. If possible, sell it. If not, back in the ground it goes.
Fermentation, particularly for alcohol, produces a lot of CO2. That's why beer has bubbles. Actually, that used to be the reason. Often the CO2 produced is released into the air during fermentation, and other CO2 is pumped into the beer at the end. Remember that for every molecule of ethanol that you drink or put in your car, a molecule of CO2 escapes into the atmosphere. Catch it and use it.
Various chemical processes generate CO2. In some cases, petrochemical plants are already capturing it. There is the famous example of the ethylene glycol plant of Shell Chemicals in Scotford selling CO2 to Air Liquide, which processes it for the soft drink industry. But larger-scale processes could also capture their CO2, including the production of ammonia and the calcination of carbonates in lime kilns to make cement. Again, the gas is produced in high concentrations and is easily captured.
Various processes use CO2. Some use it and sequester it, and some use it and release it, so the same argument applies to them: catch it and recycle it. It's used in the beverage industry. Huge waste - it necessarily gets released into the air. It's used in refrigeration as dry ice or to replace freon. Well, it's better than freon anyway but it is still released into the air. Is it better in terms of CO2 to use dry ice in refrigeration rather than portable refrigeration units? Not sure. It's used in some chemical processes: urea and ethanol. It's used in enhanced oil recovery. It's pumped into the ground as a solvent. That has the potential to be sequestered, but in actual fact it gets pumped right back out once it's done its job underground and tends to be released into the air. Close but not quite there. It can also be used in greenhouses, for two purposes - as a pesticide since it is after all a poison in high concentration, and to enrich the air. Plants convert it into sugars and such things. Give them a little more in their atmosphere and it replaces some fertilizer, and allows cold countries to reduce their food imports a bit. The processes that use CO2 already get it in part from the process that produce it, and in part from CO2 wells. Let's at least put the wells out of business. In relative terms it's not a major part of our total GHG emissions that are affected, but it's a start and it's cheap to do compared to alternatives.
All of this is to say that carbon capture has a role to play in reducing greenhouse gas emissions, particularly by co-locating producers and consumers of CO2. And you can also often use the synergy to make use of waste heat. Carbon storage also has a role, when all the CO2 that is easily captured has saturated the market for industrial uses and can be buried relatively cheaply. But extracting it from flue gas to bury it? Only when through conservation and efficiency we have eliminated all the high-carbon fuel use (coal, tar sand, heavy oil, biofuels) that we can. Then when carbon taxes reach $150 a ton it becomes worthwhile to consider extracting CO2 from flue gas. But that's really a last resort. There is no good reason to ever choose carbon capture and storage over other alternatives, say the experts. Wind, solar, and conservation are a lot cheaper for the same effect.Wed, 14 Nov 2007
As soon as it came out, I read "Hot Air - Meeting Canada's Climate Change Challenge", by Jeffrey Simpson, Mark Jaccard, and Nic Rivers. It's an interesting writing style. The first part is historical fiction, without the constraints of chronological presentation, and the second part is fantasy.
The reason why there are three authors is, presumably, because this is the list of people he is in agreement with. This makes for an odd selection of material. The purpose of the book is so that Jaccard, the archetypal one-handed economist, can expound on how everyone else is wrong, even those who claim to agree with him, how they have all been wrong for decades and how they will all fail miserably in the future unless they do precisely what he says. On the other hand, Simpson's journalistic instincts probably made him slip in the facts that prove the opposite point of view, unredacted. I don't know whether Simpson is responsible for the various gratuitous negative comments about Quebeckers, or whether he was the one that toned them down. The book judges harshly any politicians and public servants that don't do what he says, which is to say all of them. Except for anonymous ones who they claim privately agree with him but never say so. It dismisses most studies and consultations on the subject of climate change as wastes of time, presumably with the exception of when the government consults him or pays him for a study.
It would take a while to go through the list of flaws and contradictions in the book, but I will just focus on the two principal ones: First, Jaccard says that there are four possible policy tools available to governments, command-and-control regulations, market-oriented regulations, subsidies, and information, and of those the last two are guaranteed to have no effect. Now, if we start counting the policy tools that have no effect, there are a lot more than two. But seriously, Jaccard is quick to dismiss the non-economic policy tools without a shred of evidence to support his claims (as he admits at one point). He even puts his assumptions that subsidies don't work into his CIMS model, which, reduced to the status of a ventriloquist's dummy, dutifully spits out this assumption as an output.
But the major flaw in this work is the comfortable fantasy that underlies everything he has worked on recently - that we can and should continue increasing our reliance on fossil fuels, and trust in some magic technology to capture it and bury it underground. I wish I had time to examine his figures and sources and compare them to what most people understand to be reality: this idea of capturing and sequestering carbon dioxide together is simply not feasible now, and probably it will never be feasible. Not only economics rules it out but so do the inexorable laws of thermodynamics.
The concentration of CO2 in flue gas is really quite low, about 10%. Before you can consider pumping it underground and hoping that it doesn't some day come back up and wipe out most forms of life, you have to expend considerable energy in separating it out from the other gases. As a result, the CO2 that you will be burying will have cost you $100 a tonne, not counting the fact that you have consumed nearly as much energy in processing the CO2 as you generated when you burned the coal in the first place, meaning the efficiency of production using capture-and-storage approaches zero.
Now this is the secret to Jaccard's results: the inefficiencies that he introduces into the energy production cycle by adding his imagined carbon sequestration scheme and through the more inefficient biofuels is subtracted from the efficiencies that are gained through various advances in energy efficiency and conservation. In his model, he is essentially sabotaging the energy efficiency figures by adding inefficiencies that nobody asked for, and then saying See? Energy efficiency doesn't help!
But even if we accept that people are never going to consume less energy willingly (reality begs to differ - in 2005 in Canada people consumed less energy, and the UK has made huge strides), and even accepting that the laws of thermodynamics do not apply to determined economists, there are two options before us: either we consume more energy and invest in sequestration technology, literally pouring our money into a hole in the ground, or we decide to use less energy for the same or equivalent economic results. The first gives us greater environmental impacts as we struggle to increase our energy production ever more, and increases the amount of money that we spend on energy. The second reduces our input costs, eliminates or greatly alleviates several environmental problems, and makes us more competitive on the world market. Now, Mr. Jaccard is quite right when he talks about the effect of carbon taxes and of cap-and-trade systems. They are extremely effective and relatively cheap. His figures show significant improvements with a carbon tax of merely $15 a ton of CO2. Imagine, all those people who wouldn't take the one-tonne challenge doing so if it means saving fifteen bucks.
There is a lot of willingness to change in Canada, if people had the means at their disposal. What do they need? Information. Maybe subsidies. Contrary to what the book says those have been proven to work. And yes, carbon taxes and mandatory regulations also work well, particularly for those companies that are motivated only by money. Are you listening, oil industry, and companies that make ammonia, fermentation, and adipic acid, and so forth? Controlling your emissions is cheap and now you'll have a reason to do it. As to electricity production, the idea of the utilities and the governments that own them taxing themselves into compliance was a bit silly. Here, politics is the key - people will vote for those that will shut down the fossil fuel plants. It worked in Ontario and Quebec, and it's catching on.Sun, 28 Oct 2007
One of the top innovators in the field of wireless networking is also one of the top innovators in sustainable transportation. And now, Zipcar founder Robin Chase brings both together to address the two biggest problems in the implementation of congestion pricing: cost and privacy.
Transportation is definitely a big producer of economically unnecessary GHGs, by which I mean the type that takes away our money without increasing our standard of living. Our use of vehicles could be cut down significantly without making us any poorer, but the right economic signals are required so that people not only don't lose money but actually make money by cutting down their emissions. One way to do this is through road pricing and congestion pricing. These are ways to encourage environmentally sensible behaviours and investments by making the use of roads, particularly at peak times, pay for the transportation infrastructure in general, but particularly to make transit investments pay off.
Congestion pricing has worked for years in Singapore, and London's experience has been positive. Those that drive finally get the benefit of less congestion and those that don't will get improvements in transit infrastructure. The downsides are that two-thirds of the money collected goes to paying the cost of the fee collection itself, and that it requires that the government keep track of who is where.
One of the reasons for the cost is that congestion pricing and toll collection tends to be based on proprietary technology. Someone has to set up separate frequencies, closed networks, proprietary protocols, and then distribute the equipment on hundreds of thousands if not millions of vehicles. In addition to that, the man must have cameras recording your whereabouts so that those without the equipment or not cooperating can be nabbed. All of this equipment is single-purpose, unless you count the benefits to the people that have other reasons for wanting to know who is where.
Instead, Chase proposes a system where the communication equipment is low-cost, standard-part hardware with open software. The users themselves would finance the major part of the hardware investment in exchange for a break on tolls. Most of the communication would be mesh networks, which is to say ad-hoc peer-to-peer networks. Since these wireless devices have relatively low power and work over low distances, like the wireless network in your house, your wireless device relays its information to my wireless device, and so on until we reach a device which is close enough to a "base station" that it can transmit it to the fixed network, and vice-versa. The bandwidth is free; I don't charge you and you don't charge me for the use of our tiny bit of the network. Oh, and everyone gets free internet access as a bonus.
The location of cars would be based on GPS and triangulation from fixed nodes. The pricing system could have more flexibility that other systems, because changing the location of cordons, or basing pricing on actual congestion, or even complex cordons with buffer zones.
Chase also proposes a locational privacy method that prevents unauthorized snoopers from tracking where you are. Essentially it works like those who pay cash rather than using credit cards. People can pre-pay the tolls and deposit untraceable (nearly) electronic tokens at the toll booth. What will be known is how much you paid for tolls, but where and when is more difficult.
The details have been revealed in individual blog posts on her Network Musings blog over the past month. It's definitely worthwhile reading.Fri, 12 Oct 2007
An interesting story in Gristmill. You are 3 times more likely to be killed on a bike than in a car. However, on a per-mile basis walking from the building to your car is even more dangerous and using public transit is 10 times safer than a car. If only the safety-conscious drivers of reinforced SUVs knew, the auto salesman in the showroom would be upselling them on a bus pass instead.
But, the article continues, the risk of death by violent collision is only one way of dying. Noncyclists are 40 percent more likely to die from a heart attack from lack of exercise. So cycling significantly reduces risk of death. On average, for every year of life lost in accidents, 20 years are gained in extra longevity.
Not even counting the others you take with you. If you die in a car crash, the odds are you are taking other people with you. If you die in a bicycle crash, you are likely the only casualty. The car or truck that hit you (or more likely some other inanimate object) may need a new coat of paint, but its occupant is likely unhurt.Fri, 05 Oct 2007
I've been staring at this graph for days, trying to figure out what it's telling me. The graph is from the paper "Intake fraction of nonreactive vehicle emissions in US urban areas" from Atmospheric Environment. 39 (7), by J D. Marshall, S K. Teoh, and William W. Nazaroff.
What drew me to it is a more recent paper by University of Minnesota civil engineering assistant professor Julian Marshall, that further develops some strange and beautiful mathematical properties of how cities develop. Having such a lovely straight line such as the one on the left is unusual in urban studies, knowing that each city develops in its own unique way. Each point on the graph is a city. The x axis is the population of the city as measured by the US Census. The y axis is something completely new. The paper calls it "linear population density", or the number of people living along an imaginary straight line. It is actually the population of the entire city divided by the square root of the area of the city. Since the area is measured in square metres, its square root is measured in metres, hence the unit of people per metre.
This paper stumbles upon the fact that there is a log-log linear relationship between the population of a town and its "linear population density". With the slope of the graph it means that the linear population density is proportional to p0.59, where p is the population. Forgive my rusty math, but this tells me that you can solve for the area of the city
p0.59 = k p a-0.5
That should mean that the area of cities goes up a little bit more slowly than population, that is to say that cities get relatively more compact as they get bigger.
But in a later article, in the September 2007 issue of Urban Studies, Julian Marshall shows that in fact the historical growth path of most cities over the past 50 years is not along that line, but rather as they grow the "linear population density" remains roughly constant. Redoing my calculation above with an exponent of 0 rather than 0.59, that would mean that the area of a city grows with the square of its population. Big difference. Double the population and quadruple its size. But extrapolating a bit, and I don't know whether the model allows this, if a city starts out with a uniform population density, then as it grows it will maintain that uniform density, but if it starts out with a large gradient from high to low density, then as it expands it will maintain that gradient and sprawl more and more.
There is a limit to how the data can be used that way. This analysis is based on US census data that considers a census tract to be urban if it is over a given density threshold. So if a city sprawls quite a lot, the edges will have such low density that they won't even be considered urban at all according to the data, making the city seem smaller, and therefore less dense, than it actually is. But the Brookings Institution study "Who Sprawls Most" corrected for that and came up with similar conclusions: cities with dense cores sprawl most, and cities with more uniform densities sprawl least.Sun, 30 Sep 2007
All on the same day: BC announced it was reviving the controversial "Site C" proposed hydroelectric dam on the Peace River. Energy Alberta announced that it was applying to build a nuclear power plant in Peace River, Alberta (downstream, not upstream from the dam - this isn't going to be an underwater nuclear reactor). And several oil companies announced that they were acquiring land for oil and gas wells in between the two. Why does Peace River need so much electricity? Well, tar sand extraction takes a lot of energy and water, and besides the Athabaska tar sands there is the lesser known Peace River tar sands deposits, right there near where the nuclear plant is planned. Are tar sands companies the ones buying that power? No, they say, the province lets them burn as much fossil fuels as they want (I'm paraphrasing but that's the essence) so why should they try to reduce their emissions?
So much for the peaceful farming communities of the Peace River. I wonder if changes to the water level, like those that resulted from previous dams on the Peace River, will remove so much water from the river that it affects the nuclear power plant.
More and more architects are coming out and saying out loud what a lot of people have thought all along. Getting LEED certification is good PR but not necessarily a good way to preserve the environment. An article in Fast Company explains that developers reach for easy solutions but not effective ones when it comes to being seen as one of the good guys. Like the people who drive a hybrid Hummer to the corner store.
There are all sorts of simple and inexpensive ways to get LEED points and some of them are very visible to potential tenants. The fact that the building is of a form that wastes huge amounts of energy is not a big deal as long as you have a long enough list of things that save energy, like bicycle racks. Having solar panels is a huge publicity boost and LEED point getter, even though it make less impact to the energy use than, say, the design of windows. Building two ordinary four-storey buildings is a lot more environmentally friendly than one 8-storey one, but it doesn't get you LEED certification. You get that by building the inefficient building then adding gimmicks until you collect enough points.
Sun, 16 Sep 2007
In a study published this week in The Lancet, epidemiologists from the London School of Hygiene & Tropical Medicine have calculated that in order to reach the Mayor of London's ambitious Climate Change Action Plan, London should go beyond its current restrictions on cars in the inner city and ban cars from the inner and outer boroughs, and promote walking and cycling.
By doing so, along with public transint and with occasional taxi trips, London's residents would lose an average of 4.5 kg of fat per year, women would reduce risk of breast cancer by 25% and increase life expectancy by between 1 and 2 years, while men would enjoy a 20-40% reduction in the risk of premature mortality and around a 30% reduction in risk of type 2 diabetes.Wed, 12 Sep 2007
A debate is on in Britain, which invented the Merton Rule, as to whether a feed-in tariff is better at improving the supply of renewables.
A few definitions. The Merton rule is a planning law, first introduced in the London suburban borough of Merton, that says that all new larger-scale development must provide at least 10% of its energy using onsite renewable sources. This typically means solar, but sometimes wind or geothermal. Because this technology can get expensive, developers also do their best to reduce energy use, so that the renewables portion is 10% of a smaller number. A large number of jurisdictions in Britain have gotten on this bandwagon. Since 2005, 90 have adopted the Merton Rule in draft or final Local Development Framework documents.
But more recently, some believe that a better instrument to promote renewables is a feed-in tariff, which promotes larger-scale economically viable projects. A feed-in tariff is essentially an agreement for a utility to purchase renewable electricity at a price that is a lot higher and conditions that are a lot looser than when they buy other sorts of electricity. Essentially, someone with a small generation capacity can sell their surplus power to the public utility when they have too much and buy some back from the utility when they have too little.
Ontario has feed-in tariffs and their system for connecting micro power generating sources and for net metering is quite sophisticated and successful. It has the advantage that the powers-that-be need not worry about financing and approving anything beyond the interconnection to the grid. Of course, everyone realizes that the utility is buying power at a high price when it least needs is and that they can not count on it being a reliable source, but that is not the point of it. The point is that these small producers are using their own power and so the utility does not need to produce as much, even at peak. The purchase of surplus power is just a way to help with the financing, but without putting in any money up front.
This is different from the Merton Rule. The Merton rule does not require any interconnection. For a feed-in tariff to be effective, a project should provide at least 100% of its own electricity, give or take, so that it has a surplus. Then and only then is there an incentive. At 10% as the Merton rule says, the project is financed by its own energy savings alone. It is more of a stick than a carrot, but it is a cheap way to get architects thinking and for developers to have to care. In the end, it's not that difficult. Canadian Tire sells solar or solar-wind kits that can easily supply 10% of your power for a few thousand dollars, and 100% for $20,000 and there are tax incentives to make it even more attractive. They even have a cool ROI calculator for grid-connected systems.
The two types of policy instruments are completely different. One can not replace the other. Is 10% a little arbitrary? Yes, but it's a practical way to force development into an appropriate mindset, at little cost to anyone. As for feed-in tariffs, they are essential to a burgeoning renewable strategy. Once renewable micro-generators become a larger percentage of the mix, other tools must be added to preserve the stability of the grid, like vehicle-to-grid systems that take advantage of the big batteries in hybrid and electric cars.Wed, 05 Sep 2007
There was a discussion earlier on this blog as to which countries are doing better than others at reducing GHG emissions, and why. There are various possible ways to do the analysis when trying to determine whether differences are because of this or that policy or this or that economic structure. But when you look at the changes in GHG emissions from the base year for different countries (see chart from the unfccc.int web site, click for a closer view) it is clear that one factor predominates: geography.
Starting from the bottom of the chart, it is pretty obvious that a country's stay behind the iron curtain is a good predictor of major reductions. Several countries have even negotiated for base years earlier than 1990, which makes the change even bigger. At the top of the chart, being in southern Europe seems to be a good predictor for major increases. Various English-speaking countries are also well represented up there. Being Scandinavian also places the country above average, well that one is a surprise to me, I thought the Scandinavians were doing well. Some of them are more difficult to classify. Germany is partly northern and partly eastern. France is partly southern. How do you judge whether the country has developed a national will to change its ways or whether it is being dragged along up or down by the regional economy and the "maturity" of its own economic system?
One way I find useful is to see whether a country is outperforming the countries it borders with. With that unrigorous analytical tool in mind, some interesting insight comes out, and some countries stand out. Germany, for instance, inherited the highly polluting East German economy in 1990, and its emissions went down, but much less than its eastern fellow Soviet bloc neighbours, while its western neighbours kept their emissions relatively steady. So, good for Germany but it's not any better than the average of its neighbours.
The UK definitely stands out as one that does much better than all its neighbours. It cut its emissions significantly, while Ireland is through the roof, the Benelux countries are almost holding the line and France is down but not as much. France is actually outperforming all its neighbours except for the UK and Germany. France is caught between extremes. It has an eastern hybrid to the right, southern economies below (Spain and Portugal are the worst performers and Italy's emissions are also growing), the British GHG-cutting champion across the channel and small economies holding the line to the east.
Italy is a southern country, but all of its neighbours including Slovenia and Croatia are doing significantly better than it, except for Austria. What's with Austria? It also stands out as a country doing much worse than all its neighbours. Australia and New Zealand are difficult to compare to anything. On the surface, they are both doing badly, but to be fair Australia is not a signatory and its Kyoto target is 10% above 1990 level, while New Zealand's is 0%, so New Zealand is doing much worse.
Canada. Sigh. Worse than the US. Worse than just about every comparable country. It ratified the Protocol. It's a little better than the graph says, there was an error in that year's data, but it's still pretty bad.
The winners, relative to their neighbours: the UK, Sweden, France. The losers: Austria, Canada, Ireland, New Zealand, Spain, and Italy.Tue, 04 Sep 2007
Britain's Liberal Democrats are the current leaders in the one-upmanship between political parties in the UK for who will most reduce GHG emissions, criticizing Labour's planned 60% cut as too timid. But their plan to bring Britain's net emissions to zero, by 2050, and non-nuclear to boot, was barely noticed. It's almost expected. You can view their plan here
Their plan for post-Kyoto includes allocation of per-capita targets for all countries. Interesting. They plan to use enforcement mechanisms so that economies that do not have or reach their target can not undercut those that do. Also they talk about auctioning emission allowances rather than distributing caps, and taxing fuels as they enter the economy rather than when they are consumed. They also want nothing but zero-carbon cars by 2040, and all new houses to use no fossil fuels for space heating by 2011. They plan to tax road freight and use it to subsidize rail. They want to speed up planning permissions for wind farms but they would like them to be mostly in the sea. Here they walk a tightrope between casting the planning process as the villain without telling local communities taht listening to their concerns is a waste of time. They want microgenerators to receive a return four times greater than the cost of electricity imported from the grid. Oddly enough they complain about how many trees in UK forests go unharvested. Individuals would be given personal carbon caps and trade in allowances and offsets. There is a little paragraph about the fact that the poorest people have the least efficient housing stock and can't afford to upgrade it, but little is said about what, besides making it even more expensive for them, they would do about it.Sat, 25 Aug 2007
House of Representatives member John Dingell, chairman of the House Energy and Commerce Committee, plans to introduce comprehensive climate change reform legislation. This new law would cut off mortgage-interest tax deductions for houses over 3000 square feet, presumably just on the portion of the house exceeding that area. In the US, interest on your mortgage is tax deductible. By changing the deduction, most homeowners will still get the popular tax deduction, but it also gives the housing market a little nudge away from the enormous energy-guzzling houses. Real estate groups instantly complained that the measure would bring down house prices by 4%, which is bad for lenders about to foreclose. Apparently, about 15 percent of all houses are over 3000 square feet. Unbelievable. In 1950 the average new house was 1,000 square feet. Now the average is about 2500, but there are signs the average may have peaked.
Dingell first floated the idea at a town hall meeting in Ann Arbor where he also talked about a cap-and-trade system on carbon emissions, a $100 per ton carbon tax, and a 50-cents-a-gallon tax on gasoline, with money for research on renewable energy.
Chicago Climate Exchange (CCX) launched its futures contracts on Certified Emission Reductions yesterday, with a contract of 1,000 metric tons of carbon dioxide equivalent changing hands. The Certified Emission Reductions (CER) futures contract is the first time that hedging tools for CERs, a Kyoto Protocol compliant emissions instrument, was traded in an exchange in North America.
These trades, already popular in Europe, are becoming the new "carbon standard" currency among civilized countries that support the UN Clean Development Mechanism for approved and verified greenhouse gas (GHG) emission reduction and sequestration projects undertaken in developing countries, and link domestic GHG markets with international ones.Fri, 24 Aug 2007
I'm just a few recent stories together under the theme of the "green dividend".
The Canadian government, forced by law to present a plan that respects the Kyoto Protocol, has instead tabled essentially the same old plan, with a section arguing that respecting the GHG reductions in the protocol would damage the economy and drive up energy costs. They predict that by 2012, GHG emissions will be only slightly less than those of 2005. That is a neat trick, because the trend in GHG emissions 2003-2005 was already down, not up. This means that they believe that their more recent programs will actually reverse that downward trend and bring emissions back up. Sad part is, they may well be right.
"The Government's analysis, broadly endorsed by some of Canada's leading economists, indicates that Canadian Gross Domestic Product (GDP) would decline by more than 6.5% relative to current projections in 2008 as a result of strict adherence to the Kyoto Protocol's emission reduction target for Canada."This "broad endorsement" of the analysis saying that their straw man scenario of an instantaneous 30% cut in emissions by economist also included an opinion that the new government plan to reduce emissions gradually would not work, and proposed some economically painless ways of achieving the objectives.
However, I think that even the modest 6.5% GDP reduction in the catastrophe scenario is probably off. Economics has trouble predicting the future, it can only use models that account for changes in the past. And in the past, there was a strong correlation between GDP and energy use. But which is the cause and which is the effect? If you have no reduction targets, then an extra unit of production costs an extra unit of energy. And domestic GDP is based on consumption. As Shumacher tells us, GDP is not a measure of how well off we are in the sense of how happy we are with our lives, it is a measure of how much money changes hands. If you consume more, the GDP goes up. Put more gas in your car and GDP rises. But will GDP fall if we consume less energy? Not necessarily. Although rises in GDP usually accompany increases in energy use, many economies including our own have experienced reductions of GHG emissions and of energy use without a decline GDP. The models may very well presume that the two go hand in hand simply because most data point in the past had the two linked, back when reducing production was a major reason for reducing energy use.
A white paper from CEOs for Cities, "Portland's Green Dividend," argues (unfortunately from misleading data, which does not change the value of the qualitative argument) that cities where investments in urban planning may have resulted in less driving, people can simply take the time and money they save by driving less and spend it on things of more benefit to the local economy than foreign oil and foreign cars. An interesting argument. I think it's a metaphor more than the fact that people suddenly find their gas money accumilating in their pocket and decide to spend it on local organic cuisine or other things whose price remains constant.
But these simplistic arguments may well hold when using a more defensible model of the economy. For every economically and environmentally expensive Alberta job lost as a result of this economic shift away from fossil fuels, new jobs will appear in B.C. and Ontario and Quebec who see a benefit in being at the forefront of a greener economy. The cost of energy will go up, yes, but not only will we consume less of it but it makes all sorts of other labour-intensive and knowledge-intensive parts of our economy perform better, and it also insulates them from the inevitable rise in energy prices that will fell economies that did not shift gears soon enough.
So which is right? I suspect that there would indeed be short-term economic disruption. A carbon tax, imposed even on exports, would move money around in the economy and displace some oil workers and truck drivers. However this adjustment would not be as painful as some of the other economic transformations from de-regulation and globalization, since those oil workers and truck drivers were not doing that job a few years ago: they are mobile, enterprising, and had other skills before going there. They won't all be making windmills, but our economy has a range of sectors, some energy intensive and some not. If they were to switch to green construction and transit drivers, the carbon tax could probably make their transition easier than, say, a employee of a suddenly defunct dot com in 2000.Wed, 22 Aug 2007
Last week's Skype outage shows the fragility of peer-to-peer networks once they reach a certain size. To be fair, not all peer-to-peer networks. Skype has based its network's invulnerability on a set of horcruxes (what, don't you read?) known as supernodes, which it has hidden among its customers's computers.
Skype does not provide its own servers to perform most of its critical services. Instead, whenever it notices that one of its customers has lots of bandwidth and no pesky firewall, that computer is pressed into service as a critical server on the Skype network. It sets up a little sign on the internet saying it works for Skype and other computers wishing to make Skype calls should connect through it.
As it happens Windows machines with automatic Windows Update are prominent among the machines that can be secretly promoted to supernode because of their large bandwidth budget and permissive security. So when Microsoft made a lot of those machines reboot at the same time on August 16-17, Skype eventually went down, unable to find new "volunteer" servers quickly enough. I think that the secrecy and unwitting outsourcing of the supernode role is a throwback to the Skype developers' Kazaa past, where they thought this was a good defense against being shut down by the authorities.
Now wouldn't it be better if the supernodes were conscious of the fact that they were being used that way? They could maybe take some steps to make sure that Skype wasn't caught short every time they turn off their computer, maybe agreeing to giving Skype a 60-second heads-up to start moving customers to a different supernode temporarily.
More telling than the network failure, though, was the fact that it took several days for the network to recover, long after the root cause had disappeared. That shows singular lack of foresight in the design. I've had that happen to my servers before. When running a particularly popular internet service, I had to restart some server components at one point. Users of the service, seeing a delay in response time, of course just hit reload over and over again, queueing a huge number of requests. But while a service is being restored, not all components come back instantly. The database checks itself out, new database connections are set up, errors are logged, and all sorts of components are all yelling at each other if you could just hang on a few seconds I'm busy here. The backlog of requests gets bigger and bigger until the whole thing comes crashing down again, not neatly at all but each with its own timeouts and each filling out its error logs with cryptic and misleading messages.
Thus with the Skype network. It had been brought up to full volume over a period of years. It doesn't have a good mechanism for restarting from zero with everyone waiting. This isn't Windows, where reboots are frequent and the system knows in what order things have to be restored, and users know when to quietly drum their fingers. Since it's distributed, you can't go over the web's intercom and say would everyone please stop trying to connect for 5 minutes, OK?
If Skype is going to become respectable, with its billions from eBay, it's going to have to be more open about relying on its users for its service. Maybe give them a freebie in exchange for a service level agreement. Maybe even provide some of its own superdupernodes on its own hardware. It can't assume that people won't have firewall rules to keep them out. Skype also has to learn to compartmentalize damage to its network.
Sun, 19 Aug 2007
The annual National Inventory Report on Greenhouse Gas Sources and Sinks in Canada 1990–2005, required under the Kyoto Protocol, will be out soon. I asked for and received a copy last week, in advance of the full report being made available on the Environment Canada web site.
The good news:
The bad news:
The big picture here is that some of the existing initiatives in Canada and Ontario in 2003-2005 have borne fruit and will continue to do so if they continue. The next big targets: reduce trucking, reduce emissions from agriculture, particularly cattle, manure, and tilling. That may turn into slightly higher food prices or a partial reversing of recent economic trends in agricultural trade. What do you say? Are you up for internal and international restrictions of trade based on GHG intensity of food? Are you at least up for labeling of the weight of CO2 equivalents associated with different foods?
I don't usually blog about politics, but since I went on the radio on the MMP or no MMP debate, I might as well go public here.
There is a referendum coming up in Ontario on October 10, about whether to keep the current "First Past the Post" (FPTP) election system where elections are based on who wins the most votes in each riding, or a particular type of Mixed-Member Proportional (MMP) system, which is a form of proportional representation that is a hybrid where some seats are FPTP and where "top-up seats" are attributed from lists prepared by parties in order to make the total of FPTP seats and top-up seats occupied by members of political parties roughly proportional to the party's electoral support.
I realize that the Green Party is very much behind a change to this MMP formula. The Green Party, never yet having won a seat under the FPTP, believes that some form of proportional representation may be as effective here in winning seats as it has been in Europe. And although the values and principles of the Green Party are usually in concordance with mine, I think in this case the interests of the Green Party may diverge from the interests of environmentalism, and are at odds with my values.
Given a choice, I prefer local decisions rather than provincial or national; I trust individuals and their personal values more than parties, which are more of a brand than a reliable set of values; I like simple solutions, with the possibility of elegant emergent behaviours, more than complex ones; and I prefer having a mainstream consensus resulting in bold action, rather than gambling on being in a position to force a set of diluted terms on the majority. I've decided to go with FTPT, not MMP in the Ontario referendum.
As it happens, when I run the numbers I see very little possibility of the Green Party winning any seats under the proposed MMP formula, but even if it did I don't see that the end justifies the means. As I am wont to do, I tried to put together a spreadsheet to figure out precisely how the proposed system takes votes and translates them into seats. It's actually a lot more complex than I thought: the algorithm has a couple of dozen steps, what with the Thresholds, the Largest Remainders, the Hare Quotas and the fact that "Overhangs" make you have to throw away the whole calculation and start over after discarding a large part of the votes. I've also been playing with the paradoxes. These are situations where getting more votes means less seats, or where losing a race makes you win the election. It's entertaining, but is it democracy?
I am also worried about the Scottish elections in May 2007. That is when the Scots, who have had some experience with Mixed-Member Proportional voting, were first exposed to a "one ballot, two votes" system, virtually identical to what is being proposed for Ontario. The UK Government calls the election a fiasco. The Opposition says no, it's a debacle. The Scottish Electoral Commission has called an inquiry. It seems that the rate of spoiled ballots went up by a factor of 10, with much higher rates in poor areas. Was it because this ballot was more confusing, or did previous designs simply not give as much opportunity for misunderstandings to be translated into spoiled ballots? The evidence, from independent studies, seems to point to the fact that voters never understood the MMP system.
Both sides of the issue have legitimate points, and this blog is not the place to debate them (please). If you feel a need to debate the pros and cons, I can point you to other sites where there is a lively debate going on.Sun, 05 Aug 2007
I've only read the summary so far and glanced at some of the Statistics Canada tables used to make it, but Canada's latest National Greenhouse Gas Inventory seems to confirm my prediction a year ago: Canada's GHG emissions from energy use (see yellow line) went down in 2005, according to the latest data.
Looking beyond various increases and decreases that follow the earlier trends, the major factor seems to be Ontario's 17% reduction in total coal-fired electricity generation from 36.2 down to 30.1 terawatt-hours, including the elimination of the the coal-fired Lakeview Generating Station in Mississauga.
Ontario's industries have had continued reductions in emissions, in general, but the appetite for cement and increasing use of trucks and planes, and certain agricultural practices, undid some of the gains. Acts of God did a lot of the work. Katrina reduced refinery capacity, pushing up prices and reducing demand for Alberta and Saskatchewan oil; a fire shut down a major tar sands facility; and the weather was comparatively mild.
The good news is that the factors that worked for Ontario are continuing: high prices, conservation, and less coal use. Unfortunately, what slowed down Alberta's emissions in 2005 was temporary. Alberta is also implementing a few good measures in its electricity and some of its agricultural and industrial methods, but until they reduce their fossil fuel production from tar sands, Alberta will remain the millstone around Canada's neck.
For a few years, the California legislature has been telling Californians and California cities that they should do everything in their power to reduce their driving. And unlike residents of other states, Californians have done exactly that. In the land of the freeway and of road rage, Californians are burning less fuel. Supply and demand has now kicked in and their gas prices have been sliding, contrary to other states. That gives Californians a double savings, one from using less fuel and another from reduced fuel costs.
California legislators are about to go beyond relatively mild exhortations and insist that land use planning become more than lip service. Locally produced regional land use plans must choose scenarios that reduce VMT (vehicle miles travelled) or risk losing transportation funding. The new SB 35 transportation Planning bill links land use, VMT targets, vehicle ownership, and induced travel demand, and requires the use of travel demand models and public dissemination of the methodology, results, and key assumptions of the model. It also provides some constraints to the methodology. They allow some leeway on what is a reasonable transportation service level requiring extra capacity, but not much on meeting the federal and state VMT and GHG targets.
It also requires that significant resource areas or significant farmlands can be identified as a development area only if they are adjacent to urbanized areas, already served by utilities, and if there is no feasible alternative. These areas must be used with a minimum of 10 units per acre, and need to adopt traffic mitigation measures. If their plans fail to meet the VMT reduction goals, they are required to make new plans, and this time they risk losing transportation funding.
Californians will be among the first to reap the benefits of decreased energy use, giving their economy a significant economic advantage. The new planning laws will make the structural change to the state's infrastructure to make it more productive while other economies flounder.Tue, 31 Jul 2007
An interesting article on cities where transit costs nothing. With the cost of collecting fares and policing payment, and fares being a relatively small proportion of transit costs, making it free is not as expensive as it might seem.
And free transit greatly increases transit use. But as the head of a free transit agency warns, there is more to the success of the plan than the price
"To be successful," says Jean Vandeputte, the chief engineer-director for the City of Hasselt, "I think that the public transport system must not be crowded at the start. Our project was originally organized to attract more passengers and to have less cars in the city center. The buses also need separate lanes, because traveling by bus has to be faster than by car, so the infrastructure of intersections and streets has to be adapted. The buses have to be modern, clean ... you need to have more bus stops. And the shelters must be attractive."The transportation infrastructure has to be changed to accommodate the extra riders. It has to be changed to cheaper infrastructure, but nonetheless there is a transition cost.
That is something that the somewhat enthusiastic article does not go into. What are the overall economics of free transit? What is the cost avoidance by not having to provide capacity for so many cars? What happens to all the parking spots? What happens to the way people build? What happens to trip planning? What happens to shopping? It seems like quite a paradigm shift.Mon, 23 Jul 2007
As I've discussed here before, bigger, faster computers are no more intelligent and only progress in algorithms can make AI progress.
This is apparently the message of Oren Etzioni at IAAI-07. He refutes the idea that applying simple computation techniques to large mountains of data will yield intelligence. And he intends to point to his latest achievements Farecast.com and KnowItAll.
I am a big fan of Etzioni, and I would like to point to his even greater achievement, Jango, and the economic and management environment also required to bring AI to fruition. Jango was a gem, but still a bit in the rough. Around 96-97 if I recall, Etzioni brought out Jango, a shopping assistant in the tradition of the expert system wine assistants of the 80's. It used the search engines to help ferret out the best deals on the products you wanted. But it also had intrinsic knowledge of the domain. If you say you want a red Burgundy, it knows that a Beaujolais qualifies, but that a Beaujolais Nouveau is probably not what you meant. Jango tried to find the best price by searching out the world's web pages, long before there was any sort of standardization imposed by major search engine shopping search.
Jango was commercialized by Etzioni's firm Netbot, staffed with all sorts of high-flying high-tech executives. The Jango engine was client-server and nearly nearly worked most of the time and only crashed now and then. They brought out all sorts of other domains, roses, tea, music, etc. I thought it was great and I played with it all the time, but of course hardly ever bought anything. I was just admiring the handywork. Then Jango started trying to coax us into buying more so that it could start making money. Revenue, in 1997? Come on! It made deals with the merchants, it had specials just for us, it started recommending not the best deal but whoever cut them in.
Apparently making money was more important to the company and its investors than the quality of the AI. An odd turn of events. Then the company was sold to Excite and became the Excite Shopping Channel. The client end of the client-server was removed and put on the server. That could be good. But then all the AI was stripped out of it and the only thing that remained was listing the "deals" that they got a cut of. The searching was dumbed down to straight keyword string search.
Thus died a promising improvement to the quality of search, dumbed down because its users weren't willing to pay to provide a quality product. We'd all rather complain that good search is impossible, despite the fact that it was right there and was killed in front of our eyes. All other search engines then came up with unwieldy XML feed standards telling internet merchants: "you tell me what you have for sale and what's comparable. I don't feel like figuring it out."
Etzioni's new products are no slouches either. Farecast.com looks inside the mind of the airline managers and predicts where and when the cheapest fare will magically briefly appear. That one can be killed by the market, but its AI is more central to its economic success so it's not necessary to dumb it down.
His KnowItAll research project is to some degree a son of Jango. It does unsupervised extraction of product features from text. For example, the sentence "Our room's temperature was just right" mentions the explicit feature "RoomTemperature" whereas the sentence "The hotel is ridiculously expensive" refers to the implicit feature "HotelPrice". Second, the system identifies opinions regarding product features and establishes their polarity. For example, "fantastic" is a positive opinion, whereas "disappointing" is a negative opinion. Finally, the system ranks opinions corresponding to the same feature based on their strength. For example, "great" is stronger than "almost great" which in turn is stronger than "mostly ok".
No amount of computing power and keyword search will find the pros and cons of a hotel without this sort of intelligence behind it. But the problem is how to quantify how intelligent is the algorithm so that it can compete with the statistics by which people normally rate computers and software: gigahertz, number of documents, milliseconds to query. What we need is an intelligence index for applications so that people can want and demand smarter more useful applications without knowing what the heck they are talking about.
Thu, 19 Jul 2007
Last Saturday, the Vancouver Sun's story on Vancouver's new highrise apartment buildings was actually about highrise apartment buildings. The Sun so often gets EcoDensity wrong that they imagine this form everywhere, but this time they reported a planning story correctly.
For the first time, certain suburban townships are allowing buildings higher than 4 storeys, which may lead to a pattern of development with clusters of towers throughout the region. Spreading the density more uniformly is one way to reduce sprawl, although consistent 4-6 storey height limits over a large area seems to be the most effective way.
Langley politicians who oppose highrises correctly point to a major factor that produces sprawl: some people move out consciously to get away from the city, the rest want Starbucks and highrises. For whatever reasons, economic, cultural, social, aesthetic, who knows, proximity to highrises makes a chunk of the population want to move away. So when you plunk them down in the suburbs are you reducing the segregation and therefore improving all your statistics, or are you chasing half your population even further afield?
I don't have an answer. The mountains change the land economics, and also the social parameters. In most cities, local density differences make a large difference to demographic mix, and the greater the variance the worse the sprawl and the GHG emissions. And in most cities high-rises are unnecessary, playing no positive role in reducing total land use or total car use. But in cities like Hong Kong and Honolulu highrises are a real, not illusory, contributor to compactness. I am curious to know if Vancouver is also like that. Some day, I'd like to get all the data and run simulations on the city.
Notwithstanding, there is a fatal flaw in the transition, which may cause economics to dictate terms to planners. Vancouver has already allowed several applications that exceed the city's current height limits. Brent Toderian, head of city planning, knows that any sign that the city may not follow its rules will cause land prices to shoot up, to be based not on what the zoning allows but what they think some good lobbying at city hall might produce. This makes it difficult to build anything at all, and smart developers will put their money on lawyers and lobbying, not on architects and urban designers.
However, Toderian gets into the Achilles' heel of the entire EcoDensity initiative: there's no money for a planning review. Instead the city will wheel and deal to get parks and housing without paying. In other words, unable to get parks and housing the open and equitable way, Vancouver may have to discard urban planning and instead auction its zoning laws to the highest bidder.Wed, 18 Jul 2007
New York City, on the brink of going ahead with congestion charges for part of the island of Manhattan, missed a key deadline yesterday, when the state of New York failed to endorse its bid for $537 million in U.S. Department of Transportation funding for the project.
With the congestion charge, or "cordon toll" as DoT sometimes calls it, the $8 toll for each car coming downtown on work dasys would be used to improve transit. The plan got caught up in a tight deadline and politics, with mayor Bloomberg unable to convince state legislators to let him go ahead with it. The plan is, predictably, not very popular in New York suburbs, and only in Manhattan does it get support.Tue, 17 Jul 2007
I've received a bunch of e-mail and seen a bunch of blogs about the evil Yahoo Web Beacons recently. I don't know why now. According to the apocryphal stories, these new web beacons are similar to cookies, but allow Yahoo to record every website you visit -ANYWHERE- on the Internet, even when you're not connected to Yahoo. And this is considered unusually invasive.
In fact, Yahoo has been using "web beacons" for a decade or so. Everyone has. A web beacon is just a picture, sometimes a transparent pixel, that carries a cookie. Like any web page or picture, it tells its web server the URL of the referring page. Like any cookie, and there are cookies associated with virtually every element of every web page out there on the internet, it lets the domain that is serving it assign you an identifier and track what other pages or pictures it has served you before. Every web advertiser has been using these tracking cookies for a decade, ever since the cookie was invented.
So besides the fancy "beacon" name, what is different about Yahoo's tracking cookie? Actually the difference makes it more benign than most since it lets you turn off 3rd party cookies. Yahoo has let you opt out since 2001, according to the web archive from August 2001.
3rd party cookies are what advertisers use. A web server can only read a cookie that has been written by a server within the same domain and even then it depends on the file path. So yahoo.com can only read a cookie written by a yahoo.com server. However, if my blog were to have an ad served by Yahoo (it doesn't) then Yahoo could read the unique id that it assigned to the user at the time that the Yahoo ad was being retrieved. My blog server would have no access to this information, but I the author of the page am aware of the possibility because I would have been the one who put the Yahoo ad there. So in theory I should have something in my privacy statement saying "you realize that whenever you see an ad or picture or script on this site that is served by a third party, that third party server may be using cookies and I have no way of finding out whether they are or not and if they are what use they make of the information."
By opting out of beacons on the Yahoo site, you are asking the Yahoo ad server to please avert its eyes whenever the cookies that Yahoo have set show up on an ad that Yahoo serves on my blog. Ironically, it does that using a 3rd party cookie about you so that it can look up your tracking preference. Yahoo does not have complete power to stop the cookies from being transmitted. It the browsers that volunteer the cookie information whenever they are requesting a file from a server. If you want the cookie to not be transmitted at all, you can go to your own browser options and change that.
It's unfortunate that Yahoo is being singled out not because it is doing the tracking but because it offers the ability to opt out (albeit 6 years after the fact). It's a valid issue to be concerned about because the major web companies now own own so much of the online advertising market that they are in a good position to collect pretty good profiles about just about anyone. Since they also own the sites where you log in with personal information and where your e-mail goes they could, if they chose to, put together some pretty intrusive profiles of what you do and what you read on a minute by minute basis. We just have to trust them not to. It would be interesting to find out how many web pages have tracking cookies from Google, because of Adsense, search, Google Analytics, and now Doubleclick and Adscape. If you said "all of them" you'd probably be close. But then again Google promises not be be evil so we're OK.
Wed, 11 Jul 2007
I have just been looking at the LEED for Neighborhood Development Pilot Rating System , part of a project that I heartily endorse. LEED, Leadership in Energy and Environmental Design, is a certification for green buildings. But the environmental impact of buildings is just as much about the context in which the building is located as about the building itself.
The LEED for Neighborhood Development Rating System integrates the principles of smart growth, urbanism, and green building into a standard for neighborhood design, high standards for environmentally responsible development. It is a collaboration between the U.S. Green Building Council, the Congress for the New Urbanism, and the Natural Resources Defense Council. The project is in a pilot stage and the list of registered pilot projects will be posted soon.
Although most of it is quite reasonable, with points for various aspects of environmental protection, reduced driving, access to services and open space, walkability as well as energy efficiency of buildings and infrastructure, there is a significant problem in some of the indicators, specifically between Neighborhood Pattern and Design (NPD) credits #1 and #3.
NPD credit 1 gives 0-7 points depending on the density (see table). It implies that the environmental benefits of density, and specifically residential density, are linear. That is to say it implies that the greater the density of a neighborhood the greater the environmental benefit. This is probably not the case in most cities. There is actually an optimal residential density above which environmental benefits decrease. I would gladly compare the evidence supporting each of these positions.
One of the factors that is responsible for this decrease in benefits at higher density is partially, but not sufficiently, compensated by the diversity index. Higher densities tend to be dominated by multifamily forms. On a regional scale too a large proportion of multifamily forms in one area, and in particular highrise apartment buildings, is as bad as too large a proportion of single-family forms. In fact they are two sides of the same coin - when many areas have an above-average proportion of one form, mathematically other areas will be below average, and economics and sociology will also exacerbate the pattern.
NPD credit 3, diversity in housing sizes and types is on a 3-point scale, using a Simpson Diversity Index, typically used to measure biodiversity. But it's pretty well the same as the Gibbs-Martin index normally used for this type of measure, but it indicates that the people who set up this LEED rating system majored in biology.
The density credit is only partially compensated by the diversity index. Having very high density made up of apartments only can earn 7 points at the risk of giving up the 3 diversity points.
The solution to this problem is to reduce the maximum score for residential density and increase the one for diversity. Higher diversity tends to imply higher density in any case, so the points for good compact mixed neighborhoods where everyone can live will be maximized and only the more sterile socio-demographic ghettoes will score a bit lower.
By the way, employment density does not have the same problem, although that is a matter of some debate and is probably related to transit infrastructure. Having some good employment destinations on transit lines is a necessary condition to letting the rest of the city increase its transit modal share.
Sun, 08 Jul 2007
A lot of people from across the political spectrum support a carbon tax. There is of course the Green Party, George Soros, Al Gore, and the the Toronto Star. On the right, the British Conservative Party supports a carbon tax, and U.S. conservatives are warming up to it, and they don't even need to stop denying climate science to support it.
But now the Cato Institute, an organization so right wing that they had to go back over 2,000 years to find a politician to emulate (*), supports it? Actually what it proposes is just a kneejerk reaction against Live Earth, calling emission reductions "Expensive and Infeasible". But it continues by stating as fact its opinion that "There is no politically and technologically acceptable suite of technologies that could reduce US emissions 90% in the next 42 years. But the cost of any attempt, absent the proper technology, will be enormous. The legacy of Live Earth will be one of increasing atmospheric carbon dioxide and decreasing wealth -- wealth that could be saved and used to invest in the technologies of the future."
Hmmm... Tap into the wealth created by current CO2-emitting technology, save it and invest it in new technologies. They're proposing a carbon tax! Either that or a program of mandatory investments by the wealthy into designated green energy technology companies. No, I think the first one is right.
Actually, the Cato Institute does not disagree with a carbon tax. They even thought a decade ago that a modest carbon tax was quite reasonable. They quote economist William Nordhaus who at the time had calculated that $5.30 a ton rising slowly to $10 a ton was optimal (average cost = average benefit). He re-calculated just recently and came up with an optimal carbon tax of $35 per ton of carbon, rising over time to $85 in 2050 and to $206 in 2100. Using different assumptions that environmental costs paid for by our children in the future are nearly as important as financial costs paid by us today, he gets $360 per ton, rising to $1000 but then coming down again. The paper will be published in a forthcoming issue of the Journal of Economic Literature.
(* The Cato Insititute is named after Cato the Younger. Yes, the dislikeable guy in the HBO series Rome, who found Julius Caesar too left wing. Actually to be fair, the Institute is named after Cato's Letters, a British work of the 1720's. )Tue, 03 Jul 2007
According to Bruegmann (author of Sprawl, A Compact History) in Forbes, attacks of sprawl in the US are built on "an extremely shaky foundation of class-based aesthetic assumptions and misinformation". So now the suburbs are built by the poor and it is the upper-class inner-city elite that is criticizing it.
That must sound ridiculous to someone from Winnipeg, where incomes get higher the further you get from downtown. That is no longer the case in all cities, but certainly we can't blame the underclass for sprawl. I won't even critique the Bruegmann article. Its use of mix-and-match statistics speaks for itself. For instance, he seems to call Los Angeles both the lowest and the highest density city depending on the argument.
The reactions to the article are also entertaining. Something about King George and Transformers.
Mon, 02 Jul 2007
I went to Amazon to get more details on the book Zoned Out: Regulation, Markets, and Choices in Transportation and Metropolitan Land Use (RFF Press) , which explores the theories that sprawl is caused by market forces; that it exists because this is what people want. I used the "search inside the book" feature of Amazon, and asked it for a random page. It gave me page 70.
There is a well-known theory of land economics, known as the Tiebout-Hamilton model of local public choice, which says in essence that with a multitude of different local governments, consumers will choose the one that specializes in the particular combination of taxes, services and zoning that matches their preferences. That way there is a free market in governments as well as houses and everyone one gets what they want.
Here is page 70 of Zoned Out, which sold me on the book
Where travel behavior research focuses solely on the external costs of transportation - such as the pollution and congestion imposed on others - it erroneously assumes that households are able to choose their optimal location, within budget constraints. But such optimization would be impossible if a community that would otherwise be desirable to a household were closed because local regulations excluded the kind of housing the household desired or could afford. Thus this research systematically neglects the internal costs of exclusion - the gap in benefits from a household's actual location and the location it might have chosen in the absence of exclusionary regulations.To get more details about the book, hover your mouse on the picture of the book's cover.
Thu, 28 Jun 2007
Downtown Columbus, Ohio, is being transformed into an intentionally eclectic residential neighbourhood that covers 9 city blocks. Built principally of townhouses and lowrise apartment buildings, this new neighbourhood with tree-lined steets is going in amidst the office highrises and entire blocks of surface parking.
Eric Fredericks of Walkable Neighborhoods calls it disappointing. He thinks it is too low-rise for downtown. I call it inspired and necessary.
If you build larger-scale condos within walking distance of downtown and easy access to transit, the area will become a socially stunted community, with only a small range of demographic groups. And that range will not include those who would consume a great deal more land and transportation if they were elsewhere. Building bigger near downtown creates more sprawl.
Typically, the high-density areas that work well at supporting all generations and at reducing the city's total ecological footprint were built a hundred years ago or more. What I like about this development is that they are building it with some variety, as though it was built gradually over the course of decades. It's got some porte-cocheres and inner courtyards, something that was done in the 19th century to let the coaches through and to bring in the horses. It's got some buildings that look like converted industrial and institutional buildings. Who knows, they may actually be occupied by industrial or institutional tenants, but more likely they will be residential. The apartments look like walkups, I see no obvious signs of elevators in the mock-up. And it achieves all that without looking too much like make-believe heritage. I guess the fact of mixing Victorian, Edwardian, (do americans name architectural styles after the reigning British monarchs?) and Art Nouveau style buildings side by side they make it look like it grew naturally.
There is a wonderful virtual walking tour that shows one streetscape. As you go through it, you will be reminded of several cities. This bit looks like Plateau Mont-Royal in Montreal, that bit like a bit of old Toronto, here's a snippet of Philadelphia, there's that really nice old building from Chicago. Columbus? You're sure it's Columbus, Ohio, not another one? That's not how I pictured it.
Wed, 27 Jun 2007
In a bold move that may change the face of North American cities, General Growth Properties, a major shopping mall owner, has hired Duany Plater-Zyberk to turn a suburban mall into a New Urbanist town centre.
The company intends to add offices, housing, and other uses to its mall properties, further enhancing the profitability of their property holdings. "the big idea is to integrate the mall into a larger urban fabric, kind of like the 19th-century urban arcaded streets were in Europe," says Thomas D’Alesandro IV, the company's senior vice president.
European cities tended to be centered on a cathedral, not a JC Penney's, and the arcaded street predates the nineteenth century by a thousand years, but it was only in 19th century Europe that the famous enclosed "shopping arcades" of Milan and Picadilly, with their luxury products and single landlord, inspired the enclosed shopping malls of America with the fiction of a walkable street to tie it together.
However, with shopping having become the new religion, building a new town around a shopping mall not is a bad idea. If the owners of the enclosed shopping mall are weaned away from the economic necessity of capturing far-flung customers in a gilded prison separated from the rest of the world by a moat of surface parking, and instead see it as a "company store" where the company also controls the houses and employment of an entire captive town, there just might be an economic incentive to reducing driving.
Tue, 26 Jun 2007
No, this is not a plug-in from your browser. Google is promoting plug-in hybrid electric vehicles. Through its RechargeIT program, the Google.org philanthropic arm of Google has created a small fleet of plug-in hybrids and plans to have over 100 plug-in hybrids in its corporate fleet.
In cooperation with Toronto-based plug-in hybrid maker Hymotion, and battery maker A123 Systems, Google is attempting to demonstrate the environmental advantage of plug-in hybrids. Although not quite as misleading as the calculations of certain PHEV advocates, Google's calculations do yield to the temptation to exaggerate just a little. For instance they compare their small cars not to cars of the same size and age, but to the U.S. fleet average. They also go straight from the state-level emission factor (pounds CO2e/kWh) per kilowatt-hour generated to the number of kilowatt-hours consumed by the car without considering the losses along the way. But it is neat that they are doing it state by state. So for instance if you live in North Dakota, it tells you that the gasoline engine produces less CO2 than plugging in your hybrid.
Luckily, Google is in California, which does a pretty good job of keeping the CO2 emissions of electric generation relatively low. But just to be sure, Google is installing solar panels, to recharge the vehicles off-grid. More importantly for the economics of plug-in hybrids, Google will be trying out vehicle-to-grid technology, letting the power grid take advantage of the vehicle's power storage potential to reduce the problems of peak loads. In addition, Google is experimenting with car-sharing. This allows Google employees who come to work without a car to borrow one for business reasons or personal errands.
Thu, 21 Jun 2007
There is a nice set of Dilbert cartoons about green consultants. In this one, Dogbert suggests sabotaging someone's career so he no longer commutes.
According to latest figures, 70% of workers drive to work alone. The daily commute is getting longer. Cars have been getting less efficient. Unfortunately, the commute to work is becoming an increasingly small percentage of total driving. Non-work travel now is the vast majority of driving, and commuting accounts for, I think, less than a quarter of the total. In fact, studies have shown that telecommuting doesn't help much, because telecommuters actually increase their non-work driving.
It's sad that even the most obvious simple solutions barely even work any more. The commute is important only because it can most easily be substituted by a transit trip. But really, eliminating car trips to stores and to schools will have a lot more impact than thinking back to the 50's when the breadwinner's commute downtown accounted for most of the household's miles travelled.
Wed, 20 Jun 2007
I occasionally write to government ministers. A very few of them I know personally, but mostly I fool myself into thinking that I can somehow provide information that will enlighten government policy.
I'm not a complete lunatic. I work for a policy research firm. Work that I do professionally does influence government policy. I've testified at House of Commons and Senate committees. Government white papers have plagiarized my reports. And given that a couple of my friends have been elected and gotten important responsibilities, I know it is possible to have an intelligent conversation with them and maybe even see that reflected in legislation later.
But every now and then I write as a private citizen, or a letter I write to someone I know gets forwarded to a minister that I don't know, without a covering letter saying I am not a raving loony.
I just got a reply back from one. I had sent a thoughtful piece, with scholarly references, showing what aspects of New York City's "Housing First" programs could be adapted here and the expected effects and policy context. I'm not with any pressure group here, I just happen to have done a few projects in related areas and I saw an opportunity to improve programs over here.
Well, as usual, the letter I got back bore no relation to the one I sent. It grabbed on to a keyword in my first sentence and listed the department's programs related to that keyword. I wish that I could be sure that this is a form letter spat out by a machine that looks for key words. No, someone who writes this sort of correspondence assures me that a thousand dollars or two of taxpayers' money went to composing and approving that studiously vacuous response, and that anyone who had anything to do with developing programs was carefully kept far away from any such external input. If I wanted to be a responsible citizen, I would avoid addressing the government directly.
Mon, 18 Jun 2007
Mansfield, Texas, in the Dallas-Forth Worth area, is trying a new approach to downdown revitalization. Rather than putting money into bricks and mortar and hoping that a nicer looking main street will attract customers and businesses, they are puttting their money into events that draw people into the town centre and hoping that merchants and landowners will then see an economic benefit to sprucing up their storefronts.
This is pretty well how a shopping mall works. They provide the retail traffic, people walking in front of your store on their way to the anchor tenant. It is then the responsibility of the smaller tenant to tempt them to come in, and the storefront spares no expense.
This is not necessarily the best way to preserve historic buildings, but if the preservation rules are in place first, a nice looking pedestrian oriented frontage will draw more more customers than its dingy neighbours. As long as the dingy neighbours don't drive people away in the first place
Thu, 14 Jun 2007
I'll admit it, I own shares in the oil industry. Long story. I also have some interests in enhanced oil recovery, where you pump water, detergent or a solvent into the ground to extract reluctant oil. Pumping high-pressure CO2 is even better. Not only does pressurized liquid CO2 dissolve the oil, but it is a good way to take greenhouse gases out of the atmosphere and into a sink. Not a magic bullet, but from a business point of view, the impending market in GHG credits means that this business is attractive even if relatively little oil is recovered. From an environmental point of view it is better than tar sands or coal.
I was a bit surprised when reading a press release from one of these exploration companies that they had found significant valuable CO2 reserves. Yes, the CO2 being pumped into the oilfields for enhanced oil recovery, after all the research subsidized by environmental research funding, will not come from the atmosphere, but be pumped out of the ground from CO2 wells that would not have been drilled a few years ago. It is a steady supply, already pressurized and close to the oilfields, unlike the stuff from industrial processes that could be diverted from going into the atmosphere and might make a difference in our climate. How much will leak into the atmosphere along the way? Has all this research merely found a way to release even more CO2 into our long-suffering atmosphere?
It's an old story elsewhere but it's relatively new in Ottawa. Two different developers, having had setbacks in their attempts to increase the zoning, seem to be taking it out on neighbours.
In one case, the owner of a brownfield zoned for a set of large office buildings wanted to convert it to residential and build even larger apartment highrises, but ended up building significantly smaller buildings and mostly townhouses. The developer had promised to remove the contaminated soil in winter, when the health effect of moving contaminated soil would not be as great.
Well when winter came some neighbours, not realizing that this was as good a win as anyone ever gets, decided to initiate legal action against the decision, hoping to get rid of the highrise towers altogether. It was all eventually settled, but the developer is now digging up the contaminated site in summer, saying the neighbours should have thought of that when messed with the project schedule through their legal action. Retribution?
In a different case, City Council unexpectedly overturned a decision of the Planning Committee, and turned down a zoning change that was inconsistent with a community plan that they had passed two years before. This is an exceedingly rare event, and the quiet lobbying by neighbours seems to have worked. This time, the developer appealed City Council's decision. A few days ago the neighbours decided (I think) to hire a lawyer and planner to represent them at the appeal. A few days later, all trees on or near the property line, some reputedly a century old, were cut down, including some across the property line. Their noise and privacy barrier is gone, as is their shade. Punishment?
The reason these stories are rare in this city is that neighbours never win against large developers, particularly not those in modest central communities. I remember once trying to organize neighbours in a legal fight against one developer. I went door to door asking for donations in a really down and out part of town. The residents of the rooming houses all around donated $5 and $10 bills in great numbers, and apologized for not being able to give more. Of course that wasn't nearly enough. The developer won as usual.
Thu, 07 Jun 2007
Two studies have come out roughtly simultaneously regarding long-term transportation planning in the Ottawa-Gatineau area. Two very different studies with very different visions.
One is the terms of reference for the Environmental Assessment Study of Future Interprovincial Crossings, a study that looks at bridges over the river separating the two cities and the two provinces.
The second is the report of the Mayor of Ottawa’s Task Force on Transportation, formed in the aftermath of the light rail fiasco where the new mayor and council voted to cancel the contract for construction of a rail system that had been voted by the previous mayor and council. People who lobbied for the vote against it were a coalition of people who thought that it was too ambitious, not ambitious enough, or about right but using the wrong technology (electric vs. diesel).
The new Ottawa proposed plan actually shows some vision, while the interprovincial EA seems like old-style business as usual. The Ottawa plan focuses on rail, that's its job, but also presents it in the context of larger-scale transportation planning. So the idea of using rail to provide key interprovincial passenger and freight capacity and to therefore reduce vehicle transportation demand is proposed, along with integrated multi-modal transportation strategies. Their objective is to make automobile and truck transportation the least attractive of many transportation options. Not to the extent of the Vancouver "congestion is our friend" plan to let roads degrade, but by not making the tempting compromises that make transit a little cheaper but also makes it a poor alternative to cars. So their plan makes transit seem positively luxurious. The glorious old downtown train station, currently converted into a conference centre, is turned back into a convenient train station. Downtown transit stops are underground, with climate-controlled passages to main office buildings. Sounds a lot better than what we have now where passengers, pedestrians and cyclists use umbrellas to protect their legs from the constant spray of slush that buses mete out.
The interprovincial plan, however says "There is a need for additional capacity across the Ottawa River even if aggressive Official Plan transit modal split, Transportation Demand Management (TDM) and Transportation Systems/Supply Management (TSM) targets are achieved." By "capacity" they of course mean auto capacity (what else matters?) Their count of lanes and capacity ignores pedestrian and cycling lanes, and rail bridges. It just focuses on cars.
Both plans talk about the necessity to eliminate the heavy truck route through ordinary downtown and residential streets, currently featuring 4 different 90 degree turns through signalled intersections in mostly residential heritage areas. The Ottawa plan talks about dangers to the health of residents because of fine particulates and the transportation of dangerous goods, and the damage to traditional neighbourhoods. The interprovincial plan talks about congestion and the economic impact on tourism. All right that's not all they talk about, but it looks like when they see an idling vehicle, the interprovincial planners think of the cost of gas and the frustration of the driver, while those breathing in the fumes are a secondary concern.Thu, 31 May 2007
Here is an interesting panorama of a busy dreary Toronto intersection as it might look if it were a woonerf.
A woonerf is a Dutch invention based on having very little formal separation between automobiles and pedestrians. The effect is like making cars drive on the sidewalk - the ultimate traffic calming measure. Pedestrians don't just have priority, this is an alternate universe where they are the evil overlords and their children make you bend to their whim. Once the drivers realize that the pedestrians own this realm, very few visual signals are required to control traffic flow. The speed gets low enough and drivers are alert enough that a minimalist approach works. Woonerf intersections and traffic circles have some visual cues but no signs.
The panorama in the link lets you click and drag to look around. What the photographer has done is to paint out all the traffic regulating elements. The effect is surreal. Of course, woonerf is unlikely to work on this type of high-volume intersection of major arterials, so I assume the panorama is tongue in cheek.
Woonerfs have been slow to propagate outside the Netherlands and Flanders, but there are some pilot projects in the UK, where they are called Home Zones. Despite a great deal of talk in North America, and high-profile articles about traffic designer Hans Mondermann, nothing has come of it yet. Would the concept work here? Will cars submit to this humiliation? Will fire departments and snow removal officials veto it? They're really not fond of new ideas.
There is an urban design charrette coming up for a large brownfield area near where I live. I will propose using my neighbours as guinea pigs.
Tue, 29 May 2007
This is my answer to a Canadian Press story yesterday where an unspecified group (what's with these journalists?) discussed some well-known ideas for creating a more sustainable transportation system. To quote from the story:
One of the challenges facing those who aspire to limit the number of vehicles on the road is the consumer, as any hike in gas prices is usually accompanied by calls for government intervention - not for better urban planning and public transit.
Well, here it is: a high-end public transit ad.
I nearly cried when I saw the Vancouver Sun's story on Vancouver's new Ecodensity Draft Charter with "high-density housing next to its major parks and along every one of its major streets", according to the Sun. What a way to kill a perfectly good city, lining the edges of all its parks and commercial streets with high-rises, so no one ever sees the sun or anything human-scale again unless they flee to the suburbs. Since there are misguided people in Vancouver who honestly believe that living in a million-dollar unit in 35-storey condo is ecologically righteous, maybe Vancouver is really going down that path.
However, I remembered that the press badly mangled the Ecodensity story last year when it came out.
Looking at the original documents on the vancouver-ecodensity.ca web site, they say nothing of the sort. First of all there is nothing at all about these cockeyed ideas in the charter itself. However, in a discussion document entitled "Suggested Tools and Actions - DRAFT" it gives the following two suggestions:
For those who read this on a daily basis, the blog was offline for a few days. This is because the machine that hosts it lost power and then couldn't restart. I learned a great deal more than I wanted to about SCSI & RAID controllers, hard drives, master boot records, volume boot records, and NTFS. It's better now.
Tags: computersThu, 24 May 2007
I'm tired of having to figure out why my software does things other than what I want. Really it should know and be willing to volunteer the information.
For instance, take the width of a cell or div on an html page. I go through the trouble of specifying the width of an element in the html or the style sheet. But the browser reserves the right to change this width unilaterally if it thinks it won't fit. OK, it could be that it's for the best and that there would be dire consequences if it were to do as it's told, or maybe there's a contradiction with two conflicting width instructions. What I would like to know is WHY? Why is this table wider than I said? Back when I was doing expert system shells, they had an explanation facility. You asked why it came to a decision, and it could tell you the input information and rules that it used to come to that conclusion. Why can't a rendering engine have that? Countless millions of hapless web page authors do this laborious detective work themselves every day, making the web a pretty inefficient place to do any serious work.
Another one that needs a WHY button is the failure to connect to a web server, or even worse a slow connection. I don't know how often I have to pull out the ipconfig, traceroute and nslookup type of tools to figure these things out myself. Or the windows task manager to sort out what is the holdup on my end or on the server. Most people just stare at the "loading" animation or the "server not found" page and then decide to blame their innocent firewall, or if they're using Windows, reboot and hope it goes away. Would it be so hard to have a "WHY?" button and let the machine go through the relatively routine analysis? Rather than sitting there staring back at me smugly saying you figure it out, genius.
Fri, 18 May 2007
A follow-up to the previous post. Apparently, the fact that high-rise apartment buildings consume more energy than houses has been known for a long time. For instance, here is a table (my bolding) from "Multi-Unit Residential Building, Energy & Peak Demand Study" By Paul Myors, EnergyAustralia, with Rachel O'Leary and Rob Helstroom, NSW Department of Planning in ENERGY NEWS Vol 23 No. 4, December 2005.
TOTAL ANNUAL GREENHOUSE GAS EMISSIONS BY DWELLING TYPE.
A person in a high-rise building consumes nearly 3 times as much energy as one living in a townhouse. This particular survey was based on utility billing data, plus building and unit audits, plus some whole of building load profiles with on-site logging equipment.
Several CMHC reports going back for decades also wonder at this finding. CMHC is surprised and blames air leaks. Australians blame a penchant for long showers. But none of these can account for such a large difference. Some say that apartment dwellers lack concern for the environment, either because they are convinced that apartment living is already inherently more efficient, or because they are less invested in their community, while others still find something inherently wrong with sealed buildings that can not take advantage of natural air or light and where so much of the energy use seems outside the control of the individual.
Tue, 15 May 2007
Which one uses up more energy per square foot, a single detached house, or an apartment in a large building?
That should be easy: most of the walls of your apartments are interior walls, only one surface is exposed to exterior temperatures, as opposed to six surfaces in a single detached house. Plus you have economies of scale: you can insulate the heck out of that one surface and two windows, and the entire building can share the cost of the most efficient heating and cooling technology available and make the investment worthwhile.
You would think that apartments consume a lot less energy per square foot than houses. You would be wrong. According to the U.S. Energy Information Administration, Office of Energy Markets and End Use, apartments consume more energy per square foot than even single detached houses. An average single-family detached house consumes 44,700 BTU per square foot, while an apartment in a building with 5 or more units consumes 48,500 BTU per square foot.
Air conditioning a 3-bedroom unit consumes 4,300 BTU/ft2 in a house and 7,600 BTU/ft2 in an apartment. Heating does takes less energy in apartments, but the total energy use is higher. Is it the elevators? Is it the hall and lobby lights left on all night? I don't know. In another article I wrote last year, it turns out that in most densities, apartment dwellers drive more than house dwellers.
So, seriously, why are companies like Tridel building nothing but apartment buildings in their "Green Communities" repertoire?
Sun, 13 May 2007
The theme of "urban splatter" is common in both blogs.
Fri, 11 May 2007
I just downloaded and used this nifty Windows tar-gzip tool. In Windows, most compression programs can easily decompress gzipped tars. But to create these things you have to resort to command-line tools. The ones I use requires two steps, one to tar a directory and the other to gzip the tar. And for a single file, the gzip utility will erase the original file unless I remember to tell it not to. And of course Windows directory names with spaces can cause all sorts of problems.
This little tool is drag and drop. Drag a gzipped file or tar on it and it extracts it. Drop any other type of file and it gzips it. Drop a directory or group of files and it offers to tar it first. With default compression level it is pretty fast.
I could complain about how it could be even easier to use, and make even more efficient use of temporary disk space, but I won't. Some tools give you expectations that they will do everything, and then disappoint. This one only claims to do one thing and it does it with a minimum of fuss.Thu, 10 May 2007
After a city crew had erased an 8 year old's hopscotch course on a residential sidewalk, acting under the new anti-graffiti by-law, a city councillor in Ottawa has decided to try to legalize both hopscotch and street hockey.
Notice of Motion
Moved by Councillor C. Doucet
WHEREAS obesity is an increasing health problem for children;
AND WHEREAS it is increasingly difficult for children to have access to unstructured play;
AND WHEREAS streets are the city's largest hard asset;
AND WHEREAS children deserve a fair access to that public asset;
AND WHEREAS activities like hopscotch and ball hockey are traditional play activities for children and have been widely accepted as an appropriate use of residential streets and sidewalks;
BE IT THEREFORE RESOLVED that ball hockey on residential streets will be permissible as long as the free flow of traffic is maintained once an adjustment in the game has been made to allow the passage of the car;
AND BE IT FURTHER RESOLVED that hopscotch will be permitted on city sidewalks and washable chalk hopscotch patterns will not be regarded as graffiti;
AND BE IT FURTHER RESOLVED that staff will consult with the public concerning what other child-friendly, community-friendly activities could be considered by Council as acceptable public uses of city sidewalks and streets.
It's nice to have a new municipal regulation with negative enforcement cost. It is curious that none of those involved, from the neighbour who phoned in the complaint to the call centre employee to the water truck driver instructed to drop his other duties and go erase the chalk hopscotch course quickly before it rains, stopped to think that this was not an appropriate proper use of public funds. The 8 year old figured it out.
Tue, 08 May 2007
The theory of zoning is to make new development compatible with existing users of land, while treating all landowners equitably. If one landowner has the right to build one thing and the one next door has very different rights, then something is wrong. There are legitimate reasons why they may have different rights. To preserve the equitable treatment of landowners, there are two ways in which to make exceptions: One is to have an comprehensive plan that explains in terms of officially accepted land use objectives why people on one side of a dividing line should have rights different from those on the other side. The other is the variance route, a way for an individual property owner to argue why one lot in particular should be exempt from a general rule because his lot differs sufficiently that the spirit of the zoning rule is better applied with a variant on the letter of the rule.
When exceptions to the general rule are granted for other reasons, one can reasonably suspect that influence is at work to promote the interests of one owner over the other. Planners in some cities (Toronto, and sometimes others) have argued that these exception have become required because the official plans have not kept up with changing conditions. Individual landowners can justify the cost of re-zoning, or if required of making a small change to the comprehensive that only applies to them, but not of redoing the entire planning exercise.
Essentially, owners and planners ask for rezoning arguing that the rezoning is in accordance with the "Shadow Plan", as I call the unpublished plan in the heads of senior planners. This plan is the set of reasonings determining what zoning changes they will consider reasonable even though it is not in accordance with the official comprehensive plan. You can well imagine what knowledge of and influence over this shadow plan is worth to a developer. They can buy land valued by others according to the official comprehensive plan, and re-value it according to this shadow plan.
The "Shadow Plan" comes into effect when a few conditions arise: one is a divergence in vision between those who create the plans and those who deal with applications for changes. In some cases, the official plans may be for public consumption only, given that the are developed with a degree of public consultation and democracy that creates a momentum that no one dares challenge, but knowing that the philistines can later be kept out of the important work of development decisions. Another condition is a land market that depends on zoning changes. Once the expectation has been established that serious development proposals can get zoning changes, land values go up to unrealistic levels. The traditional mantra of developers that they can not afford to develop anything within existing zoning and that their children will go hungry if they have to follow the regulations may have a tiny speck of truth to it. Breaking out of that vicious circle requires both the elimination of the "shadow plan", either by adopting one that is enforceable or by some backbone in enforcing it, and a change in the political culture whereby following the rules is for suckers.
The practice of changing the zoning for a single parcel only is sometimes known as spot zoning. It is illegal in some jurisdictions. It is definitely frowned upon as a failure in the planning process elsewhere. It creates a patchwork of small irregular zoning areas that are tantamount to not having zoning at all - everything is changeable if you ask the right way.
How changeable is zoning? What is the "right" size for a zone? Is spot zoning a symptom of a malaise in planning and if so is the problem influence or lack of comprehensive plans?
In a nearly-scientific study, I assembled a few zoning maps from different cities to see how many zones and exceptions were typical. The methodology is not extremely rigorous but here it is. First, I searched for online zoning maps in cities that are reputed for patchwork zoning. "Reputed" depends on what I've read or who I've spoken to lately. Second, I looked for generally comparable areas, somewhat hampered by my lack of in-depth knowledge of many of these cities. Starting from downtown, I went generally West, or if it seemed that West would encounter some geographical obstacle I chose another likely-looking direction. I was looking for a 100-block area, reasonably close to downtown, with mostly residential but partially commercial zoning. I was looking for a 100-block area that included if possible two distinct types of street layout. My guess was that these were the types of areas where there would be some development pressure and some local resistance. It happens that this often hit poorer areas of town. Anecdotally, poorer areas of town can be subject to more development pressure since residents are often renters and may not be well equipped to influence political decisions.
So here are the results (click pictures to zoom) :
So what works best? Some cities have little or no zoning at all. Houston is the most famous example. Is this the cause of sprawl, or are zoning codes the cause of sprawl? Similarly, zoning can be very restrictive, forcing complete separation of uses, or the regulations can be comprehensive enough for a single zone to allow the fabled "mixed use" while constraining it so that the market does not re-segregate the uses. But what does not work is sucker zoning, where your property right are whatever you can cajole and bully the city into giving you.
Wed, 02 May 2007
An interesting non-competitive procurement procedure by the Government of Canada says that it has failed to find anyone in North America able to produce a CO2 capture and compression system. The contract comes from the government department with all the expertise in CO2 capture and storage.
They failed to get any bids last year when they looked for a company with directly related capabilities. They are now proposing to award the million dollar contract non-competitively to a firm that has experience in other types of gas compression. It will award it unless some other North American firm can demonstate "evidence that it has successfully fabricated at least one gas liquefaction system OR at least one CO2 capture and compression system OR at least one CO2 Recovery System in the past seven years".
The pilot will need to works on a small combustion facility in Ottawa. The flue gas also contains nitrogen and sulfur oxides and particulates, which can apparently be vented.
If this is being pitched as the technology that will allow us to achieve greenhouse gas reductions without burning less fossil fuels, it would be more believable if someone somewhere knew how to do it at all, on even a small scale, even if it barely works.
Anyone can be an urban designer, thanks to the virtual world online video game Second Life. A sort of Sim City on steroids, Second Life charges you real money to lease some virtual land, but after that everything is virtual money and land development, and lets you interact with other people in this world
A community group in Paris is leasing an island in this world to build a model of Les Halles, the area near the Louvre, in order to evaluate development proposals and to hold a virtual design contest.
Why pay planners for expensive visualization software seen by only a few people, when you can have a very cheap and transparent process like this massive virtual charrette? The city planners and selected architect are not involved in this innovative initiative, which was launched by the community association "Accomplir".
It was an interesting and sobering realization by planners in the area of Washington DC. Of all the areas built over the years and designed to maximize the building intensity and density, the one that best delivers the goods is downtown Washington DC, despite (but really because of) the grid layout of streets, the severe height restrictions which locks height to the width of the street and preserves heritage buildings and character. Not to mention the huge parks, museums, and monuments just taking up space.
This article on the urban development intensities, by the Director of Arlington Economic Development, expresses surprise that even Tyson's Corners, a no-holds-barred high-rise area in Arlington county across the Potomac, intended to maximize intensity, has only one quarter of the job and population density of old downtown DC.
The article also says that the 18th century street grid has no trouble handling 21st century traffic levels. This is where DC cheats. Another thing that DC does right is to have good public transit and very little parking.
This should not be a surprise. 18th/19th century street grids and height restrictions, plus transit, are features of most high-density areas, including Paris, Berlin, Barcelona, and Plateau Mont-Royal in Montreal. They also look good. No matter the theory, high rises and hierarchical curvilinear streets don't come close.Tue, 27 Mar 2007
This article on the cul-de-sac, the archetypal symbol of suburbia, which minimizes or maximizes paved area, depending on which paved area you count and how wide you make the streets.
City planners don't believe in them any more. They give the illusion of safety, but in fact just reduce accidents in one place and increase them elsewhere, by forcing a hierarchy of unwalkable streets. The book "Suburban Nation" coined the term "cul-de-sac kid" to describe children held captive by this way to lay out streets. Children can't get to where they want to be without getting a parent to chauffeur them, and this causes social isolation.
This is similar to the social isolation of children in high-rise buildings. Little wonder then you get anti-social behaviours concentrating in areas of high-rises and in areas of suburban cul-de-sacs, practically skipping over the traditional grid streeted and compact home areas that were built around 100 years ago. Christopher Alexander said it best in the article "A City is Not a Tree" where he argues against thinking of a city hierarchically, but rather as a lattice or semilattice.
After half a century of pretending that we were changing things because we were thinking of the children, it turns out that the old way was better for the children all along.
Mon, 26 Mar 2007
The Oxford English Dictionary defines McJob as "an unstimulating, low-paid job with few prospects, esp. one created by the expansion of the service sector". McDonald's wants it changed to "a job that is stimulating, rewarding and offers genuine opportunities for career progression and skills that last a lifetime." This is according to a letter written by David Fairhurst, chief people officer in northern Europe for McDonald's, and obtained by the Financial Times.
What the company fails to appreciate is that dictionaries are not opinion leaders, they are timid followers. They add or change a definition only after people have been using a term to mean something consistently for a few years. The only way that major dictionaries will change their definition is if McDonald's and everyone else start using the term to mean "stimulating, rewarding job which offers excellent career prospects". I look forward to seeing those want ads offering "McJobs" without having to specify that it is meant in a positive sense.
I tend to ignore requests to change definitions for the purpose of PR. There is one exception. I got a letter some time back complaining about the term "warlike" in the definition of Blackfoot (indian nations of the western prairies). There is plenty of historical evidence that in the 19th century, the Blackfoot confederacy was considered warlike by neighbours of all ethnic backgrounds. Still, they were at peace a lot of the time and the adjective certainly does not apply today. But then there are plenty of nations that could objectively be called warlike, but where the word is not used in the definition. So I removed the adjective from all indian nations. Except for the Iroquois nations, after consulting several of its members who consider the warrior past and even present as a matter of pride and inherent to their ethnic identity. OK...
Thu, 15 Mar 2007
This article in E Magazine, Ten Things Wrong With Sprawl, is a relatively fair analysis of pro and anti sprawl thinking, focusing on what the author believes are "undeniably adverse effects". These are:
Do Americans move to the suburbs to avoid seeing poor black people, or to not have to pay for their schooling? Well in places where the school district covers both cities and suburbs and there is no legal way to avoid paying for them, the poor black people still end up in different schools with fewer resources. There is some other economics at work in creating segregation, and there are other political reasons why schools with richer parents end up with more resources. I remember one instance when I got $300,000 of extra goodies for an inner-city school through corporate donations and a government grant. Two years later, all the goodies had been moved to the board's "alternative school" in a richer neighbourhood. The poor children are welcome to go there, if their nannies can go pick them up at 3:00 like all the other kids. One way or another Amartya Sen's Capability Deprivation mechanisms will equate economic, social, and political inequalities, and auto-oriented urban design is just one way to do this with a clear conscience.
The environmental and community consequences are more undeniable and universal. To his credit, the author concededes that modern day exurbs are not the places of alienation described by some new urbanist writers, many of whom draw upon affection for the older urban neighborhoods of the early and mid-20th century. It's the new high-density downtown condos that have a lock on alienation. But these exurban communities are the "glocalized" communities described by Barry Wellman. Each person participates in several distinct partial communities of place, with a drive in between the places. That means that the political power of each place is relatively weaker and the weakest communities lose. That is why things like homeless shelters end up in poor neighbourhoods, where they do the most social damage. This is also probably why the association between density and various noisy and disruptive uses still prevails in zoning codes. It's a vicious circle, where the uses that nobody wants end up in the areas with the least political clout, by associating them with density. Then when someone wants to tear things down and build something big, profitable, and alienating, it's much easier to dispossess the poor. Letting a designated area sink into misery is practically a necessity for the ability to eventually rebuild the infrastructure, if economics is alone to do the job.
Sun, 04 Mar 2007
The Sierra Club of Florida is supporting the latest versin of an initiative to give voters the ultimate decision over changes to comprehensive land use plans. The Sierra Club seems to believe that voters, given a choice, are more likely to oppose environmentally destructive sprawl than elected officials.
So does the Florida Chamber of Commerce, who opposes the initiative. Here is the text of its brochure
Right now, a special interest group is gathering signatures across the state in an attempt to put its latest pet
project into the Florida Constitution. They call it "Hometown Democracy," but it is really a scam that will turn
every local planning decision into a negative political campaign. They want to take the power to manage
growth out of the hands of the people and put it into the hands of a few special interest extremists.
Who are these special interests taking over "like pregnant pigs"? They don't say, but the brochure shows their pictures: a florida panther, two manatees, and some mangroves.
London is looking at what it can do next to reduce its greenhouse gas emissions. Most measures are simple changes in behaviour, including higher congestion charges for more polluting cars, and measures to make transit and cycling easier to use
At the same time, New York City is being encouraged to remove incentives to drive, including free or very inexpensive parking when people have grood transit alternatives. In both cases, both the city and the resident save money.
Wed, 14 Feb 2007
The announcement of Intel's prototype TeraFLOPS-on-a-chip 80-processor system is hailed as making artificial intelligence an everyday reality. Yet another one that thinks that computing power is what is holding AI back.
For decades, journalists and amateurs blamed the lack of computing power for the inability to make significant progress in AI. Well now the average new desktop computer is 100 times faster than the original Cray supercomputers, and a game console is 100 times faster than that. High-end video cards can reach a peak performance of hundreds of gigaflops.
And yet AI is not that much further ahead than 15 years ago. That's because algorithms, not computing power, is what AI needs to progress. Arthur Samuels had a champion-beating checkers program in the late 50's. AI software was solving algebra and calculus problems, and analogy problems from IQ tests, in the early 60's. MYCIN was doing medical diagnosis in the early 70's. In the late 80's, ALVINN drove coast to coast autonomously.
Then, caught up in a competitive atmosphere with each faction trying to prove that the other guy's algorithm could not possibly work and was diverting funding away from the real solution, funding for algorithms stopped and funders put their faith in computing power. Sorry, but NP-complete problems don't get solved much faster with faster hardware, planning and disambiguation were never a matter of retrieving data faster, and numerical methods like neural networks and fuzzy logic don't solve their fundamental mathematical limitations with more hardware. At least when hardware was constraining, researchers tried to compensate by having smarter algorithms.
By 2029, the computational power needed to simulate the human brain will cost a dollar, says Ray Kurzweil in his review of James Gardner's The Intelligent Universe. But it's going to run hot - roughly the temperature of a hydrogen bomb. I guess when a human brain does it, with slow analog synapses and at room temperature, it's doing it wrong, not realizing how many transistors it really needs to be intelligent.
Tue, 13 Feb 2007
So there is a attempt to put a high levy on iPods and MP3 players. I hesitate to say this because some of them are my friends, but I think that a lot of people involved in this levy have it completely wrong, if they are willing to listen to a non-lawyer's opinion.
The private copying exception under the Berne Convention was intended to compensate music authors for the revenue they might lose when a country gives its citizens the right to make copies of media for private use, or more precisely removes the right of authors to prevent it.
Canada, among other countries, has chosen to remove this right from authors (it is not clear that they have given a right to consumers) and can levy a fee on the sales of recording media whose proceeds are to be redistributed to authors to compensate them for losing this right. This exception is not the same as fair use, and is also not intended to give anyone the right to pirate copyrighted content, and consequently it is not intended to compensate authors for pirating or other uses.
There are arguments that since private copying from one medium to the other, for instance ripping a CD onto an MP3 player, is legal it should not be subject to a levy. That is exactly backward. The levy is meant to compensate for that very act in exchange for making it "legal".
When calculating the appropriate levy, the question that must be asked is: if the authors had kept their right to prevent private copying, how many of those customers who use a particular recording medium would have purchased extra copies of the content?
Two things are important here: when DRM is used, it can very well interfere with the ability of consumers to make perfectly legal private copies. Authors do not have the right to use DRM to prevent this, but neither is it illegal for them to implement it. Consumers also do not have the right to DRM-free content for private use, but it is probably not illegal for them to break the DRM code for the purpose of private copying. But since DRM does make it more difficult to make private copies, the fact that it is in use should significantly reduce the compensation from the CMCC fund to the publishers that use DRM, since they are through electronic means preventing the loss for which they could have been compensated.
As to putting a levy on iPods, this raises a very important question. Consider a professional musician who owns an iPod and religiously pays iTunes for every track. The track gets copied from her Mac to her iPod. Does this constitute private copying for which the author requires compensation? No, because there is no other way to get the content on to this player; the intention of the author when allowing iTunes to sell the content is to have it copied onto the iPod. Until music distributors find some way to get the content into these players without having it stored, the iPod must be considered the primary medium that was intended when the sale was consummated.
Put another way, if you were unable to copy the music from your computer to your iPod, would you be more likely to buy two copies? Of course not, you wouldn't even buy the first copy. So what is the proper compensation for the use of iPods and MP3 players? Zero; it is a fundamental property of the player technology that all that it plays must be copied on to it, so it differs fundamentally from something like an audio tape, where recordable media displace the published media. The private copying levy is not meant to be like crop insurance, passing on a cost to consumers to compensate producers for losses that they may or may not have been able to prevent. It is a very specific compensation for a very specific loss, with consumers equitably paying fair market value for what they use.
See Michael Geist's blog (left menu) for more analysis on the private copying levy.
Tags:Fri, 09 Feb 2007
Google is about 10 years old depending on when you start counting. This is Google's predecessor, Backrub as it looked in December 1996. In 1997, Larry Page and Sergey Brin started working on a successor, called Google, with the help of Scott Hassan and Alan Steremberg.
Their home pages from when they were students are all still there. Stanford never
deletes any of that stuff!
Tags: GoogleWed, 07 Feb 2007
The Congress for the New Urbanism has released its new rating system for green urban neighbourhoods. To qualify, the neighbourhood must satisfy certain mandatory criteria like avoiding wetlands and flood plains. After that, there are points for things like walkable streets and diversity of uses.
The huge, gaping hole in the evaluation system is that it does not try to prevent self-selection of specific segments of the population, something that has turned so many well-meaning "green" strategies into unreproducible islands of meaningless statistics that make no dent in the problem or even make it worse.
What do I mean? If, for example, you take prime locations close to transit and develop them with all apartments. Those small households that already take transit will happily live there, and keep on taking transit. No change overall. But the larger households, who would only take transit if it is convenient, can no longer live near transit. You've actually increased their car use, but you will have fine statistics saying how everyone in your new development now uses transit.
To their credit, the New Urbanists address this indirectly when they have "affordable housing", "diversity of housing types", and "close to schools" as being worth a couple of points. Those should really be mandatory. To have any effect on sustainability, any project whose population does not have the same demographic characteristics as the city as a whole, and in particular that does not attract and change the behaviour of high energy users such as suburban families with two cars, is not a green one. Families will flee neighbourhoods that are too overtly compact or affordable, and then vow to keep that stuff out of their suburbs.
Tue, 06 Feb 2007
Here is an interesting new site that just launched. Green Options is a savvy venture, that is doing everything right commercially, from attracting the best bloggers and writers, to getting the word out and the links in. They have a fine SEO and advertising strategy. Who says economics and ecology are incompatible?
The site, which has been in "beta" for a month, includes a set of blogs, some discussion forums, a "Green Life Guide" wiki, and a "Green Report" summary of recent news. The last two or three are really too young to comment on. They need a couple of redesigns and maybe more useful and compelling content, but I'm sure they'll get around to it when they get some time.
The blogs are first-rate. A lot of good bloggers seem to have suspended work on their own personal site in order to devote more time to this one. They range from the quirky "Grow Your Own Grass Arm Chair" to the serious analyses and important interviews, and including the ubiquitous preachy impractical campaigns from the zero-waste and fear-what-you-can't-pronounce crowds, each competing for superlatives on how toxic are all common household items. Enough ranting. It's a good site, worth a visit.
Tags: EnvironmentMon, 05 Feb 2007
In a recent Globe and Mail article, financial journalist Eric Reguly proposes a simple eco-list to make Canada a greener nation.
He does not mention killing tar sands subsidies or the tar sands altother (maybe he ran out of room). For him, it's easy and fair to set up economic incentives to achieve environmental goals. For instance he says developers have no incentive to implement energy efficiency. He also believes energy taxes are not the answer, since they would have to be too high to be effective.
Compare that to the interesting debate on regulation vs. carbon taxes that ran in the Alternatives Journal between energy economists Jack Mintz and Mark Jaccard. One argues that specific regulation aimed at emitters is better than general costs passed on to consumers, whereas the other believes that both emission regulations and gradually increasing carbon taxes are required, and points to examples like tobacco, where it is the combination of regulations restricting specific uses and taxes as a general deterrent and economic signal that works where one or the other alone might fail.
Tue, 30 Jan 2007
Here is an interesting interview with Ed Mazria of Architecture 2030, who says that architecture is responsible for 48% of greenhouse gas emissions. "Every time we design a building, we set up its energy consumption pattern and its greenhouse gas emissions pattern for the next 50-100 years. That's why the building sector and the architecture sector is so critical."
He wants to see an immediate 50% reduction in fossil fuel energy consumption for developments and major renovations, and have it go to 60% in 2010, 70% in 2015, 80% in 2020, 90% in 2025, and carbon-neutral by 2030.
This has been bothering me for years and I finally found a simple solution. Say you have an html page encoded with one particular character set, in this case ISO-8859-1 (Latin 1, very widely used), and you need to call something that requires that parameters be encoded with Unicode UTF-8. How do you convert, particularly if you are using a POST query?
I won't go through all the solutions that nearly work, but not quite. Telling one of the two servers, the one that generates the page, or the one that receives the query to do the conversion is tricky, both of them have databases and programming languages that don't easily support translation of character encoding. And mixing encodings on a single page, with various "charset" and "enctype" attributes hardly ever works across browsers.
Here is a neat new tool. Actually it's been there since May 2006 or so. It lets you visualize data in several dimensions, using animation, colours and size of bubbles.
Here is a chart I put together showing progress in GHG emissions vs GNP in several major countries. In general, the higher the GNP, the more the per capita GHG emissions. However, GNP growth can be achieved with or without GHG growth. Look at France and Germany. Their GNP increase while their GHG decreased. Less developed countries (no, I'm not including the U.S. in this) get GHG growth as they increase their wealth. But the rate of growth differs. The worst one here is India. In 3 decades, its per capita GNP doubled, but its per capita GHG emissions tripled. China, on the other hand, personal wealth increased eight-fold, with a "mere" doubling of their GHG emissions.
The British government has launched a debate online about giving everyone from the Queen on down an equal target for carbon emissions, say 5 tonnes. This is an interesting concept, where everyone is given a ration card for how much fuel they can burn, directly and indirectly. If you run out, you can buy someone else's remaining allocation. There are some pilot projects underway
Now this is a good way to get people to think twice about whether to fly or drive if they can take the train. But as to doling out responsibility for indirect effects... How about swiping your card if you rent in a high-rise, because of the carbon emissions from making the concrete and mining the aggregate? How about one when you eat meat, because of the enteric fermentation (cow flatulence) and the feed grain farming practices that deplete carbon from the soil? How about when you vote conservative, because you know they'll try to block measures that reduce GHG emissions?
The consultation itself is interesting. First of all it is being discussed on the minister's blog, but more importantly, it is on 10 Downing street's new e-petition system. This system allows anyone to create a petition asking for a change in government policy, and after it is checked against the basic terms and conditions, it goes on the site and anyone can sign it.Tue, 23 Jan 2007
Dear Councillor Vaughan,
Congratulations for your insistence that development near downtown must include family housing.
In your recent analysis of the effects of the condo boom on the demographic makeup of Toronto, you hit the nail right on the head. If you put all of the households without children near downtown, then the households with children are going to be further away and in lower density.
The mathematics of reality are inexorable in this case: the city would then require more land to house its population and many more kilometres will be driven by car.
Although I am not the only one to document these patterns, I can point you to some articles that I have written about how to crunch the numbers to calculate the size of these effects.
M. Laplante, "Optimizing TOD Housing Mix and Density", Ontario Planning Journal Ontario Planning Journal, Volume 21, Number 4, July/August, 2006
I would also point you to a recent Ontario Planning Journal article "Visualizing Urban Form that Works: Moving Toward Healthy and Sustainable Communities", which shows how the Region of Waterloo can achieve the Places to Grow's destructive density targets for downtowns, through the use of stacked townhouses. It also shows the benefits of clustering the high-density at the edges of town rather than at the centre.
It's going to be difficult going against preconceived notions of what a city should look like, but if you look around the world at the cities that have achieved the objectives that Toronto would like to achieve, you will not see central neighbourhoods with exclusively childless households.Fri, 19 Jan 2007
Urban Design International is available for free until Feb 14. So are all the Palgrave Macmillan journals.
Here is a great article in the current issue, entitled Close Encounters with Buildings. It details the interactions between the scale of buildings, which get bigger all the time and the scale of people, who don't.
A lot of what is recommended is the same as the Christopher Alexander's classic "A Pattern Language" which I am re-reading right now. But the methodology here is quite striking: a building and streetscape must interact with people from different distances. It is possible to make the interaction work at different distances, which can be complex but very pleasing.
The good news is that most urban facades in North America are so dysfunctional that they are broken on every single scale so improving them is easy. Just about anything you do is better.
The only thing missing is the child-scale interaction. Children have a different perspective, both trigonomically and psychologically. Don't forget the child-sized features that they and only they can safely explore.
In a recent decision of the Ontario Municipal Board, the body set up to appeal certain municipal decisions, the City of Toronto was essentially told that once it determines that an area is mixed use, it loses the right to determine what the mix should be or what the density or the distribution of uses can be.
This is a major setback to the ability to achieve smart growth: past decisions have always allowed this in both residential-only and employment-only land.
The decision comes after a high-rise project on Queen Street West which had been changed by city planners to redistribute the density and put minimum and maximum proportions of business and residential uses, was appealed, with the developer facing off against the city and the residents.
The Ontario Municipal Board approved essentially the developer's original plan for several towers (between 10 and 26 storeys) in a Victorian-era neighbourhood characterized by art galleries, low-rent studios and retail stores. The official plan and planning decisions were intended to protect the character of the neighbourhood, and in particular the ability to live and work in the same area.
According to the Board's decision, "If the City is intending to rely on the securing of a specific amount of employment land use on individual sites based on a quantifiable goal for each planning area, district or neighbourhood, [...] designation of exclusive employment districts offers a more objective approach in that regard." In other words, if you want the right to do planning, you must stick to separation of uses.
The board ordered the city not only to approve the development, but to change its official plan to remove limits on density and on mix of uses. Needless to say, I think this decision is appalling and makes it impossible to implement smart growth strategies in Ontario.
http://www.omb.gov.on.ca/e%2Ddecisions/pl060443%5F%230053.pdfThu, 11 Jan 2007
Canada is increasing the amount of ethanol in gasoline. This is supposed to reduce greenhouse gas emissions without reducing energy use. But at the same time we are increasing the amount of synthetic crude from tar sands, which increases greenhouse gas emissions without changing energy use. How much alcohol do we have to use to compensate for the tar sands.
The U.S. DOE claims that the "net energy balance" of making fuel ethanol from corn grain is 1.34. That means that one barrel of oil is required to make ethanol with the energy equivalent of 1.34 barrels of oil.
They should really say the balance "can be" 1.34 since to achieve this you need the latest high efficiency processes, and you have to assume that all of the by-products are recovered and used as animal feed, thereby reducing the necessity to grow that feed. The fact that we have to replace the missing corn crop with other food without using any more land is not included in the calculation. We could become vegetarians, but who would eat all that animal feed from the industrial by-products? So, pretty big assumptions, but 1.34 is a number we can use.
This is just a back of the envelope calculation. A barrel of conventional oil releases about 450 kg of CO2, plus another 28.6 kg used in processing. For tar sands it's 80-90 kg, or 13% more. I'm not counting the opportunity cost of using up all our cleaner natural gas production (3 000 cu ft per barrel) in the process. Do the math and you have to mix 2/3 of a barrel of synthetic crude with 1/3 of a barrel of ethanol to do the same amount of damage as we did in the 90's.Mon, 08 Jan 2007
A recent Globe and Mail column by John Barber, entitled Never mind the NIMBYs -- this boom is a rare success brings up some interesting issues about how Toronto wants to reduce sprawl. The only flaw is that every single one of its many arguments demontrates the exact opposite of what it claims to prove. Such a wooly-minded rehash of the type of superficial arguments that developers make to not have rules apply to them!
Where to start? Barber actually comes close to the actual state of affairs when he says some of the most attractive U.S. cities -- Boston, Chicago, San Francisco -- are losing population as high housing prices squeeze out middle-class workers, and that they are in danger of becoming what U.S. urbanist Joel Kotkin called "amusement parks for the rich, the nomadic young and tourists ." He tries to prove that things are different in Toronto, but it is clear that this is precisely what is happening.
Here is a gem: "Decrying the anti-development politics that restricts the supply of housing in New York City, Harvard economist Edward Glaeser noted recently that that city permitted only 21,000 new units in the entire decade of the 1990s -- compared to 13,000 units in 1960 alone." What is wrong with that statement? 1960 was the last year of significant population growth that New York has ever had. Population growth stopped dead in its tracks that year and remained at 0% for over 4 decades. What happened in 1961? That was the year of New York City's last major change to its zoning laws, a huge upzoning everywhere in preparation for a population boom that never happened. That has happened in city after city - increase the zoning to allow highrises everywhere and all the families that can afford it will move away en masse. Bye-bye population growth, hello Kotkin amusement park. So what is this anti-development politics that he is talking about? New York City's recent "downzoning uprising" is what he is talking about. Spearheaded in part by Paul Graziano, a planning consultant and Green Party leader, and encouraged by mayor Bloomberg, this movement tries to preserve existing urban communities to be roughtly the current scale or a little higher, with more growth targeted along certain main streets. This, incidentally, is the pattern that has given New York an enviable record for land use and transit ridership. Destroy the pattern and you risk losing its environmental and cost benefits.
Here is the most telling section:
Downtown politicians have traditionally been pro-development: Deputy Mayor Joe Pantalone automatically approves of absolutely anything that requires the employment of construction workers, while gay Councillor Kyle Rae thinks everybody should live in a high-rise south of Bloor. Former Mayor Barbara Hall's most successful initiative was to spark downtown development by repealing obsolete regulations.What better recipe have you ever seen to ensure that the central city has a socio-demographic shift that polarizes it to house the rich and nomadic young, as Kotkin warns, and to fuel a new exodus to sprawl? Laissez-faire highest-and-worst-use in the downtown, by removing any requirement to build or preserve those communities that already achieve the planning objectives and that house the full range of age groups and incomes. This gives density a bad name and causes an entrenchment of sprawl mentality in the suburbs. Good going Toronto, you are practically guaranteeing that the opposite of what you want is what will actually happen.
"Compared to New York's ultra-influential Landmarks Preservation Commission, the Ontario Municipal Board is a model of enlightened governance." Apparently, heritage is the enemy, and good planning requires that heritage experts be ignored and that lawyers make planning decisions.
Here is another one:
It goes without saying that the Miller regime would prefer to spend more money on subsidized housing for the poorest citizens. But with such money in short supply, the relatively unfettered free market is creating what Prof. Glaeser calls "true" affordability in Toronto. "True affordability doesn't mean a small number of artificially cheap units," he wrote, "but a large number of units that reduces prices for everyone."This is the old trickle-down theory: subsidize luxury condos and somehow housing prices will go down. Toronto tried that and now has the highest housing prices in Canada. Ottawa did the same and nearly caught up. What they did get is cheaper high-end housing, but the low end went up to compensate. Trickle-down works for developers, but in the end any argument that allows zoning to be increased for free works for developers, whether they dress it up as social or as environmental policy. In the long run, ad hoc decisions that allow a concentration of high-rises rather than a real range of housing types aren't good for a city.