Category Archives: Climate

Air Quality Index

The Air Quality Index (AQI) is a numeric indicator of local air quality. Since last Thursday night, our AQI has been in the 500 to 526 range. The range only goes up to 500. Anything above 300 is considered hazardous to health for all.

We live in the area of the fires in Oregon.

We’ve been sealed up inside your home since Thursday. Fortunately, there is little smoke smell inside as our house is sealed very tight – but outside it was extremely difficult to breath. The smoke led to headaches, fatigue, sore throat and a slight shortness of breath – classic symptoms of smoke inhalation, I am told.

As of right now, the AQI has fallen to 169 which is a noticeable improvement, however this is not yet to a normal range.

The fire storm was caused by an early Arctic air mass (high pressure) descending south from Canada. The winds blew from the colder, high pressure to the warmer, low pressure to the south and on the west of the Cascades. This caused very strong winds to blow for 1-2 days, crossing mountain ridges at 55 to 75 mph and pouring down west side mountain canyons.

Routine summer thunderstorms had started fires on August 16th, fires which had been fought since then. The Lionshead fire had grown to about 22,000 acres – but then when the winds hit, exploded by over 100,000 acres in about 24 hours. Similarly, Beachie Creek Fire had also started on August 16th and was at only 500 acres when the winds hit – and it too was soon over 100,000 acres (now 190,000 acres) in size.

The cold air mass poured into the valleys and plains – creating an inversion layer of colder air that would not mix with the warmer air above. This trapped the smoke near the ground.

Yesterday – and likely again today – we will see the effects of the weakening inversion. Yesterday, AQI fell to around 250 in the afternoon but as the air mass cooled in the evening, it went back to 468 where it remained overnight. Suspect today’s lower reading will do the same this evening and repeat this pattern until late in the week when a fall weather storm system enters the area, delivering rain and breaking the inversion layer.

It’s worse than we thought: “Second Analysis of Ferguson’s Model”

In the past I had some comments on Neil Ferguson’s disease model and have repeatedly noted its poor quality. This model was used, last spring, as the basis for setting government policies to respond to Covid-19. Like many disease models, its output was garbage, unfit for any purpose.

The following item noted that the revision history, since last spring, is available and shows that ICL has not been truthful about the changes made to the original model code.

Source: Second Analysis of Ferguson’s Model – Lockdown Sceptics

THIS! Many academic models including disease models and climate models, average the outputs from multiple runs, some how imaginatively thinking that this produces a reliable projection – uh, no, it does not work that way.

An average of wrong is wrong.  There appears to be a seriously concerning issue with how British universities are teaching programming to scientists. Some of them seem to think hardware-triggered variations don’t matter if you average the outputs (they apparently call this an “ensemble model”).

Averaging samples to eliminate random noise works only if the noise is actually random. The mishmash of iteratively accumulated floating point uncertainty, uninitialised reads, broken shuffles, broken random number generators and other issues in this model may yield unexpected output changes but they are not truly random deviations, so they can’t just be averaged out.

Software quality assurance is often missing in academic projects that are used for public policy:

For standards to improve academics must lose the mentality that the rules don’t apply to them. In a formal petition to ICL to retract papers based on the model you can see comments “explaining” that scientists don’t need to unit test their code, that criticising them will just cause them to avoid peer review in future, and other entirely unacceptable positions. Eventually a modeller from the private sector gives them a reality check. In particular academics shouldn’t have to be convinced to open their code to scrutiny; it should be a mandatory part of grant funding.

The deeper question here is whether Imperial College administrators have any institutional awareness of how out of control this department has become, and whether they care. If not, why not? Does the title “Professor at Imperial” mean anything at all, or is the respect it currently garners just groupthink?

When a software model – such as a disease model – is used to set public policies that impact people’s lives – literally life or death – these models should adhere to standards for life-safety critical software systems. There are standards for, say, medical equipment, or nuclear power plant monitoring systems, or avionics – because they may put people’s lives at risk. A disease model has similar effects – and hacked models that adhere to no standards have no business being used to establish life safety critical policies!

I and another software engineer had an interaction with Gavin Schmidt of NASA regarding software quality assurance of their climate model or paleoclimate histories[1]. He noted they only had funding for 1/4 of a full time equivalent person to work on SQA – in other words, they had no SQA. Instead, their position was that the model’s output should be compared to others. This would be like – instead of testing, Microsoft would judge its software quality by comparing the output of MS Word to the output of another word processor. In other words, sort of a quailty-via-proxy analogy. Needless to say, this is not how SQA works.

Similarly, the climate model community always averages multiple runs from multiple models to create projections. They do this even when some of the model projections are clearly off the rails. Averaging many wrongs does not make a right.

[1] Note that NASA does open source their software which enables more eyes to see the code, and I do not mean to pick on NASA or Schmidt here. They are doing what they can within their funding limitations. The point, however is that SQA is frequently given short shrift in academic-like settings.

Nature Conservancy: “California: Let’s Stop Making Wildfire History”

Some of the factors that shape the frequency and severity of wildfire in California, like drought, record high temperatures and strong winds are beyond our control and in many cases, exacerbated by a changing climate. Other factors, such as how we manage our fire-adapted conifer forests, where we build homes and how we prepare and protect our communities are within our control.

Source: The Nature Conservancy – California: Let’s Stop Making Wildfire History

Media and social media have been quick to blame California’s fires (including recent years and the present) on climate change. Social media instapundits proclaim that “only if we had done X on climate change” this would not have happened. Or if “Politician X was not in office” we would have solved climate change and this would have prevented the fires.

But that makes no sense – what could have been done on climate change, last year, or five years ago or ten years ago or even 20 years ago that would have effected forest fires this year? If we magically ended all fossil fuel usage 20 years ago, the forest fire risk this year would have been exactly the same.

While dealing with climate is an issue, it would have done nothing vis a vis current fires. Nor will spending trillions on climate change in the next 10 or 20 years resolve California’s fire problems – since spending trillions diverts enormous sums to climate change, it  diverts money away from measures that would reduce California fire risk now.

We need to  control what we can control – now. And that is what this Nature Conservancy report says.

Update: More here on how building codes evolved to create safer structures in earthquake prone areas, whereas we have not evolved building codes to make safer fire proof structures in fire prone areas. Fire is a natural part of the California ecosystem – and now, millions of people are living within areas that are dependent on fire.

Continue reading Nature Conservancy: “California: Let’s Stop Making Wildfire History”

Hypothesis, not conclusion: “In the US, switching to EVs would save lives and be worth billions”

With a confidence interval between zero and infinity:

A team led by Northwestern’s Daniel Peters decided to have a particularly detailed look at this issue, examining several scenarios of grid generation and EV adoption in the US. The results show that even with today’s grid, switching to EVs produces significant benefits.

The researchers used simulated hourly air pollution data from vehicles around the country, along with emissions data for power plants. This went into a model of weather over the course of a year (2014, as it happens), which also simulated important chemical reactions and natural emissions of compounds that interact with pollutants. The resulting air quality simulations were applied to an EPA population health model to show the expected impact on human health.

Source: In the US, switching to EVs would save lives and be worth billions | Ars Technica

And this was pushed through climate models afterwards.

No matter how you slice it, when your model is based on assumptions, simulated values, multiple models, all applied on top of one another, you have created an interesting video game simulation.

Perhaps you can use it to produce multiple hypotheses. But one thing you cannot do – in any way, shape or form – is produce a useful forecast of anything. Claiming this pile of models produces definitive conclusions is scientific fraud.

Continue reading Hypothesis, not conclusion: “In the US, switching to EVs would save lives and be worth billions”

Climate: Eating vegetarian has little impact on your household carbon emissions

In spite of popular claims that eating vegetarian offers a dramatic reduction on your personal carbon emissions , the actual reduction is about 2%. Which means it is one of the last places you should focus your attention, in terms of having a meaningful impact.

I run a separate blog called Social Panic, focusing on how propaganda messaging influences our thinking. One trick to combat propaganda messaging is to practice what the late Dr. Hans Rosling calls “Factfulness”.  That means, basically, verifying what you are told. As he learned, almost everything the general public thinks they know, is actually wrong! (Read his book Factfulness to understand that issue).

We practice “Factfulness” when we try to verify claims. Let’s do it.

From the Center for Sustainable Systems at the University of Michigan comes this chart showing an allocation for an “average” person:

From this chart, it sure looks like eating “Meat” has a large impact on your carbon footprint. Some in the climate advocacy community take this one step way too far:

which is a straight out lie!

The food pie chart is itself a small slice of a much larger pie. Many people, such as the linked climate advocate,  see charts like the above  and promptly misinterpret the chart, fail to understand that food is only one small slice of the household’s emissions, and falsely believe eliminating meat from their diet has a large impact on their carbon emissions.

Continue reading Climate: Eating vegetarian has little impact on your household carbon emissions

Climate: Tried using the EPA Household Carbon Emissions Calculator and its a Fail on first Question

The first question on the EPA Household Carbon Emissions calculator is to select my type of household heating:

  1.  Natural Gas
  2. Electricity
  3. Fuel Oil
  4. Propane

We heat our home with locally produced wood pellets. Since I cannot answer the first question correctly, the entire calculator fails.

I used this one at the University of California, Berkeley and it estimates our household annual carbon emissions at about 20 tons of CO2e per year, or 62% less than the average  American home.

I used a different one in the past (link not saved) and it estimated our usage at around 16 -17 tons, and said the average home was about 40 tons/year, which seems inline with the UCB calculator.