Jump to content
Snow?
Local
Radar
Cold?

Chris Knight

Members
  • Posts

    889
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Chris Knight

  1. Professor Singer said of the little ice age 'the Thames used to freeze over every winter'. Oh no it didn't. He also said that for the last decade or so the planet has been cooling. Not correct.

    I agree with the first objection, Dev, but the second one by Singer is strictly correct. Of course, as the earth cooled, it also warmed due to solar irradiation, as has happened for a very long time. The relative global change in temperature is still contentious.

  2. For a lot of people the Chilcot investigation is to confirm what everyone already knows.

    For a lot of people Climategate confirms what was not popularly known.

    So RE: Chilcot there is little to talk about that is new - that is why the BBC is giving it blanket coverage.

    The BBC is not the only fruit!

    I just checked Google. Fewer than 90,000 entries for "Chilcot enquiry". Over 11,000,000 for "Climategate".

    My hypothesis gains credence.

    :blush:

    Suggestions for titles please:

    Harry Plotter and the Dendrochronologist of Doom

    Harry Plotter and the Hidden Decline

    :lol:

  3. The BBC online news and BBC 24 is multiple stories on the Chilcot report but both mediums are completely silent when it comes to the leak and its significance.

    Nice theory, back to the drawing board.

    That's funny, I entered "Chilcot" into the Netweather TV forum seach box and only came up with a single hit - with your above post. I need to get out more... :crazy:

  4. I think the leak was done by the government to focus attention away from the Iraq war enquiry. They probably thought we needed some harmless diversion whilst they salted away another enquiry under the governmental cloak of inadmissible evidence and evasive Blairite posturing.

    If you want to tell a big lie, hide it among a load of little lies.

  5. Here in waterlogged Worthing, we have so far received about one quarter of our annual rainfall for 2009 in November alone, so far, with only a single dry day - the 19th. So, with it chucking down again, and expected to continue for half the night, about 150mm have fallen this month, and now over 600mm for the year to date. Over twice as wet as November last year and wetter than June 2007, and that's saying something. What's the forecast for the rest of the month - cool and wet. Wonder if we will top 180mm?

  6. http://www.metoffice.gov.uk/corporate/pressoffice/2009/pr20091124a.html

    Might as well have the meto's latest press release to highlight how they view the occurrence of climate change

    A statement from the Met Office, Natural Environment Research Council and the Royal Society.

    The UK is at the forefront of tackling dangerous climate change, underpinned by world-class scientific expertise and advice. Crucial decisions will be taken soon in Copenhagen about limiting and reducing the impacts of climate change, now and in the future. Climate scientists from the UK and across the world are in overwhelming agreement about the evidence of climate change, driven by the human input of greenhouse gases into the atmosphere.

    As three of the UK’s leading scientific organisations, involving most of the UK scientists working on climate change, we cannot emphasise enough the body of scientific evidence that underpins the call for action now, and we reinforce our commitment to ensuring that world leaders continue to have access to the best possible science. We believe this will be essential to inform sound decision-making on policies to mitigate and adapt to climate change up to Copenhagen and beyond.

    Hmm, not a mention of UEA CRU there - perhaps I detect a certain distancing by Julia Slingo there. Damage limitation? :unsure:

  7. :unsure: :angry: :lol: :rofl::shok::shok::shok::shok:

    I am almost rendered speechless. Am I dreaming? Is it just me? Did everyone else know and not tell me?

    JOHN PRESCOTT IS THE NEW RAPPORTEUR FOR CLIMATE CHANGE FOR THE COUNCIL OF EUROPE?

    http://www.dailymail.co.uk/news/worldnews/article-1230620/I-wont-let-b---s-theyve-deal-Climate-change-envoy-John-Prescott-reveals-unusual-approach-solving-global-warming.html

    :lol: :lol: :wacko: :wacko: :cray::cray::cray::cray:

    This is unashamedly ad-hom, but why could they not have chosen a thin person?

  8. This is not a complaint, VP...Chris...Shoreham - lord knows, I am glad there are people out there who understand it - but I have to confess that you could all be talking Assyro-Babylonian underwater for all the sense I can make of those last few posts :) .

    What worries me is that there are probably plenty of people on here who find my verbose, vocabulary-vomiting, literary-referenced, subordinate-clause-ridden essays just as inpenetrable....:)

    I try not to technobabble, Osmposm, honest I do, but there are times when you have to use the jargon for precision and brevity. I will try to provide a glossary in future, but there is always Google.

    "Legacy systems" means old software running on old computers and accessories - software that is likely not to run properly on current systems - many modern computers have replaced serial and parallel ports with USB (Universal Serial Bus) ports, so old accessories like backup tapes may not be readily attached to new computers. The drivers and backup installation software may be on floppy disks - when can you find floppy disk drives on new computers nowadays? Unix-type operating systems (like Linux distributions) may be more immune to these situations than Windows Operating Systems(OSs), which seem to exclude some legacy software and hardware with each version change.

    VP, last night, admitted to running on OS H*, which by name implies that it is a legacy system and is only fully understood by others running on the same OS! :drinks: I could make out a few phrases he wrote, that is all. :)

    *Old Speckled Hen

    edited for missed parenthesis

  9. As I pointed out in an earlier post this is the crux of the whole matter

    There is nothing intrinsically wrong with legacy architecture or writing programs in the old programming languages such as FORTRAN so long as proper attention is paid to system design, code reviews, testing, documentration etc. Throughout the cold war much of the nuclear defence of the Western world depended on the early warning systems that largely ran on mainframes running COBOL and old CODAYSL IDMS databases all of which are extremely reliable if used properly. The real issue here is how the raw data was vetted, evaluated. stored and modified over time. It is clear that this has been done in a haphazard and uncontrolled manner with almost no configuration management or audit trails. This means that unless the original source readings or data files have been retained there must be doubts over its reliabilty, particulary as it seems from 'Harry's' comments that the basic structure of the original database was not underpinned by the proper use of referential integrity constraints. As a consequence it is quite easy for data in different tables or files to become corrupted or out of sync. If the data is shot then it does not matter how brilliant are the rest of the coding alogorithms since once GIGO takes effect all results must be regarded as suspect.

    Gordon Manley must be spinning in his grave.

    There is nothing wrong with properly maintained legacy systems that have been designed and properly managed, as you say Shoreham (just down the road from me?), but they do need to be upgraded to run seamlessly on current hardware and modern OSs, etc. It is possible to get to the state where one Monday morning, all you have is a set of backup tapes, and nothing to read them with.

    It seems that the UEA CRU CRUTEM so-called "database" has grown organically with less than ideal input from a variety of idiosyncratic academic coding sources over time, and it is now a nigh impossible task to salvage a definitive clean dataset.

  10. Spend more? Haven't you read all the goings on about gravy trains and research grants? :wallbash: . Spend too little one lot of sceptics say they need to spend more on computer power, spend more and another lot start blathering on about gravy trains, grants 'our taxes'...

    You're speaking as a?

    I didn't see that, but nice to know you're there. Under another guise or just lurking?

    I don't believe all that you read, Dev. If Prof Jones can get £14,492.00 for "CRU involvement in the development of an improved global historic surface temperature dataset" for a 9-month project in 2004, I am sure he had the clout to get say £120,000.00 for "Essential upgrade to core surface station temperature database" for 12 months - covering the salaries of 3 data analysts. Especially after reading his programmer's reports. Has the Prof no heart at all? :mellow:

    So I think they should spend more on cleaning up the existing core data, speaking as one who has spent most of my working life supporting academic research, either doing the practical lab work or as an analyst of research data (but not at UEA).

    It's nice to see you there too! I don't post much there - it is somehow like when once walking into a backstreet pub in Mortlake - the conversation stops and all eyes follow you, and you get the feeling that they all suspect you of being a policeman in plain clothes. :unsure:

  11. The "leak" or "hack" has nothing to do with AGW, the mechanism of greenhouse gas forcing, or of the role of forcing feedbacks. Thus the disclosure does not affect AGW theory per se.

    Both the emails and the documents focus on the temperature record, particularly the instrumental station records and tree-ring proxies. Exactly the material that ClimateAudit has been after.

    The station temperature record was what the FOI requests were about, and someone has gone through a lot of trouble to copy both the relevant documents and the emails, and to filter out much irrelevant stuff*.

    It is almost as though someone was covering their back in case they were ordered to fall on their sword if the FOI request was granted (The request was rejected on the day after the last email in the file was sent). Someone had been putting this datafile together for some time, someone who does not fear the consequences of illegal disclosure as much as the consequences of withholding or destroying information requested from the FOI. In fact they were probably rather disappointed that the information was not going to see the light of day, and released it anyway. I wonder how they feel now? Exhilarated, or sick?

    The emails are nothing unusual for an academic department, backbiting, snark and jealousy is daily fare in all the academic workplaces I have been employed in - but it is still normal to consider that emails are like letters or faxes, but they can come back to bite you, since they are inevitably archived somewhere. If they had just archived their emails in their CRUTEM database, they probably would have been safe forever. :mellow:

    The state of the databases is possibly why there has been a certain amount of reticence to publish raw station data - it is just not available in anything like a publishable form, since the code that has accrued over the years is not up to reconstructing the data, and the appalling state of documentation means that do do so will require major forensic reconstruction.

    Gavin Schmidt states on RealClimate that he mostly does his own coding (in FORTRAN). It is the way within academia that without formal training, computers have become tools that academics use to produce their own data. Sometimes that goes beyond Excel, or Filemaker Pro on a Mac (or god forbid, Hypercard) and the old mainframe languages like FORTRAN or COBOL (in science or economics, in medicine, M (MUMPS)code is an old favourite, found throughout the NHS) are the only way to access earlier databases, the data often hard coded so that the data is not portable.

    That is the problem facing "Harry" the programmer who has been trying to port the legacy CRUtem 2.1 data over to version 3.0, so it will run on Linux and Sun Alpha systems, with some code segments in FORTRAN77, other code in FORTRAN90. Here's an example of his frustration from HARRY_READ_ME.txt

    I am seriously close to giving up, again. The history of this is so complex that I can't get far enough into it before by head hurts and I have to stop. Each parameter has a tortuous history of manual and semi-automated interventions that I simply cannot just go back to early versions and run the update prog.

    I could be throwing away all kinds of corrections - to lat/lons, to WMOs (yes!), and more.

    So what the hell can I do about all these duplicate stations? Well, how about fixdupes.for? That would be perfect - except that I never finished it, I was diverted off to fight some other fire. Aarrgghhh.

    I - need - a - database - cleaner.

    What about the ones I used for the CRUTEM3 work with Phil Brohan? Can't find the bugger!! Looked everywhere, Matlab scripts aplenty but not the one that produced the plots I used in my CRU presentation in 2005. Oh, FXXK IT. Sorry. I will have to WRITE a program to find potential duplicates. It can show me pairs of headers,

    and correlations between the data, and I can say 'yay' or 'nay'. There is the finddupes.for program, though I think the comment for *this* program sums it up nicely:

    ' program postprocdupes2

    c Further post-processing of the duplicates file - just to show how crap the

    c program that produced it was! Well - not so much that but that once it was

    c running, it took 2 days to finish so I couldn't really reset it to improve

    c things. Anyway, *this* version does the following useful stuff:

    c (1) Removes and squirrels away all segments where dates don't match;

    c (2) Marks segments >5 where dates don't match;

    c (3) Groups segments from the same pair of stations;

    c (4) Sorts based on total segment length for each station pair'

    You see how messy it gets when you actually examine the problem?

    This time around, (dedupedb.for), I took as simple an approach as possible - and almost immediately hit a problem that's generic but which doesn't seem to get much attention: what's the minimum n for a reliable standard deviation?

    I wrote a quick Matlab proglet, stdevtest2.m, which takes a 12-column matrix of values and, for each month, calculates standard deviations using sliding windows of increasing size - finishing with the whole vector and what's taken to be *the* standard deviation.

    The results are depressing. For Paris, with 237 years, +/- 20% of the real value was possible with even 40 values. Windter months were more variable than Summer ones of course. What we really need, and I don't think it'll happen of course, is a set of metrics (by latitude band perhaps) so that we have a broad measure of

    the acceptable minimum value count for a given month and location. Even better, a confidence figure that allowed the actual standard deviation comparison to be made with a looseness proportional to the sample size.

    All that's beyond me - statistically and in terms of time. I'm going to have to say '30'.. it's pretty good apart from DJF. For the one station I've looked at.

    Back to the actual database issues - I need a day or two to think about the duplicate finder.

    That is one unhappy tale, and the 700Kb textfile describes three years of grief for "Harry". It also indicates what a mess the data is really in, and however much the CRU have spent on computer systems and model making, they need to spend more on teams of data analysts who can clean up the mess they have allowed to build up, instead of leaving it up to one individual.

    They also need to honestly reassess the confidence they have in the long term record, including the instrumental station record, before making proclamations on how much we may expect to warm in the future. If the data that feeds the climate models is only slightly flawed, the output is absolutely worthless.

    *For example, on the TWO forum Prof. Tom Choularton seemed almost to be disappointed that he had no emails represented, since he has been an external examiner for UEA CRU for many years, and certainly had exchanged many emails over the years with people there.

  12. This forum seems weary - As LG says WUWT is doing a roaring trade in this "Aunt Sally", as now is RC (Gavin has got his head together with his mates at CRU by whatever means they now use to communicate, and are doing a splendid, and "decent" (in the English sense) job of defending the fort), yet we struggle past 3 pages here! Where's the food-fight!! mentality? :lol:

    I would not second the "Shame on the..." comment though, like climate change, it is a reality of the world we live in - nothing is sacred any more, not even "private" conversations. As a scientist, why can't I tell to a mate that a competitor is a "plonker" in a private email, and plot to reveal his flaws in a future rebuttal?

    I was up late last night and linked into this story early, even had time to explore the FTP site where the FOI2009.zip first appeared. I don't know why that Russian guy's FTP site ended up hosting the file - it was pretty unremarkable - had a certain amount of stuff for gamers, a bit of mainstream porn, some downloaded music, images and videos, some Linux and other OS stuff, as well as some personal images from the host's latest vacation, I guess - certainly nothing to show that this was some committed hacker intent on destroying the AGW political momentum - I guess that his webspace was just a convenient vehicle for the file.

    The actual file content (which will be published elsewhere in fragments ad nauseam...) was also unremarkable, except, possibly, for the content that was not contained within it, if anything. I doubt that any of the published content was faked or embellished, but that there were possibly chunks removed, missing and possibly embarrassing to several individuals, who may be currently considering their futures...

    I'd look to a slighted lover if I were (close to the focus of this attack) at CRU, and short of committing suicide (if I were that vulnerable), I would make sure the police nailed the mole before any real damage was done.

  13. Just to get a 'feel' for the differing conditions as we move north.

    At the pole the sun sets on sept 24th and does not rise again for 175days.

    At 78 degrees north the sun sets on Oct 27th and doesn't appear again for 111 days.

    On the arctic circle the sun is only up at winter solstice for 2hrs 11mins.

    Here's the latest from NSIDC on this years melt season.....

    http://nsidc.org/news/press/20091005_minimumpr.html

    Interesting figures, GW. I have often wondered if there is much mileage in the albedo theory of amplification. The ocean really only opens up in the few weeks before the September equinox, when the sun is below the horizon for much of the time, and very low in the sky in polar regions otherwise. The biggest albedo effect would be around the solstice, when the ice cover is only slightly diminished.

  14. We haven't got our thinking caps on have we?

    If the min ice extent is getting smaller (since sat. records began) then the 'circle of ice' left at ice min is also becoming smaller. Would this not mean that the ice edge at 'min.' is also budging ever more north wards?

    Though the resumption of re-freeze remains static in mid sept the location of that freeze moves ever north wards as the size of the pack reduces?

    To me that would suggest that we'd need to look at when, say 78 degrees north (for the sake of argument) started to freeze again and not when the summer loss ended and the autumn re-freeze began.

    Am I making sense?

    If ,in 72, there was 8million km2 at min and this year we are 5 and a bit million km 2 then that must be quite a jump north and yet it still started it's re-freeze around the same time as it did when it was much further south???

    That sounds like re-freeze (in a given geographic location) is starting ever later does it not???

    The NASA source I quoted breaks down the 1978-2007 data into specific sea areas:

    Sea of Okhotsk, Bering Sea, Hudson Bay, Baffin Bay, Greenland Sea, Kara-Barents Seas, Arctic Ocean, Canadian Archipelago, Gulf of St Lawrence, as well as the Total Arctic.

    It would therefore be possible to do the analysis you suggest. Some will be easy to do - Bering and Gulf of St Lawrence for instance - the date of the last consecutive day of zero ice should do, and also the number of consecutive days without ice for these areas. There may be other odd days without ice due to break-up of the pack, but I'd suggest these are ignored.

    For the other sea areas, a simple minimum will suffice.

    The way I'd do this is to dump the data into a spreadsheet, and mark a character (I used "X" for the total Arctic Ice) into a separate column against each row which contains the minimum for that year. I'd then sort on the column with the "X" in descending order and then by date in ascending order. Then just create a scatter chart from the selected dates and extents. Bar charts would be good to show consecutive ice-free periods in Baring and Gulf of St Lawrence data.

    I've started, but have no time to finish: any takers?

    Arctic seaice1978-2007.xls

  15. Jethro, the Arctic datasets from 1972-2007 can be found here:

    http://polynya.gsfc.nasa.gov/seaice_datasets.html

    Jackone's spreadsheet data supply the data for the last two minima.

    date Day of year Total Arctic ice extent

    08/09/1972 252 7292045

    14/09/1973 257 7172536

    17/09/1974 260 6904485

    03/09/1975 246 7121414

    13/09/1976 257 7039986

    11/09/1977 254 6867947

    17/09/1978 260 6798272

    21/09/1979 264 6904193

    24/08/1980 237 7517696

    12/09/1981 255 6905228

    07/09/1982 250 7161541

    08/09/1983 251 7198425

    16/09/1984 260 6370661

    09/09/1985 252 6478039

    06/09/1986 249 7147201

    02/09/1987 245 6894980

    11/09/1988 255 7014784

    22/09/1989 265 6846555

    21/09/1990 264 6012693

    16/09/1991 259 6267875

    07/09/1992 251 7140304

    06/09/1993 249 6148266

    05/09/1994 248 6908397

    07/09/1995 250 5993640

    10/09/1996 254 7119571

    20/09/1997 263 6592881

    18/09/1998 261 6296278

    12/09/1999 255 5696905

    11/09/2000 255 5934431

    19/09/2001 262 6522513

    17/09/2002 260 5592786

    17/09/2003 260 5946320

    19/09/2004 263 5728849

    19/09/2005 262 5295806

    15/09/2006 258 5737976

    14/09/2007 257 4150112

    09/09/2008 253 4707813

    13/09/2009 256 5249844

    post-7302-12547850755483_thumb.png

    Day of minimum has little correlation with minimum ice extent. Date - year of minimum does correlate.

  16. I have recently been involved in a discussion elsewhere, about the asymmetrical way the earth behaves over time because of the differences between what happens at the poles. Here is an example, Global Sea Ice (source: http://polynya.gsfc.nasa.gov/seaice_datasets.html, Arctic and Antarctic combined):

    post-7302-1253799663441_thumb.png

    Note the shape of the curve - it is not a sine wave - there is typically a sharp trough in early March, a small peak in late July, a small trough in late September and another small peak in early November, although the dates vary from year to year. A little investigation shows that the deep trough corresponds to the minimum for Antarctic ice extent, the little trough for the Northern ocean minimum.

    The curve is the product of both polar quasi-sinusoidal ice-extent curves running at 180degrees out of phase.

    This curve, or variations of it, crops up all over the place when you become aware of it - even superimposed over decadal trends, when looking at global datasets. It is the major part of the data that is filtered out of anomaly datasets. It occurs in Global SST measurements

    2zgi8n7.png

    (From Bob Tisdale), the cool southern summer Antarctic meltwater providing the deep trough in SSTs, before the southern ocean starts to warm up.

    Global Ozone measurements

    total_ozone_M.jpg

    (from ESA), can anyone suggest why?

    The pattern shows up in some even more surprising places.

    Here is the variation in the Length of Day (i.e. the speed of rotation of Earth) since 1962.

    post-7302-12538021853295_thumb.png (source IERS) .

    A major component in the cause for LOD changes is the AAM, which again has a similar annual pattern.

    Note the similar pattern over the year, but also notice that it is not in phase with the sea ice. However, is there a connection between the redistribution of sea ice and the Atmospheric angular momentum?

    Not only the earth shows an asymmetric north-south pattern, so does the sun!

    CosmicRayFlux.png

    (From Leif Svalgaard).

    The Cosmic ray counts are a proxy for the strength of the solar Magnetic field, the measurements on earth rectify (like a diode) the changing north-south polarity of the sun's magnetic field over the course of the Hale cycle. If you visualise the sharp peaks as troughs (southwards), and the flat peaks as peaks (northwards), I think you will see what I mean.

    Finally, the curve is the similar to the light curve that is supposed to arise from eclipsing binary stars, with a big trough, a small peak, small trough and another small peak. Differential polar radiation (i.e. a dark south pole, bright north pole) for a star with a precessing axis could create this waveform.

    If anyone can suggest other datasets which show this pattern, I will do some further analysis and see if there are any links between the seasonal variations.

    If this is out of place, I would be happy for it to go somewhere else.

  17. Richard A Muller has an alternative theory also here

    In a nutshell, the inclination of the earth's orbit to the "invariable plane of the solar system" reaches an extreme tilt every 100,000 years. Whereas the invariable plane is swept free of dust by the planets as they orbit, the remaining dust from the accretion disk that formed the solar system, is still mixed up on either side.

    If you think of a fried egg sandwich with the sun as the yolk, and the planetary orbits as the white, and the bread as the remaining dust, you get the picture, I hope.

    When the earth's orbital inclination is at an extreme, then twice a year, the earth will pass through the dust, and in fact drag dust around the orbit in its wake. There should be spectacular meteor showers, and the amount of solar radiation reaching the surface of the earth would diminish, because the finest dust would enter the upper atmosphere, dimming the sun. Dust acts as nuclei for condensation of the water in the cooling atmosphere and the earth enters a glaciation. Eventually the earth's orbital tilt clears the dust fields, and the dust clears from the atmosphere, but the earth is now in a glacial state and the high albedo, low water vapour levels due to low temperature, and low CO2 levels mean there is little greenhouse effect and the earth is in a glacial low temperature equilibrium state.

    The assumption is then that Vulcanism, perhaps exacerbated by continents sinking due to additional weight of ice, increase the greenhouse gases by melting ice and release of carbon dioxide, until the ice caps retreat once more.

  18. Symposium website here.

    A series of video lectured accompanied by Powerpoint and PDF presentations.

    PUBLIC SYMPOSIUM

    NATURAL CLIMATE CHANGE

    held at

    MONASH UNIVERSITY, Clayton,

    SUNDAY, 24 May, 2009,

    LECTURE THEATRE, South 1

    (near Alexander Theatre, parking nearby)

    This was a scientific symposium, open to the public, directed to explaining the natural driving forces that cause natural climate change. It was convened to show that the earth has recently completed a natural warming phase, and has now entered a natural cooling phase. This new understanding of the natural driving forces of climate change enables a reasonable prediction of future climate trends. It is emphasised that the speakers regard the understanding of natural climate change as a strictly scientific matter, and not a matter for resolution at political levels.

    The speakers are all independent scholars who share an understanding that climate change on earth is determined by natural causes rather than by the influence of carbon emissions of mankind. The speakers are respected nationally and internationally for expertise in their field.

    Emeritus Professor Lance Endersbee, AO, FTSE. ( Convenor)

    William Kininmonth, former Head, National Climate Centre, Bureau of Meteorology,

    Richard Mackey

    Ian Wilson, PhD.

    David Archibald.

    Professor Robert Carter, Marine Geophysical Laboratory, James Cook University.

    Topics:

    The Recent History of Man, and the Influence of Climate.

    Carbon Dioxide and Computer Models: the Fatal Flaw.

    Carbon Dioxide and the Oceans.

    The Electric Universe.

    The Role of the Sun in Regulating the Climate Dynamics of the Earth.

    External Forcing of the Rotation Rate of the Earth, and Climate Effects.

    Solar Cycles, Cosmic Radiation, Clouds, and Climate.

    The Moon - Extreme Tides QBO, SOI and ENSO

    Scientific Summary: The Dynamic Drivers of Natural Climate Change.

    Towards a New Public Understanding of the Natural Climate.

    Link to Robert Carter's website also has further links to YouTube lectures.

  19. No, nowt to do with carbon capture - Pete asked what happened to waste products from power stations, suggesting it could be put to good use, the link was to demonstrate it mostly already is.

    Plastic dumps aka carbon capture, I like that; have you ever considered a career in marketing/advertising? It's very lucrative.

    A career in Marketing/Advertising - no, I see myself more as a WALL-E, picking up the plastic waste, and squeezing it into little cubic bricks. I even have a border collie to play M-O.

  20. A little more research on the GLOSEA product from which the (UK and anywhere else they want to sell their products) seasonal forecasts are taken, has come up with the following from the Met Office:

    PREDICTION OF THE GLOBAL TEMPERATURE ANOMALY FOR 2009 USING DYNAMICAL AND STATISTICAL METHODS

    Chris Folland & Andrew Colman, Met Office Hadley Centre

    FORECAST ISSUED DEC 30 2008. A PRESS RELEASE is on http://www.metoffice.gov.uk/corporate/pressoffice/2008/pr20081230.html.

    I found it HERE. I had to log on to the Science section for seasonal forecasts, so this link may not work. PM me if you have difficulty.

    It clearly states that volcanic, Solar and Greenhouse Gas forcings are being incorporated in the model:

    1.1 PREDICTORS USED IN THE EMPIRICAL METHODS

    The six predictors listed below have been identified by more than one author to be related to large-scale temperature:

    a) ENSO HF1: The High Frequency El Nino Southern Oscillation sea surface temperature index 1 (ENSO SST HF 1). This is the time series of the first covariance eigenvector of high frequency (<13 years) global SSTA for 1891-2005 in Parker et al (2007). This eigenvector pattern is related strongly to ENSO.

    :rolleyes: IPO: The Inter-decadal Pacific Oscillation (IPO), the quasi-global manifestation of the PDO. This is used in the form of the time series of the second covariance eigenvector of low frequency global SSTA for 1891-2005 as described in Parker et al (2007). This is a small factor and replaces previous use of the Atlantic Multidecadal Oscillation which is not statistically significant in current models.

    c)VOLCANO: An index of global volcanic dust cover (VOLCANO) produced by Sato et al (1993). Dust veils from major volcanic eruptions, particularly in the tropics, lead to a significant drop in global temperature for a year or two after the eruption.

    d)SOLAR: An index of solar irradiance (SOLAR) as supplied by Lean (Frohlich & Lean, 1998) and extrapolated to the present. A small downward adjustment has been made to the figures for 2006,2007 and 2008 to place 2008 at the bottom of the solar cycle. Latest observations on http://www.dxlc.com/solar/solcycle.html show the latest solar minimum trough to be 18-24 months later than expected from the Lean predictions.

    e)GSO: An estimate of the global mean anthropogenic net radiative forcing at the tropopause. This comes from changing concentrations of well-mixed anthropogenic greenhouse gases, the direct and indirect effects of sulphate aerosol emissions and from tropospheric ozone concentration changes (GSO).

    f) GLOSEA NINO 3.4: Predictions of the Nino3.4 area (170-120oW, 5oN-5oS) SST anomaly made by the Met Office GLOSEA coupled ocean-atmosphere global circulation model are used to replace the current observed ENSO state as measured by predictor a), observed ENSO HF1, to make a second forecast.

    GSO forcing is now the average of the forcings appropriate to the IPCC Fourth Assessment Report (2007) B1, B2, A2 and A1F1 scenarios. It is expressed as the annual mean forcing at the top of the troposphere in wm-2 . This represents only a small change from the previous forcings used to calibrate the models since 1947. Experiments show that the best fit to the observed surface temperature data since 1947 is to use the value for the previous full year. Predictor data for the following periods are used.

    ENSO HF1 October-December 2008

    VOLCANO December 2008 (extrapolated from data ending in 1997 assuming no

    significant recent activity)

    SOLAR January-December 2008

    GSO January-December 2008

    GLOSEA NINO3.4 index. January-April 2009 forecast (used in place of ENSO HF1 )

    The predictor periods were chosen to extract maximum skill from data available at the time of the forecast.

    However, in variance to what the WMO report I quoted in an earlier post said, these forcings were used as long ago as 2000 in the GLOSEA model, as this 2000 report states:

    1.1 PREDICTORS

    The six predictors selected and listed below have been identified by more than one author to be related to large-scale temperature:

    a) The Inter-Hemispheric Contrast (IHC) index which is the time series of the second covariance eigenvector of low frequency global SST for 1911-1995 in Folland et al (1999). This index is also highly correlated to rainfall in the Sahel on decadal time scales.

    :unknw: The High Frequency El Niño Southern Oscillation index 1 (ENSO HF 1) is the time series of the first covariance eigenvector of high frequency (<13 years) global SST for 1911-95. This eigenvector pattern is clearly strongly ENSO related.

    c) The High Frequency El Niño Southern Oscillation index 2 (ENSO HF 2) is the time series of the second covariance eigenvector of high frequency (<13 years) global SST. This eigenvector pattern is also ENSO-related, but the time series is 6-9 months out of phase with HF ENSO 1. These patterns are also in Folland et al (1999).

    d) An index of global volcanic dust cover (VOLCANO) produced by Sato et al (1993). Dust veils from major volcanic eruptions, particularly in the tropics, lead to a significant drop in global temperature for a year or two after the eruption.

    e) An index of solar irradiance (SOLAR) as supplied by Lean (Frohlich & Lean, 1998).

    f) An estimate of the global mean net radiative forcing at the tropopause from well-mixed anthropogenic greenhouse gases, the direct and indirect effects of sulphate aerosol emissions and from stratospheric and tropospheric ozone concentration changes (GSO). This index was calculated using the Hadley Centre’s current Coupled Ocean-Atmosphere general circulation model, HADCM3. It is expressed as the annual mean forcing at the top of the troposphere in wm-2 (Johns, personal communication).

    g) In one of the of the forecasts, predictions of the Niño3.4 area (170-120oW, 5oN-5oS) SST anomaly made by the NCEP coupled ocean-atmosphere global circulation model (NCEP NINO3.4).

    Over the jackknife testing period, 1949-1998, our knowledge of the state of the North Atlantic Oscillation in the current year did not add skill. Tests also showed that the current state of the Interdecadal Pacific Oscillation added no interannual skill over and above that of ENSO. We chose this period because the predictor and predictand data are best

    then, though in the near future, current advances in data set analysis might allow this period to be substantially extended.

    Predictor data for the following periods are used. The examples given are for the prediction for 2000 IHC September-November 1999

    ENSO HF1 September-November 1999

    ENSO HF2 September-November 1999

    VOLCANO November 1999

    SOLAR January-December 1999

    GSO December 1998 - November 1999

    NCEP NINO3.4 January-June 2000 (used in place of ENSO HF1 )

    The predictor periods chosen were selected to extract the maximize available skill from data available at the time of the forecast. Note that for the observed predictors, use of predictor values simultaneous with the forecast in the training equations did not produce more than a marginal increase of skill. Use of such predictors in real time would also involve estimating them in the year ahead. So only observed predictor values are used, except for the small annual SOLAR radiation index which can be estimated quite accurately for the current year. So far, no lags have been introduced into the radiative forcing data predictors other than an effective lag of about one year due to the choice of predictor values

    that centre on middle of the year prior to that being predicted. This will be investigated in future.

    I guess the estimate of "skill" refers to hindcasts made by the model on various runs with and without the predictors, with various adjustments made to each to achieve the best result.

×
×
  • Create New...