Jump to content
Snow?
Local
Radar
Cold?
IGNORED

Scepticism Of Man Made Climate Change


Paul

Recommended Posts

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

How do we know the climate is warming?

 

Normally, one gets hold of a climate series [1], and draws a trend line on it; something like this,

 

post-5986-0-57719100-1376732044_thumb.pn

 

The trend is clear it's going up. Furthermore we can quantify by how much. If you look at the equation at the top left, you'll note the first term (0.00463021) is how much the climate is trending upwards each year. This gives us a linear warming of a whopping positive anomaly change of some 0.75oC over the series. This is quite compelling, and is pretty much in line with IPCC analysis [2].

 

This sort of an analysis is exceptionally common in all sorts of climate blogs, and is routinely used to 'prove' that the climate, regardless of the recent hiatus, is warming. Look at the upward slope, look at the equation terms: it's still warming - it's in disputable; it's easy to replicate, with near 100% accuracy the findings of the IPCC with an internet connection, a spreadsheet, and some source data - those who can't, won't or fail to, must, clearly be in denial. The presence of terms like equations, linear regression, outliers, residuals, and many more, is normally enough to frighten off the less mathematically inclined, and if that's not enough, keep on dropping in terms like denial (designed to connect the dots to holocaust denial) misspell the term sceptic, and more recently spread the notion that those who question a consensus must, necessarily, be misleading.

 

Which got me thinking. What would it take for this linear trend to trend downwards? ie that the climate might be cooling. Let's have a look at what linear trends tell us if we set the global climate anomaly to -0.5oC for thirty years. I pick thirty years because, we are told, ten years, seventeen years, is simply not long enough to be statistically significant, but given that the anomaly is measured from a thirty year period it must be the test of the time required,

 

post-5986-0-99857900-1376733338_thumb.pn

 

So, with 30 years of extremely low anomalies, we can be sure that the climate is still warming. OK, so the linear warming over the entire series is now just 0.22oC but warming we are. Whilst the population die of starvation due to poor crops, fuel poverty is rife; we can be safe in the knowledge that it's a blip; the climate will continue to warm. In fact, you'd need to go out to 2066 to post a negative linear trend. That's some 53 years,

 

post-5986-0-50522900-1376733599_thumb.pn

 

That's one long blip. The likelyhood is, that I'll be dead, before I can post a linear regression that shows the climate might be cooling down even in the presence of catastrophic cooling for over half a century. My advice - you see a linear trend line and a commentary about the direction of the trend; ignore it, move on, it's meaningless twaddle.

 

All seems a little convenient to me. The thing is, using linear trends on time series charts, is almost always a bad idea; and that's not just with reference to climate - nearly everything. It has it's uses, such as detrending for further analysis, but to prove a point: at best, it's just a plain stupid use of statistics, and, at worse, it really is telling lies. And that applies universaly; whether you think hell on earth is just around the corner to expecting the next ice age; statistics is good, because, when used properly, it just doesn't care.

 

http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.2.0.0.annual_ns_avg.txt

[ii] http://www.ipcc.ch/publications_and_data/ar4/syr/en/mains1.html

Edited by Sparkicle
  • Like 3
Link to comment
Share on other sites

Posted
  • Location: York
  • Weather Preferences: Long warm summer evenings. Cold frosty sunny winter days.
  • Location: York

We know that some of the reconstructed temperature data sets come from tree ring analysis. The attached raises questions the inherent bias in these data sets and highlights of trying to use these to predict future climate.

Link to comment
Share on other sites

Posted
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.

Can anyone more knowledgeable than me clear this one one up... 

 

NASA Massively Tampering With The US Temperature Record
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

 

Can anyone more knowledgeable than me clear this one one up... 

 

NASA Massively Tampering With The US Temperature Record

 

 

Goddard is not really very reliable, and almost certainly he has never read this. And I'd fall off my chair if he even knew about the issue of homegenisation. For all we know a temperature record that was sited on black concrete was removed ... but it's much more likely that the chaps at NASA's climate office are pretty poor computer programmers.

Edited by Sparkicle
Link to comment
Share on other sites

Posted
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.

Goddard is not really very reliable, and almost certainly he has never read this. And I'd fall off my chair if he even knew about the issue of homegenisation. For all we know a temperature record that was sited on black concrete was removed ...

 

Thanks Sparky Posted Image

Link to comment
Share on other sites

Posted
  • Location: Napton on the Hill Warwickshire 500ft
  • Weather Preferences: Snow and heatwave
  • Location: Napton on the Hill Warwickshire 500ft

Goddard is not really very reliable, and almost certainly he has never read this. And I'd fall off my chair if he even knew about the issue of homegenisation. For all we know a temperature record that was sited on black concrete was removed ... but it's much more likely that the chaps at NASA's climate office are pretty poor computer programmers.

 

Maybe he read this part and asked why remove the 1930s ??

 

--------------------------------------

The reason for the larger number of cold step changes is not completely clear, but they may be due in

part to systematic changes in station locations from city centers to cooler airport locations that occurred

in many parts of the world from the 1930s through the 1960s

------------------------------

 

Not sure how 'poor programmers' would go down with 'the media' who jump on data manipulation

Edited by stewfox
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

Not sure how 'poor programmers' would go down with 'the media' who jump on data manipulation

 

Take a look at this. Sure, for non programmers, this is pretty much gobble-de-gook; essentially, it is titled the Good Programming Guide, but it's little more than ways to get your code to execute faster. Most modern programmers will fall of their chair in abject horror. These days it is considered infinitely inefficient if the code isn't correct, and, as far as I can tell, not one bit of advice mentions correctness. Not one bit.

 

NASA isn't always like this, and, I suspect it's limited to their Climate science groups. Take a look at this - which is really the very very high end of the field.

Edited by Sparkicle
  • Like 1
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

Take a look at this. Sure, for non programmers, this is pretty much gobble-de-gook; essentially, it is titled the Good Programming Guide, but it's little more than ways to get your code to execute faster. Most modern programmers will fall of their chair in abject horror. These days it is considered infinitely inefficient if the code isn't correct, and, as far as I can tell, not one bit of advice mentions correctness. Not one bit.

 

NASA hasn't always been like this, and, I suspect it's limited to their Climate science groups. Take a look at this - which is really the very very high end of the field.

 

 

:)

Edited by Sparkicle
  • Like 1
Link to comment
Share on other sites

Posted
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.

If you look at the two first graphs in his post, would you agree that they show that the 30s have been 'eliminated'?

If yes, look again and see if you can spot a fundamental difference between the two plots.

 

D.

 

Hi diessoli, Im aware of the differences and the fact that the 30s have vanished, My Question was more why..? 

Link to comment
Share on other sites

Posted
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.
  • Location: Derbyshire Peak District South Pennines Middleton & Smerrill Tops 305m (1001ft) asl.

That's the thing. Why do think they have vanished? From looking at the graph?

 

D.

 

But the answer to that would not come from looking at the graph, but maybe why the data is missing ?  

Link to comment
Share on other sites

Posted
  • Location: Beccles, Suffolk.
  • Weather Preferences: Thunder, snow, heat, sunshine...
  • Location: Beccles, Suffolk.

That's the thing. Why do think they have vanished? From looking at the graph?

 

Edit:

I don't want to beat about the bush. The second plot contains a decade more of data. That decade happens to be pretty warm and so it appears in the second graph that the 30s are suddenly cooler. If you look at the actual values the 30s are indeed cooler. But only very slightly and talking of 'elimination' is pure hyberbole. The trend I calculate in both data sets, the 1999 one and the updated ones are both positive (only just though).

 

D.

That's why I tend to use global data...Anyone can produce a graph with inconvenient data extracted, for a multitude of reasons.

Link to comment
Share on other sites

Posted
  • Location: Solihull, West Midlands. - 131 m asl
  • Weather Preferences: Sun, Snow and Storms
  • Location: Solihull, West Midlands. - 131 m asl

Diessoli

 

You too are missing the point - 

 

1933 (I think) has been reduced from 1.5C to 1.2C.

1999 (I think) has been increased from 0.9C to 1.45C.

The extra data in the 2000 era is not involved in that comparison.

What has caused the  above two massive changes (Bear in mind that global warming is currently estimated at 0.1 per decade?)

Also  is the figure for (I think) 2011 correct? It is the value which distorts the whole graph.  

What has happened to these figures since 2011?

 

MIA

Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

What I meant is that it is obvious that the code should be correct and the fact that it's not mentioned in the programming guide does not say anything about it - whereas you made it sound as if it some bid deal.

 

D.

 

I think assurances that the model source-code is correct is pretty valuable if not essential; but then that's me.

That's why I tend to use global data...Anyone can produce a graph with inconvenient data extracted, for a multitude of reasons.

 

Yep, and changes (read: errors and corrections) tend to have less impact to the overall result.

Edited by Sparkicle
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

Yes of course. But that is not what the 'programing guide' is about. You make it sound as if that omission means that they don't care about it being correct.

 

The guide is called the Good Programming Guide. It isn't, it is a way to make Fortran programs run faster; some naieve PhD might turn up and believe that fast code is good code - and I've seen that happen before! It isn't. Correct code is good code. Whilst it might seem like an exercise in pedantry, it isn't; these issue vex the computer programming world today. Premature optimisation is consider evil for whole variety of reasons, and, in that respect - it should be renamed to be the 'Poor Programming Guide'

 

Incidentally, there are good climate programming standards out there: The Met Office standards, without reading it all, look pretty good to me. If you look at code behind NASA model, it comes nowhere near close.

Edited by Sparkicle
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

For instance, consider the following excerpt - purported to be 'code good' from NASA

 

 

 


Array arithmetic. In FORTRAN 90, arrays can be used directly in an expression (as long as it is conformal) without having to loop over the indexes. The example discussed above can be compactly written as below with the compiler deciding the most efficient way to loop over the variables.

 COMMON U,V,T,P,Q COMMON/WORK6/UT(IM,JM,LM),VT(IM,JM,LM),TT(IM,JM,LM),PT(IM,JM),*             QT(IM,JM,LM) COMMON/WORK2/UX(IM,JM,LM),VX(IM,JM,LM),TX(IM,JM,LM),PX(IM,JM) DIMENSION UALL(IM,JM,LM*3+1),UTALL(IM,JM,LM*3+1),UXALL(IM,JM,LM*3+1) EQUIVALENCE (UALL,U),(UTALL,UT),(UXALL,UX) .... UXALL=UALL UTALL=UALL QT=Q

 

Then read the relevant part of the MetO standards,

 

 


Some of the following sections detail features deprecated in or made redundant by Fortran 90. Others ban features whose use is deemed to be bad programming practice as they can degrade the maintainability of code.

  • [*]COMMON blocks - use Modules instead

 

You don't need to even know Fortran to realise something is really really wrong, there (and the MetO standards predate the NASA page,1995 vs 1999) Look at the existence of the COMMON blocks. Incidentally, the use of EQUIVALENCE (use pointer types instead) is also banned under the MetO quality assurance regime, but actively encouraged under NASA's climate office.

 

The reason COMMON is banned? Global variables are evil.

Edited by Sparkicle
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

I also still claim that Goddard misleads people by including data past 1999 in the second graph which visually reduces the 1930s, thus incorrectly bolstering his claim.

 

Agreed. Looks to me that Goddard is just one of those 'sceptics' that it's best to ignore, rather than spend a day or two figuring out what he (or his source) has done wrong etc etc. Would be much better to provide data in tables, as well as charts, if you are making such a bold claim.

Edited by Sparkicle
  • Like 2
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

The guide does not mention that 1+1=2 either.

 

Execpt x + x = y, therefore y / 2 = x, doesn't hold up sometimes. The maths might be right, but coding it correctly is whole different kettle of fish. Consider this simple code,

with Ada.Text_IO;procedure Main is    package My_Float is new Ada.Text_IO.Float_IO(Float);    x : Float := 0.1;    y : Float := x + x;begin    My_FLoat.Put (ITEM => y, FORE => 1, AFT =>20, EXP => 0);    x := y / 2.0;    My_FLoat.Put (ITEM => x, FORE => 1, AFT =>20, EXP => 0);    end Main;

Which gives the following results

 

x = 0.1

y = x + x = 0.20000000298023223900

x = y / 2 = 0.10000000149011611900

 

And then one starts to wonder, seriously, about sensitivity to initial conditions.

Edited by Sparkicle
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

Or maybe I have a different understanding of correctness?

 

Indeed, you do.

 

Correctness is that the software will function as the author intended; that is to say, logical soundness, secure, possesses integrity, and has bounded space and time requirements. I suggest you believe it is the first point, logical soundness, that counts.

 

Consider a global variable. I could write a module that contains a global variable, an array, perhaps; my module might possess all of those things above, but other authors come along and use that global variable in some hitherto unexpected way, in some remote module (or COMMON block!) It may well be the case that logical soundness is still enshrined, but there are no guarentees - indeed it is likely it is impossible to know - that it now possess any of the other attributes necessary to ensure a quality system.

 

Crucially, it might not be wrong, but it fails the correctness test. This reason is precisely the reason why object-oriented code has become so  popular; when well written, and designed, it makes building a correct system easier: particularly that encapsulation is at the heart of the technique. ie no or very few global variables.

Except climate models are not prone to sensitivity to initial conditions; weather models are.

 

That's absurd. Why on earth would anyone care about running climate ensembles if they aren't testing for initial condition sensitivity? AND they both tend to use, at least, the same primitive equation solutions (per institution) such as the Reynolds' equations. Why would you write it twice? Why would you write it badly twice?

Edited by Sparkicle
  • Like 2
Link to comment
Share on other sites

Posted
  • Location: Dulwich Hill, Sydney, Australia
  • Weather Preferences: Hot and dry or cold and snowy, but please not mild and rainy!
  • Location: Dulwich Hill, Sydney, Australia

 

That's absurd. Why on earth would anyone care about running climate ensembles if they aren't testing for initial condition sensitivity? ?

 

To perturb parameters. There is considerable uncertainty in the parameter inputs not just the climate initial conditions, and you can run ensembles for both reasons.

 

I believe that they do both. Initial conditions, and then parameter perturbations on top of this.

Edited by SomeLikeItHot
Link to comment
Share on other sites

Posted
  • Location: Rochester, Kent
  • Location: Rochester, Kent

To perturb parameters. There is considerable uncertainty in the parameter inputs not just the climate initial conditions, and you can run ensembles for both reasons.

 

I believe that they do both. Initial conditions, and then parameter perturbations on top of this.

 

Why perturb the parameters? The answer is in the question ....

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...