rightleft22, I think Crunch's problem is not trusting how the data is adjusted. As a non-scientist if you had a weather station that reported the following directly measured average temperatures over 10 years: 72, 72, 73, 72, 72, 71, 74, 72, 72, 72. It seems pretty clear its flat, but when that gets added to the model "after adjustment" as the following: 68, 69, 71, 70, 71, 72, 74, 72, 73, 75 and now shows a clear warming trend what do you end up believing?
The scientific case for adjustments on data is often plausible, but the adjustment trend seems to be supporting more of the warming case than the raw data, and that's a circumstance that clearly muddles the believability short term (long term it shouldn't be possible to "adjust" enough to maintain a reported non-existent trend - which is partially why failing to hit projections repeatedly is so damaging).
I personally, think the measurement data is still in the "early caveman" period of determining an accurate global temperature. Take a walk through the history of temperature measurements, and how much human error is built in (even today, let alone just 50 years ago), and then add in how much extrapolation is required for any historical temperature measures (all of which are measured indirectly, based on our best theories - which may turn out not be remotely accurate).