Solar irradiance has been trending downwards for more than 70 years, and steeply downwards for 30 years, the exact opposite of the temperature trend. Are you suggesting the resulting signal response for solar irradiance has a delay of more than 60 years?
The concept in play here is a little abstract but not that difficult to grasp. Anybody who has spent sufficient time in the kitchen, or with a chemistry set should generally be able to appreciate it.
You have a starting temperature. In this case the "gold standard" is 19th Century temperature norms(which
part of the 19th century may be another matter. Given that we have last gasp of the Little Ice Age, the Dalton Minimum, and "the Year Without a Summer" also present at the start of it.
You take note of the temperature of the item you're using in this "experiment" and use it for future references. But your item in question is rather large, and comprised of many different substances, some of which are at different temperatures, you use the temperature of one material as the proxy for the temperature of the item.
You then place the item on a burner/stove/heating element(proxy for solar irradiance), and you start to slowly increase the amount of heat energy being introduced into the object.
But parts of the object were much colder(and more energy dense) than what you used as your reference, so it takes a long while before you really start to notice a change. (Oceans start to warm by fractions of a degree, glaciers start to melt and recede in response to that extra fraction of a watt per meter, which then lowers albedo which then accelerates the process further, etc)
You then periodically start turning the burner up, then down, then up again, before turning it down once more. But never do you turn the burner down to a point below your initial reference point. As such your experiment is showing a clear and conclusive trend -- your item keeps getting hotter! The temperature increase continues without respect to if you have the burner set on high, or on low. Because your item hasn't yet hit an equilibrium point in response to that new energy input, even on the lowest setting(which wasn't off).
Thing is, for the past 100 years, that bunsen burner(the sun) was never turned "off," it never went below the lowest setting. The item(earth) hasn't yet achieved equilibrium with the amount of energy the sun had imparting upon it, although it probably was getting close, which is why the difference between 1998 and every hot year since has been so incrementally small, even as solar irradiance started to decline(from 20th century norms, not the 19th century ones) about 15 years ago.
It has remained hot, and even managed to get a bit hotter, because the proverbial burner was still on.
With a Grand Solar Minimum now appearing to be rushing towards us, we have the very real prospect that for the next 2 to 3 solar cycles at a minimum, we may not even see typical 19th century levels of irradiance. Luckily, if the above oversimplification is right, we should have a lot of "thermal inertia" in the system that will likely take a decade, if not more, to work its way out of the Earth's system. The only
immediate change you should see if solar irradiance has been driving things, is that we're probably going to stop setting temperature records after this year, although it could be a close run thing because of that same thermal inertia(which the "AGW effect" may be helping amplify in some respects, afterall, that's what the Greenhouse effect is all about, trapping heat).
As previously posted, if solar irradiance remains effectively unchanged, or starts to increase again, we'll continue setting records. IE "All bets are off."
The hard and solid "real world" test would be a continued decline in irradiance coupled with temperature stagnation followed by temperature decline within "a few years time" (up to a 10 year lag may be possible before it's detectable--if it took decades to really get started, taking decades to reverse course is in reason)