To me it is a science-philosophy thing. One collects data, develops models, and tests them. Or one develops a model and collects data to support it. The conlusion driven reasoner is fertile ground for exuses and rationalizations. Delay of a particular NIST standard cripples a must-have technology, lazy dependent-class workers derail supply-side utopia, just as every new standards incarnation is the ONE that will precipitate inevitability, only one last regulation stands in the way of lassez faire supply side nirvana.
The people are not dumb per se, but they have a broken process. The have decided ahead of time what is, and collect data to support it, ingoring the pieces that don't fit as explainable exceptions.
It is a consequence of nearness. Were one to replace the ojects with things less near, the broken process would be more obvious. If GG or SKS where presented information about a company with a new tool to core and make edible a widely distributed new awesome tropical fruit, indeed that there were 100s of millions of these fruits, that these fruits were well publicised and endorsed by government, and the coring tool was comparitively cheap in culinary terms, the presenter claiming pleople just didn't get it and clung foolishly to their less nutritious bananas would likely be told that it looks like few desire the new miracle fruit, and that this truth would prove crippling and that they needed to rethink things and collect better data before dropping another dime on the corer. Same goes for the trickle-down idiots, they are likely adequately clever on things further away.