Part of the problem is we're too focused on statistics and measured outcomes than explanations. We need to go back over what was observed reliably in a statistical way and then understand the underlying mechanics well enough that we get the same result without measure.
That's a lot harder than it sounds but working through that difficulty will force us to understand things we don't understand currently and the resulting understanding can extrapolate to further prediction which will outperform statistics.
If I understand the circuits of a computer because I designed it I can understand what it will do if a certain segment of code is run into it more reliably that having probed its outside. You have to understand the mechanics, not the statistical outcomes of those mechanics.
(post is archived)