Back when I was doing research, one of my advisors once joked that, if you wait long enough, you can produce an old result using new methods, manage to get it published, and everyone will be impressed. I think his time limit was 15 years. Apparently, when it comes to big ideas about science (rather than scientific results), the schedule's a bit accelerated.
Just shy of 10 years ago, Chris Anderson, then Editor-in-Chief at Wired, published a piece in which he claimed that cloud computing was making the scientific method irrelevant. All those models and theories didn't matter, so long as an algorithm could identify patterns in your data. The piece was wrong then, as I explained at the time (see below). It hasn't gotten any more right in the meantime.
Yet a quote from Chris Anderson's article led off a new column last month that essentially says Anderson was right, he just had the wrong reason. It's not cloud computing that's going to make theory irrelevant—it's AI, the piece argues. Once trained, AI can recognize patterns using rules that we don't comprehend. Set it loose on scientific data, and it can pull things out without needing anything like a model or a theory.