I’m privileged to attend and present at an immunology conference this week on B cells and HIV vaccines, a cause that is surely one of the scientific world’s most important challenges. But as I’ve listened through a few days of incredibly impressive talks, I’ve begun wondering whether the structure of scientific research in the 21st century is its own worst enemy.
The problem is that there is an auditorium full of brilliant ambitious minds. And it seems the goal, or even the groupthink, of the field is to put together the most impressive rigorous set of experiments. That sounds like a reasonable goal, but it seems like many of the projects I’ve seen prioritize that goal over the development of new knowledge, either basic or applied.
Too often, there is a new, flashy, and usually very expensive technique, and that technique is used to rehash experiments done in the past with inferior technologies. It’s rigorous, and it fits nicely into our high profile journals. But have we learned anything new from these experiments?
I’m not alone on this idea. In his Nobel acceptance speech, Sydney Brenner famously remarked “We are drowning in a sea of data and starving for knowledge,” and that was in 2002, before the current explosion of “big data.”
I’m not immune from this critique myself; I’ll be presenting on a proteomic view of autoantibodies in a set of diseases, something that could fit into the small side of the “big data” spectrum. But I think we all need to step back and think a bit more creatively about how we can advance science in more directions than we currently do.