I've been thinking about this for a little while so I thought it's be nice to have some discussion, see what other people who happen not to be me think. So, I've been thinking that as disciplines advance, they require the people studying those disciplines to become more and more specialized. This makes sense. Even in already specialized disciplines, sub-specialties arise; for instance, in English Lit, there's Medieval, Renaissance, Restoration and 18th century, Romanticism, Victorianism, Modernism, Postcolonial, and Literary Theory & Criticism. This all makes perfect sense, as disciplines grow, they naturally require a greater focus on them to continue advancing.
The problem that I've encountered with this is simply that, as disciplines become more and more specialized, they naturally exclude, to the extent that there is no posible way for a layperson to completely understand the advancements coming out of those specialties. The problem with this is that those specialists have the power to make decisions that change society. As academic disciplines become more and more specialized, this essentially means that society is changing for reasons that the vast majority of the populace, with only a general education, cannot possibly understand. We see this already with evolution, certainly. Anytime you encounter a debate about evolution on the internet, there will be maybe three or four people participating in the debate on either side who actually know or understand what evolution entails, or the Big Bang Theory (stupidest name for a theory ever) for that matter. Certainly most of the creationists do not understand, but even those arguing for evolution often display a lack of knowledge about evolution, and their arguments can often be boiled down to "it's science, lol. Believe it!" It has been this way a long time for religious theologies; without some major study of theogy, religious decisions are barely understandable.
So, I suppose what I'm uncomfortable with is the fact that as the world becomes more and more specialized, human beings lose a majority of their agency in social changes that, for the most part, they have to take on faith. Biologists say this is likely, therefore it is. This certainly occurs with nutrition already - most people know whether a food is good or bad for you, but a lot of them, including myself, certainly don't know why that food is good or bad for you. To some extent, society is taking on the worst parts of religion, in that people are caught up in social change that is for the most part misunderstood or not understood at all, and agency is being limited in that we have to accept most things in our lives on faith alone.
I don't know, maybe I'm overthinking this, but I wanted to see what you guys thought about this. I don't really know what I'm expecting. Just discussion, I guess.