Absolutely fair. Apologies concerning my lack of clarity and coherence.
Here, I'm speaking about how little fruit arguments about epistomology have produced. In my opinion, the best work done in this realm has been related to critque rather than the creation of any convincing model. Theory of knowledge arguments are not affected by our thinking or instruments we create to detect phenomena because the no new discovery diminishes the question. Kant's "a priori" (innate) theory of knowledge is not only boring, but lazy. Believing knowledge to be innate was a road to lgicists later considering mathematics to be reducible or synonymous with logic. This was true because of logic's obviously close relationship with knowledge. It's already seductive enough to believe that mathematics and logic are one in the same. Bertrand Russell and others spent a lot of time trying to prove a something that might not have been so alluring a century after Kant's death if Kant and his laziness weren't so revered.
Nietzsche and Popper both attacked epistemological claims, albeit from different directions, but they didn't replace those claims with anything compelling. What I meant by utility solving epistemological problems is that, we can (if we chose) view human being's increased capacity to manipulate what we perceive as reality as a proving ground for the salience of knowledge we claim to have. This is essentially, William James' argument. If you have to know something new in order to generate some sort of outcome, use suppositions you gain from empirical observation and test them. The beauty of this is that #1 there is no requirement for this all-encompassing, free-floating "truth" to underpin knowledge because context is respected as a valid element in reality. And #2, if we consider some human inventions as extensions of and perhaps additions to our ability to generate empirical data, we will only continue to make progress in our ability to manipulate reality.
Arguments about what we can possibly know or if we can prove that we ever know anything seem to produce more edgy nonsense among the public than anything else and it wastes the time of philosophy as a field constantly falling behind its cousins in the sciences that need its scrutiny. What I am proposing is that the major problems that affect mankind are ontological ones. Our attitude about what we call an object or phenomenon, the qualites and processes we consider to be part of them. These are problems that can end us as a species. How? The same progress we make in building instuments to detect and measure reality also holds for our productive capacity. Our ability to affect reality in fundamental ways increases much more quickly than our philosophical sophistication about what is produced can accurately contextualize it. And in this case, I mean accuraccy in terms of utility for survival. Understanding as much about the nature of being both in the contexts we are aware of as well as expanding possible contexts is of greater importance. I absolutely understand that the great epitemological questions are valid. They're just less available.
These would take longer to cover than I am prepared to do here. The Wikipedia article for
Speculative Realism gives somewhat of a good overview. Speculative Realism has its roots in
Process Philosophy. The point of these disciplines is to get out of the game of pretending our current way of assessing objects is sufficient or free from bias. It is a philosophy more grounded in considering things in terms of ALL the factors across time, phenomena and thinking put to it. Our handling of objects would look less like nouns and more like a description that more fully explains it with all factors intact. This is a grotesque oversimplification.
Gilles Deleuze started to formalize a framework we could use to recontextualize our approach to judging reality. Notice the section on his epistemological stance. Now imagine his level of sobriety about the religious nonsense of Kantians is in the strong minority after centuries of contact with non-linear systems and practically all of quantum physics.
Manuel Delanda picked up where Deleuze left off and greatly improved the model. A way to think about this is, we should understand our inherited contexts and how impoverished much of it is. Imagine holding fast to the religious fanatacism of requiring things be more simple than they are even as we increase our capacity to know vastly more that shows the exact opposite is true. Every field that pretends all of its processes are somehow eternally reducible/expandible is doomed to failure. This is key to some complexity scientists' critique of all of neoclassical economics as well as some of the heterodox ideas. Physics at quantum levels does not behave in Newtonian ways. Unfortunately, most laypersons and people firmly entrenched in professional economic norms think that macroeconomics is little more than microeconomics at scale. This kind of thinking isn't thinking at all. It's religious fanatacism. It is addiction to simplicity of perspective even as we produce wider reaching and inherently more complex outcomes. The OP's question is not something that lacks validity, but it is a dichotomy wherein both sides are unified in their preference for linearity.
Person: Y U no vote?
Me: It's complicated.
I wrote this while working a shift after a sleepless day. Apologies again.