An Epistemic Predicament and a Method of Handling It
A responsible thinker seeks to acquire justified true beliefs and avoid falsehoods, although there is a significant challenge in combining these goals. I will elaborate below.
Take any proposition p which is epistemically probable and yet fallibly justified for person S. In other words, the reasons for believing that p are adequate but don’t guarantee that p is true. Suppose that p has an epistemic probability of .7. Now, on one hand, if S believes that p, which given its probability of .7, is not epistemically certain for S, the fact that p is not epistemically certain entails that for all S knows, p might be false; thus, although p might well be true, S risks having a false belief that p. Yet on the other hand, if S denies that p, or remains agnostic about p, then S risks missing a promising opportunity to obtain a true belief, since p might well be true, given its likelihood of .7.[1] In short, the predicament is this: if one believes that p, one might well obtain a true belief (if p is true), but might just acquire a false one (if p is false); however, if one does not believe that p, one will avoid a false belief (since one does not commit to any belief on the matter), yet might miss a decent opportunity to pick up a true belief. Neither option is ideal for someone seeking to acquire true beliefs and avoid false ones.
What is one to do about this predicament?
Here is one plausible approach to handling the quandary. Let’s grant for now that propositional knowledge entails epistemic certainty. In other words, if one knows that p, then one is epistemically certain that p given one’s evidence for p (i.e., if S knows that p, then p is certain for S given S’s evidence for p; one cannot be wrong that p given one’s reasons for p.) Smith resolves to avoid claiming to know any proposition which is not epistemically certain for him and instead only to claim knowledge of propositions for which he is epistemically certain, such as propositions of basic mathematics and logic, fundamental analytic propositions, indubitable self-knowledge, matters regarding how terms are defined, and perhaps basic propositions of morality.[2]
Further, Smith adopts the mindset that he will remain doxastically agnostic about any proposition that (a) falls short of epistemic certainty or something close to it and (b) does not have any existentially significant pragmatic, moral, or spiritual encroachment on his life or the lives of others to whom he has a relevant moral obligation, while at the same time remaining reasonably open-minded to obtain evidence that would move such propositions into the category of epistemic certainty. Examples of such propositions are “There is an even number of stars in the universe” and “There is intelligent life in the Alpha Centauri solar system” and “Electrons exist and are not merely useful fictions constructed for the sake of their explanatory and predictive value in physics.”
For those propositions which fall short of epistemic certainty (or something close to it) but encroach on his life (or on human welfare generally) in an existentially weighty manner, Smith will believe (or be inclined to believe) such propositions if they are such that their evidence makes them epistemically probable to some degree between 50 and 100% (i.e., if the epistemic probability of such propositions is above .5 but below 1). Yet Smith will remain reasonably open-minded to obtain evidence that would move such propositions either into the category of epistemic certainty or the category of epistemic improbability (less than .5 likelihood) or the category of epistemic counterbalance (i.e., being at .5 and thus equally probably true and probably false). For those propositions that are counterbalanced, Smith will be open to believing them, and likely will believe them, if there is some non-epistemic (pragmatic) reason that is adequately weighty as to push the balance in favor of believing them, such as some grave matter of human welfare.[3]
I have articulated a practical approach to addressing the predicament noted at the outset. One should believe that which is epistemically certain (e.g., that 2+2=4, that all bachelors are unmarried men, or that there are no square circles) and remain agnostic about claims which are to some degree probable yet not epistemically certain unless the claim is of such existential import that one has a serious practical reason to believe it. One’s beliefs that are probably true and have some degree of practical importance (e.g., that Julius Caesar crossed the Rubicon on January 10, 49 BC, or that it will rain this afternoon, or that there is a tree in the yard) yet fall short of epistemic certainty should be held loosely, and one should be prepared to modify them based on new evidence, just as one would change a shirt based on weather conditions. In this respect, we might heed St. Francis, who is supposed to have advised that one “wear the world like a loose garment.” (I have not been able to confirm that St. Francis actually said this, notwithstanding many claims on the internet that he, in fact, was the author of this nice aphorism. I have not found the source of this quotation.) Philosophers and other responsible thinkers wear their non-certain positions loosely, ready to exchange them if they find a better fit.
My proposed method of managing the predicament should help optimize combinatorially the goals of (i) obtaining true belief and (ii) avoiding false belief. One who follows this method might not obtain as many true beliefs as the gullible person who believes everything and therefore also believes many falsehoods. Clearly, the wise person will avoid such credulousness. And one who follows this method might adopt more false beliefs than the Pyrrhonian skeptic who (supposedly) refused to believe any proposition whatsoever — which is arguably not the wisest course of life either. My method seeks a rational midpoint between gullibility and total doxastic agnosticism (which arguably is impossible to achieve in practice, since it seems we must have at least some basic beliefs).
But my proposed method is merely a stop-gap. In the end, there might be a better way. I hope so.
[1] By “agnostic,” I mean what I called elsewhere “doxastic agnosticism” rather than “epistemic agnosticism.” The former is a matter of neither believing nor not believing a proposition. The latter concerns the claim that one does not know whether or not a proposition is true.
[2] Descartes referred to such propositions as “morally certain” and thus “sufficient to regulate our behavior, or which measures up to the certainty we have on matters relating to the conduct of life which we never normally doubt, though we know that it is possible, absolutely speaking, that they may be false.” (See The Philosophical Writings of Descartes, Volume 1, J. Cottingham, R. Stoothoff, and D. Murdoch (eds.), Cambridge: Cambridge University Press, 1985, p. 289.) Also discussed here.
[3] In this respect, it is wise to reject W. K. Clifford’s maxim. In The Ethics of Belief, he writes: “To sum up: it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.” (See page 5) Let’s set aside, for now, the objection that there does not seem to be sufficient evidence for believing that maxim, in which case the maxim undermines itself. Consider another objection: if a proposition is counterbalanced with respect to the relevant evidence, and yet the proposition is sufficiently grave such that its truth value encroaches seriously on human welfare, one has a non-epistemic reason to believe it, and therefore (arguably) is permitted to believe it.
Say, for example, that you are stranded alone in the desert, and you are near death from thirst. You discover a bottle of clear liquid which appears to be water, and yet you have some considerable reason to believe it might be toxic liquid which looks like water. In this case, let us suppose, your reasons for and against the belief that the bottle contains water are equally probable. That is, the proposition that the bottle contains water is epistemically counterbalanced for you. Now, if you do not drink water in the next five minutes, you will almost certainly die in that period. If you drink the liquid, being aware of the epistemic probability of 50% that the liquid is water, you thus possess a 50% chance of drinking water and therefore remaining alive. Hence, if you don’t drink the liquid, you have (say) a 95% probability of dying, but if you drink the liquid, you have a 50% probability of surviving and a 50% probability of dying. It is prudentially useful to believe for practical reasons that the bottle contains water, a belief which will support your decision to drink its contents. And yet your hard evidence is insufficient to support that belief, since you are counterbalanced at 50/50.