In epistemology articles and textbooks (e.g., in the Stanford Encyclopedia), you'll often see claims like the following. S (some person) knows that P (some proposition) only if:
(1.) P is true.Although many philosophers (following Gettier) dispute whether someone's meeting these three conditions is sufficient for knowing P and a few (like Dretske) also dispute the necessity of condition 3, pretty much everyone accepts that the first two conditions are necessary for knowledge -- or necessary at least for "propositional" knowledge, i.e., knowing that [some proposition is true], as opposed to, for example, knowing how [to do something].
(2.) S believes that P.
(3.) S is justified in believing P.
But it's not clear to me that knowing a fact requires believing it. Consider the following case:
Ben the Forgetful Driver: Ben reads an email and learns that a bridge he normally drives across to get to work will be closed for repairs. He immediately realizes that he will have to drive a different route to work. The next day, however, he finds himself on the old route, headed toward the closed bridge. He still knows, I submit, in that forgetful moment, that the bridge is closed. He has just momentarily failed to deploy that knowledge. As soon as he sees the bridge, he'll smack himself on the forehead and say, "The bridge is closed, of course, I know that!" However, contra the necessity of (2) above, it's not clear that, in that forgetful moment as he's driving toward the bridge, he believes (or more colloquially, thinks) the bridge is closed. He is, I think, actually in an in-between state of believing, such that it's not quite right to say that he believes that the bridge is closed but also not quite right to deny that he believes the bridge is closed. It's a borderline case in the application of a vague predicate. (Compare: is a man tall if he is 5 foot 11 inches?) So: We have a clear case of knowledge, but only an in-betweenish, borderline case of belief.
Although I find that a fairly intuitive thing to say, I reckon that that intuition will not be widely shared by trained epistemologists. But I'm willing to wager that a majority of ordinary English-speaking non-philosophers will say "yes" if asked whether Ben knows the bridge is closed and "no" if asked whether he believes or thinks that the bridge is closed. (Actual survey results on related cases are pending, thanks to Blake Myers-Schulz.)
One way of warming up to the idea is to think of it this way: Knowledge is a capacity, while belief is a tendency. Consider knowing how to do something: I know how to juggle five balls if I can sometimes succeed, other than by pure luck, even if most of the time I fail. As long as I have the capacity for appropriate responding, I have the knowledge, even if that capacity is not successfully deployed on most relevant occasions. Ben has the capacity to respond knowledgeably to the closure of the bridge; he just doesn't successfully deploy that capacity. He doesn't call up the knowledge that he has.
Believing that P, on the other hand, involves generally responding to the world in a P-ish way. (If the belief is often irrelevant to actual behavior, this generality might be mostly in counterfactual possible situations.) Believing is about one's overall way of steering cognitively through the world. (For a detailed defense of this view, see here and here.) If one acts and reacts more or less as though P is true -- for example by saying "P is true", by inferring Q if P implies Q, by depending on the truth of P in one's plans -- then one believes. Otherwise, one does not believe. And if someone is mixed up, sometimes steering P-ishly and sometimes not at all P-ishly, then one's belief state is in between.
Consider another case:
Juliet the Implicit Racist: Juliet is a Caucasian-American philosophy professor. Like most such professors, she will sincerely assert that all the races are intellectually equal. In fact, she has better grounds for saying this than most: She has extensively examined the academic literature on racial differences in intelligence and she finds the case for intellectual equality compelling. She will argue coherently, authentically, and vehemently for that conclusion. Yet she is systematically racist in most of her day-to-day interactions. She (falsely) assumes that her black students will not be as bright as her white and Asian students. She shows this bias, problematically, in the way she grades her papers and leads class discussion. When she's on a hiring committee for an office manager, she will require much more evidence to become convinced of the intelligence of a black applicant than a white applicant. And so on.
Does Juliet believe that all the races are intellectually equal? I'd say that the best answer to that question is an in-betweenish "kind of" -- and in some attributional contexts (for example, two black students talking about whether to enroll in one of her classes) a simple "no, she doesn't think black people are as smart as white people" seems a fair assessment. At the same time, let me suggest that Juliet does know that all the races are intellectually equal: She has the information and the capacity to respond knowledgeably even if she often fails to deploy that capacity. She is like the person who knows how to juggle five balls but can only pull it off sporadically or when conditions are just right.
(Thanks to David Hunter, in conversation, for the slogan "knowledge is a capacity, belief a tendency".)