We develop a framework for assessing when somebody will eventually notice that she has a misspecified model of the world, premised on the idea that she neglects information that she deems—through the lens of her misconceptions—to be irrelevant. In doing so, we assess the attentional stability of both general psychological biases—such as naivete about present bias—and empirical misconceptions—such as false beliefs about consumer demand. We explore which combinations of errors and environments allow an error to persist, versus which errors lead people to incidentally learn they have things wrong because even the data they deem relevant tells them that something is amiss. We use the framework to shed light on why fresh eyes are valuable in organizational problems, why people persistently use overly coarse (vs. overly fine) categorizations, why people sometimes recognize their errors in complex environments when they don’t in simple environments, and why people recognize errors in others that they don’t recognize in themselves.