A key goal of survey interviews is to collect the highest quality data possible from respondents. In practice, however, it can be difficult to achieve this goal because respondents do not always understand particular survey questions as designers intended. Researchers have used a variety of indicators to identify and predict respondent confusion and difficulty in answering questions in different modes. In web surveys, it is possible to automatically detect response difficulty in real time. The research to date has focused on response latencies—mostly long response times—as evidence of difficulty. In addition to response latencies, however, web surveys offer rich behavioral data, which may predict respondent confusion and difficulty more directly than response times. This article focuses on one such behavior, mouse movements. We examine a set of mouse movements participants engage in when answering questions about experimental scenarios whose difficulty has been manipulated (as confirmed by respondent ratings). This approach makes it possible to determine which movements are general movements, demonstrating how a person interacts with a computer, and which movements are related to response difficulty. We find not only that certain mouse movements are highly predictive of difficulty but also that such movements add considerable value when used in conjunction with response times. The approach developed in this article may be useful in delivering help to confused respondents in real time and as a diagnostic tool to identify confusing questions.