Predictive Social Perception

This project conceptualizes the human ability to make sense of the behavior of other people in terms of predictive processing. These views turn conventional models of information processing in the brain on its head. Incoming stimuli – a cup, a flower, or, indeed, the action of another person  – are not just assumed to passively activate knowledge of their meaning. Instead, perception is seen as a process of hypothesis testing, in which the brain actively tries to fit new interpretations to the input, and in which conscious experience always reflects the current “best guess”. Such frameworks have been widely influential, explaining longstanding puzzles in perception (that pink surfaces appear white under red illumination), action (people’s striking ability to catch fast moving objects), or psychological disorders (the “realness” of schizophrenic hallucinations). 

Do such hypothesis testing processes also underlie people’s remarkable ability to understand other people’s behaviour? If this is the case, then how people perceive others' behaviour should be similarly affected by prior expectations, as it is the case for lower-level perception. In other words, in the same way as an object might appear either convex or concave depending on whether one believes the light comes from above or below, actions might appear slightly differently depending on which goals one attributes to them.

Results have provided direct support for these ideas. With simple experimental manipulations (see sample paradigm below), we can make actions appear differently, simply by manipulating the observers expectations about it, even when they visually identical. For example, people perceive reaches further towards an object than they really were, when they believe that the actor wants to pick it up, and further away when they assume a withdrawal (Hudson et al. 2016a; 2016b; 2017). Similarly, the same actions appear slightly lower or higher if a matching goal object is located in these directions, or if there is an obstacle in the way.

Such distortions do not only happen in the visual domain, but also in the tactile domain, when people sometimes "feel" stimulation happening to others on their own bodies, for example, when watching someone in pain or seeing a spider crawl up their arm. Again, our work suggests that these vicarious sensations are predictive: they do not only capture information provided by the stimulus, but are enriched by prior knowledge. As an example, people "feel" others' pain on their own body to the extent that they report illusory stimulation, even when they do not see any skin damage, but know that the object they have just touched is painful (Morrison et al., 2013; Bach et al., 2014).

Together, these findings provide a first indication that perception of others' behavior is not veridical, but biased by one's prior expectations of what others will do and how these actions will feel. As assumed by recent predictive processing models, perceptual systems may therefore play a key role in understanding others' actions, not only passively representing the perceptual input, but actively predicting what will be perceived next, from knowledge about the seen individuals and the objects that are available in the current situation.

Esrc_logo.png

This project was funded by the ESRC Grant: “One step ahead: Prediction of other people's behavior in healthy and autistic individuals” (ES/J019178/1).

 

Sample experimental paradigm

This paradigm induces simple visual illusions in which goals attributed to others change the perception of their actions. Participants see an actor reach or withdraw from an object. At some point on its course, the hand disappears and they judge the hand's disappearance point, simply by indicating whether it is identical or different from a probe stimulus presented shortly after, which is either in the same position or shown slightly ahead or back. Prior to the action onset, we induce different action goals, either by the actor verbally stating it (e.g., "I'll take it!" vs. "I'll leave it!") or by the participant giving a goal to the actor (e.g., "Take it!" vs. "Leave it!").

Across three series of experiments, this paradigm has shown that such goal attributions directly affect the actions' perceptual representation: hands are perceived to have disappeared closer to the object when the actor has the (inferred) goal to pick it up, and further away when participants believe the goal is to withdraw, even when visual stimulation is identical. This happens irrespective of whether the actor stated the goal himself (Hudson et al., 2016a) or when the participants instructed him with a goal (Hudson et al., 2016b), but the effect is modulated by the reliability with which goals predict future actions. The effect is enhanced for actors that typically do as they say, and reduced for actors that typically do the opposite (Hudson et al., 2017). 

 

Project Publications

Theoretical articles

Bach, P.  & Schenke, K. (2017). Predictive social perception: towards a unifying framework from action observation to person knowledge. Social and Personality Psychology Compass. PublisherPDF

Bach, P., Nicholson, T. and Hudson, M. (2014) The affordance-matching hypothesis: how objects guide action understanding and prediction. Frontiers in Human Neuroscience 8:254Publisher – PDF

 

Empirical articles

Hudson, M., Nicholson, Kharko, A., T., McKenzie, R., & Bach, P. (2021). Predictive Action Perception from Explicit Intention Information in Autism. Psychonomic Bulletin and Review. PublisherPDFData

Schenke, K. C., Wyer, N., Tipper, S., & Bach, P. (2020). Predictive person models elicit motor biases: the face-inhibition effect revisited. Quarterly Journal of Experimental Psychology, 74(1), 54-67. PublisherPDF — Data

McDonough, K.L., Costantini, M., Hudson, M., Ward, E. & Bach, P. (2020). Affordance matching predictively shapes the perceptual representation of others’ ongoing actions. Journal of Experimental Psychology: Human Perception and Performance. PublisherPDFData

McDonough, K.L., Hudson, M., & Bach, P. (2019). Cues to intention bias action perception toward the most efficient trajectory. Scientific Reports, 9. PDF -- Data

Hudson, M., McDonough, K. L., Edwards, R., & Bach, P. (2018). Perceptual teleology: expectations of action efficiency bias social perception. Proc. R. Soc. B285(1884), 20180638. Publisher -- PDF -- Data
--> Press Coverage: EurekaAlertBusiness StandardDeccan ChronicleTUNReddit-Front page

Hudson, M., Nicholson, T., & Bach, P. (2017). You Said You Would! The Predictability of Other's Behavior from their Intentions Determines Predictive Biases in Action Perception. Journal of Experimental Psychology: Human Perception and Performance. Publisher – PDF – Data

Hudson, M., Nicholson, T., Simpson, W. A., Ellis, R., & Bach, P. (2016). One step ahead: The perceived kinematics of others’ actions are biased toward expected goals. Journal of Experimental Psychology: General, 145(1), 1-7. Publisher – PDF – Data
-- featured in APA's Particularly Exciting Experiments in Psychology (PeePs) Newsletter. 

Hudson, M., Nicholson, T., Ellis, R., & Bach, P. (2016). I see what you say: Prior knowledge of other's goals automatically biases the perception of their actions. Cognition, 146, 245-250. Publisher – PDF – Data

Joyce, K., Schenke, K., Bayliss, A. & Bach, P. (2015). Looking ahead: Anticipatory cuing of attention to objects others will look at. Cognitive Neuroscience, 1-8Publisher – PDF – Data

Bach, P., Fenton-Adams, W., & Tipper, S.P. (2014). Can't touch this: the first-person perspective provides privileged access to predictions of sensory action outcomes. Journal of Experimental Psychology: Human Perception and Performance. Publisher – PDF – Data

Morrison, I., Tipper, S. P., Fenton‐Adams, W. L., & Bach, P. (2012). “Feeling” others' painful actions: The sensorimotor integration of pain and action information. Human Brain Mapping, 34(8), 1982–1998. PDF

Hudson, M., Burnett, H.G. & Jellema, T. (2012). Anticipation of intentional actions in high-functioning autism. Journal of Autism & Developmental Disorders, 42(8), PDF

Hudson, M., Nijboer, T. & Jellema, T. (2012). Implicit learning of social information and its relation to autistic traits. Journal of Autism & Developmental Disorders, 42(12), 2534-2545. PDF

Hudson, M. & Jellema, T. (2011). Resolving ambiguous behavioural intentions by means of involuntary prioritisation of gaze processing. Emotion. 11(3), 681-686. PDF

Hudson, M, Hong-Liu, C. & Jellema, T. (2009). Anticipating intentional actions: The effect of eye gaze direction on the judgment of head rotation. Cognition. 112, 423-434. PDF