Two more preprints

Parrotta, E., Bach, P., Pezzulo, G., Costantini, M., Ferri, F. (2023). Exposure to false cardiac feedback alters pain perception and anticipatory cardiac frequency. eLife12:RP90013. Preprint

Parrotta, E., McDonough, K., Enwereuzor, C, Bach, P., (2023). Observers translate information about other agents’ higher-order goals into expectations about their forthcoming action kinematics. OSF. Preprint

Four new lab preprints

Currie, J., Giannaccini, E.M., & Bach, P. (2023). Consequential sound induces illusory distortions in the perception and prediction of robot motion. OSFPreprints. Preprint.

Parrotta, E., McDonough, K., Bach, P. (2023). Imagery as predicted perception: imagery predictively biases perceptual judgments of action kinematics. PsyArXiv. Preprint.

Parrotta, E., Bach, P., Perrucci, M. G., Costantini, M., & Ferri, F. (2022). Heart Is Deceitful Above All Things: illusory perception of heartbeat is induced by pain expectation. bioRxiv, 2022-09. Preprint.

Pulling, V., Phillips, L.H., Bach, P., Newlands, A. & Jackson, M. (2023). Using Predictions to Resolve Emotional Ambiguity: Facial Expression Intensity Influences the Reliance on Prior Expectation. PsyArXiv, DOI: 10.31234/osf.io/f75cv. Preprint.

Our pre-registered replication of our prior paper is out

Expectations of efficient actions bias social perception: a pre-registered online replication

Humans take a teleological stance when observing others' actions, interpreting them as intentional and goal directed. In predictive processing accounts of social perception, this teleological stance would be mediated by a perceptual prediction of an ideal energy-efficient reference trajectory with which a rational actor would achieve their goals within the current environmental constraints. Hudson and colleagues (2018 Proc. R. Soc. B285, 20180638. (doi:10.1098/rspb.2018.0638)) tested this hypothesis in a series of experiments in which participants reported the perceived disappearance points of hands reaching for objects. They found that these judgements were biased towards the expected efficient reference trajectories. Observed straight reaches were reported higher when an obstacle needed to be overcome than if the path was clear. By contrast, unnecessarily high reaches over empty space were perceptually flattened. Moreover, these perceptual biases increased the more the environmental constraints and expected action trajectories were explicitly processed. These findings provide an important advance to our understanding of the mechanisms underlying social perception. The current replication tests the robustness of these findings and whether they uphold in an online setting.

McDonough, K., Bach, P. (2023). Expectations of efficient actions bias social perception: a preregistered online replication. Royal Society Open Science, 10(2). PublisherPDFData

A twitter thread on the paper can be found here.

Our theoretical paper on the mechanisms of motor imagery is out

Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control.

Overt and imagined action seem inextricably linked. Both have similar timing, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is, therefore, often assumed to recruit the same motor processes that govern action execution, and which allow one to play through or simulate actions offline. Here, we advance a very different conceptualization. Accordingly, the links between imagery and overt action do not arise because action imagery is intrinsically motoric, but because action planning is intrinsically imaginistic and occurs in terms of the perceptual effects one want to achieve. Seen like this, the term ‘motor imagery’ is a misnomer of what is more appropriately portrayed as ‘effect imagery’. In this article, we review the long-standing arguments for effect-based accounts of action, which are often ignored in motor imagery research. We show that such views provide a straightforward account of motor imagery. We review the evidence for imagery-execution overlaps through this new lens and argue that they indeed emerge because every action we execute is planned, initiated and controlled through an imagery-like process. We highlight findings that this new view can now explain and point out open questions.

Bach, P., Frank, C., & Kunde, W. (2022). Why motor imagery is not really motoric: Towards a reconceptualization in terms of effect-based action control. Psychological Research. https://doi.org/10.1007/s00426-022-01773-w. PublisherPDF

Special issue on predictive processing

Predictive Mechanisms in Action, Perception, Cognition, and Clinical Disorders

Patric, Anila D’Mello, Phil Corlett and Liron Rosenkrantz organized a special issue on predictive processing. It is now published. Check out the eight articles here — see below for our editorial.

D'Mello, A. M., Bach, P., Corlett, P. R., & Rozenkrantz, L. (2022). Predictive mechanisms in action, perception, cognition, and clinical disorders. Frontiers in Human Neuroscience, 598. Publisher

Preprint on the links between motor imagery and action planning

Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control.

Overt and imagined action seem inextricably linked. Both follow similar timings, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is therefore often assumed to rely on the motor processes governing action execution itself, which allow one to play through or simulate actions offline. Here, we advance a very different conceptualization. In this view, the links between imagery and overt action do not arise because action imagery is intrinsically motoric, but because action planning is intrinsically imaginistic and occurs in terms of the perceptual effects we want to achieve. Viewed like this, the term 'motor imagery' is a misnomer of what is more appropriately portrayed as 'effect imagery'. In this article, we review the evidence for imagery-execution overlaps through this new lens and argue that they indeed emerge because every action we execute is planned, initiated and controlled through an imagery-like process. We highlight findings that this new view can now explain and point out open questions

Bach, P., Frank, C., & Kunde, W. (2021, October 23). Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control. PsyArxiv.

Special issue on action affordances

Behavioral and Neural Bases of Object Affordance Processing and Its Clinical Implications.

Patric, Sanjay Kumar and Dimitrios Kourtis organized a special issue on affordance processing. It is now published. Check out the eight articles here — see below for our editorial.

Kumar, S., Bach, P. & Kourtis, D (2021). Editorial to the Special Issue: Behavioral and Neural Bases of Object Affordance Processing and its Clinical Implications. Frontiers in Human Neuroscience. Publisher

We're recruiting a post doc

1xBJsGYV_400x400.jpg

Come work with us in Aberdeen!

We are currently recruiting for a 42 months postdoc, to work on the Leverhulme Trust funded project “Social perception as Bayesian Hypothesis Testing and Revision”. The deadline for applications in the 27th of April.

The project investigates how predictions help people make sense of the behavior of others, and which neuro-cognitive mechanisms underlie these abilities. The work will be lead by Patric Bach in Aberdeen, in collaboration with Elsa Fouragnan and Giorgio Ganis in Plymouth and Paul Downing in Bangor. The project will run for 42 months.

Dr. Katrina McDonough already works as postdoctoral researcher on the grant and leads the behavioral research stream. We are looking to recruit a second full time post-doctoral researcher with expertise in neuro-imaging methods (EEG and/or fMRI) and good programming skills. If interested, please email Patric Bach and have a look at the role description and project description.

Our paper on predictive social perception in autism is out


Predictive action perception from explicit intention information in autism

Esrc_logo.png

Social difficulties in autism spectrum disorder (ASD) may originate from a reduced top-down modulation of sensory information that prevents the spontaneous attribution of intentions to observed behaviour. However, although people with autism are able to explicitly reason about others’ mental states, the effect of abstract intention information on perceptual processes has remained untested. ASD participants (n = 23) and a neurotypical (NT) control group (n = 23) observed a hand either reaching for an object or withdrawing from it. Prior to action onset, the participant either instructed the actor to “Take it” or “Leave it”, or heard the actor state “I’ll take it” or “I’ll leave it”, which provided an explicit intention that was equally likely to be congruent or incongruent with the subsequent action. The hand disappeared before completion of the action, and participants reported the last seen position of the tip of the index finger by touching the screen. NT participants exhibited a predictive bias in response to action direction (reaches perceived nearer the object, withdrawals perceived farther away), and in response to prior knowledge of the actor’s intentions (nearer the object after “Take it”, farther away after “Leave it”). However, ASD participants exhibited a predictive perceptual bias only in response to the explicit intentions, but not in response to the motion of the action itself. Perception in ASD is not immune from top-down modulation. However, the information must be explicitly presented independently from the stimulus itself, and not inferred from cues inherent in the stimulus.

Hudson, M., Nicholson, Kharko, A., T., McKenzie, R., & Bach, P. (2021). Predictive Action Perception from Explicit Intention Information in Autism. Psychonomic Bulletin and Review. PublisherPDFData

New paper on perspective taking


Is Implicit Level-2 Visual perspective taking embodied? Perceptual simulation of others’ perspectives is not impaired by motor restriction.

Embodied accounts of visual perspective taking suggest that judgements from another person’s perspective are less effortful if one’s own body position aligns with that of the other person, indicating a causal role of posture in visual perspective taking. Using our adapted mental rotation paradigm, here we tested whether movement has a causal role in perspective taking, by restricting participants’ movement in half of the experimental trials. Here we show, using our previously validated task, that the perceptual representation of another’s visual perspective is not influenced by participants’ ability to move. These data therefore rule out active physical movement as a causal explanation of visual perspective taking and instead argue that postural readjustments when making judgements from another’s perspective are a bodily consequence of the mental transformations of a person’s actual to imagined position in space.

Ward, E., Bach, P., McDonough, K., & Ganis, G. (2022). Is Implicit Level-2 Visual perspective taking embodied? Perceptual simulation of others’ perspectives is not impaired by motor restriction. Quarterly Journal of Experimental Psychology. PDFPreprintData

New paper out, with Kim Schenke, Natalie Wyer, and Steve Tipper


Predictive person models elicit motor biases: The face-inhibition effect revisited

Using an established paradigm, we tested whether people derive motoric predictions about an actor’s forthcoming actions from prior knowledge about them and the context in which they are seen. In two experiments, participants identified famous tennis and soccer players using either hand or foot responses. Athletes were shown either carrying out or not carrying out their associated actions (swinging, kicking), either in the context where these actions are typically seen (tennis court, soccer pitch) or outside these contexts (beach, awards ceremony). Replicating prior work, identifying non-acting athletes revealed the negative compatibility effects: viewing tennis players led to faster responses with a foot than a hand, and vice versa for viewing soccer players. Consistent with the idea that negative compatibility effects result from the absence of a predicted action, these effects were eliminated (or reversed) when the athletes were seen carrying out actions typically associated with them. Strikingly, however, these motoric biases were not limited to In-Context trials but were, if anything, more robust in the Out-of-Context trials. This pattern held even when attention was drawn specifically to the context (Experiment 2). These results confirm that people hold motoric knowledge about the actions that others typically carry out and that these actions are part of perceptual representations that are accessed when those others are re-encountered, possibly in order to resolve uncertainty in person perception.

Picture7.jpg

Schenke, K. C., Wyer, N., Tipper, S., & Bach, P. (2020). Predictive person models elicit motor biases: the face-inhibition effect revisited. Quarterly Journal of Experimental Psychology, 74(1), 54-67. PublisherPDF — Data

The 3rd paper from Katrina's PhD is out -- published in JEP:HPP


Affordance Matching Predictively Shapes the Perceptual Representation of Others’ Ongoing Actions

Predictive processing accounts of social perception argue that action observation is a predictive process, in which inferences about others’ goals are tested against the perceptual input, inducing a subtle perceptual confirmation bias that distorts observed action kinematics toward the inferred goals. Here we test whether such biases are induced even when goals are not explicitly given but have to be derived from the unfolding action kinematics. In 2 experiments, participants briefly saw an actor reach ambiguously toward a large object and a small object, with either a whole-hand power grip or an index-finger and thumb precision grip. During its course, the hand suddenly disappeared, and participants reported its last seen position on a touch-screen. As predicted, judgments were consistently biased toward apparent action targets, such that power grips were perceived closer to large objects and precision grips closer to small objects, even if the reach kinematics were identical. Strikingly, these biases were independent of participants’ explicit goal judgments. They were of equal size when action goals had to be explicitly derived in each trial (Experiment 1) or not (Experiment 2) and, across trials and across participants, explicit judgments and perceptual biases were uncorrelated. This provides evidence, for the first time, that people make online adjustments of observed actions based on the match between hand grip and object goals, distorting their perceptual representation toward implied goals. These distortions may not reflect high-level goal assumptions, but emerge from relatively low-level processing of kinematic features within the perceptual system.

Picture8.jpg

McDonough, K.L., Costantini, M., Hudson, M., Ward, E. & Bach, P. (2020). Affordance matching predictively shapes the perceptual representation of others’ ongoing actions. Journal of Experimental Psychology: Human Perception and Performance. PublisherPDFData

Grant from the Leverhulme Trust!

1xBJsGYV_400x400.jpg

We are grateful for the Leverhulme Trust for awarding us £462.995 to investigate how predictions help people make sense of the behavior of others, and which neuro-cognitive mechanisms underlie these abilities. The work will be lead by Patric Bach in Aberdeen, in collaboration with Elsa Fouragnan and Giorgio Ganis in Plymouth and Paul Downing in Bangor.

The project will run from May 2020 to December 2023. Please see here for for a project overview.

Dr. Katrina McDonough will work as postdoctoral researcher and lead the behavioral research stream. We are looking to recruit a second full time post-doctoral researcher with expertise in neuro-imaging methods (EEG and/or fMRI) and good programming skills.

While recruitment is currently on hold until the COVID-19 situation is clearer, please email Patric Bach (patric.bach@abdn.ac.uk) if you are interested in the position.

Ellie's 2nd paper is out -- published this week in Cognition


Perspective taking as virtual navigation? Perceptual simulation of what others see reflects their location in space but not their gaze.

Other peoples' (imagined) visual perspectives are represented perceptually in a similar way to our own, and can drive bottom-up processes in the same way as own perceptual input (Ward, Ganis, & Bach, 2019). Here we test directly whether visual perspective taking is driven by where another person is looking, or whether these perceptual simulations represent their position in space more generally. Across two experiments, we asked participants to identify whether alphanumeric characters, presented at one of eight possible orientations away from upright, were presented normally, or in their mirror-inverted form (e.g. “R” vs. “Я”). In some scenes, a person would appear sitting to the left or the right of the participant. We manipulated either between-trials (Experiment 1) or between-subjects (Experiment 2), the gaze-direction of the inserted person, such that they either (1) looked towards the to-be-judged item, (2) averted their gaze away from the participant, or (3) gazed out towards the participant (Exp. 2 only). In the absence of another person, we replicated the well-established mental rotation effect, where recognition of items becomes slower the more items are oriented away from upright (e.g. Shepard and Meltzer, 1971). Crucially, in both experiments and in all conditions, this response pattern changed when another person was inserted into the scene. People spontaneously took the perspective of the other person and made faster judgements about the presented items in their presence if the characters were oriented towards upright to them. The gaze direction of this other person did not influence these effects. We propose that visual perspective taking is therefore a general spatial-navigational ability, allowing us to calculate more easily how a scene would (in principle) look from another position in space, and that such calculations reflect the spatial location of another person, but not their gaze.

Ward, E., Ganis, G., McDonough, K., & Bach, P. (2020). Perspective taking as virtual navigation? Perceptual simulation of what others see reflects their location in space but not their gaze. Cognition, 199, 104241. PublisherPreprintData

Ellie's paper on visual perspective taking made a big splash!

The study has been extensively covered in Irish Times, Science News, BT Online, YahooNews, and many others...

Ian Apperly wrote a lovely spotlight article in Trends in Cognitive Science on our study… and it received a F1000prime recommendation by Stephen M Fleming!

Reference to the paper: Ward, E., Ganis, G., Bach, P. (2019). Spontaneous Vicarious Perception of the Content of Another’s Visual Perspective. Current Biology. Publisher PDFData