What Are the Chances? fMRI Correlates of Observing High and Low-Probability Actions

Abstract

Cognitive scientists often use probabilistic equations to model human behavior in ambiguous situations. How, where, and even if such probabilities are represented in the human brain remains largely unknown. Here, we manipulated the probability of simple bottle-pouring action based on two considerations, the relative fullness of two glasses and the relative distance between the two glasses and the bottle. Whole brain functional magnetic resonance imaging was used to measure brain activity while participants viewed probable and improbable pouring actions. Improbable actions elicited increased activity in the theory of mind (ToM) network, commonly found active when trying to grasp the intentions of others, whereas probable actions elicited increased activity in the human mirror neuron system (hMNS) and areas associated with mental imagery and memory. These data provide novel insight into the brain mechanisms humans use to distinguish between high and low-probability actions.

Share and Cite:

R. Norlund, K. Bruggink, R. Cuijpers and H. Bekkering, "What Are the Chances? fMRI Correlates of Observing High and Low-Probability Actions," Journal of Behavioral and Brain Science, Vol. 3 No. 1, 2013, pp. 49-56. doi: 10.4236/jbbs.2013.31005.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] H. R. Heekeren, S. Marrett and L. G. Ungerleider, “The Neural Systems That Mediate Human Perceptual Decision Making,” Nature Reviews Neuroscience, Vol. 9, No. 6, 2008, pp. 467-479. doi:10.1038/nrn2374
[2] R. H. Cuijpers, et al., “Goals and Means in Action Observation: A Computational Approach,” Neural Networks, Vol. 19, No. 3, 2006, pp. 311-322. doi:10.1016/j.neunet.2006.02.004
[3] M. Haruno, D. M. Wolpert and M. Kawato, “MOSAIC Model for Sensorimotor Learning and Control,” Neural Computation, Vol. 13, No. 10, 2001, pp. 2201-2220. doi:10.1162/089976601750541778
[4] E. Oztop, D. Wolpert and M. Kawato, “Mental State Inference Using Visual Control Parameters,” Cognitive Brain Research, Vol. 22, No. 2, 2005, pp. 129-51. doi:10.1016/j.cogbrainres.2004.08.004
[5] D. M. Wolpert, “Probabilistic Models in Human Sensorimotor Control,” Human Movement Science, Vol. 26, No. 4, 2007, pp. 511-524. doi:10.1016/j.humov.2007.05.005
[6] J. I. Gold and M. N. Shadlen, “The Neural Basis of Decision Making,” Annual Review of Neuroscience, Vol. 30, 2007, pp. 535-574. doi:10.1146/annurev.neuro.29.051605.113038
[7] T. Yang and M. N. Shadlen, “Probabilistic Reasoning by Neurons,” Nature, Vol. 447, No. 7148, 2007, pp. 10751080. doi:10.1038/nature05852
[8] K. Preuschoff, S. R. Quartz and P. P. Bossaerts, “Human Insula Activation Reflects Risk Prediction Errors as Well as Risk,” Journal of Neuroscience, Vol. 28, No. 1, 2008, pp. 2745-2752. doi:10.1523/JNEUROSCI.4286-07.2008
[9] K. Samejima, et al., “Representation of Action-Specific Reward Values in the Striatum,” Science, Vol. 310, No. 5752, 2005, pp. 1337-1340. doi:10.1126/science.1115270
[10] B. Calvo-Merino, et al., “Seeing or Doing? Influence of Visual and Motor Familiarity in Action Observation,” Current Biology, Vol. 16, No. 19, 2006, pp. 1905-1910. doi:10.1016/j.cub.2006.07.065
[11] T. Koelewijn, et al., “Motor-Cortical Beta Oscillations Are Modulated by Correctness of Observed Action,” Neuroimage, Vol. 20, No. 2, 2008, pp. 767-775. doi:10.1016/j.neuroimage.2007.12.018
[12] G. Buccino, G. R. Fink, N. J. Shah, K. Zilles, R. J. Seitz and H. J. Freund, “Observation and Imitation of Object Related Actions,” Neuroimage, Vol. 13, No. 5, 2001.
[13] F. P. de Lange, et al., “Complementary Systems for Understanding Action Intentions,” Current Biology, Vol. 18, No. 6, 2008, pp. 454-457. doi:10.1016/j.cub.2008.02.057
[14] L. Fogassi, et al., “Parietal Lobe: From Action Organization to Intention Understanding,” Science, Vol. 308, No. 5722, 2005, pp. 662-667. doi:10.1126/science.1106138
[15] M. Iacoboni, et al., “Grasping the Intentions of Others with One’s Own Mirror Neuron System,” PLoS Biology, Vol. 3, No. 3, 2005, p. e79. doi:10.1371/journal.pbio.0030079
[16] G. Metta, G. Sandini, L. Natale, L. Craighero and L. Fadiga, “Understanding Mirror Neurons: A Bio-Robotic Approach,” Interaction Studies, Vol. 7, No. 2, 2006, pp. 197-232. doi:10.1075/is.7.2.06met
[17] M. Iacoboni, “Perspectives on Imitation: From Mirror Neurons to Memes,” MIT Press, Cambridge, 2005, pp. 77-100.
[18] G. Rizzolatti and L. Craighero, “The Mirror-Neuron System,” Annual Review of Neuroscience, Vol. 27, No. 1, 2004, pp. 169-192. doi:10.1146/annurev.neuro.27.070203.144230
[19] E. A. Roy, et al., “Analysis of Task Demands in Apraxia,” International Journal of Neuroscience, Vol. 56, No. 1-4, 1991, pp. 177-186. doi:10.3109/00207459108985414
[20] N. Geschwind and E. Kaplan, “A Human Cerebral Deconnection Syndrome: A Preliminary Report,” Neurology, Vol. 12, No. 10, 1962, pp. 675-685. doi:10.1212/WNL.12.10.675
[21] G. Goldenberg, “Pantomime of Object Use: A Challenge to Cerebral Localization of Cognitive Function,” Neuroimage, Vol. 20, No. 1, 2003, pp. S101-S106. doi:10.1016/j.neuroimage.2003.09.006
[22] G. Goldenberg, K. Hartmann and I. Schlott, “Defective Pantomime of Object Use in Left Brain Damage: Apraxia or Asymbolia?” Neuropsychologia, Vol. 41, No. 12, 2003, pp. 1565-1573. doi:10.1016/S0028-3932(03)00120-9
[23] S. H. Johnson-Frey, R. Newman-Norlund and S. T. Grafton, “A Distributed Left Hemisphere Network Active during Planning of Everyday Tool Use Skills,” Cereb Cortex, Vol. 15, No. 6, 2005, pp. 681-695. doi:10.1093/cercor/bhh169
[24] G. Buccino, et al., “Neural Circuits Underlying Imitation Learning of Hand Actions: An Event-Related fMRI Study,” Neuron, Vol. 42, No. 2, 2004, pp. 323-334. doi:10.1016/S0896-6273(04)00181-3
[25] P. D. McGeoch, D. Brang and V. S. Ramachandran, “Apraxia, Metaphor and Mirror Neurons,” Medical Hypotheses, Vol. 69, No. 6, 2007, pp. 1165-1168. doi:10.1016/j.mehy.2007.05.017
[26] A. E. Cavanna and M. R. Trimble, “The Precuneus: A Review of Its Functional Anatomy and Behavioral Correlates,” Brain, Vol. 129, No. 3, 2006, pp. 564-583. doi:10.1093/brain/awl004
[27] S. M. Kosslyn, G. Ganis and W. L. Thompson, “Neural Foundations of Imagery,” Nature Reviews Neuroscience, Vol. 2, No. 9, 2001, pp. 635-642. doi:10.1038/35090055
[28] H. Takahashi, N. Yahata, M. Koeda, T. Matsuda, K. Asai and Y. Okubo, “Brain Activation Associated with Evaluative Processes of Guild and Embarrassment: An fMRI Study,” Neuroimage, Vol. 23, No. 3, 2004, pp. 967-974. doi:10.1016/j.neuroimage.2004.07.054
[29] C. Kobayashi, G. H. Glover and T. Elise, “Children’s and Adults’ Neural Bases of Verbal and Nonverbal ‘Theory of Mind’,” Neuropsychologia, Vol. 45, No. 7, 2007, pp. 1522-1532. doi:10.1016/j.neuropsychologia.2006.11.017
[30] B. A. Voellm, A. N. W. Taylor, P. Richardson, R. Corcoran, J. Stirling, S. McKie, J. F. Deakin and R. Elliot, “Neuronal Correlates of Theory of Mind and Empathy: A Functional Magnetic Resonance Imaging Study in a Nonverbal Task,” Neuroimage, Vol. 29, No. 1, 2006, pp. 9098. doi:10.1016/j.neuroimage.2005.07.022
[31] H. L. Gallagher, F. Happe, N. Brunswick, P. C. Fletcher, U. Frith and C. D. Frith, “Reading the Mind in Cartoons and Stories: An fMRI Study of ‘Theory of Mind’ in Verbal and Nonverbal Tasks,” Neuropsychologia, Vol. 38, No. 1, 2000, pp. 11-21. doi:10.1016/S0028-3932(99)00053-6
[32] B. Seltzer and D. N. Pandya, “Frontal Lobe Connections of the Superior Temporal Sulcus in the Rhesus Monkey,” Journal of Comparative Neurology, Vol. 281, No. 1, 1989, pp. 97-113. doi:10.1002/cne.902810108
[33] G. Hein and R. T. Knight, “Superior Temporal Sulcus— It’s My Area: Or Is It?” Journal of Cognitive Neuroscience, Vol. 20, No. 12, 2008, pp. 1-12. doi:10.1162/jocn.2008.20148
[34] V. Gallese and A. Goldman, “Mirror Neurons and the Simulation Theory of Mind-Reading,” Trends in Cognitive Sciences, Vol. 2, No. 12, 1998, pp. 493-501. doi:10.1016/S1364-6613(98)01262-5

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.