Article
Neurosciences
Justyna O. Ekert, Andrea Gajardo-Vidal, Diego L. Lorca-Puls, Thomas M. H. Hope, Fred Dick, Jennifer T. Crinion, David W. Green, Cathy J. Price
Summary: Previous studies have shown that the left posterior superior temporal sulcus (pSTS) and left temporo-parietal junction (TPJ) both contribute to phonological short-term memory, speech perception and speech production. The study further dissociated the response profiles of these regions and highlighted their distinct roles in speech processing.
Article
Neurosciences
Johannes Rennig, Michael S. Beauchamp
Summary: Regions of the human posterior superior temporal gyrus and sulcus are responsible for processing visual and auditory speech, and successful integration of visual and auditory speech produces a characteristic neural signature that benefits the comprehension of noisy auditory speech.
Article
Multidisciplinary Sciences
Lea Roumazeilles, Matthias Schurz, Mathilde Lojkiewiez, Lennart Verhagen, Urs Schuffelgen, Kevin Marche, Ali Mahmoodi, Andrew Emberton, Kelly Simpson, Olivier Joly, Mehdi Khamassi, Matthew F. S. Rushworth, Rogier B. Mars, Jerome Sallet
Summary: The study found that activity in a specific region of the macaque middle superior temporal cortex is modulated by the predictability of social situations, similar to the human TPJ, suggesting a precursor for theory of mind ability in the last common ancestor of human and Old World monkeys.
Article
Neurosciences
Julia Landsiedel, Katie Daughters, Paul E. Downing, Kami Koldewyn
Summary: This study confirms that bilateral SI-pSTS plays a central role in dynamic social interaction perception, but is less involved when 'interactiveness' is conveyed solely with static cues. Regions in the social brain, including SI-pSTS and EBA, are sensitive to both motion and interactive content. SI-pSTS is somewhat more tuned to video interactions than is EBA, but both regions show a greater response to social interactions compared to non-interactions and both respond more strongly to videos than static images.
Article
Neurosciences
Yang Zhou, Krithika Mohan, David J. Freedman
Summary: Categorization is a crucial cognitive process for decision-making and recognition, and the posterior parietal cortex, specifically the lateral intraparietal (LIP) area, is believed to transform visual feature encoding into abstract categorical representations. However, areas closer to sensory input, such as the middle temporal (MT) area, only encode stimulus features and not abstract categorical information. By studying the neuronal activity in the medial superior temporal (MST) and LIP areas during a visual motion categorization task, it has been found that MST plays a significant role in cognitive computation beyond its recognized role in visual motion processing.
JOURNAL OF NEUROSCIENCE
(2022)
Article
Behavioral Sciences
Dorita H. F. Chang, Nikolaus F. Troje, Yuji Ikegaya, Ichiro Fujita, Hiroshi Ban
Summary: The study aimed to understand the spatiotemporal characteristics of biological motion perception. Univariate response components were observed at around 200ms and 650ms post-stimulus onset, with multivariate patterns specific to form discriminated as early as 100ms. Early responses to biological motion likely originate from occipital cortex, while later responses likely come from extrastriate body areas.
Article
Clinical Neurology
Gaurav H. Patel, Sophie C. Arkin, Daniel R. Ruiz-Betancourt, Fabiola Plaza, Safia A. Mirza, Daniel J. Vieira, Nicole E. Strauss, Casimir C. Klim, Juan P. Sanchez-Pena, Laura P. Bartel, Jack Grinband, Antigona Martinez, Rebecca A. Berman, Kevin N. Ochsner, David A. Leopold, Daniel C. Javitt
Summary: This study examines the neural correlates of social cognition in schizophrenia using naturalistic stimuli and functional MRI imaging, focusing on the TPJ region. The findings suggest that deficits in TPJ function may contribute to impaired social inference in schizophrenia participants.
Article
Biochemistry & Molecular Biology
Tanya Orlov, Maayan Raveh, Ayelet McKyton, Itay Ben-Zion, Ehud Zohary
Summary: This study examined the anorthoscopic vision of Ethiopian children with bilateral early-onset cataracts who were surgically treated years later. While their visual acuity improved substantially post-surgery, many initially struggled to recover shape under slit-viewing conditions. However, after retesting, almost all patients could judge shape in such conditions, indicating that this acquired capability often transferred to novel stimuli.
Article
Neurosciences
C. Bodin, A. Pron, M. Le Mao, J. Regis, P. Belin, O. Coulon
Summary: The study reveals that plis de passage (PPs) can be identified as landmarks from the geometry of the STS walls, with their depth related to white-matter connectivity. These findings propose new hypotheses regarding the spatial organization of PPs, the relationship between cortical anatomy and structural connectivity, and the potential role of PPs in regional functional organization.
Article
Multidisciplinary Sciences
Mert Ozkan, Stuart Anstis, Bernard M. 't Hart, Mark Wexler, Patrick Cavanagh
Summary: The study reports a powerful example of paradoxical stabilization produced by a moving frame, where the stabilization is related to the distance of the frame's travel rather than its speed. The results suggest a link to visual stability across eye movements, indicating that the displacement of the entire visual scene may act as a frame to stabilize the perception of relative locations.
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
(2021)
Article
Neurosciences
Anna Gaglianese, Alessio Fracasso, Francisco G. G. Fernandes, Ben Harvey, Serge O. O. Dumoulin, Natalia Petridou
Summary: Perception of object speed involves distinct and independent selectivity for the spatial and temporal frequencies of visual motion stimuli as well as pure tuning for the speed of motion in the human middle temporal cortex (hMT+). We found that these encoding mechanisms occur in different neuronal groups within hMT+, with the largest subregion showing separable tuning for the spatial and temporal frequency of the visual stimuli. Our findings suggest that hMT+ plays a primary role in evaluating the spatial features of moving visual input and highlight the complexity of the speed perception process.
HUMAN BRAIN MAPPING
(2023)
Article
Neurosciences
William D. Hopkins, Oliver Coulon, Adrien Meguerditchian, Nicky Staes, Chet C. Sherwood, Steven J. Schapiro, Jean-Francois Mangin, Brenda Bradley
Summary: This study examined the role of genetic factors in individual variation in superior temporal sulcus (STS) morphology in chimpanzees. The results showed significant correlations between single nucleotide polymorphisms in the KIAA0319 and AVPR1A genes and STS depth and lateralization, but no significant effects on surface area and depth measures for the central sulcus. The overall findings suggest that genetic factors contribute to a small to moderate amount of variation in STS morphology in chimpanzees.
Article
Biochemistry & Molecular Biology
Kyle Rupp, Jasmine L. Hect, Madison Remick, Avniel Ghuman, Bharath Chandrasekaran, Lori L. Holt, Taylor J. Abel
Summary: The ability to recognize abstract features of voice during auditory perception is a complex task. Voice selectivity increases along the auditory hierarchy from supratemporal plane (STP) to the superior temporal gyrus (STG) and superior temporal sulcus (STS). Voice can be accurately decoded from human auditory cortical activity, even without linguistic content. Encoding models demonstrate that voice encoding in STG/STS is best explained by voice category and acoustics, while STP responses are primarily accounted for by acoustic features.
Article
Neurosciences
Joao Valente Duarte, Rodolfo Abreu, Miguel Castelo-Branco
Summary: This study investigates the functional separation between local biological motion perception and global dynamic body perception, revealing two independent processing stages and highlighting the importance of two early dorsal and two ventral visual regions in encoding biological motion.
Article
Computer Science, Artificial Intelligence
Yi Zhou, Guillermo Gallego, Xiuyuan Lu, Siqi Liu, Shaojie Shen
Summary: Identifying independently moving objects in dynamic scenes is a crucial task for scene understanding. Traditional cameras may face limitations such as motion blur or exposure artifacts, while event-based cameras offer advantages to overcome these limitations by reporting pixel-wise intensity changes asynchronously. This research develops a method to solve the event-based motion segmentation problem by casting it as an energy minimization problem involving fitting multiple motion models. The experiments demonstrate the versatility of the method in scenes with different motion patterns and number of moving objects, achieving state-of-the-art results without the need to determine the expected number of moving objects beforehand. The software and dataset are released under an open source license to encourage research in event-based motion segmentation.
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
(2023)
Article
Behavioral Sciences
Elissa M. Aminoff, Yuanning Li, John A. Pyles, Michael J. Ward, R. Mark Richardson, Auniel S. Ghuman
Article
Psychology
Nicole L. Jardine, Cathleen M. Moore
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE
(2016)
Article
Ophthalmology
Ying Yang, Yang Xu, Carol A. Jew, John A. Pyles, Robert E. Kass, Michael J. Tarr
Article
Computer Science, Software Engineering
Brian Ondov, Nicole Jardine, Niklas Elmqvist, Steven Franconeri
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
(2019)
Article
Ophthalmology
Daniel D. Leeds, Darren A. Seibert, John A. Pyles, Michael J. Tarr
Article
Mathematical & Computational Biology
Daniel D. Leeds, John A. Pyles, Michael J. Tarr
FRONTIERS IN COMPUTATIONAL NEUROSCIENCE
(2014)
Article
Multidisciplinary Sciences
Avniel Singh Ghuman, Nicolas M. Brunet, Yuanning Li, Roma O. Konecky, John A. Pyles, Shawn A. Walls, Vincent Destefino, Wei Wang, R. Mark Richardson
NATURE COMMUNICATIONS
(2014)
Article
Cell Biology
Tina T. Liu, Adrian Nestor, Mark D. Vida, John A. Pyles, Christina Patterson, Ying Yang, Fan Nils Yang, Erez Freud, Marlene Behrmann
Article
Multidisciplinary Sciences
Nadine Chang, John A. Pyles, Austin Marcus, Abhinav Gupta, Michael J. Tarr, Elissa M. Aminoff
Article
Behavioral Sciences
Lindsay W. Victoria, John A. Pyles, Michael J. Tarr
BRAIN AND BEHAVIOR
(2019)
Article
Psychology
Cristina R. Ceja, Nicole L. Jardine, Steven L. Franconeri
ATTENTION PERCEPTION & PSYCHOPHYSICS
(2020)
Article
Computer Science, Software Engineering
Nicole Jardine, Brian D. Ondov, Niklas Elmqvist, Steven Franconeri
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
(2020)
Article
Psychology
Hauke S. Meyerhoff, Nicole Jardine, Mike Stieff, Mary Hegarty, Steve Franconeri
Summary: Research shows that in visual working memory tasks, performance improves when repeated objects have redundant features in color, shape, and paired color + shape. This indicates that feature-based repetition aids visual compression, enhancing cognition.
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE
(2021)
Article
Neurosciences
Daniel A. Stehr, Xiaojue Zhou, Mariel Tisby, Patrick T. Hwu, John A. Pyles, Emily D. Grossman
Summary: The study found that decoding accuracy in the right pSTS and left IFC varied depending on the dimension of action vignettes observers attended to. The right pSTS classified actions most accurately when observers focused on action kinematics, while the left IFC classified most accurately when observers attended to the actor's goal. Functional connectivity between the right pSTS and right IFC increased when observers focused on actions portrayed in the vignettes, suggesting that the attentive state of the viewer modulates sensory representations in the pSTS during action observation.
Article
Neurosciences
Guillaume Lemaitre, John A. Pyles, Andrea R. Halpern, Nicole Navolio, Matthew Lehet, Laurie M. Heller