Understanding memorability through artificial and artist intelligence
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00213-9. doi: 10.1016/j.tics.2023.08.017. Online ahead of print.ABSTRACTDavis and Bainbridge reveal a consistent memorability signal for artworks, both online and in a museum setting, which is predicted by the intrinsic visual attributes of the paintings. The fusion of artificial intelligence (AI) with artistic intuition emerges as a promising avenue to deepen our understanding of what makes images memorable.PMID:37696691 | DOI:10.1016/j.tics.2023.08.017 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Lore Goetschalckx Claudia Damiano Source Type: research

Dynamic reading in a digital age: new insights on cognition
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00198-5. doi: 10.1016/j.tics.2023.08.002. Online ahead of print.ABSTRACTPeople increasingly read text displayed on digital devices, including computers, handheld e-readers, and smartphones. Given this, there is rapidly growing interest in understanding how the cognitive processes that support the reading of static text (e.g., books, magazines, or newspapers) might be adapted to reading digital texts. Evidence from recent experiments suggests a complex interplay of visual and cognitive influences on how people engage with digital reading. Although readers can strategically adjust th...
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Sixin Liao Lili Yu Jan-Louis Kruger Erik D Reichle Source Type: research

A collective neuroscience lens on intergroup conflict
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00229-2. doi: 10.1016/j.tics.2023.08.021. Online ahead of print.ABSTRACTHow do team leaders and followers synchronize their behaviors and brains to effectively manage intergroup conflicts? Zhang and colleagues offered a collective neurobehavioral narrative that delves into the intricacies of intergroup conflict. Their results underscore the importance of leaders' group-oriented actions, along with leader-follower synchronization, in intergroup conflict resolution.PMID:37696689 | DOI:10.1016/j.tics.2023.08.021 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Kelong Lu Yafeng Pan Source Type: research

A goal-centric outlook on learning
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00207-3. doi: 10.1016/j.tics.2023.08.011. Online ahead of print.ABSTRACTGoals play a central role in human cognition. However, computational theories of learning and decision-making often take goals as given. Here, we review key empirical findings showing that goals shape the representations of inputs, responses, and outcomes, such that setting a goal crucially influences the central aspects of any learning process: states, actions, and rewards. We thus argue that studying goal selection is essential to advance our understanding of learning. By following existing literature in fram...
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Gaia Molinaro Anne G E Collins Source Type: research

Understanding memorability through artificial and artist intelligence
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00213-9. doi: 10.1016/j.tics.2023.08.017. Online ahead of print.ABSTRACTDavis and Bainbridge reveal a consistent memorability signal for artworks, both online and in a museum setting, which is predicted by the intrinsic visual attributes of the paintings. The fusion of artificial intelligence (AI) with artistic intuition emerges as a promising avenue to deepen our understanding of what makes images memorable.PMID:37696691 | DOI:10.1016/j.tics.2023.08.017 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Lore Goetschalckx Claudia Damiano Source Type: research

Dynamic reading in a digital age: new insights on cognition
Trends Cogn Sci. 2023 Sep 9:S1364-6613(23)00198-5. doi: 10.1016/j.tics.2023.08.002. Online ahead of print.ABSTRACTPeople increasingly read text displayed on digital devices, including computers, handheld e-readers, and smartphones. Given this, there is rapidly growing interest in understanding how the cognitive processes that support the reading of static text (e.g., books, magazines, or newspapers) might be adapted to reading digital texts. Evidence from recent experiments suggests a complex interplay of visual and cognitive influences on how people engage with digital reading. Although readers can strategically adjust th...
Source: Trends Cogn Sci - September 11, 2023 Category: Neuroscience Authors: Sixin Liao Lili Yu Jan-Louis Kruger Erik D Reichle Source Type: research

Attention with or without working memory: mnemonic reselection of attended information
This report failure is thought to stem from a lack of consolidating the attended information into working memory, indicating a dissociation between attention and working memory. Building on these findings, a new concept called memory reselection is proposed to describe a secondary round of selection among the attended information. These discoveries challenge the conventional view of how attention and working memory are related and shed new light onto modeling attention and memory as dissociable processes.PMID:37689583 | DOI:10.1016/j.tics.2023.08.010 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 9, 2023 Category: Neuroscience Authors: Yingtao Fu Chenxiao Guan Joyce Tam Ryan E O'Donnell Mowei Shen Brad Wyble Hui Chen Source Type: research

Attention with or without working memory: mnemonic reselection of attended information
This report failure is thought to stem from a lack of consolidating the attended information into working memory, indicating a dissociation between attention and working memory. Building on these findings, a new concept called memory reselection is proposed to describe a secondary round of selection among the attended information. These discoveries challenge the conventional view of how attention and working memory are related and shed new light onto modeling attention and memory as dissociable processes.PMID:37689583 | DOI:10.1016/j.tics.2023.08.010 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 9, 2023 Category: Neuroscience Authors: Yingtao Fu Chenxiao Guan Joyce Tam Ryan E O'Donnell Mowei Shen Brad Wyble Hui Chen Source Type: research

Action observation network: domain-specific or domain-general?
Trends Cogn Sci. 2023 Sep 2:S1364-6613(23)00208-5. doi: 10.1016/j.tics.2023.08.012. Online ahead of print.ABSTRACTThe action observation network (AON) has traditionally been thought to be dedicated to recognizing animate actions. A recent study by Karakose-Akbiyik et al. invites rethinking this assumption by demonstrating that the AON contains a shared neural code for general events, regardless of whether those events involve animate or inanimate entities.PMID:37666724 | DOI:10.1016/j.tics.2023.08.012 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 4, 2023 Category: Neuroscience Authors: Li Wang Yi Jiang Source Type: research

Action observation network: domain-specific or domain-general?
Trends Cogn Sci. 2023 Sep 2:S1364-6613(23)00208-5. doi: 10.1016/j.tics.2023.08.012. Online ahead of print.ABSTRACTThe action observation network (AON) has traditionally been thought to be dedicated to recognizing animate actions. A recent study by Karakose-Akbiyik et al. invites rethinking this assumption by demonstrating that the AON contains a shared neural code for general events, regardless of whether those events involve animate or inanimate entities.PMID:37666724 | DOI:10.1016/j.tics.2023.08.012 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 4, 2023 Category: Neuroscience Authors: Li Wang Yi Jiang Source Type: research

Action observation network: domain-specific or domain-general?
Trends Cogn Sci. 2023 Sep 2:S1364-6613(23)00208-5. doi: 10.1016/j.tics.2023.08.012. Online ahead of print.ABSTRACTThe action observation network (AON) has traditionally been thought to be dedicated to recognizing animate actions. A recent study by Karakose-Akbiyik et al. invites rethinking this assumption by demonstrating that the AON contains a shared neural code for general events, regardless of whether those events involve animate or inanimate entities.PMID:37666724 | DOI:10.1016/j.tics.2023.08.012 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 4, 2023 Category: Neuroscience Authors: Li Wang Yi Jiang Source Type: research

Action observation network: domain-specific or domain-general?
Trends Cogn Sci. 2023 Sep 2:S1364-6613(23)00208-5. doi: 10.1016/j.tics.2023.08.012. Online ahead of print.ABSTRACTThe action observation network (AON) has traditionally been thought to be dedicated to recognizing animate actions. A recent study by Karakose-Akbiyik et al. invites rethinking this assumption by demonstrating that the AON contains a shared neural code for general events, regardless of whether those events involve animate or inanimate entities.PMID:37666724 | DOI:10.1016/j.tics.2023.08.012 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 4, 2023 Category: Neuroscience Authors: Li Wang Yi Jiang Source Type: research

Action observation network: domain-specific or domain-general?
Trends Cogn Sci. 2023 Sep 2:S1364-6613(23)00208-5. doi: 10.1016/j.tics.2023.08.012. Online ahead of print.ABSTRACTThe action observation network (AON) has traditionally been thought to be dedicated to recognizing animate actions. A recent study by Karakose-Akbiyik et al. invites rethinking this assumption by demonstrating that the AON contains a shared neural code for general events, regardless of whether those events involve animate or inanimate entities.PMID:37666724 | DOI:10.1016/j.tics.2023.08.012 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 4, 2023 Category: Neuroscience Authors: Li Wang Yi Jiang Source Type: research

Bridging the data gap between children and large language models
Trends Cogn Sci. 2023 Aug 31:S1364-6613(23)00203-6. doi: 10.1016/j.tics.2023.08.007. Online ahead of print.ABSTRACTLarge language models (LLMs) show intriguing emergent behaviors, yet they receive around four or five orders of magnitude more language data than human children. What accounts for this vast difference in sample efficiency? Candidate explanations include children's pre-existing conceptual knowledge, their use of multimodal grounding, and the interactive, social nature of their input.PMID:37659919 | DOI:10.1016/j.tics.2023.08.007 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 2, 2023 Category: Neuroscience Authors: Michael C Frank Source Type: research

What are large language models supposed to model?
Trends Cogn Sci. 2023 Aug 31:S1364-6613(23)00202-4. doi: 10.1016/j.tics.2023.08.006. Online ahead of print.ABSTRACTDo large language models (LLMs) constitute a computational account of how humans process language? And if so, what is the role of (psycho)linguistic theory in understanding the relationship between artificial and biological minds? The answer depends on choosing among several, fundamentally distinct ways of interpreting these models as hypotheses about humans.PMID:37659920 | DOI:10.1016/j.tics.2023.08.006 (Source: Trends Cogn Sci)
Source: Trends Cogn Sci - September 2, 2023 Category: Neuroscience Authors: Idan A Blank Source Type: research