Skip Nav Destination
Close Modal
1-3 of 3
Special Session: Artificial Perception: Machines with Lifelike Failings
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Proceedings Papers
. isal2021, ALIFE 2021: The 2021 Conference on Artificial Life22, (July 18–22, 2021) 10.1162/isal_a_00379
Abstract
View Papertitled, A Monocular Depth Estimator to Perceive Crater Illusions in Several Characteristics
View
PDF
for content titled, A Monocular Depth Estimator to Perceive Crater Illusions in Several Characteristics
How machines perceive visual illusions is essential to clarify the differences between artificial and human perception. Crater illusions cause depth perception in the texture by the intensity gradient of the disks. In this study, we investigated whether a monocular depth estimator perceives the crater illusions. We adopted MonoDepth as the estimator and conducted two experiments using the illusions with planar images and attached to a cube. As a result, the estimator perceived the convexity on the top-lit disk, while a little concavity on the bottom-lit disk. We also found the vertical heterogeneity for the depth perception of it. Additionally, the machine perceived the illusion attached to the cube, although the indeterminacy for object scale was revealed. Our findings would bridge the research between computer vision and cognitive science in terms of depth perception.
Proceedings Papers
. isal2021, ALIFE 2021: The 2021 Conference on Artificial Life24, (July 18–22, 2021) 10.1162/isal_a_00462
Abstract
View Papertitled, Artificial Cognitive Map System based on Generative Deep Neural
Networks
View
PDF
for content titled, Artificial Cognitive Map System based on Generative Deep Neural
Networks
We present a novel artificial cognitive map system using the generative deep neural networks called Variational Autoencoder / Generative Adversarial Network (VAE/GAN), which encodes input images into the latent space and the structure of the latent space is self-organized through the learning. Our results show that the distance of the predicted image is reflected in the distance of the corresponding latent vector after training, which indicates that the latent space is organized to reflect the proximity structure of the dataset. This system is also able to internally generate temporal sequences analogous to hippocampal replay/pre-play, and we found that these sequences are not just the exact replay of the past experience, and this could be the origin of creating novel sequences from the past experiences. Having this generative nature of cognition is thought as a prerequisite for artificial cognitive systems.
Proceedings Papers
. isal2021, ALIFE 2021: The 2021 Conference on Artificial Life23, (July 18–22, 2021) 10.1162/isal_a_00442
Abstract
View Papertitled, Psychophysical Tests Reveal that Evolved Artificial Brains Perceive Time like Humans
View
PDF
for content titled, Psychophysical Tests Reveal that Evolved Artificial Brains Perceive Time like Humans
Computational neuroscience attempts to build models of the brain that break cognition into basic elements. Here we study time perception in artificial brains, evolved over thousands of generations to judge the duration of tones, and compare the evolved brains’ behavioral characteristics to human subjects performing the same task. We observe substantial similarities in psychometric properties in human subjects and digital brains with very similar perception artifacts, but also see differences due to different selective pressures during training or evolution. Our findings suggests that digital experimentation using brains evolved within a computer can advance computational cognitive neuroscience by discovering new cognitive mechanisms and heuristics.