ERC CoG
2023-2028
Passive value markers: Identification and change of passive value markers in the laboratory and naturalistic environments
When you look at a given object - are you calculating its value even if not prompted to doWhen you look at a given object - are you calculating its value even if not prompted to do so? Most theories of valuation claim that choice, prompt or in general elicitation are needed to induce valuation. However, the well-known mere exposure effect dating back to 1968, suggests that merely viewing an item can induce valuation and even enhance value. In this proposal I aim to demonstrate that valuation is an early and automatic process relying on visual, attentional and motor systems. PassiveValueMarkers offers a novel framework to identify biomarkers of value of individual items passively, without elicitation. The aims of PassiveValueMarkers are: 1) Identifying behavioural and neural passive markers for individual items using computational modelling in and influencing these markers ; 2) Detecting passive markers of value in gaze pattern analysis behaviourally and in the brain using fMRI; 3) Considering the gap that exists between laboratory studies and the real world, I will identify passive value markers in naturalistic virtual reality environments. The overarching aim of this proposal is to develop a new theoretical framework on individual passive value construction and change. To do so, I will use a unique combination of neuroimaging, computational modelling, gaze-tracking analysis and virtual reality. This research will directly address an understudied area of how value for individual items is formed by the brain without active prompt or elicitation.
Uncovering the mechanisms of passive value representation at the single item level and individualized per participant, will allow the design of closed loop manipulations at the item level. This approach will serve as the basis for developing novel evidence-based methods for enhanced preference modification in healthy participants and in disorders with abnormal valuation such as addictions, mental illness, and eating disorders.
Attention, memory, and preference changes in Cue Approach Training (CAT) | Adi Cantor
The research focuses on investigating the underlying mechanisms of Cue Approach Training (CAT), particularly how attention and memory influence preference changes.
VR Museum | Yana sklyar
This study explores the potential of virtual reality (VR) to revolutionize museum experiences by offering personalized, interactive tours that cater to individual visitor preferences. By employing a novel VR application allowing participants to freely navigate a virtual museum space, the research examines the impact of active decision-making on visitor satisfaction. Participants are divided into three groups to compare satisfaction levels based on varying degrees of choice during the tour. Utilizing the Mizne-Blumenthal collection, participants select art features, with the chosen artworks and their sequence remaining consistent across groups to ensure structural validity. Physiological measures, such as facial expressions and eye movements, along with behavioral and satisfaction questionnaires, provide comprehensive data. This approach combines data science with cultural heritage to enhance engagement, optimize museum visits, and address accessibility challenges, ultimately contributing to museums' missions of interpretation and preservation in the digital age.
Predicting Subjective Preferences in Humans by Their Physiological Movements | Eden Ishakov
In this project, we aim to study whether physiological movements in humans can predict their subjective preferences. The goal of this study is to see whether active or passive movements can indicate an individual’s subjective preference of pictures shown in a VR environment setup. By using 3 of the tracking devices useable in the VR headset (eye tracking, face tracking and hand tracking), we want to see how by swiping right (if they like the picture) or left (if they don’t), or just observing the pictures passively, one’s movements can predict what will one prefer.
ResXR - An Open-Source Toolkit for Standardized XR Behavioral Research
An end-to-end open-source software project for conducting behavioral XR experiments. ResXR provides a Unity-based experiment template for multimodal data capture (head, hand, eye, and face tracking from Meta’s Quest Pro head mounted display) alongside a
Python processing pipeline that automates the creation of a standardized data structure, validation, preprocessing, and quality reporting, inspired by established neuroimaging data formats and tools like fMRIPrep.
Sudden Insight and the "Aha!" Moment in Spatial Problem-Solving
This project investigates the relationship between the cognitive phenomenon of "sudden insight" and the subjective "Aha!" experience within the domain of spatial navigation, an area less explored compared to traditional verbal tasks. Building on a successful pilot study utilizing a novel Virtual Reality (VR) maze paradigm, the research aims to elicit and objectively measure how these sudden realizations manifest during spatial problem-solving. Our initial findings indicated a strong link between the self-reported feeling of an "Aha!" moment and immediate, objective improvements in navigation efficiency. We are currently conducting a rigorous, pre-registered (OSF) replication of this work to validate these patterns with expanded analysis. By integrating human introspection with objective behavioral metrics, this work establishes a framework for future cross-species comparisons, aiming to uncover the fundamental mechanisms of how insight drives learning.
Multisensory value-based decision making
In this study, we examine how learned values of unisensory stimuli are integrated during multisensory decision making. Participants learn the value of visual, auditory, and somatosensory stimuli and then evaluate multisensory combinations of these cues. We aim to explore the relative contribution of each modality, the effect of value conflict across modalities on decision difficulty and whether participants evaluate multisensory combinations as the sum of their individual components.
Studying automatic behavior in humans
This research focuses on automatic behavior in different environments. I developed experimental paradigms to test how action tendencies are learned and how they respond to changes in context, rules, or outcomes. My work examines both adaptive forms of automaticity, where efficient responding supports performance, and maladaptive forms, where previously learned responses persist despite new goals or altered contingencies. Using virtual reality and laboratory-based behavioral methods, I aim to refine behavioral measures of automaticity and to identify the task and environmental factors that promote flexibility versus rigidity in learned behavio
VR Modality, Locomotion, and Spatial Learning
This project examines how VR modality (level of immersion) and locomotion interfaces influence spatial learning and gaze behavior. By translating a classic spatial learning task across different VR conditions, the work aims to characterize strategy selection and quantify behavioral and eye-tracking markers that predict individual differences in navigation performance.
Relevant Papers:

Gabay, M., & Schonberg, T. (2026). The effect of virtual reality modality level of immersion and locomotion on spatial learning and gaze measures. Virtual Reality, 30, 36. https://doi.org/10.1007/s10055-025-01297-9







.jpg)




