Are event related potentials, well investigated under laboratory
conditions, a signature of cortical processing during natural behavior?
We explore this question with a fully mobile recording setup. It
integrates and synchronizes an EEG system, a mobile eye tracker with
pupil- and world-cameras, as well as a step-sensor. These data are
compared to recordings with more restricted behavior as well as classic
fully controlled laboratory conditions. With a focus on the N170 ERP we
streamline the data analysis using deep neural networks to categorize
elements like faces in the participant’s surrounding.
We find that widely reported effects are not as robust as they
seemed. Furthermore, under ecological conditions the image content at
not only the present, but also the previous fixation influence the
cortical potentials. Finally we present data on the step-wise bridge the
gap between lab and real world recordings.
Invited by Ralf Engbert