Postdoctoral Position (f/m/d) | Computational Auditory Perception

Scientist Frankfurt am Main
Cultural Studies Social Sciences Cognitive Research Linguistics

Job Code:

Job Offer from April 17, 2021

The Max Planck Institute for Empirical Aesthetics in Frankfurt/Main, Germany, investigates the attentional, cognitive, and affective mechanisms of aesthetic perception and evaluation.

The Research Group Computational Auditory Perception invites applications for a Postdoctoral Position: Characterizing Human Perception and Culture Using Massive Online Experiments

The Max Planck Institute for Empirical Aesthetics seeks a postdoctoral researcher to develop behavioral experiments in the domain of auditory perception, cognition, and culture. Under the direction of Dr. Nori Jacoby, the Research Group Computational Auditory Perception uses experimental paradigms—including iterated transmission chains, simulated cultural evolution, and online marketplaces—to explore cognitive processes, cultural transmission, and internal representations (e.g Jacoby et al. 2019; Harrison et al., 2020; Langlois et al., 2021).

Many of these paradigms are inspired by machine-learning algorithms (e.g., Markov Chain Monte Carlo, stochastic gradient descent), and often leverage state-of-the-art deep-learning models for stimulus generation (e.g., GANs and VAEs). We are invested in building new virtual lab technologies to run these paradigms as massive online experiments, and in extending these technologies to work with rich modalities such as user-generated audio and video (e.g., Angalda-Tort et al., 2021).

We are seeking a researcher with experience in both independent research and scientific coding. The position entails both the development of new virtual lab technologies and the application of these to novel research in the cognitive sciences. The balance of these research desiderata can be adjusted depending on the applicant’s skills and interests.

The ideal candidate for this position will have:

  • excellent coding skills, experience with both front-end web development (HTML/CSS/Javascript) and back-end development (esp. Python), and knowledge of open-source and collaborative coding practices;
  • a publication record that includes peer-reviewed journals;
  • comprehensive experience with data analysis, statistics, and research methods.

The following are also desirable:

  • experience with online experiments and virtual labs technologies;
  • experience with data science;
  • knowledge of modern orchestration technologies such as Kubernetes and Docker;
  • experience with deep learning, especially generative models such as VAEs and GANs;
  • knowledge of signal processing, specifically in the auditory domain.

This is a one-year fixed-term position with the possibility of renewal for at least one more year. The anticipated start date is flexible. Some remote working is possible (details can be discussed).

The Computational Auditory Perception research group offers an exciting interdisciplinary area of activity in an international scientific environment, with English and German being the languages spoken. The Max Planck Institute for Empirical Aesthetics is located in an attractive location with excellent infrastructure in the Westend neighborhood of Frankfurt am Main, Germany. You can expect a modern, well equipped workplace with flexible working hours and a pleasant working atmosphere.

Salaries are determined following the German TVöD/Bund salary scale at E13 (ca. €51,000–€60,000, depending on experience; see, and also entail substantial benefits.

The Max Planck Society strives for gender equality and diversity. Likewise, we aim to employ more differently abled people. We therefore welcome applications from people of all backgrounds.

Your application

Your application should include: a detailed CV, a summary of research interests and experience, two writing samples, and the names and contact information of two references. Please send these materials, compiled in a single PDF file, by e-mail to .

If you have any questions, please contact Diana Gleiss at: .

Go to Editor View