Assistant Professor, Newark Campus Regional FacultyFrom learning a language to riding a bike, most of our experiences are multisensory in nature. The fact that the brain can integrate information from different sensory modalities into a coherent and unitary experience is truly amazing given that each modality simultaneously receives qualitatively different types of input (e.g., photons, molecules, pressure, etc.) and this information is processed, at least in the early stages of processing, by dedicated sensory systems. My program of research examines how infants, children, and adults process and integrate multisensory information and how this ability sub-serves various cognitive tasks such as statistical learning, categorization, word learning, and individuation. Some of the questions guiding my research are: (a) How do people allocate attention to multisensory stimuli, (b) Do sensory modalities have dedicated attentional resources which allows for multisensory stimuli to be processed in parallel or do sensory modalities compete for the same pool of resources, and (c) Do attentional weights assigned to sensory modalities change across development and change in the course of processing? My research consists of two interrelated parts, one examining the mechanisms underlying cross-modal processing and another attempting to ground many sophisticated behaviors in the dynamics of cross-modal processing. I take a lifespan approach, so my research includes infants and toddlers (8- to 24-months), children (3- to 5-years), and adults, and I have recently proposed a set of studies examining cross-modal processing in elderly populations (see Future Directions section). While most of my research is behavioral in nature, I also try to incorporate psychophysiological measures such as Heart Rate and Event Related Potentials (ERP) into my research studies.