Scholarly article on topic 'Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes'

Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes Academic research paper on "Clinical medicine"

CC BY-NC-ND
0
0
Share paper
Academic journal
Biological Psychology
OECD Field of science
Keywords
{ADHD / ODD / Sleep / Memory / Faces / Pupillometry}

Abstract of research paper on Clinical medicine, author of scientific article — Alexander Prehn-Kristensen, Ina Molzow, Alexandra Förster, Nadine Siebenhühner, Maxime Gesch, et al.

Abstract Children with attention-deficit/hyperactivity disorder (ADHD) display deficits in sleep-dependent memory consolidation, and being comorbid with oppositional defiant disorder (ODD), results in deficits in face processing. The aim of the present study was to investigate the role of sleep in recognizing faces in children with ADHD+ODD. Sixteen healthy children and 16 children diagnosed with ADHD+ODD participated in a sleep and a wake condition. During encoding (sleep condition at 8p.m.; wake condition at 8a.m.) pictures of faces were rated according to their emotional content; the retrieval session (12h after encoding session) contained a recognition task including pupillometry. Pupillometry and behavioral data revealed that healthy children benefited from sleep compared to wake with respect to face picture recognition; in contrast recognition performance in patients with ADHD+ODD was not improved after sleep compared to wake. It is discussed whether in patients with ADHD+ODD social stimuli are preferentially consolidated during daytime.

Academic research paper on topic "Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes"

Accepted Manuscript

Title: Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes

Author: <ce:author id="aut0005" author-id="S0301051116304008-87b4ef7de29ca00e6f0f769423bf8178"> Alexander Prehn-Kristensen<ce:author id="aut0010" author-id="S0301051116304008-1fd4b930a757871329ad849cdfcd60f7"> Ina Molzow<ce:author id="aut0015" author-id="S0301051116304008-ca85d94f93535504fe32f048b5ab6d81"> Alexandra Forster<ce:author id="aut0020" author-id="S0301051116304008-6c5eb481cad665e7e2e8f2a5b7a361e9"> Nadine Siebenhwhner<ce:author id="aut0025" author-id="S0301051116304008-525b2cea2a3f60f23b5cf95e2b2c14a5"> Maxime Gesch<ce:author id="aut0030" author-id="S0301051116304008-57947f40f01075aa5fab7024bd250d27"> Christian D. Wiesner<ce:author id="aut0035" author-id="S0301051116304008-23abd1e8b7131bb77fb5dbda7059dfe7"> Lioba Baving

PII: DOI:

Reference:

S0301-0511(16)30400-8

http://dx.doi.Org/doi:10.1016/j.biopsycho.2016.12.017 BIOPSY 7316

To appear in:

Received date: Revised date: Accepted date:

13-4-2016 8-11-2016 29-12-2016

Please cite this article as: Prehn-Kristensen, Alexander, Molzow, Ina, Förster, Alexandra, Siebenhwhner, Nadine, Gesch, Maxime, Wiesner, Christian D., Baving,

Lioba, Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes.Biological Psychology http://dx.doi.org/10.1016/j.biopsycho.2016.12.017

This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes.

Alexander Prehn-Kristensen1*, Ina Molzow1, Alexandra Förster1, Nadine Siebenhühner1, Maxime Gesch1, Christian D. Wiesner1, Lioba Baving1

department of Child and Adolescent Psychiatry and Psychotherapy, Center for Integrative Psychiatry, School of Medicine, Christian Albrechts University, Kiel, Germany

* corresponding author, E-mail: a.prehn@zip-kiel.de

Key words: ADHD; ODD; sleep; memory; faces; pupillometry

Highlights

- Sleep in healthy children fostered the memory recognition of face pictures

- In children with ADHD+ODD the face recognition was not improved after sleep

- Pupillometry is an appropriate measurement of sleep-dependent memory consolidation

- Sleep had no impact on emotional picture ratings

Abstract: Children with attention-deficit/ hyperactivity disorder (ADHD) display deficits in sleep-dependent memory consolidation, and being comorbid with oppositional defiant disorder (ODD), results in deficits in face processing. The aim of the present study was to investigate the role of sleep in recognizing faces in children with ADHD+ODD. Sixteen healthy children and 16 children diagnosed with ADHD+ODD participated in a sleep and a wake condition. During encoding (sleep condition at 8 p.m.; wake condition at 8 a.m.) pictures of faces were rated according to their emotional content; the retrieval session (12h after encoding session) contained a recognition task including pupillometry. Pupillometry and behavioral data revealed that healthy children benefited from sleep compared to wake with respect to face picture recognition; in contrast recognition performance in patients with ADHD+ODD was not improved after sleep compared to wake. It is discussed whether in patients with ADHD+ODD social stimuli are preferentially consolidated during daytime.

Introduction

Recognizing individuals and their emotional state by their face is essential for appropriate social interaction (Adolphs & Birmingham, 2011; Gobbini & Haxby, 2007). Humans are inherently social, and the ability to detect faces or to recognize face identity can already be seen in newborn infants (Hoehl & Peykarjou, 2012; Simion & Giorgio, 2015; Wilmer et al., 2010). However, children's ability to recognize faces is also clearly influenced by learning and experience (da Silva Ferreira, Crippa, & de Lima Osorio, 2014; Goodman et al., 2007; Macchi Cassia, Luo, Pisacane, Li, & Lee, 2014; Proietti, Macchi Cassia, dell'Amore, Conte, & Bricolo, 2015). On the other hand, deficits in identifying faces can lead to serious social consequences as seen in prosopagnosia. (Susilo & Duchaine, 2013; Yardley, McDermott, Pisarski, Duchaine, & Nakayama, 2008). Also, several childhood and adult psychiatric disorders are accompanied by deficits in recognizing facial identities or emotional expressions which may amplify aberrant social behavior (Bortolon, Capdevielle, & Raffard, 2015; Collin, Bindra, Raju, Gillberg, & Minnis, 2013; Ventura, Wood, Jimenez, & Hellemann, 2013). With a prevalence of 5.7%, disruptive behavior disorders are one of the most prevalent psychiatric diseases in childhood and adolescence (Polanczyk, Salum, Sugaya, Caye, & Rohde, 2015) and are characterized by profound deficits in socio-emotional development (American Psychiatric Association, 2013). Disruptive behavior disorders include oppositional defiant disorder (ODD) or conduct disorder (CD) which are highly comorbid with attention-deficit/hyperactivity disorder (ADHD) (Waschbusch, 2002). Particularly, childhood ADHD in combination with ODD is a high risk factor for later antisocial behavior often resulting in young adult delinquency. Patients with disruptive behavior disorder display defiant performance when it comes to recognizing faces and their emotional states (Aspan, Bozsik, Gadoros, & Nagy, 2014; Cadesky, Mota, & Schachar, 2000; Downs & Smith, 2004) and show altered brain activity in emotion-related brain regions (amygdala, prefrontal cortex) while processing social stimuli such as faces (Jones, Laurens, Herba, Barker, & Viding, 2009; Lozier, Cardinale, VanMeter, & Marsh, 2014; Marsh et al., 2008; White et al., 2012).

Sleep supports the consolidation of various memory systems: post-learning sleep benefits the integration of newly encoded, fragile memory information into more stable, long-term memory traces (Diekelmann & Born, 2010; Rasch & Born, 2013; Wilhelm, Prehn-Kristensen, & Born, 2012). Recently, we observed that children with ADHD display altered sleep-dependent memory consolidation; i.e. sleep in children with ADHD benefits procedural memory more than in healthy children (Prehn-Kristensen et al., 2011), and sleep in ADHD supports declarative memory less than in healthy children (Prehn-Kristensen et al., 2011; Prehn-Kristensen et al., 2014; Prehn-Kristensen et al., 2013). We assumed that in ADHD the sleep-dependent consolidation of declarative memory is disturbed due to dysfunctional prefrontal brain activity during sleep (Wilhelm et al., 2012). There is a whole body of evidence that sleep supports the memory for emotional events (Wagner, Gais, & Born, 2001; Wagner, Hallschmid, Rasch, & Born, 2006) or scenes (Hu, Stylos-Allan, & Walker, 2006; Payne, Stickgold, Swanberg, & Kensinger, 2008; Prehn-Kristensen et al., 2009). The question of whether or not sleep modulates the affective tone of learned emotional stimuli, however, is still under debate: Some authors emphasize that the affective tone of learned emotional stimuli is attenuated after sleep (Walker & van der Helm, 2009), while others have not found support for the hypothesis that sleep tunes down the affective tone of consolidated stimulus material (Baran, Pace-Schott, Ericson, & Spencer, 2012; Deliens, Neu, & Peigneux, 2013; Groch, Wilhelm, Diekelmann, & Born, 2013; Wiesner et al., 2015).

Studies in healthy adults have reported that sleep also benefits the memory recognition of social stimuli such as pictures of faces with neutral or emotional expressions (Mograss, Guillem, & Godbout, 2008; Mograss, Godbout, & Guillem, 2006; Wagner, Hallschmid, Verleger, & Born, 2003; Wagner, Kashyap, Diekelmann, & Born, 2007). A recent study of young patients with autism spectrum disorder (ASD) reported that sleep in ASD patients amplified altered recognition of neutral faces (Tessier, Lambert, Scherzer, Jemel, & Godbout, 2015). Autism is characterized by profound difficulties in communication and social interactions (American Psychiatric Association, 2013) and is associated with deficits in face processing (Harms, Martin, & Wallace, 2010; Nomi & Uddin, 2015). To

date, no information - neither in healthy nor in diseased children - is available about whether or not sleep modulates the affective tone of learned social stimuli such as faces.

The aim of the present study is to investigate the influence of sleep on the picture recognition of emotional faces and their affective regulation in children with ADHD and comorbid ODD (ADDH+ODD) in comparison to healthy children. Based on our previous studies, we hypothesize that healthy children benefit more from sleep with respect to the recognition of faces compared to children with ADHD+ODD. Besides behavioral responses, we use pupillometry as an objective measurement to assess recognition performance on a psychophysiological level (Goldinger, He, & Papesh, 2009; Laeng, Sirois, & Gredeback, 2012; Papesh, Goldinger, & Hout, 2012). Since emotional face expression might influences the sleep-related face recognition performance, different emotional conditions (happiness, fear, anger, and neutral) are introduced on an explorative level. The question of whether or not sleep regulates affective tone is still under debate. Therefore, we do not have any specific hypothesis about the impact of sleep on the emotional assessment of consolidated pictures.

Methods Participants

Sixteen male children diagnosed with ADHD and ODD aged between 8-11 years (M=11.4, SD=1.5) and

16 healthy male children aged between 9-11 years (M=11.1, SD=1.1) participated in this study.

Patients and controls did not differ in age, pubertal stage, or IQ (see Table 1). All children and their

parents were interviewed using a German translation of the Revised Schedule for Affective Disorders

and Schizophrenia for School-Age Children: Present and Lifetime Version (Kiddie-SADS-PL; Delmo et

al., 2000; Kaufman et al., 1997). A standardized questionnaire, the Child Behavior Checklist (CBCL;

Achenbach, 1991), was filled out by parents to assess any psychiatric symptoms of their children. According to the DSM-IV-TR, all patients met the criteria for ADHD (13x combined type, 3x inattentive type) and comorbid oppositional defiant disorder (ODD). In addition, one patient suffered from nocturnal enuresis. Controls were excluded if they displayed any psychiatric abnormalities. Exclusion criteria for all participants were: below average intelligence quotient (IQ < 85), as measured by the Culture Fair Intelligence Test 20-Revised Version (CFT 20-R; Weiß & Osterland, 2013), or profound memory impairment, as measured by a figural learning test to assess cerebral dysfunctions (Diagnosticum für Cerebralschädigung, DCS; Lamberti & Weidlich, 1999; cut-off score: 16th percentile of the reference sample).

According to the Sleep-Self-Report questionnaire (SSR; Owens, Spirito, McGuinn, & Nobile, 2000; critical score: >24) patients rated themselves as having more sleep problems than healthy controls [ADHD: Median: 23.5, Range: 21-40, M=25.4, SD=4.8; healthy controls: Median: 20.5, Range: 18-28, M=20.9, SD=2.5; ADHD vs. controls: t(30)=3.4, p=.002]; the same was true for parental ratings on the Children Sleep Habit Questionnaire [CSHQ; Owens, Spirito, & McGuinn, 2000; critical sore >41; ADHD: Median: 42.5, Range: 36-60, M=43.9, SD=5.9; healthy controls: Median: 38.5, Range: 33-45, M=38.6, SD=4.0; ADHD vs. controls: t(30)=2.9, p=.006)]. The ratings of seven patients and two controls were above the critical SSR sore; parents of ten patients and three controls rated the sleep habits of their children above the critical CSHQ score. All participants had normal or corrected-to-normal vision. To control for possible group differences in emotional face processing, patients and controls completed the computer-based program for the training and testing of facial affect recognition (Frankfurter Test und Training fazialen Affekts, FEFA; Bolte et al., 2002). Although patients performed worse than controls in general [FEFA sum score patients: M=69.7%, SD=2.9; controls: M=79.2, SD=7.5; patients vs. controls: t(30)=2.75, p=.01] and made more errors regarding neutral pictures (M=90.2%, SD=16.3; controls: M=99.0, SD=3.7; patients vs. controls: t(29)=2.06, p=.048), the affect recognition performances concerning angry, fearful, or happy faces were not different between groups (p>.147). According to self-reports, all participants were free of any

neurological, immunological, or endocrinological diseases. Parental reports revealed that no participant took any medication except for methylphenidate in eleven ADHD patients; however, these patients discontinued medication 48h (approximately twelve half-lives) prior to each experimental condition.

All participating children and their parents gave written, informed consent and were reimbursed with a voucher for their participation. The study was approved by the ethics committee of the medical faculty of the University of Kiel and followed the ethical standards of the Helsinki Declaration.

Procedure

Each participant took part in a sleep and a wake condition. The sleep condition consisted of an encoding phase in combination with a baseline measurement in the evening at 8 p.m. and a retrieval session in the morning after a 12-h interval which included nocturnal sleep. In the wake condition, the encoding session and baseline measurement were conducted in the morning at 8 a.m.; the retrieval session took place after a 12-h wake interval. The order of conditions (each being conducted at least two weeks apart) was counterbalanced across both groups. All experimental sessions were carried out under laboratory conditions: Children were seated in a comfortable EEG chair; the distance between the children's eyes and the monitor was constantly kept at 60cm; and the room illumination was the same for all participants.

At the beginning of each session, participants rated their current emotional state using the Self-Assessment Manikin (SAM; Bradley & Lang, 1994) scales for valence and arousal and their tiredness using a 10-cm analog scale (0 = very tired; 10 = very alert). In addition, the current alertness was measured by a computer-based alertness test (KITAP subtest "Alertness"). In the evening sessions, participants were asked to report on whether or not they slept during the daytime or intake of any medication.

Picture recognition paradigm

A total of 320 black and white pictures of faces were used showing different kinds of emotional expressions (80 angry, 80 fearful, 80 happy, 80 neutral faces). Pictures of faces were taken from the following databases: Database of facial expressions in young, middle-aged, and older women and men FACES (Ebner, Riediger, & Lindenberger, 2010), NimStim set of Facial Expressions (Tottenham et al., 2009), 3D Facial Emotional Stimuli (Gur et al., 2002), Karolinska Directed Emotional Faces Systems (KDEF; Lundqvist, Flykt, & Ohman, 1998), and Productive Aging Laboratory Face Database (Minear & Park, 2004). Prior to the experiment, a preselection of 180 pictures of females and 177 pictures of males were presented to a group of 12 healthy children (aged 9-12 yrs., 7 girls) and 12 healthy adults (aged 25-48 yrs., 6 women) in order to be able to remove pictures which depicted ambiguous emotional face expression. 160 female and 160 male pictures were finally selected and divided into two sets of 160 pictures each (50% female pictures) with comparable emotional intensity ratings. Pictures' greyscales and their backgrounds were manually homogenized using Adobe Photoshop CS3 (Adobe Systems). Each picture set consisted of 80 target pictures (20 of each emotional category) for the encoding session, 20 foil pictures (5 of each category) as distractors for baseline measurement, and 60 foil pictures (15 of each category) as distractors for the retrieval session.

During encoding, the 80 target pictures were presented in a pseudo-randomized order (see also Figure 1). Face pictures (530 x 450 pixels, 72 x 72 dpi) were presented centered on a 23" screen (resolution: 1280 x 768). The distance between the participant and the screen was kept at 60 cm, resulting in a picture's visual angle of 17.7°. Each trial started with a white fixation cross centered in the middle of the screen on a black background. Participants had to fixate the cross for 0.5s continuously. By using eye-tracking (for methods see below) we ensured that all participants fixated the middle of the screen in order to pay attention for the next visual stimulus. After fixation of the cross for 0.5s, a face picture was presented for 2.5s. Faces pictures were centered in the middle of

the screen, and participants were asked to evaluate the emotional state of the face. Hereafter, the emotional face expression had to be rated on the three visual analogue scales "anger", "fear" and, "happiness" (ranging from "not at all" to "maximal") using a computer mouse (no time limit). Then, the same target picture was presented again for another 2.5s, and the participants were to answer a control question of whether the picture depicts a women or a man (no time limit) by pressing one of two response buttons. On average, encoding took 25min. After a short break, a baseline control measurement was conducted: 20 targets (5 of every emotional class) along with 20 foils were presented in pseudo-randomized order. Here too each trial started with a white fixation cross, and, after 0.5 seconds of continuous fixation (controlled by eye-tracking), a target or foil picture was presented without a time limit and did not disappear until the participant's response was detected. Children were asked to rate whether or not the currently displayed picture was an "old" (target) or a "new" (foil) picture by pressing one of two computer mouse buttons (duration ca. 3min).

During retrieval, the remaining 60 targets were presented intermixed with 60 foils. Here, two different measurements were obtained. In a first block of trials, the 120 pictures were presented for the purpose of recognition, which was similar to the baseline recognition task (see above) with respect to the procedure. In a second block, the same 120 pictures were presented again in order to assess the emotional face expression: After 0.5s of fixation, a face picture was presented for 2.5s, and the facial expression was to be rated on the three scales "anger", "fear", and "happiness" (see above). Duration of the retrieval session was 30min.

Eye-tracking and pupillometry

Binocular eye movements and pupil diameter were measured by a Tobii eye-tracker TX300 (Tobii Technology, Sweden). Cameras using the dark-pupil technique were affixed under the experimental computer screen. The sampling rate was 300 Hz. During encoding, eye-tracking was analyzed online by the software Tobii studio 2.3.2.0 in order to ensure that participants aligned their visual focus on

the fixation crosses (see above). In the retrieval session, pupil diameter was recorded during the whole recognition task but analyzed only within the time window from -500ms before to 2500ms after face picture onset. Offline analysis was performed by in-house software programmed in Matlab R2013a (MathWorks U.S.A.). Pupil data were cut into segments ranging from -500ms before to 2500ms after picture onset, each representing one event-related epoch. A first visual inspection of the segments revealed frequent eye blinks 1500ms after onset which resulted in missing data. The segments were therefore re-cut to -500ms to 1500ms. The preprocessing was carried out for each segment individually and visually controlled. Missing data (due to eye blinks) were lineally interpolated; however, segments that were significantly affected by artifacts were discarded. Furthermore, the raw data were smoothed using a two-way floating average filter (averaging ± 8 data points ranging ± 26.7ms) and then baseline-corrected using the time window of -100-0ms as a baseline. Afterwards, each segment was visually inspected again and, if still corrupted by artifacts (e.g. eye blinks), discarded. Artefact rejection was done conservatively and lead to a drop-out rate of about 24%. The artifact-free and baseline-corrected data in the range 0-1500ms were averaged over consecutive sections of 75ms resulting in 20 bins. Finally, the preprocessed segments of each participant and each event class (old/new by emotion) were averaged over the left and right eye and entered into the statistical analysis.

Sleep recordings

Sleep was assessed under laboratory conditions. All participants spent two nights in the sleep laboratory. The first night (including polysomnography, PSG) was used for adaptation to the conditions of the sleep laboratory; the experimental condition took place in the second night. Adaptation and experimental nights were separated at least by one night sleeping at home in order to avoid any recovery-carryover effects from the adaptation to the experimental night. For the experimental night, participants arrived at the sleep lab at 7 p.m. After affixing the PSG electrodes,

the encoding session started at 8 p.m. followed by the baseline measurement. Thereafter, participants went to bed, and lights were switched off at 9.30 p.m. The sleep EEG was recorded at a sampling-rate of 128 Hz with a band-pass filter of 0.2-35Hz using multi-use Ag/AgCl-electrodes affixed to the positions C3 and C4 according to the 10-20 system and referenced to an electrode on the bridge of the nose and with a ground placed at Fpz. A diagonal electroocculogram (EOG) was recorded at a sampling-rate of 128Hz with a band-pass filter of 0.2-75 Hz using single-use Ag/AgCl-electrodes attached to the lower right and upper left canthi. A bipolar EMG was recorded at a sampling-rate of 256 Hz with a band-pass filter of 0.2-128 Hz using single-use Ag/AgCl-electrodes attached to the chin. Recordings were visually scored according to standard criteria [35] by a trained rater. For each night the following was measured: time in bed (TIB), sleep onset latency (time in minutes from lights off to the first epoch of stage 2 sleep), total sleep time (in minutes), sleep efficiency (ratio of total sleep time to time in bed), sleep stages 1-4 and REM sleep (in minutes), and REM latency (time in minutes from sleep onset to the first epoch of REM sleep). Children were awakened at 7 a.m., electrodes were removed, and breakfast was served. The retrieval session began at 8 a.m.

Statistical analyses

Recognition performance was calculated by the accuracy rate, which is defined as the differences

between the standardized hit rate (percent of targets correctly identified as targets) and the

standardized false-alarm rate (percent of foils incorrectly identified as targets) according to the signal

detection theory (Verde, MacMillan, & Rotello, 2006). In order to control for any baseline effects,

data obtained during baseline measurements were analyzed by a 2x2x4 ANOVA, using the between

subject factor GROUP (ADHD vs. controls) and the two within subject factors SLEEP (sleep vs. wake

condition) and EMOTION (angry vs. fearful vs. happy vs. neutral faces). Baseline-corrected memory

performance was calculated by the difference between the baseline recognition accuracy and the

retrieval recognition accuracy as an index of forgetting. This recognition index was analyzed

employing a 2x2x4 ANOVA, again using the between subject factor GROUP and the two within subject factors SLEEP and EMOTION.

For statistical analysis, pupil data were first depicted separately for target and foil pictures (see also Figure 3). In both the sleep and wake condition, robust brightness adaptations were observed through the entire picture presentation interval of 1.5s. In both groups, however, at bin 6 (375ms after picture onset) targets and foils started to evoke different pupil diameters. Based on these data, pupil diameters were averaged post-hoc over the bins 6-20 (0.375-1.5s after picture onset) for each condition and emotion. The analysis of the averaged pupil diameters was performed by a 2x2x4 ANOVA using the between factor GROUP and the two within factors SLEEP and EMOTION.

In order to compare the single means in sleep parameters with respect to groups, t-tests were used. A detailed description of the analyses of picture ratings can be found in the electronic supplemental materials S1 and S2. The analyses of the control variables "mood", "tiredness", and "alertness" are described in detail in S3.

Results

Picture recognition data

Analyses of baseline performance revealed main effects for EMOTION [F(3,90)=6.8, p<.001] and GROUP [F(1,30)=5.3, p=.029], as well as interactions for Emotion x GROUP [F(3,90)=3.6, p=.016], SLEEP x EMOTION [F(3,90)=4.6, p=.005] and Sleep x GROUP [F(1,30)=8.5, p=.007]. Here, we refrain from going into all details (for descriptive statistics see Table 2). However, a decomposition of the interaction SLEEP x GROUP showed that children with and without ADHD+ODD displayed comparable

recognition performance in the sleep condition [t(30)=0.1, p=.9]. On the other hand, patients displayed a lower baseline performance in the wake condition [t(30)=3.6, p=.001] than did healthy children. The interaction SLEEP x EMOTION X GROUP did not reach significance [F(3,90)=2.4, p=.070]. These significant group differences in baseline recognition performance clearly indicate group differences already at the stage of encoding. Therefore, controlling retrieval data with respect to baseline performance is required.

Regarding uncorrected retrieval data there were main effects for SLEEP [F(1,30)=23.3, p<.001] and EMOTION [F(3,90)=53.4, p>.001] and interactions for EMOTION x GROUP [F(3,90)=8.6, p>.001], SLEEP x EMOTION [F(3,90)= 4.2, p=.008], and SLEEP x GROUP [F(1,30=5.4, p=.027]. Here again, we refrain from going into all details (see Table 2). However, a decomposition of the latter interaction revealed that patients still displayed comparable retrieval performance in the sleep condition compared to healthy children [t(30)=.3, p=.751] but worse performance in the wake condition [t(30)= 2.3, p=.027]. The interaction SLEEP x EMOTION X GROUP did not reach significance [F(3,90)=2.2, p=.096].

With respect to baseline-corrected recognition performance, there were main effects for

SLEEP [F(1,30)=4.6, p=.040: performance was higher in the sleep than in the wake condition] and

EMOTION [F(3,90)=10.1, p<.001: happy faces were better recognized than all other picture

conditions]. There was an interaction for SLEEP x EMOTION [F(3,90)=6.1, p=.001]. Comparisons of

single means revealed that a benefit of sleep on recognition performance was most pronounced in

the angry face condition [t(30)=5.8, p<.001; fearful faces: t(30)=0.3, p=.801; happy faces: t(30)=1.4,

p=.167; neutral faces: t(30)=1.4, =.168]. In addition, there was an interaction EMOTION x GROUP

[F(3,90)=4.1, p=.009] showing that patients compared to controls performed better on fearful faces

[t(30)=2.1, p=.042] and happy faces [t(30)=2.4, p=.023]; on a descriptive level the opposite was true

for angry faces [t(30)=1.4, p=.160). Indeed, the interaction SLEEP x GROUP was only of marginal

significance [F(1,30)=3.5, p=.070]. However as assumed, planned comparisons of single means

revealed that sleep only fostered the baseline-corrected recognition performance [t(15)=3.4, p=.004]

in healthy children; in children with ADHD, there was no benefit of sleep on baseline-corrected picture recognition [t(15)=0.2, p=.868, see also Figure 2]. Neither the main effect for GROUP [F(1,30)=2.7, p=.111] nor the interaction SLEEP x EMOTION x GROUP [F(3,90)=1.5, p=.226] was significant.

Pupil data

The analysis of averaged pupil data (0.375-1.5ms, see also Figure 3) showed a main effect for SLEEP [F(1,90)=10.3, p=.003: in general, pupil diameters were increased after sleep compared to wake], a main effect for TARGET [F(1,90)=30.9, p<.001: target pictures compared to foils evoked a stronger pupil dilatation], a main effect for GROUP [F(1,30)=4.4, p=.045: healthy children compared to children with ADHD displayed an overall increased pupil diameter], and a main effect for EMOTION [F(3,90)=38.0, p<.001: highest pupil dilatations were observed while processing angry and fearful faces, the lowest dilatation was observed while watching happy faces]. In addition, the interaction EMOTION x TARGET x GROUP reached significance [F(3,90)=3.3, p=.025]. But most importantly the interaction SLEEP x TARGET x GROUP was significant [F(1,30=7.3, p=.001]. A decomposition of the latter interaction revealed in healthy children a pronounced pupil dilatation in response to target pictures compared to foils only in the sleep condition [t(15)=5.3, p<.001] but not in the wake condition [t(15)=1.6, p=.230]. In contrast, in children with ADHD an increased pupil dilatation in response to target pictures compared to foils was visible only in the wake condition [t(15)=4.9, p<.001] but not in the sleep condition [t(15)=1.5, p=.144]. The 4-fold interaction SLEEP x EMOTION x TARGET x GROUP was not significant [F(3,90)=0.7, p=.522]. Figure 3 depicts pupil reactions in response to target and foil pictures separated for sleep and wake condition for children with and without ADHD+ODD; for every bin, comparisons of single means (target vs. foils) were calculated using t-tests for dependent samples.

Picture rating data and control variables

Children with and without ADHD+ODD did not differ regarding any polysomnographic data (see Table 3). The analyses regarding gender ratings during encoding showed that patients performed less accurately than controls. The same was true for basic affective ratings during retrieval session. However, there was no evidence of a sleep had an impact on the observed group differences (for further details see electronic supplemental material S1a/b). Regarding intensity ratings during encoding and retrieval, emotional faces were rated according to their a priori defined emotional classes (angry faces as being angry; happy faces as being happy and so on), but there were no significant differences in intensity ratings, neither with respect to sleep condition nor with respect to group. Further details are presented in S2 and Figure S1. In addition, no significant effects regarding the factor GROUP were found with respect to alertness, tiredness ratings, and emotional self-ratings (see electronic supplemental material S3).

Discussion

In this study we investigated the impact of sleep on the consolidation of pictures of emotional faces in children diagnosed with ADHD+ODD and healthy controls. As expected, sleep benefits the recognition performance in healthy children but not in children with ADHD+ODD. Pupil diameter measurement during the retrieval session as a psychophysiological marker of picture recognition confirmed a sleep-associated improvement in memory performance in healthy children but not in young patients with ADHD+ODD. There were no signs of sleep-related emotion regulation in children with or without ADHD+ODD. Before going into a detailed discussion on group level, we start discussing the results in a more general way.

When combined over children with and without ADHD+ODD, there was an overall effect of sleep on the recognition performance, particularly in the angry face condition. Independent of the emotional condition, the expected interaction between sleep/wake retention interval and groups was marginally significant. The explorative interaction between groups, emotional condition, and sleep/wake retention interval, however, failed to reach significance. Since we had no hypotheses regarding the different emotional conditions, we refrained from further analysis of recognition data with respect to their emotional content. Of note, however, is the fact that the results rely on the use of baseline-corrected retrieval data. Utilization of a baseline correction was mandatory since groups already differ significantly during the immediate retrieval session in the wake condition. Alertness, tiredness, mood, or simple effects of initial gaze direction (controlled by eye-tracking) cannot explain different levels of face encoding. Measurement of pupil data clearly showed that target pictures evoked stronger pupil dilatations than foils in both groups during the retrieval session. These data are in line with former studies that reported that the human pupil diameter is sensitive to the recognition of familiar stimulus material (Goldinger et al., 2009; Laeng et al., 2012; Papesh et al., 2012). Again, there were no indications in the pupil data that the emotional condition had any influence on the differences between groups with respect to the sleep-dependent pupil recognition measurement.

As indicated by planned comparisons of single means, only healthy children benefit from sleep with respect to the recognition of emotional faces. In the same way, the processing of (familiar) target pictures during retrieval clearly evoked stronger pupil dilatation in healthy children than did (unfamiliar) foils only in the sleep but not in the wake condition. These data confirmed that healthy children benefit more from sleep than wakefulness with regards to the memory consolidation of faces, as has been observed previously in adults (Mograss et al., 2008; Mograss et al., 2006; Wagner et al., 2003; Wagner et al., 2007).

In contrast to healthy children, patients with ADHD+ODD did not show better overall

performance after sleep compared to wakefulness. In accordance with the behavioral data, children

with ADHD+ODD did not display stronger pupil dilatation in response to target pictures compared to foils in the sleep condition. Interestingly, patients displayed a higher pupil recognition response in the wake than in the sleep condition, suggesting a wake-dependent benefit in face recognition. In fact, on a descriptive level the behavioral data suggest that patients compared to healthy controls did not display worse performance in the sleep-condition but a proportionally better performance in the wake condition. These signs of a reversed sleep/wake benefit of picture recognition in children with ADHD+ODD cannot be ascribed solely to daytime effects of encoding or retrieval: neither the alertness task nor subjective ratings (mood, tiredness) revealed that children with or without ADHD+ODD were differentially affected by daytime. Moreover, daytime had no influence on control variables regarding the memory task itself (i.e. identifying the subject's gender and rating the emotional face expression during encoding and retrieval); furthermore solely subjective but not objective sleep parameters did in fact differ between both groups (Cortese, Faraone, Konofal, & Lecendreux, 2009; J. Owens et al., 2009). Therefore, the behavioral data along with the pupil data lead to the assumption that wakefulness supported the memory of socially relevant stimuli in ADHD+ODD as least to a comparable amount as sleep did. In our former studies we observed that children with ADHD compared to healthy controls displayed worse recognition performance, especially in the sleep condition, indicating a deficit in sleep-dependent consolidation of declarative memory (Prehn-Kristensen et al., 2011; Prehn-Kristensen et al., 2014; Prehn-Kristensen et al., 2013). The data of the present study, however, paint a more complex picture. In contrast to the former studies, where almost no social stimuli were used, exclusively social stimuli were presented in the present study. Although patients with ADHD+ODD display alterations in processing social signals (Aspan et al., 2014; Cadesky et al., 2000; Downs & Smith, 2004; Jones et al., 2009; Lozier et al., 2014; Marsh et al., 2008; White et al., 2012), social stimuli such as faces might still be highly relevant for these patients: ADHD patients initiate social interactions more frequently than control children (Nijmeijer et al., 2008) but often have difficulties in attuning their behavior to other people (Ronk, Hund, & Landau, 2011; Stroes, Alberts, & Van Der Meere, 2003). It is, however, highly speculative

whether or not a) a pronounced focus on social stimuli such as faces increased the probability for better long-term memory performance in patients with ADHD+ODD during daytime, but b) that this memory advantage diminishes due to deficits in sleep-dependent memory consolidation. On the other hand, we cannot exclude that certain emotional face categories might benefit from sleep while other categories counteracted in children with ADHD+ODD. However, due to the lack of significance these assumptions are elusive and need to be investigated in further studies. Therefore, speculations about any underlying neural mechanism of a disturbed long-term consolidation of social stimuli are premature at this time.

We did not find any signs of sleep-associated changes in the emotional picture intensity ratings. In line with others (Baran et al., 2012; Deliens et al., 2013; Groch et al., 2013; Wiesner et al., 2015), we had to conclude that neither in healthy children nor in children with ADHD+ODD does sleep play an important role in emotional regulation on a behavioral level. Since we have no information about pupil diameter during encoding, we can only compare the pupil reaction of target and foil pictures at the retrieval sessions. Therefore, we have no information about possible changes in pupil diameter in response to repeated picture presentations after the retention interval, meaning that we cannot exclude possible sleep-dependent regulatory processes on a pre-conscious level.

As the interpretation of the picture recognition results mostly relies on a marginally significant interaction, our results need replication. Therefore, further studies should also take into account that presenting pictures of adults to children might have a different impact than pictures of peers (Macchi Cassia et al., 2014; Proietti et al., 2015). In addition, nonsocial stimuli should be used to control for any social reactions since faces per se can act as emotional stimuli (Ohman, Lundqvist, & Esteves, 2001; Schrammel, Pannasch, Graupner, Mojzisch, & Velichkovsky, 2009; Senju & Johnson, 2009). Moreover, a balanced number of trials across the immediately and delayed recognition measurement should be considered. Finally, the inclusion of psychophysiological parameters during encoding are required [e.g. measuring of brain functions during encoding, see also (Sterpenich et al.,

2007)], or autonomic responses such as electrodermal activity) in order to predict the face recognition performance after sleep on an objective level.

Taken together, we observed that sleep in healthy children supported the recognition of socially relevant stimuli such as faces. Pupillometry was successfully employed as an objective measurement of recognition in the field of sleep-dependent memory consolidation in healthy children. As revealed by behavioral data and pupillometry, sleep in children with ADHD+ODD had no additional benefit to the picture recognition of adult faces.

Acknowledgement

We thank Petra Schneckenburger and Susanne Kell for their technical assistance. This study was supported by a grant from the German Research Foundation (DFG) SFB 654 "Plasticity and Sleep".

References

Achenbach, T. M. (1991). Manual for the child behavior checklist/4-18 and 1991 Profile. Burlington,Vermont: Department of Psychiatry, University of Vermont.

Adolphs, R., & Birmingham, E. (2011). Neural Substrates of Social Perception. In A. J. Calder, G. Rhodes, M. H. Johnson & J. V. Haxby (Eds.), The Oxford Handbook of Face Perception (pp. 571-589). New York: Oxford University Press.

American Psychiatric Association. (2013). The Diagnostic and Statistical Manual of Mental Disorders, 5th ed. Washington, DC: American Psychiatric Association

Aspan, N., Bozsik, C., Gadoros, J., & Nagy, P. (2014). Emotion recognition pattern in adolescent boys with attention-deficit/hyperactivity disorder. 2014, 761340. doi: 10.1155/2014/761340

Baran, B., Pace-Schott, E. F., Ericson, C., & Spencer, R. M. (2012). Processing of emotional reactivity and emotional memory over sleep. Journal of Neuroscience, 32(3), 1035-1042.

Bolte, S., Feineis-Matthews, S., Leber, S., Dierks, T., Hubl, D., & Poustka, F. (2002). The development and evaluation of a computer-based program to test and to teach the recognition of facial affect. Int J Circumpolar Health, 61 Suppl 2, 61-68.

Bortolon, C., Capdevielle, D., & Raffard, S. (2015). Face recognition in schizophrenia disorder: A comprehensive review of behavioral, neuroimaging and neurophysiological studies. Neurosci Biobehav Rev, 53, 79-107.

Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49-59.

Cadesky, E. B., Mota, V. L., & Schachar, R. J. (2000). Beyond words: how do children with ADHD and/or conduct problems process nonverbal information about affect? J Am Acad Child Adolesc Psychiatry, 39(9), 1160-1167

Collin, L., Bindra, J., Raju, M., Gillberg, C., & Minnis, H. (2013). Facial emotion recognition in child psychiatry: a systematic review. Res Dev Disabil, 34(5), 1505-1520.

Cortese, S., Faraone, S. V., Konofal, E., & Lecendreux, M. (2009). Sleep in children with attention-deficit/hyperactivity disorder: meta-analysis of subjective and objective studies. Journal of the American Academy of Child and Adolescent Psychiatry, 48(9), 894-908.

da Silva Ferreira, G. C., Crippa, J. A., & de Lima Osorio, F. (2014). Facial emotion processing and recognition among maltreated children: a systematic literature review. Front Psychol, 5, 1460.

Deliens, G., Neu, D., & Peigneux, P. (2013). Rapid eye movement sleep does not seem to unbind memories from their emotional context. J Sleep Res, 22(6), 656-662.

Delmo, C., Weiffenbach, O., Gabriel, M., Bolte, S., Marchio, E., & Poustka, F. (2000). Kiddie-SADS-present and lifetime version (K-SADS-PL), 3rd edn. (3rd ed.). Frankfurt, Germany: Clinic of Child and Adolescent Psychiatry.

Diekelmann, S., & Born, J. (2010). The memory function of sleep. Nature reviews. Neuroscience, 11(2), 114-126.

Downs, A., & Smith, T. (2004). Emotional understanding, cooperation, and social behavior in high-functioning children with autism. J Autism Dev Disord, 34(6), 625-635.

Ebner, N. C., Riediger, M., & Lindenberger, U. (2010). FACES--a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods, 42(1), 351-362.

Gobbini, M. I., & Haxby, J. V. (2007). Neural systems for recognition of familiar faces. Neuropsychologia, 45(1), 32-41. doi: 10.1016/j.neuropsychologia.2006.04.015

Goldinger, S. D., He, Y., & Papesh, M. H. (2009). Deficits in cross-race face learning: insights from eye movements and pupillometry. J Exp Psychol Learn Mem Cogn, 35(5), 1105-1122.

Goodman, G. S., Sayfan, L., Lee, J. S., Sandhei, M., Walle-Olsen, A., Magnussen, S., . . . Arredondo, P. (2007). The development of memory for own- and other-race faces. J Exp Child Psychol, 98(4), 233-242.

Groch, S., Wilhelm, I., Diekelmann, S., & Born, J. (2013). The role of REM sleep in the processing of emotional memories: evidence from behavior and event-related potentials. Neurobiol Learn Mem, 99, 1-9.

Gur, R. C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P., Macy, L., . . . Gur, R. E. (2002). A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. J Neurosci Methods, 115(2), 137-143.

Harms, M. B., Martin, A., & Wallace, G. L. (2010). Facial emotion recognition in autism spectrum disorders: a review of behavioral and neuroimaging studies. Neuropsychol Rev, 20(3), 290322.

Hoehl, S., & Peykarjou, S. (2012). The early development of face processing--what makes faces special? Neurosci Bull, 28(6), 765-788.

Hu, P., Stylos-Allan, M., & Walker, M. P. . (2006). Sleep Facilitates Consolidation of Emotional Declarative Memory. Psychological Science, 17(10), 891-898.

Jones, A. P., Laurens, K. R., Herba, C. M., Barker, G. J., & Viding, E. (2009). Amygdala hypoactivity to fearful faces in boys with conduct problems and callous-unemotional traits. Am J Psychiatry, 166(1), 95-102.

Kaufman, J., Birmaher, B., Brent, D., Rao, U., Flynn, C., Moreci, P., . . . Ryan, N. (1997). Schedule for affective disorders and schizophrenia for school-age children - Present and lifetime version (K-SADS-PL): Initial reliability and validity data. Journal of the American Academy of Child and Adolescent Psychiatry, 36, 980-988.

Laeng, B., Sirois, S., & Gredeback, G. (2012). Pupillometry: A Window to the Preconscious? Perspect Psychol Sci, 7(1), 18-27

Lamberti, G., & Weidlich, S. (1999). DCS - A visual Learning and Memory Test for Neuropsychological Assessment (3rd ed.). Goettingen: Hogrefe.

Lozier, L. M., Cardinale, E. M., VanMeter, J. W., & Marsh, A. A. (2014). Mediation of the relationship between callous-unemotional traits and proactive aggression by amygdala response to fear among children with conduct problems. JAMA Psychiatry, 71(6), 627-636.

Lundqvist, Daniel, Flykt, Anders, & Ohman, A. (1998). The Karolinska directed emotional faces (KDEF). CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, ISBN 91-630-7164-9.

Macchi Cassia, V., Luo, L., Pisacane, A., Li, H., & Lee, K. (2014). How race and age experiences shape young children's face processing abilities. J Exp Child Psychol, 120, 87-101.

Marsh, A. A., Finger, E. C., Mitchell, D. G., Reid, M. E., Sims, C., Kosson, D. S., . . . Blair, R. J. (2008). Reduced amygdala response to fearful expressions in children and adolescents with callous-unemotional traits and disruptive behavior disorders. Am J Psychiatry, 165(6), 712-720.

Minear, M., & Park, D. C. (2004). A lifespan database of adult facial stimuli. Behav Res Methods Instrum Comput, 36(4), 630-633.

Mograss, M. A., Guillem, F., & Godbout, R. (2008). Event-related potentials differentiates the processes involved in the effects of sleep on recognition memory. Psychophysiology, 45(3), 420-434.

Mograss, M., Godbout, R., & Guillem, F. (2006). The ERP old-new effect: A useful indicator in studying the effects of sleep on memory retrieval processes. Sleep, 29(11), 1491-1500.

Nijmeijer, Judith S, Minderaa, Ruud B, Buitelaar, Jan K, Mulligan, Aisling, Hartman, Catharina A, & Hoekstra, Pieter J. (2008). Attention-deficit/hyperactivity disorder and social dysfunctioning. Clinical psychology review, 28(4), 692-708.

Nomi, J. S., & Uddin, L. Q. (2015). Face processing in autism spectrum disorders: From brain regions to brain networks. Neuropsychologia, 71, 201-216.

Ohman, A., Lundqvist, D., & Esteves, F. (2001). The face in the crowd revisited: a threat advantage with schematic stimuli. J Pers Soc Psychol, 80(3), 381-396.

Owens, J. A., Spirito, A., & McGuinn, M. (2000). The Children's Sleep Habits Questionnaire (CSHQ): psychometric properties of a survey instrument for school-aged children. Sleep, 23(8), 10431051.

Owens, J. A., Spirito, A., McGuinn, M., & Nobile, C. (2000). Sleep habits and sleep disturbance in elementary school-aged children. J Dev Behav Pediatr, 21(1), 27-36.

Owens, J., Sangal, R. B., Sutton, V. K., Bakken, R., Allen, A. J., & Kelsey, D. (2009). Subjective and objective measures of sleep in children with attention-deficit/hyperactivity disorder. Sleep Med, 10(4), 446-456.

Papesh, M. H., Goldinger, S. D., & Hout, M. C. (2012). Memory strength and specificity revealed by pupillometry. Int J Psychophysiol, 83(1), 56-64.

Payne, J. D., Stickgold, R., Swanberg, K., & Kensinger, E. A. (2008). Sleep preferentially enhances memory for emotional components of scenes. Psychological Science, 19(8), 781-788

Polanczyk, G. V., Salum, G. A., Sugaya, L. S., Caye, A., & Rohde, L. A. (2015). Annual research review: A meta-analysis of the worldwide prevalence of mental disorders in children and adolescents. J Child Psychol Psychiatry, 56(3), 345-365.

Prehn-Kristensen, A., Goder, R., Chirobeja, S., Bressmann, I., Ferstl, R., & Baving, L. (2009). Sleep in children enhances preferentially emotional declarative but not procedural memories. Journal of Experimental Child Psychology, 104(1), 132-139.

Prehn-Kristensen, A., Goder, R., Fischer, J., Wilhelm, I., Seeck-Hirschner, M., Aldenhoff, J., & Baving, L. (2011). Reduced sleep-associated consolidation of declarative memory in attention-deficit/hyperactivity disorder. Sleep Medicine, 12(7), 672-679.

Prehn-Kristensen, A., Molzow, I., Munz, M., Wilhelm, I., Müller, K., Freytag, D., . . . Baving, L. (2011). Sleep restores daytime deficits in procedural memory in children with attention-deficit/hyperactivity disorder. Research in Developmental Disabilities, 32(6), 2480-2488.

Prehn-Kristensen, A., Munz, M., Goder, R., Wilhelm, I., Korr, K., Vahl, W., . . . Baving, L. (2014). Transcranial Oscillatory Direct Current Stimulation During Sleep Improves Declarative Memory Consolidation in Children With Attention-deficit/hyperactivity Disorder to a Level Comparable to Healthy Controls. Brain Stimul, 7(6), 793-799.

Prehn-Kristensen, A., Munz, M., Molzow, I., Wilhelm, I., Wiesner, C. D., & Baving, L. (2013). Sleep promotes consolidation of emotional memory in healthy children but not in children with attention-deficit hyperactivity disorder. PLoS One, 8(5), e65098.

Proietti, V., Macchi Cassia, V., dell'Amore, F., Conte, S., & Bricolo, E. (2015). Visual scanning behavior is related to recognition performance for own- and other-age faces. Front Psychol, 6, 1684.

Rasch, B., & Born, J. (2013). About Sleep's Role in Memory. Physiol Rev, 93(2), 681-766.

Ronk, Marla J, Hund, Alycia M, & Landau, Steven. (2011). Assessment of social competence of boys with attention-deficit/hyperactivity disorder: Problematic peer entry, host responses, and evaluations. Journal of abnormal child psychology, 39(6), 829-840.

Schrammel, F., Pannasch, S., Graupner, S. T., Mojzisch, A., & Velichkovsky, B. M. (2009). Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience. Psychophysiology, 46(5), 922-931.

Senju, A., & Johnson, M. H. (2009). The eye contact effect: mechanisms and development. Trends Cogn Sci, 13(3), 127-134.

Simion, F., & Giorgio, E. D. (2015). Face perception and processing in early infancy: inborn predispositions and developmental changes. Front Psychol, 6, 969.

Sterpenich, V., Albouy, G., Boly, M., Vandewalle, G., Darsaud, A., Balteau, E., . . . Maquet, P. (2007). Sleep-related hippocampo-cortical interplay during emotional memory recollection. PLoS Biology, 5(11), 2709-2722.

Stroes, A., Alberts, E., & Van Der Meere, J. J. (2003). Boys with ADHD in social interaction with a nonfamiliar adult: an observational study. J Am Acad Child Adolesc Psychiatry, 42(3), 295302.

Susilo, T., & Duchaine, B. (2013). Advances in developmental prosopagnosia research. Curr Opin Neurobiol, 23(3), 423-429.

Tessier, S., Lambert, A., Scherzer, P., Jemel, B., & Godbout, R. (2015). REM sleep and emotional face memory in typically-developing children and children with autism. Biol Psychol, 110, 107-114.

Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., . . . Nelson, C. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res, 168(3), 242-249.

Ventura, J., Wood, R. C., Jimenez, A. M., & Hellemann, G. S. (2013). Neurocognition and symptoms identify links between facial recognition and emotion processing in schizophrenia: meta-analytic findings. Schizophr Res, 151(1-3), 78-84

Verde, M. E., MacMillan, N. A., & Rotello, C. M. (2006). Measures of sensitivity based on a single hit rate and false alarm rate: the accuracy, precision, and robustness of d', Az, and A'. Percept Psychophys, 68(4), 643-654.

Wagner, U., Gais, S., & Born, J. (2001). Emotional Memory Formation Is Enhanced across Sleep Intervals with High Amounts of Rapid Eye Movement Sleep. Learning & Memory, 8, 112-119.

Wagner, U., Hallschmid, M., Rasch, B., & Born, J. (2006). Brief Sleep After Learning Keeps Emotional Memories Alive for Years. Biological Psychiatry, 60, 788-790.

Wagner, U., Hallschmid, M., Verleger, R., & Born, J. (2003). Signs of REM sleep dependent enhancement of implicit face memory: a repetition priming study. Biol Psychiatry, 62, 197210.

Wagner, U., Kashyap, N., Diekelmann, S., & Born, J. (2007). The impact of post-learning sleep vs. wakefulness on recognition memory for faces with different facial expressions. Neurobiol Learn Mem, 87, 679-687.

Walker, M. P., & van der Helm, E. (2009). Overnight therapy? The role of sleep in emotional brain processing. Psychological Bulletin, 135(5), 731-748.

Waschbusch, D. A. (2002). A meta-analytic examination of comorbid hyperactive-impulsive-attention problems and conduct problems. Psychol Bull, 128(1), 118-150.

Weiß, R. H., & Osterland, J. (2013). Grundintelligenztest Skala 1 - Revision, CFT 1-R. Göttingen: Hogrefe.

White, S. F., Marsh, A. A., Fowler, K. A., Schechter, J. C., Adalio, C., Pope, K., . . . Blair, R. J. (2012). Reduced amygdala response in youths with disruptive behavior disorders and psychopathic traits: decreased emotional response versus increased top-down attention to nonemotional features. Am J Psychiatry, 169(7), 750-758.

Wiesner, C. D., Pulst, J., Krause, F., Elsner, M., Baving, L., Pedersen, A., . . . Göder, R. (2015). The effect of selective REM-sleep deprivation on the consolidation and affective evaluation of emotional memories. Neurobiol Learn Mem, S1074-7427(1015)00032-00035.

Wilhelm, I., Prehn-Kristensen, A. , & Born, J. (2012). Sleep-dependent memory consolidation - What can be learnt from children? Neuroscience and Biobehavioral Reviews, 36(7), 1718-1728.

Wilmer, J. B., Germine, L., Chabris, C. F., Chatterjee, G., Williams, M., Loken, E., . . . Duchaine, B. (2010). Human face recognition ability is specific and highly heritable. Proc Natl Acad Sci U S A, 107(11), 5238-5241.

Yardley, L., McDermott, L., Pisarski, S., Duchaine, B., & Nakayama, K. (2008). Psychosocial consequences of developmental prosopagnosia: a problem of recognition. J Psychosom Res, 65(5), 445-451.

Figure Legends

Figure 1. Design and paradigm. During encoding participants were instructed to evaluate 80 target pictures according to their emotional expression and determine the subject's gender; here each picture was presented twice. In a subsequent baseline recognition session (old/new response), participants were shown 20 target pictures along with 20 new pictures. During the retrieval session (after 12h of either including sleep or not) the remaining 60 targets and 60 foils were presented in a first block in order to measure the recognition performance (old/new response) and in a second block to assess their emotional evaluation.

Figure 2. Results of recognition performance; left for all participants (N=32), right separated according to groups (n=16 each)

Figure 3. Pupil reactions during retrieval session; upper panel: patients with ADHD+ODD, lower panel: healthy children; lines refer to the course of pupil diameter over the first 1.5s after picture onset (black dots: target "old" pictures, white dots: foils "new" pictrues); middle bar charts (black bars: target "old" pictures, white bars: foils "new" pictrues) refer to the means of pupil reaction averaged in the time window 0.375-1.5s after stimulus onset; lower bar charts refer to p-values of the single mean comparisons between the pupil reaction evoked by old and new pictures in 0.075s intervals (bins).

Table l.Characteristics of participants

ADHD Mean ± SEM Controls Mean ± SEM ADHD vs. Controls t/U p

Age 11.4 ± 0.38 11.1 ± 0.28 0.6 .547

IQ 104.7 ± 3.8 110.4 ± 2.8 1.2 .236

Pubertal state Parental ratings 3.5 ± 0.26 3.3 ± 0.11 0.5# .724

Children's ratings 3.6 ± 0.22 3.4 ± 0.16 0.5# .696

Figural memory DCS score 50.4 ± 6.5 72.3 ± 5.1 2.5# .014

Sleep problems SSR score 25.4 1.2 20.9 0.6 3.4 .002

CSHQ score 34.9 1.5 38.6 1.0 2.9 .006

CBCL (T scores) Anxious/Depressed 62.3 ± 2.3 52.7 ± 1.2 3.7 .001

Withdrawn/Depressed 61.1 ± 1.8 53.6 ± 1.2 3.5 .001

Somatic complaints 59.0 ± 2.1 54.9 ± 2.3 1.4 .187

Social problems 67.4 ± 2.3 52.2 ± 1.1 6.1 <.001

Thought Problems 56.6 ± 1.8 52.8 ± 1.6 1.6 .130

Attention problems 71.7 ± 1.5 53.6 ± 1.0 10.5 <.001

Rule-breaking behavior 68.1 ± 1.9 52.6 ± 1.0 6.8 <.001

Aggressive behavior 73.2 ± 1.6 52.1 ± 1.3 10.6 <.001

Internalizing 62.5 ± 2.1 50.4 ± 2.3 3.8 .001

Externalizing 71.6 ± 1.4 47.2 ± 2.4 8.9 <.001

Global 71.4 ± 1.2 48.1 ± 2.5 8.5 <.001

FEFA Anger 5.6 ± 0.40 5.7 ± 0.32 0.1# .922

Fear 2.1 ± 0.39 2.9 ± 0.46 1.3# .202

Happiness 8.3 ± 0.18 7.9 ± 0.38 0.6# .626

Neutral 6.3 ± 0.23 6.9 ± 0.07 8 .216

Sadness* 5.1 ± 0.35 6.7 ± 0.32 2.9# .003

Surprise* 3.8 ± 0.44 5.2 ± 0.33 2.5# .015

Disgust* 3.8 ± 0.36 4.2 ± 0.42 0.8# .446

Global 69.8 ± 2.91 79.3 ± 1.87 2.2# .026

Note: Bold values indicate a significant comparison; ADHD, attention-deficit hyperactivity disorder; CBCL, Child Behavior Checklist; FEFA, "Frankfurter Test und Training fazialen Affekts" a computer-based program for the

training and testing facial affect recognition; *, emotional categories were not used in the experiment; U-value according to Mann-Whitney-U-Test, otherwise t-values are reported.

Table 2: Picture recognition performance (d') during baseline and retrieval session

ADHD Controls

Emotional Condition Baseline M ± SEM Retrieval M ± SEM Retrieval-Baseline M ± SEM Baseline M ± SEM Retrieval M ± SEM Retrieval-Baseline M ± SEM

SLEEP Combined 1.05 ± 0.06 0.61 ± 0.03 -0.44 ± 0.04 1.06 ± 0.06 0.62 ± 0.03 -0.44 ± 0.06

Anger 1.08 ± 0.08 0.59 ± 0.04 -0.49 ± 0.09 0.92 ± 0.07 0.73 ± 0.04 -0.19 ± 0.07

Fear 0.87 ± 0.07 0.54 ± 0.03 -0.33 ± 0.08 1.05 ± 0.12 0.43 ± 0.04 -0.62 ± 0.14

Happiness 0.99 ± 0.08 0.78 ± 0.04 -0.22 ± 0.07 1.07 ± 0.11 0.79 ± 0.06 -0.28 ± 0.11

Neutral 1.25 ± 0.10 0.53 ± 0.03 -0.71 ± 0.09 1.20 ± 0.08 0.55 ± 0.03 -0.65 ± 0.10

WAKE Combined 0.90 ± 0.07 0.45 ± 0.02 -0.45 ± 0.06 1.21 ± 0.06 0.57 ± 0.05 -0.64 ± 0.05

Anger 1.21 ± 0.09 0.43 ± 0.03 -0.78 ± 0.08 1.38 ± 0.09 0.56 ± 0.05 -0.82 ± 0.10

Fear 0.84 ± 0.14 0.38 ± 0.02 -0.46 ± 0.15 0.91 ± 0.09 0.34 ± 0.04 -0.57 ± 0.11

Happiness 0.65 ± 0.05 0.52 ± 0.03 -0.13 ± 0.05 1.34 ± 0.13 0.77 ± 0.09 -0.57 ± 0.11

Neutral 0.91 ± 0.11 0.48 ± 0.04 -0.44 ± 0.10 1.20 ± 0.11 0.60 ± 0.05 -0.60 ± 0.11

Table 3. Results of sleep parameters

ADHD Mean ± SEM Controls Mean ± SEM ADHD vs. Controls t P

Time in bed 532.9 ± 8.3 545.5 ± 6.7 1.2 .246

Sleep onset latency (min) 21.6 ± 4.3 20.9 ± 2.0 0.3 .804

Total sleep time (min) 480.4 ± 10.6 488.2 ± 8.6 0.6 .571

Sleep efficiency (%) 90.1 ± 1.2 89.4 ± 0.7 0.5 .650

Wake after sleep onset 45.5 ± 6.8 50.4 ± 3.8 0.6 .531

REM latency 109.8 ± 9.5 106.8 ± 8.8 0.2 .816

Sleep stages, time in min

stage 1 sleep 38.3 ± 4.0 46.9 ± 3.2 1.7 .104

stage 2 sleep 241.3 ± 6.9 231.3 ± 10.1 0.8 .420

stage 3 sleep 39.5 ± 2.7 39.6 ± 1.8 0.1 .962

stage 4 sleep 71.7 ± 5.2 72.3 ± 5.0 0.8 .935

non-REM sleep 352.6 ± 8.9 343.3 ± 9.3 0.7 .479

REM sleep 89.9 ± 4.8 97.9 ± 4.2 1.3 .199

Total REM density (%) 15.0 1.4 17.3 1.4 1.2 2.57

Note: SWS, slow-wave sleep; REM, rapid-eye movement sleep.