Scholarly article on topic 'The Effects of Integrating Mobile Devices with Teaching and Learning on Students’ Learning Performance: A Meta-Analysis and Research Synthesis'

The Effects of Integrating Mobile Devices with Teaching and Learning on Students’ Learning Performance: A Meta-Analysis and Research Synthesis Academic research paper on "Educational sciences"

CC BY-NC-ND
0
0
Share paper
Academic journal
Computers & Education
OECD Field of science
Keywords
{"Evaluation methodologies" / "Pedagogical issues" / "Teaching/learning strategies"}

Abstract of research paper on Educational sciences, author of scientific article — Yao-Ting Sung, Kuo-En Chang, Tzu-Chien Liu

Abstract Mobile devices such as laptops, personal digital assistants, and mobile phones have become a learning tool with great potential in both classrooms and outdoor learning. Although there have been qualitative analyses of the use of mobile devices in education, systematic quantitative analyses of the effects of mobile-integrated education are lacking. This study performed a meta-analysis and research synthesis of the effects of integrated mobile devices in teaching and learning, in which 110 experimental and quasiexperimental journal articles published during the period 1993–2013 were coded and analyzed. Overall, there was a moderate mean effect size of 0.523 for the application of mobile devices to education. The effect sizes of moderator variables were analyzed and the advantages and disadvantages of mobile learning in different levels of moderator variables were synthesized based on content analyses of individual studies. The results of this study and their implications for both research and practice are discussed.

Academic research paper on topic "The Effects of Integrating Mobile Devices with Teaching and Learning on Students’ Learning Performance: A Meta-Analysis and Research Synthesis"

Accepted Manuscript

The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and Research Synthesis

Yao-Ting Sung, Kuo-En Chang, Tzu-Chien Liu

PII: S0360-1315(15)30080-4

DOI: 10.1016/j.compedu.2015.11.008

Reference: CAE 2946

To appear in: Computers & Education

Received Date: 12 August 2015 Revised Date: 17 November 2015 Accepted Date: 19 November 2015

Please cite this article as: Sung Y.-T., Chang K.-E. & Liu T.-C., The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and Research Synthesis, Computers & Education (2015), doi: 10.1016/j.compedu.2015.11.008.

This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and

Research Synthesis

Yao-Ting Sunga, Kuo-En Changb and Tzu-Chien Liuc*

a Department of Educational Psychology and Counseling, National Taiwan Normal University, sungtc@ntnu.edu.tw

b Grad. Institute of Information and Computer Education, National Taiwan Normal University, kchan g@ntnu .edu.tw

c* Department of Educational Psychology and Counseling, National Taiwan Normal University, tzuchi en@ntnu .edu.tw

Correspondence concerning this article should be addressed to Tzu-Chien Liu, Department of Educational Psychology and Counseling, National Taiwan Normal University

Address: 162, HePing East Road, Section 1, Taipei, Taiwan E-mail: tzuchien@ntnu.edu.tw Tel: +886-2-7734-3630

The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and

Research Synthesis

Abstract

Mobile devices such as laptops, personal digital assistants, and mobile phones have become a learning tool with great potential in both classrooms and outdoor learning. Although there have been qualitative analyses of the use of mobile devices in education, systematic quantitative analyses of the effects of mobile-integrated education are lacking. This study performed a meta-analysis and research synthesis of the effects of integrated mobile devices in teaching and learning, in which 110 experimental and quasiexperimental journal articles published during the period 1993— 2013 were coded and analyzed. Overall, there was a moderate mean effect size of 0.523 for the application of mobile devices to education. The effect sizes of moderator variables were analyzed and the advantages and disadvantages of mobile learning in different levels of moderator variables were synthesized based on content analyses of individual studies. The results of this study and their implications for both research and practice are discussed.

Keywords: evaluation methodologies, pedagogical issues, teaching/learning strategies

The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and Research Synthesis

Abstract

Mobile devices such as laptops, personal digital assistants, and mobile phones have become a learning tool with great potential in both classrooms and outdoor learning. Although there have been qualitative analyses of the use of mobile devices in education, systematic quantitative analyses of the effects of mobile-integrated education are lacking. This study performed a metaanalysis and research synthesis of the effects of integrated mobile devices in teaching and learning, in which 110 experimental and quasiexperimental journal articles published during the period 1993-2013 were coded and analyzed. Overall, there was a moderate mean effect size of 0.523 for the application of mobile devices to education. The effect sizes of moderator variables were analyzed and the advantages and disadvantages of mobile learning in different levels of moderator variables were synthesized based on content analyses of individual studies. The results of this study and their implications for both research and practice are discussed. Keywords: evaluation methodologies, pedagogical issues, teaching/learning strategies

The Effects of Integrating Mobile Devices with Teaching and Learning on Students' Learning Performance: A Meta-Analysis and Research Synthesis 1. Introduction

1.1. Integrating mobile devices with learning and instruction

Mobile computers have gradually been introduced into educational contexts over the past 2 decades. Mobile technology has led to most people to carry their own individual small computers that contain exceptional computing power, such as laptops, personal digital assistants (PDAs), tablet personal computers (PCs), cell phones, and e-book readers. This large amount of computing power and portability, combined with the wireless communication and context sensitivity tools, makes one-to-one computing a learning tool of great potential in both traditional classrooms and outdoor informal learning.

With regard to access to computers, large-scale one-to-one computing programs have been implemented in many countries globally (Bebell & O'Dwyer, 2010; Fleischer, 2012; Zucker & Light, 2009), such that elementary- and middle-school students and their teachers have their own mobile devices. In addition, in terms of promoting innovation in education via information technology, not only does mobile computing support traditional lecture-style teaching, but through convenient information gathering and sharing it can also promote innovative teaching methods such as cooperative learning (Lan, Sung, & Chang, 2007; Roschelle et al., 2010), exploratory learning outside the classroom (Liu, Lin, Tsai, & Paas, 2012), and game-based learning (Klopfer, Sheldon, Perry, & Chen, 2012). Therefore, mobile technologies have great potential for facilitating more innovative educational methods. Simultaneously, these patterns in educational methods will likely not only help subject content learning, but may also

facilitate the development of communication, problem-solving, creativity, and other high-level skills among students (Warschauer, 2007).

However, despite the proposed advantages of using mobile computing devices for increasing computer accessibility, diverse teaching styles, and academic performance, currently researchers found mixed results regarding the effects of mobile-devices (e.g., Warschauer, Zheng, Niiya, Cotten, & Farkas, 2014), and very few studies have addressed how best to use mobile devices, and the effectiveness of doing so.

1.2. Review of the research into integrating mobile devices with teaching and learning

There are seven studies which reviewed the research into integrating mobile devices with teaching and learning and can be divided into two types according to the devices they focused on: (1) those focused on how laptops are used in schools and (2) those focused on the applications of various types of mobile device in education (see Appendix A).

Regarding the review of laptop-based programs, Zucker and Light (2009) believed that school programs integrating laptops into schools have a positive impact on student learning. However, they also believed that laptop use did not achieve the goals of increasing higher-level thinking and transformation of classroom teaching methods. Penuel (2006) reviewed 30 studies that examined the usage of laptops with wireless connectivity in one-to-one computer programs. Those studies found that students most often used the laptops to do homework, take notes, and finish assignments. General-purpose software such as word processors, web browsers, and presentation software were relatively common. Bebell and O'Dwyer (2010) examined four different empirical studies of laptop programs in schools. They discovered that in most schools participating in one-to-one programs there were significant increases in grade-point averages or standardized tests of student achievement, relative to schools that did not provide such programs.

In addition, they found that most students used their laptops to write, browse the Internet, make presentations, do homework, or take tests. Furthermore, teachers made more changes to their teaching methods when they had increased opportunities to use laptops. Students participating in one-to-one programs also had a deeper engagement with what they were learning when compared to control groups.

Fleischer (2012) conducted a narrative research review of 18 different empirical studies on the usage of laptops. These studies found a large range in the number of hours that students used laptops, from a few days to as little as 1 hour per week. The most frequently used computer functions were searches, followed by expression and communication. In most studies it was found that students had a positive attitude toward laptops, and felt that they were more motivated and engaged in their learning, and it was further believed that teachers conducted more student-centered learning activities. Moreover, considerable differences in classroom educational practices arose from the diversity of teachers' beliefs about the usefulness of laptops. Fleischer (2012) also found several challenges regarding the use of laptops in classrooms, such as encouraging teachers to change their previous beliefs and teaching methods (e.g., teacher-centered lectures) in response to their students' greater flexibility and autonomy; how to reconcile the conflict between the students' desire for independent study and the need for teachers' guidance; and how to facilitate teachers' competence by designing an appropriate curriculum and teaching models for laptop usage programs.

With respect to the research on the use of mobile technology in education, Hwang and Tsai (2011) provided a broad discussion of studies on mobile and ubiquitous learning published in six journals between 2001 and 2010. In their review of 154 articles, they discovered that the use of mobile and ubiquitous learning accelerated markedly during 2008; researchers mostly

studied students of higher education, and the fields most often researched were language arts, engineering, and computer technology. Frohberg, Goth, and Schwabe (2009) categorized 102 mobile-learning projects, and discovered that most mobile-learning activities occurred across different settings, and took place within a physical context and an official environment, such as a classroom or workplace. Regarding the pedagogical roles that mobile devices play in education, most research has used mobile devices primarily as a sort of reinforcement tool to stimulate motivation and strengthen engagement, and secondarily as a content-delivery tool. Few projects have used mobile devices to assist with constructive thinking or reflection. Furthermore, most learning activities using mobile devices have been controlled by the teacher, with there being only a handful of learner-centered projects in existence. Concerning the communication functions, very few projects have made any use of cooperative or team communication. Moreover, the vast majority of studies have made use of novice participants; little research has involved experienced participants. When sorted according to educational goals, it was found that the vast majority of research has focused on lower-level knowledge and skills, and ignored higher-level tasks such as analysis and evaluation. Wong and Looi (2011) investigated the influence of mobile devices on seamless learning. Seamless learning refers to a learning model that students can learn whenever they want to learn in a variety of scenarios and that they can switch from one scenario or one context to another easily and quickly (Chan et al., 2006; Wong & Looi, 2011). Wong and Looi (2011) selected and analyzed a sample of 54 articles on the use of mobile devices to facilitate seamless learning, and found that all 54 articles contained 10 features, including formal and informal learning, personalized and social learning, and learning across multiple durations and locations.

1.3. Purposes of this study

While analyzing the overall effectiveness of using mobile devices in education, the review research described above has two major limitations. First, all of the reviews adopted a qualitative approach, which may be able to describe and summarize how related studies were conducted and the problems encountered during their execution, but this makes it difficult to evaluate the effects actually produced by the mobile devices in general and the specific moderator variables. Second, much of the previous review research has focused on the usage of laptop computers as the subject of their investigation (e.g., Penuel, 2006), and most of the research participants in those reviewed articles were in primary and secondary schools. However, the many new developments in mobile hardware have meant that diverse age groups now use different devices. Therefore, many different moderators need to be accounted for when attempting to determine whether or not intervening variables have an effect.

In the context of this background, the primary goal of this study was to perform a meta-analysis and research synthesis of the research on the usage of mobile devices in education published in the last 2 decades. Specifically, the purposes of this study were as follows:

1. To provide an overview of the status of the use of mobile devices in educational experimental studies, including who is using them, which domain subjects are being taught, what kinds of mobile device and software are being used, where such programs take place, how the devices are used in teaching, and the duration of the interventions.

2. To quantify the overall effectiveness of integrating mobile technologies into education on student learning achievement.

3. To determine how the moderator variables influence the effects of mobile devices on learning achievement.

4. To synthesize the advantages and disadvantages of mobile learning in levels of moderator variables based on the content analysis of articles related with moderator variables. 2. Method

2.1. Data sources and search strategy

Journal articles published during the period 1993-2013 were searched electronically and manually, and via reference-list checking to retrieve the relevant literature. For electronic searches, the main databases were the Education Resources Information Center (ERIC) and the Social Sciences Citation Index database of the Institute of Science Index (ISI). Two sets of keywords were searched: (1) mobile-device related keywords, including mobile, wireless, ubiquitous, wearable, portable, handheld, cell phone, personal digital assistant, PDA, palmtop, pad, web pad, tablet PC, tablet computer, laptop, e-book, digital pen, pocket dictionary, and classroom response system; and (2) learning-related keywords, including teaching, learning, training, and lectures. The two sets of keywords were combined when searching the electronic databases. Manual searches included the major journals in educational technology and e-learning, such as the Australian Journal of Educational Technology, British Journal of Educational Technology, Computers & Education, Computer Assisted Language Learning, Educational Technology Research and Development, Journal of Computer Assisted Learning, Language Learning & Technology, and ReCall.

After collating all of the related literature, another round of searches was conducted using the reference lists found in the literature yielded by the electronic search to find any omitted but relevant works.

2.2. Search Results

Initial screening. The initial search yielded 4121 abstracts published between 1993 and 2013 (1718 in ERIC and 2403 in ISI) that were related to mobile learning. Two authors read each abstract of the article and judged whether or not the article was related to teaching and learning with a mobile device, which resulted in the selection of 925 articles.

Screening for experimental and quasiexperimental research. In the second stage, the studies were screened according to the research method. Experimental studies (including the pretest-posttest equivalent-group, posttest-only equivalent-groups, and randomized matched subjects and posttest-only control-group designs) and quasi-experimental studies (including the pretest-posttest nonequivalent-groups and counterbalanced designs; see Ary, Jacobs, & Razavieh, 2002; Best & Kahn, 1998, for a reference) were included. Conceptual analysis or research reviews, case studies and qualitative research, survey research, and pre-experimental studies were all excluded at this stage. At the completion of this stage there remained 182 articles.

Application of inclusion and exclusion criteria for the meta-analysis. Studies were eligible for inclusion in the meta-analysis if they conformed with the following three criteria:

1. The application of mobile devices was the key variable of the study. The experimental group had an intervention that used mobile devices, and was compared with a control group that used traditional learning. If both the experimental and control groups used mobile-device interventions, and only the teaching methods were compared, then the study was excluded (e.g., Hsu, Hwang, Chang, & Chang, 2013; Jeong & Hong, 2013; Li, Chen, & Yang, 2013; Ryu & Parsons, 2012).

2. Sufficient information was presented to calculate effect sizes, such as means, standard deviations, t, F, or x values, or the number of people in each group. Articles in which

the sample sizes of each group were not cited, lacked any inferential statistical results, or had inferential statistical results but were still inadequate for calculating an effect size according to Lipsey and Wilson (2000) were excluded (e.g., Gleaves, Walker, & Grey, 2007; Langman & Fies, 2010; Purrazzella & Mechling, 2013; Yang et al., 2013).

3. Experimental results were presented with learning achievement as a major dependent variable measured by standardized or researcher-constructed tests. Studies for which the results were related to affective variables (e.g., learning attitude or learning motivation) or interaction between peers but without learning achievement were excluded (e.g., Jian, Sandnes, Law, Huang, & Huang, 2009; Lan et al., 2007; Mouza, 2008; Siozos, Palaigeorgiou, Triantafyllakos, & Despotakis, 2009) . Application of these criteria yielded 110 articles that were acceptable for inclusion in the meta-analysis. For a complete list of these references, please see our online supplemental archive. 2.3. Selection and coding of the outcome variables

One of the most used framework for representing the research content and dimensions is the activity theory (AT), which uses activity as a unit for analyzing human practices (Bakhurst, 2009). Recently, several researchers have used the AT as a theoretical basis for analyzing mobile learning studies (e.g., Frohberg, Goth, & Schwabe, 2009; Sharples, Taylor, & Vavoula, 2007) or for designing mobile learning scenarios (e.g., Zurita & Nussbaum, 2007). This study used six major components of AT to select moderator variables and analyze mobile learning: (a) Subjects: which involve all the people who may be involved in learning curriculums through mobile devices, such as students of different age levels or teachers of different levels of teaching expertise. (b) Objects (or objectives) of the mobile learning, which focus on the goal such as acquiring cognitive skills or enhancing learning motivation through mobile devices. (c)

Tools/instruments in the mobile learning, which may be artifacts (e.g., hardware and software) or learning resources (e.g., tutors). (d) Rules/control for the activity, which are norms or regulations that circumscribe the mobile activities, such as the procedure in teaching scenarios designed for the learning pace or styles designated. (e) Context of the activity, which refers to the physical (e.g., classroom or museum) or social (e.g., ambience of learning in a group) environments for conducting mobile learning. (f) Communication/interaction, which refers to the method of interaction between users and mobile technologies (such as the process teachers' adaption to mobile devices) or the communications styles among learners.

Research name. This refers to the first author's name, the year of publication, and the article title.

Research participants. In this review, for all the reviewed articles, the research participant corresponded to the "subject" of the AT framework, and was coded by their learning stages, including kindergarten, elementary school, middle school, (senior) high school, university, graduate school, teachers, adults, and mixed.

Treatments. The treatments of the reviewed articles corresponded to the "tools" component (e.g., the hardware and software), the "rules/control" component (e.g., the teaching methods and domain subjects), and the "context" component (e.g., intervention settings, intervention duration). The description for each of these treatment variables are as follows:

1. Hardware: Different types of mobile hardware, which comprised PDAs, laptops, tablet PCs, cell phones, iPods, MP3 players, e-book readers, pads, digital pens, pocket dictionaries, and classroom response systems (CRSs), or any mixture of thereof.

2. Software: Different types of software, which encompassed general-purpose software and learning-oriented software (Sung & Lesgold, 2007), the former referring to

commercial software currently in circulation that was not designed especially for teaching and learning (e.g., word processors or spreadsheets), and the latter having been designed specifically for educational programs or goals.

3. Teaching method: Different teaching methods, including lectures, cooperative learning (students were divided into groups and completed learning tasks collaboratively, e.g., Chang, Lan, Chang, & Sung, 2010; Huang, Liang, Su, & Chen, 2012), inquiry-oriented learning (using problem-, project-, or inquiry-based methods with mobile devices for learning, e.g., Chen, 2010; Lowther, Ross, & Marrison, 2003), self-directed study (teachers/researchers did not designate or implement specific teaching scenarios for students to follow, students use mobile devices for self-paced learning, e.g., Chen & Li, 2010; Chen, Tan, & Lo, 2013) , computer-assisted testing/assessment (using mobile devices for formative assessment or quizzes in classroom or outdoors, e.g., Agbatogun, 2012), and mixed methods thereof.

4. Domain subject: Domain subjects were analyzed to establish the relative effectiveness of mobile devices for teaching different subjects, including language arts, social studies, science, mathematics, multidisciplinary (if the mobile devices were used in several subjects, but measurement of the achievement was presented as a whole instead of separately, this was coded as multidisciplinary), specific abilities (e.g., spatial ability or creativity), health-care programs, education, psychology, and computer and information technology.

5. Implementation setting: Implementation settings were included to establish whether the impact of mobile devices on learning differed according to the environment in which they were used, which included classrooms, outdoors (e.g., zoo or campus

gardens), museum, laboratory, workplaces, and unrestricted settings (devices may be used anywhere).

6. Intervention duration: Different periods of time for the intervention, including periods no more than four hours (< 4 hours), between five and 24 hours (> 4 and < 24 hours), between one day and seven days (> 1 day and < 7 days), between one week and four weeks (> 1 week and < 4 weeks), between one month and six months (> 1 month and < 6 months), and more than six months (> 6 months).

Dependent variables. The dependent variables corresponded to the "Objective" of the AT model, including two categories: the learning achievement dependent variables refer to measurement of cognitive outcomes such as knowledge application, retention, problem solving... etc. The affective variables refer to measurement of motivation, interest, participation... etc.

2.4. Data Analysis

Calculating the effect size. The following meta-analysis steps recommended by Borenstein, Hedges, Higgins, and Rothstein (2009) were employed in this study: (a) determine the effect sizes of each article, (b) determine the weighted mean effect size across articles, (c) calculate the confidence interval for the average effect size, and (d) determine whether the effect size of any particular group was influenced by a moderator variable based on a heterogeneity analysis (QB).

Two formulae were used to calculate the effect sizes of the studies. Cohen's d formula was used to determine the effect size for the experimental research with random assignment and without a pretest:

((«! - 1>!2 +(n 2 - 1)5

(n + n 2 - 2 )

where x 1 and X 2 represent the mean scores, n1 and n2 represent the sample sizes, and

2 _„j 2

and s 22 represent the variances of the experiment and control groups, respectively.

For experimental or quasiexperimental research with pretests, it was proposed that the pretest should be taken into consideration instead of using the posttest in order to mitigate possible selection bias (Furtak, Seidel, Iverson, & Briggs, 2012; Morris, 2008). Hence, the formula developed in Comprehensive Meta Analysis (version 2.0) was used to obtain effect sizes for research with pre- and posttests:

(X1_ Post - X1_Pre) - (x2_ Post - X2_Pre)

ESpre/Post Test Two Groups

SDPost

where X1Pre and X1_Post represent the mean scores of the experimental group for the pretest and posttest, respectively, and X2_Pre and X2_Post represent the mean scores of the control group for the pretest and posttest, respectively. SDPost can be calculated as follows:

'Post'-

nn2_Post-1)sf Post+(n1_Post-1K2 Post

(n2_ Post+n1_ Post-2) (3)

where nl Post and n 2_ Post represent the sample sizes of the experimental and control groups,

respectively, for the posttest, while

Post and

5 2 Post

represent the variances of the

experimental and control groups, respectively.

The two types of effect sizes were calibrated using the sample weights to calculate a Hedges' g according to

4 df - 1

Evaluating publication bias. The fail-safe N (i.e., classic fail-safe N) of Rosenthal (1979) was used to estimate how many insignificant effect sizes (unpublished data) would be necessary to reduce the overall effect size to an insignificant level. The comparison criterion was 5n+10, where n is the number of studies included in the meta-analysis. If the fail-safe N is larger than 5n+10, it means that the estimated effect size of unpublished research is unlikely to influence the effect size of the meta-analysis. Moreover, the present study also adopted Orwin's fail-safe N (Orwin, 1983) to estimate the number of missing null studies that would be required to bring the mean effect size to a trivial level. 3. Results and Discussion 3.1. Descriptive statistics information

Table 1 presents the distribution of moderator variables and their corresponding effect sizes (g). In total there were 110 articles, 419 effect sizes, and 18749 participants. The largest proportion of studies involved the college-student-level learning stage (38.4%); the next largest group was elementary-school students (33.9%). More studies used learning-oriented software (62.7%) than general-purpose software (34.5%). Handheld devices (including PDA, cell phone, iPod, MP3 player, digital pen, pocket dictionary, and classroom response system) were the most widely studied of the hardware (72.7%), followed by laptops (21.8%, including laptop, pad, tablet PC, and e-book reader). The largest proportion of studies were set in the classroom (50.0%), followed by outdoors (15.5%) and unrestricted settings (16.4%). For teaching methods, self-directed study (30.9%) was the most frequently researched, and the most frequently studied intervention duration was > 1 month and < 6 months (32.7%), followed by > 1 week and < 4 weeks, (25.5%) and < 4 hours (20.9%). Finally, language arts was the most often studied domain subject (34.7%), followed by science (22.9%).

In addition, among those moderating variables, the evolution of hardware used, implementation setting, and domain subjects may have seen the greatest amount of change during 1993-2013. The trends of those moderating variables during the two decades are shown in Figure 1 to Figure 3. Figure 1 shows the evolution of the use of different mobile devices. Compared with laptop and mixed categories, handheld devices (e.g., cell phone) had been used more since 2009-2013 and showing an obviously rising trend. Moreover, Figure 2 shows the evolution of the use of different implementation settings. Compared with informal settings (e.g., museums; outdoors) and unrestricted categories, formal settings (e.g., classroom; laboratories) had been set more since 2004-2008 and showing an obviously rising trend. Finally, Figure 3 shows the evolution of the domain subject. Compared with other domain subjects, language arts had been studied more since 2009-2013 and showing an obviously rising trend. 3.2. Overall effect size for learning achievement

The distribution of the effect sizes of the 110 articles is shown in Figure 4. The forest plot of effect sizes and the 95% confidence interval of the 110 articles are shown in Appendix B. There were two unusually large effect sizes, g = 4.045 (Hsu & Lee, 2011) and g = 3.050 (Wu, Sung, Huang, Yang, & Yang, 2011), which were larger than the average effect size for the entire collection of 110 articles (g = 0.628) more than three standard deviations, and so these were not included in further analyses (Lipsey & Wilson, 2000). Using the procedure of Lipsey and Wilson (2000) with a random-effects model to integrate the effect sizes of the 108 articles, there was an overall moderate mean effect size of 0.523, with a 95% confidence interval of 0.4320.613. Researchers (e.g., McMillan, Venable & Varier, 2013; van der Kleij, Feskens, & Eggen, 2015) have proposed that Hattie's (2009) criterion is appropriate for evaluating the effect sizes in educational contexts. Therefore, we adopt Hattie's (2009) criterion to interpret the effect size

of our research, in which an effect size of > 0.60 is high, around 0.40 is medium, around 0.20 is low, and < 0.20 is with little significant meaning. In this study it was found that using mobile devices in education had a medium effect size for learning achievement; in other words, 69.95% of learners using a mobile device performed significantly better in dependent variables related with cognitive achievement than those not using mobile devices.

The Q statistics show that the effect sizes in the meta-analysis were heterogeneous (Qtotal = 626.302, z = 11.315, p < .001), which indicates that there are differences among the effect sizes resulting from factors other than subject-level sampling error, such as the diversity of the learning stage, the hardware used, and the teaching methods.

Furthermore, we also conducted an analysis for the studies related to the affective variables (such as motivation, engagement, attitude, satisfaction, preference). The overall mean effect size of the 22 articles was 0.433 (z = 6.148, p = 0.001), with a 95% confidence interval of 0.295-0.570. According to Hattie's criterion, there is a medium effect size for affective variables when using mobile devices in educational context.

The overall mean effect size for learning achievement in this meta-analysis was 0.523, meaning that learning with mobiles is significantly more effective than traditional teaching methods that only use pen-and-paper or desktop computers. Compared to past comparisons of effects between using computers and not using computers in education, the effect size of using mobile devices reported herein seems larger than those found in meta-analysis into desktop-computer-based instruction, such as in the studies of Kulik and Kulik (1991) and Tamim, Bernard, Borokhovski, Abrami and Schmid (2011), who found mean effect sizes for computer-based instruction of 0.30 and 0.35, respectively. One of the reasons for the different effect sizes may be differences in the features of desktops and mobile devices; however, there are alternative

explanations, including differences in the meta-analysis methodology, dependent variable measurements, or software employed. Whether computer-based instruction would be able to enhance students' learning motivation remained equivocal (e.g., Jabbar & Felicia, 2015; Wouters, van Nimwegen, van der Spek, & van Oostendorp, 2013). Our study found that mobile learning was able to facilitate students' affective learning outcomes, which provides more convergent evidence for the effects of using computers in learning and teaching. Possible reasons may include that mobile learning integrated more diverse type of teaching/learning strategies and involved more different learning scenarios in different situations (see next section for more descriptions). However, because many of the articles included in our study used teaching programs lasted for very short-term durations (see next section), the effect of novelty for technology should be taken into consideration.

3.3. Effect sizes of learning achievement for moderator variables

To learn more about the effects of moderating variables on mobile devices with teaching and learning, this study conducted analyses for the effects of learning achievement with moderator variables. Because there were only 22 studies which related to affective dependent variables can calculate effect size, which is not comprehensive enough to cover different levels of moderating variables, the moderator analyses did not include the affective effects.

As indicated in Table 1, some levels of the moderator variables included small samples, and so a few of the levels were merged within some moderator variables. For the learning stage, kindergarten and elementary school were combined into a "young-children" category; middle schools and high schools were combined into "secondary-schoolers;" and college and graduate students, teachers, and working adults were combined into "adult users." With respect to the hardware, laptops, tablet PCs, and e-book readers were combined into a "laptops" category, while PDAs, iPods, MP3 players, cell phones, digital pens, dictionaries, and classroom response

systems were bundled together to form one "handheld" category. In terms of function, digital pen is different from other handheld devices, such as iPod, PDA, and smart phone. Also, there was only one study on digital pen. Therefore, it was excluded in our moderator analysis. In terms of the settings, classrooms, laboratories, and workplaces were combined into "formal learning environments," while museums and outdoors were combined into "informal learning environments" (Dierking, Falk, Rennie, Anderson, & Ellenbogen, 2003). Intervention durations were also combined, with < 4 hours, > 4 and < 24 hours, and > 1 day and < 7 days becoming "< 1 week." For domain subjects, specific abilities and multidisciplinary were combined into "domain-general subjects." In addition, health-care programs, education, psychology, and computer and information technology were combined into "professional subjects." For teaching methods, discovery and exploration, problem-solving, and project-based learning were combined into "inquiry-oriented learning." Moreover, the learning methods of self-directed study and podcasting were combined into "self-directed study." Tables 2 list the effect sizes for the moderator variables.

Learning stage. Table 2 indicate that young children had a high effect size on learning achievement (g = 0.636, z = 8.000, p < .001), while adults (g = 0.552, z = 7.360, p < .001) and secondary-schools (g = 0.451, z = 4.274, p < .001) had medium effect sizes. However, Mixed (g = 0.086, z = 0.503,p = .615) did not show significant effect sizes. The QB achieved significance (QB = 9.226, p = .026), meaning that the mean effect size different significantly between the categories.

The results indicated that mobile-assisted learning/instructions were not effective for groups with mixed-age students. The possible reason may be that it is difficult to design appropriate teaching method or material for students with different needs and competence in the

same group.

Hardware used. Table 2 gives the effect sizes for the usage of different types of hardware in mobile learning. While ignoring the "not mentioned" category, handheld devices (g = 0.591, z = 10.992, p < .001) were associated with a medium effect size, while laptops (g = 0.309, z = 3.350, p = .001) were associated with a low effect size. The QB was significant (QB = 18.426, p < .001), indicating that the effect sizes differed significantly among the various categories. The R2 was 7%, meaning that 7% of total between-study variance in effects can be explained by hardware used.

The positive learning outcomes of implementing handhelds could be attributed to their features. For example, to make use of the portability and communication functionality of cell phones, the short message service were used to help teach foreign language vocabulary (e.g., Ba§oglu & Akdemir, 2010; Lu, 2008; Saran, Seferoglu, & Cagiltay, 2012), and because the messages were short, students could efficiently use their short periods of spare time to take small "bites" out of the material. Another example is the use of cell phones to communicate, make records, and give and receive feedback. These functions can remind students about their learning schedule, and promote self-awareness (Liu, Tao, & Nee, 2008; Runyan et al., 2013) and self-regulation (Kondo et al., 2012). The aforementioned advantages of the handhelds created the environment for seamless learning, which should be able to prompt better learning outcomes.

According to the analysis result, the implementation of handhelds induced higher learning outcomes than the implementation of laptops. It is perhaps due to the fact that studies with handhelds tend to integrate innovative teaching methods (Lu, 2012). Among the handheld research, there was 31.6% employing teaching methods, such as inquiry-oriented and cooperative learning (Table C1 of Appendix C). In contrast, in a large portion of the laptop-

related studies (50.0%, Table C1 of Appendix C), the computers were placed into the classroom and used simply for lectures, self-directed study, or with no specific teaching methods.

It is important to note here that most of the research on handhelds in education has involved only short-term interventions, with 29.1% (Table C2 of Appendix C) testing their effectiveness within 1 week. These users of handhelds also probably experienced a transient effect because of their novelty (Kulik & Kulik, 1991). In contrast, most of the research on laptops involved long-term use, with 25.0% (Table C2 of Appendix C) being used for > 6 months. Long-term laptop use without appropriate supporting logistics may reduce both the students' level of commitment and the teachers' willingness to use computers to integrate their teaching with the students' learning (Drayton, Falk, Stroud, Hobbs, & Hammerman, 2010; Inan & Lowther, 2010; Penuel, 2006).

Software used. The data given in Table 2 indicated that the effect sizes for learning-oriented software (g = 0.590, z = 9.699, p < .001) approached high effect size, and generalpurpose software (g = 0.429, z = 5.407, p < .001) had medium effect size. The QB did not achieve significance at thep < .05 level (QB = 3.025, p = .220), which means that the average effect size did not differ significantly between the two categories.

According to the survey results, after 1990 most of the software that the teachers used was actually made for general purposes (e.g., word processors, spreadsheets, and web browsers) (Becker, 1991, 2001; Drayton et al., 2010), instead of learning-oriented software tailored for teaching and learning tasks. This made it difficult for most teachers to achieve the goal of greater efficiency and effectiveness in education using the technology-adapted instruction that they applied (Sung & Lesgold, 2007; Weston & Bain, 2010). The present study indicates that the aforementioned shortage of learning-oriented software has improved, with software specifically

designed for teaching and learning goals or activities being used in 62.7% of the research, and only 34.5% of the studies using general-purpose software.

Even though there was no significant difference between learning oriented software and general-purpose software in our research, learning oriented software showed interesting features in mobile based learning. First, the software and the curriculum were closely integrated. As an example, Looi and colleagues (Looi et al., 2011; Zhang, et al., 2010) combined educational software with cell phones to make a mobilized curriculum for elementary-level natural science, which was able to implement seamless learning in classrooms, outdoors, and in the home. Their designs were not only based on the pedagogy of inquiry learning, but also promoted formative assessment, cooperative learning, and social interaction in teaching tasks. The second feature of learning-oriented software is that it provides diverse educational activities. Within the studies included in this research, those in which learning-oriented software was used implemented various educational methods, most of which were related to inquiry, cooperation, game-based learning, problem-solving, and formative assessment. On the other hand, for those studies using general-purpose software, lectures and self-directed study were implemented. Moreover, among the 37 studies with the general-purpose software, 6 of them did not mention the teaching methods (Table C1 of Appendix C). The third feature of learning-oriented software is its ability to enable elaborate and efficient designs for teaching strategies and learning scenarios. The steps and procedures of the aforementioned teaching strategies, such as inquiry, cooperation, game-based learning, and problem-solving, were all fairly complex. Learning-oriented software allowed teachers with no programming skills to flexibly and efficiently implement mobileassisted education. For example, Lan et al. (2007; Lan, Sung, & Chang, 2009) designed an English foreign-language learning model based on cooperative learning and reciprocal teaching.

Procedures related to reciprocal teaching, such as reading text, questioning and probing, answering and feedback, were all designed for specific modules that could be further arranged according to the needs of different teaching situations. Teachers could substitute their own material, or even completely customize their program. In addition, the research of Roschelle et al. (2010) on cooperative learning set out three stages of design and implementation for modules, modules for experiments and classroom tryouts, and modules for classroom implementation. After 2 years of designs, tryouts, and revisions, their PDA-based cooperative learning modules were able to integrate the mathematics content, cooperative learning procedures, and teacher-training programs for efficient use in the classroom.

Implementation settings. As indicated in Table 2, when the "not-mentioned" category is ignored, informal settings had a high effect size (g = 0.768, z = 7.096, p < .001), while unrestricted settings (g = 0.550, z = 5.887,p < .001) and formal settings (g = 0.430, z = 7.328,p < .001) had medium effect sizes. The effect size of informal setting was larger than that of the formal setting, as the 95% of confidence intervals of the two effect sizes did not overlap. The QB was significant (QB = 7.993, p = .046), showing that the average effect size differed significantly with the category. The R was 8%, meaning that 8% of total between-study variance in effects can be explained by implementation settings.

As found in the present study, the effect size was larger for using mobile devices in the outdoors and informal locations than for using them in more formal places. Some observations on the use of mobile devices in informal places may be helpful for explaining this phenomenon. First, this could be due to the motivation induced by the novelty of the technology and activities. Students are keen to go outside or to museums to learn, and combining this with the use of novel learning tools can facilitate learners' motivation (e.g., Zhang, Sung, Hou, Chang, 2014). The

second is that most of the informal educational models, software functionality, and hardware characteristics were closely integrated in the included research, and this probably improved the learning effects. In the present study, 77.9% of informal learning-oriented software was specially designed for specific learning scenarios in specific settings (Table C3 of Appendix C). These more elaborately designed teaching procedures allow educational effects to become more apparent. For instance, when learning in museums, one of the important issues is how to guide learners' attention to exhibitions through an appropriate learning process, and informative and interesting activities to promote interaction among visitors, computers, and the historical contexts (e.g., Hsi, 2003; Sung, Hou, Liu, & Chang, 2010). Several of the studies included in our research combined the models of role-playing games and problem-solving to immerse learners in the historical events, engaging them to observe and learn target exhibits more deeply (e.g., Huizenga, Admiraal, Akkerman, & ten Dam, 2009; Sung, Chang, Hou, & Chen, 2010). Similarly, researchers are also concerned with how to make the fieldwork involved in the natural and social sciences structuralized, focused, and efficient, rather than loose, absent-minded, and ineffective. In several studies (e.g., Hwang, Chu, Lin, & Tsai, 2011; Liu, Tan, & Chu, 2009), the researchers tried to make observations, note-taking, problem-solving, information exchanges, and discussion more structured, and to sharply focus the students' learning process by integrating mobile devices with other peripheral devices such as camcorders, positioning functions, and measuring facilities.

Teaching methods. The data regarding the effect size for different teaching methods are given in Table 2. Three high effect sizes were found for inquiry-oriented (g = 0.844, z = 8.400, p < .001), mixed methods (g = 0.839, z = 5.702, p < .001), and computer-assisted testing (g = 0.656, z = 3.661, p < .001). Lectures (g = 0.394, z = 3.120, p = .002) and self-directed study (g =

0.440, z = 5.492, p < .001) were around medium effect sizes. However, cooperative learning (g = 0.261, z = 1.673, p = .094) and game-based learning (g = 0.407, z = 1.922, p = .055) did not show significant effect sizes. The QB achieved statistical significance (QB = 26.744, p < .001), indicating that the average effect sizes differed significantly among the various categories. The R2 was 12%, meaning that 12% of total between-study variance in effects can be explained by teaching methods.

The unique features of mobile devices can enhance the essential functionalities of certain specific teaching methods, and thus promote educational outcomes. Because each student has his own mobile device, this "individuality" combined with wireless communication enabled more accessible self-paced and self-directed study. Combining the features of individuality and instant message delivery resolves the past difficulties of putting instant formative assessment into the classroom (e.g., Chen & Chen, 2009), such that these assessments can even be performed outdoors with equal ease (e.g., Shih, Kuo, & Liu, 2012). Another feature that empowers the teaching and learning process is the portability and context awareness of mobile devices. These two features allow learners to exploit the information in the environments in which they are situated, and to retrieve, record, and react to the data needed to resolve their learning issues by traversing multiple learning environments, such as fieldwork and museums (e.g., Tan, Liu, & Chang, 2007).

It is note-worthy that although researchers (Kukulska-Hulme & Shield, 2008; Roschelle & Pea, 2002) have proposed that conveying information and giving feedback via mobile devices can help to keep learners in touch with their peers, promote discussions, and to facilitate the effects of cooperative learning, our study found that in general theses features did not help enhance cooperative learning outcomes. The researchers of cooperative learning used mobile

devices' features of individuality and sharing coupled with mechanisms for enhancing social interaction, such as co-constructing concept maps (Lai & Wu, 2006), peer evaluation (Lan, Sung, & Chang, 2007; Roschelle et al., 2010), and building consensus (Zurita & Nussbaum, 2004). Interestingly, perhaps these methods had facilitated the positive interactive relationships among team members (e.g., Lan et al. 2007; Zurita & Nussbaum, 2004), however, these teaching methods did not enhance the learning outcomes compared with the cooperative scenarios without using mobile devices. There are at least two possible reasons for the results. Firstly, the cooperative learning tasks in those studies, when coupled with mobile devices, may be helpful for increasing the interactive behaviors and social cohesions among team members. However, the increased social cohesion may not be powerful enough to enhance learning achievement. As Slavin (2012) proposed, whether higher social cohesion is related with higher learning achievement is not conclusive. Those methods used in the above-noted research may be insufficient to empower the cognitive elaboration processes imperative for enhancing students' learning. In those studies students in both the control and treatment groups received cooperative treatments: The only difference was mobile-device usage. Thus, the inherent effects of mobile devices may not go much beyond sharing, communicating, and consensus building. Therefore, elaborate design of learning scenarios, such as mechanisms for prompting questioning and explanatory strategies (Byun, Lee, Cerreto, 2014; Gillies & Haynes, 2011) specifically related with the learning content, may be needed to be incorporated into the mobile-device based activities in order to enhance students' cognitive elaboration processes and outcomes. The second possible reason is that the intervention durations of the mobile-based cooperative learning programs were not long enough to produce positive effects. Researchers have proposed that several weeks of duration is helpful for producing positive learning outcomes in cooperative

learning (Slavin, 1993), as sufficient time is important for learners to get familiar with team members, tasks, and required procedure (Slavin, 1977). Time for familiarization may be even more important for mobile-devices based cooperative learning because learners need time to get familiar not only with members, tasks, and procedure, but also with the hardware and software. Most of the research included in our study lasted for less than one month, which may be too short for the programs to produce sound effects.

Another note-worthy finding is that game-based learning did not achieve a significant overall effect in mobile learning, either. The major reason may be that most of the studies (e.g., Ketamo, 2003; Kim et al., 2011; Riconscente, 2013) focused on using the mobile devices to provide learners with a handy and individualized game-based environment to enhance their motivation and engagement. However, the relationships between the concepts to be learned and the content of the game may not have been closely integrated, and therefore the effects of learning might not have been illustrated.

Researchers have pointed out that computer interventions in education have not yet led to practical implementations of innovative educational methods (Ertmer & Ottenbreit-Leftwich, 2010; Gerard, Varma, Corliss, & Linn, 2011). Contrarily, it was found in the present study that mobile devices seemed to elicit much more diverse and innovative educational methods from researchers.

Intervention duration. When the "not-mentioned" category is ignored, interventions of > 1 month and < 6 months duration (g = 0.566, z = 6.870, p < .001), those of > 1 week and < 4 weeks duration (g = 0.552, z = 5.644, p < .001), and those < 1 week had medium effect sizes (g = 0.479, z = 5.175, p < .001). Interestingly, interventions conducted for durations of > 6 months had a non-significant effect size (g = 0.287, z = 1.942, p = .052). The QB was not significant (QB

= 4.924, p = .295), which suggests that the effect size did not differ significantly between these categories.

The non-significance of the effect size in long-term duration (> 6 months) is counterintuitive, but consistent with those of Kulik and Kulik (1991), who found that computer-based instruction had a greater effect when the duration was shorter. Kulik and Kulik (1991) and Cheung and Slavin (2013) proposed three reasons for why short-term treatments have better effects: high novelty value, stronger interventional supports, and different measurement tools for the dependent variables. These explanations are also applicable to the present findings. In most studies with intervention durations less than 6 months, the use of mobile devices and the applied teaching methods were both novel, so the students were more easily engaged in the activity. Cross-analysis of intervention duration with other moderator variables provides data that supports these arguments. For example, most research that took place over a 6-month period used general-purpose software (66.7%; Table C2 of Appendix C), which did not necessarily match the needs of the learning scenarios in specific learning topics. Furthermore, around half of the studies (44.4%; Table C2 of Appendix C) with durations of > 6 months placed the computers directly in the classroom and did not specify the teaching methods to be used to achieve specific educational goals. Conversely, 57.1% of the studies lasting for > 1 month and < 6 months used learning-oriented software for specific teaching and learning goals, and 94.3% specified a specific teaching strategy instead of simply using computers for some unspecified purpose in the classroom (Table C2 of Appendix C).

In terms of the interventional supports, in most short-term studies, researchers could gather all of their resources for one shot, so they chose the most appropriate hardware and software with more diverse functionality, prepared more elaborate learning activities, and made

every effort to control confounding factors. However, in studies lasting > 6 months, the longer duration made it more difficult to support the use of diverse resources, finding logistic assistance for technological problems, and maintaining the enthusiasm associated with using new technologies. For example, Shapley, Sheehan, Maloney, & Caranikas-Walker (2010) found that in laptop immersion schools, after four years of implementation, only 6 of 21 schools reached a substantial level of immersion, and the level of student access and use of laptops in classrooms declined during the period of implementation because of insufficient support.

Research in the field of education mostly advocates that long-term teaching interventions are important for obtaining reliable results (Hsieh et al., 2005; Pressley & Harris, 1994), but in the present study it was found that long-term interventions with mobile devices in classrooms did not necessarily lead to better effects. Such findings echo comments made by many researchers about the use of laptops in the classroom: If computers are simply given to teachers and students to use for a long time without any positive guidance, it will not necessarily produce satisfactory educational outcomes (Holcomb, 2009; Zucker & Light, 2009), especially for higher levels learning skills such as reasoning and problem solving (Drayton, et al., 2010). In order for there to be abundant effects, long-term interventions need logistical support to integrate advanced technologies with innovative and elaborate educational methods. Information technology applications in the classroom must first go through adoption and adaptation before they can proceed to innovation. These processes are also likely to take longer than 1 year (Gerard et al., 2011), or even up to 3 years (Bebell & O'Dwyer, 2010). During such a long-term process, if the main support provided to teachers and students is enthusiasm rather than appropriate support such as hardware, software, and instructional designs, computer use in the classroom will ultimately be merely superficial.

Domain subjects. The data in Table 2 indicate the effect sizes for different domain subjects. Social studies (g = 0.768, z = 3.682, p < .001) had a high effect size, while professional subjects (g = 0.592, z = 6.808, p < .001), science (g = 0.565, z = 6.397,p < .001), language arts (g = 0.473, z = 6.352,p < .001) and mathematics (g = 0.337, z = 2.628,p = .009) had medium effect sizes. No significant effect size was obtained for using mobile devices for domain-general abilities (g = 0.151, z = 0.868, p = .386). The QB did not achieve statistical significance (QB = 9.108, p = .105), which shows that the average effect size did not differ significantly among these categories.

3.4. Evaluation of publication bias

The classic fail-safe N and Orwin's fail-safe N were adopted to demonstrate the publication bias for the 108 selected studies. As suggested by the data in Table 3, the classic failsafe N test determined that a total of 4144 studies with null results would be needed in order to nullify the effect size. Moreover, the results of Orwin's fail-safe N test (see Table 4) show that the number of missing null studies required to bring the existing overall mean effect size to a trivial level (g = 0.01) was 3423. Both tests suggest that publication bias could not explain the significant positive effects observed across all studies. 4. Conclusions and Implications

Analysis of the empirical research on the use of mobile devices as tools in educational interventions that were published in peer-reviewed journals has revealed that the overall effect of using mobile devices in education is better than when using desktop computers or not using mobile devices as an intervention, with a moderate effect size of 0.523. Through the analysis of moderator variables, we found that many different combinations of hardware, software, and intervention durations for mobile devices have been applied to various ages of users,

implementation settings, teaching methods, and domain subjects. The effect of such usage was greater for handhelds than for laptops; usage in inquiry-oriented learning was more effective than usage along with lectures, self-directed study, cooperative learning, and game-based learning; informal educational environments were more effective than their formal counterparts, and medium- and short-duration interventions were superior to long-term interventions. These findings will contribute to a better understanding of where, for whom, and in which way the use of mobile devices in the learning environment will best highlight the effects of particular educational methods, and reveal the limitations of mobile devices in education.

Based on the findings of this study, it is proposed that more elaborate instructional design developments are needed to more thoroughly exploit the educational benefits possible by utilizing mobile devices. We believe that the three implications proposed below will be helpful for facilitating and achieving these goals.

4.1. Leveraging the pedagogical effects of mobile devices through elaborate designs of learning/teaching scenarios

Mobile devices have various distinctive features such as individualized interfaces, real-time access to information, context sensitivity, instant communication, and feedback. These features may be able enhance the effects of certain pedagogies, such as self-directed learning, inquiry learning, or formative assessment. However, it is note-worthy that the features of mobile devices are not sufficient conditions for positive learning effects. The minor effects of mobile-device-based cooperative and game-based learning in our study illustrated this fact. Instead, researchers must find the "key" to integrating mobile devices with instructional strategies and ingeniously match the unique features of mobile devices to the resolution of specific pedagogic challenges. Doing so will maximize the impact of those features on learning outcomes.

Some examples include using the instant-feedback functions to solve the difficulty of efficiently executing and managing formative assessment in a class with many students (Penuel, Roschelle, & Shechtman, 2007) and, for cooperative groups, using wireless communication to facilitate between-group scaffoldings and to avoid idling (Lan et al., 2007). As one of the most used strategies in mobile learning/teaching, self-directed study is an example of a method that deserves more attention paid to pairing specific features to specific challenges to yield improved results. In addition, most of the studies in our research utilized mobile devices' features of individuality and wireless communication capacity for self-directed learning, such as learning vocabularies through messaging services or using word processors for writing. However, few studies in our research provided their mechanisms for using the instant feedback to facilitate the interaction between mobile devices and users (e.g., Oberg & Daniels, 2013; Ozcelik & Acarturk, 2011), which is an important element of effective self-directed learning with computers. Therefore, more elaborate methods of implementation, such as a monitoring mechanisms for learning EFL vocabularies through the message services of cell phones, an annotation system for reading e-books (e.g., Hwang, Shadiev, & Huang, 2011), speech recognition for providing feedback to students' oral practices (e.g., Tanner & Landon, 2009), etc., should be considered to enhance the interaction between learner and computers and the effects of self-directed learning. 4.2 Enhancing the quality of the experimental design for mobile intervention

While it was found in this study that mobile devices can enhance educational effects, the actual impact of mobile learning programs needs to be enhanced by longer intervention durations, closer integration of technology and the curriculum, and further assessment of higher-level skills.

The intervention duration will affect the reliability and ecological validity of mobile learning programs. Of all of the included interventions in this study, those with durations of >

6 months constituted only about 8.3% of the research, and more than 27.2% took place within 1 week. With short programs, and especially those that last for only hours, it is difficult to prove that any effects are produced by the features of mobile-integrated instruction rather than by the experience of technology novelty. Moreover, short-term projects may adapt poorly to regular classroom practices that may last for several months. Another issue related to teaching duration is the closeness of the integration between mobile devices and the curriculum. Most of the short programs included in our study involved only one or two units of teaching materials in the curriculum of a whole semester. Although it is not necessary for a teacher to use mobile devices in every class, different units or topics may involve different instructional designs when such devices are being used, and hence an iterative trial process is likely to be needed to determine the optimal procedure for the best effects. Therefore, an abundance of mobile learning units will help to provide exemplar models for teachers and enhance the possibility of transferring practices to different lessons. Furthermore, in terms of research, it can improve the reliability and ecological validity of mobile programs in education. Based on the above considerations, researchers may consider appropriate intervention durations according to the skills or teaching methods to be developed with mobile devices. For example, for vocabulary-learning, bite-size materials and short-term durations may be appropriate for learners, but for more complex skills or methods such as inquiry or cooperative learning, longer interventions may be needed to warrant the effect of mobile programs.

Another effect of mobile usage that could be strengthened is the expansion of measurements of dependent variables. Most of the studies in our research currently still placed the interests on achievement in content knowledge (e.g., Wang & Wu, 2011; Liu, 2009), and methods for measuring higher-level skills were scarce. Mobile devices were expected to

encourage innovation in education and increase high-level abilities (Frohberg et al., 2009; Zucker & Light, 2009). Yet most of the research collected for this study focused on increasing content learning, and even though the designed educational activities involve explorative, communication, and cooperative skills, the dependent variables had almost no connection with these skills. For example, in the database of our research, only 5 of the 9 experimental/quasi experimental studies explored the interactive behaviors of students during their mobile learning; furthermore, none of the 24 inquiry-oriented learning recorded and investigated process-related skills such as hypothesis-formation and hypothesis-testing. Therefore, including dependent variables besides content knowledge—such as problem-solving, critical thinking, interactive communication, or creative innovation skills—in the measurements will make the persuasiveness of the educational effects of mobile devices much more convincing.

4.3. Empowering educational practitioners through the orchestration of mobile devices, software, and pedagogical design

Scholars have gradually reached a consensus that exerting the maximum effect of information technology in the educational field requires reconciliation of the connection among the components of technology (hardware and software), educational context and missions (e.g., learning and teaching processes in different settings), and users (teachers and students) in order to overcome many of the limitations present in the field. Scholars (Dillenbourg, Nussbaum, Dimitriadis, & Roschelle, 2013; Dimitriadis, Prieto, & Asensio-Pérez, 2013) came to agree that the efforts of building harmonious relationships among those components to enable compatible, efficient, and effective technology-enhanced teaching and learning environments may be called orchestration. To achieve orchestration in mobile-integrated education requires the pursuit of at least two directions for research and practices. The first is strengthening the functions and

expanding the applicability and breadth of learning-oriented software. For example, the research analyzed in this study paired many different learning-oriented software programs with educational activities (e.g., reciprocal teaching, inquiry learning, and formative assessment) that have already proven effective. That software may be modified to provide the functionality of authoring tools that allow teachers to flexibly arrange their own teaching and learning flows in the classroom.

The second direction is strengthening professional teacher-development programs for mobile-enhanced instruction. Most review research into the use of mobile devices for education has emphasized that one of the largest obstacles to implementing effective mobile learning programs is insufficient preparation of the teachers (Frohberg et al., 2009; Penuel, 2006). The essence of effective professional development for technology-enhanced inquiry proposed by Gerard et al. (2011) is also applicable to mobile learning programs. Teachers should be encouraged to modify already developed mobile-integrated education programs, and to gradually customize them into their own personalized program rather than simply designing their own program around the use of technology. The latter approach implicitly leads teachers to technology-adapted instruction, which means that the educational practices of the teachers may be restricted by the functions of technology, and may make it difficult for teachers to change their existing beliefs and habits. In contrast, customizing existing research-based mobile learning programs not only transfers researchers' visions and experiences for the use of technology to teachers, but also minimizes the time teachers spend on formulating new ideas and performing trial-and-error iterative procedures (Gerard et al., 2011; Penuel et al., 2007). To facilitate the transition of researchers' vision, experiences, and skills to school teachers, it is also helpful to involve university-level researchers as mentors or collaborators. Diverse functions and types of

hardware and software are available for mobile devices, but conversely the complexity is also high, and hence designing and using them can readily impose additional overhead on teachers. The plethora of technological knowledge and resources that are available to researchers for educational technology means that their participation in a program can result in their knowledge and experience greatly assisting the teachers' autonomy in implementation.

Another note-worthy fact is that, despite the importance of teachers' professional development during their adoption of and adaptation to mobile-device based teaching (Newhouse, Williams, & Pearson, 2006; Penuel & Yarnall, 2005), the investigations into increasing the education of teachers regarding the use of mobile devices have been extremely limited. Therefore, more in-depth experimental research is needed into how teachers reconcile mobile hardware and software, lesson content, teaching methods, and educational goals.

References

(Please see the Further readings for the studies included for meta-analysis)

Ary, D., Jacobs, L. C., & Razavieh, A. (2002). Introduction to research in education (6th ed.).

Belmont, CA: Wadsworth. Ba§oglu, E. B., & Akdemir, O. (2010). A comparison of undergraduate students' English

vocabulary learning: Using mobile phones and flash cards. The Turkish Online Journal of Educational Technology, 9, 1-7. Retrieved from http://eric.ed.gov/?id=EJ898010 Bebell, D. & O'Dwyer, L. M. (2010). Educational Outcomes and Research from 1:1 Computing Settings. Journal of Technology, Learning, and Assessment, 9, 5-15. Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1606 Becker, H. J. (1991). How computers are used in United States schools: Basic data from the 1989 I.E.A. Computers in Education survey. Journal of Educational Computing Research, 7, 385-405. doi:10.2190/P2UT-R3U3-FK1L-B89L Becker, H. J. (2001, April). How are teachers using computers in instruction? Paper presented at

the 2001 Annual Meeting of American Educational Research Association, Seattle, WA. Best, J. W., & Kahn, J. V. (1998). Research in education (8th ed.). Boston, MA: Allyn and Bacon. Borenstein, M., Hedges, L.V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to metaanalysis. Chichester, UK: Wiley. Byun, H., Lee, J., & Cerreto, F. A. (2014). Relative effects of three questioning strategies in ill structured, small group problem solving. Instructional Science, 42, 229-250. doi:10.1007/s11251-013-9278-1 Chan, T. W., Roschelle, J., Hsi, S., Kinshuk, Sharples, M., Brown, T., ... & Hoppe, U. (2006). One-to-one technology-enhanced learning: An opportunity for global research

collaboration. Research and Practice in Technology Enhanced Learning, 1(01), 3-29. doi: 10.1142/S1793206806000032 Chen, C. M. & Chen, M. C. (2009). Mobile formative assessment tool based on data mining techniques for supporting web-based learning. Computers & Education, 52, 256-273. doi:10.1016/j. compedu. 2008.08.005 Cheung, A. C. K., & Slavin, R. E. (2013). The effectiveness of educational technology

applications for enhancing mathematics achievement in K-12 classrooms: A metaanalysis. Educational Research Review, 9, 88-113. doi:10.1016/j.edurev.2013.01.001 Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

Dierking, L. D., Falk, J. H., Rennie, L., Anderson, D., & Ellenbogen, K. (2003). Policy statement of the "informal science education" ad hoc committee. Journal of Research in Science Teaching, 40, 108-111. doi:10.1002/tea.10066 Dillenbourg, P., Nussbaum, M., Dimitriadis, Y, Roschelle, J. (2013). Design for classroom

orchestration. Computers & Education, 69, 485-492. doi:10.1016/j.compedu.2012.10.026 Dimitriadis, Y., Prieto, L. P., Asensio-Perez, J. I. (2013). The role of design and enactment patterns in orchestration: Helping to integrate technology in blended classroom ecosystems. Computers & Education, 69, 496-499. doi:10.1016/j.compedu.2013.04.004 Drayton, B., Falk, J. K., Stroud, R., Hobbs, K., & Hammerman, J. (2010). After installation: Ubiquitous computing and high school science in three experienced, high-technology schools. Journal of Technology, Learning, and Assessment, 9, 4-56. Retrieved from http://ej ournals.bc. edu/ oj s/index.php/jtla/article/view/1608 Ertmer, P., Ottenbreit-Leftwich, A. (2010). Teacher technology change: How knowledge, beliefs,

and culture intersect. Journal of Research on Technology in Education, 42, 255-284. doi:10.1080/15391523.2010.10782551 Fleischer, H. (2012). What is our current understanding of one-to-one computer projects: A systematic narrative research review. Educational Research Review, 7, 107-122. doi:10.1016/j.edurev.2011.11.004 Frohberg, D., Goth, C., & Schwabe, G (2009) Mobile learning projects - a critical analysis of the state of the art. Journal of Computer Assisted Learning, 25, 307-331. doi: 10.1111/j.1365-2729.2009.00315.x Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-

experimental studies of inquiry-based science teaching: A meta-analysis. Review of EducationalResearch, 82, 300-329. doi:10.3102/0034654312457206 Gerard, L. F., Varma, K., Corliss, S. B., & Linn, M. C. (2011). Professional Development for technology-enhanced inquiry science. Review of Educational Research, 81, 408-448. doi:10.3102/0034654311415121 Gleaves, A., Walker, C., & Grey, J. (2007). Using digital and paper diaries for learning and

assessment purposes in higher education: a comparative study of feasibility and reliability. Assessment & Evaluation in Higher Education, 32, 631-643. doi:10.1080/0260293 0601117035 Gillies, R. M., & Haynes, M. (2011). Increasing explanatory behavior, problem solving, and

reasoning within classes using cooperative group work. Instructional Science, 39, 349366. doi:10.1007/s11251-010-9130-9 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. London, England: Routledge.

Holcomb, L. B. (2009). Results & lessons learned from 1:1 laptop initiatives: A collective

Review. TechTrends, 53, 49-55. doi:10.1007/s11528-009-0343-1 Hsi, S. (2003). A study of user experiences mediated by nomadic web content in a museum. Journal of Computer Assisted Learning, 19, 308-319. doi:10.1046/j.0266-4909.2003 .j ca_023 .x

Hsieh, P., Acee, T., Chung, W., Hsieh, Y. P., Kim, H., Thomas, G D., Robinson, D. H. (2005). Is educational intervention research on the decline? Journal of Educational Psychology, 97, 523-529. doi:10.1037/0022-0663.97.4.523 Hsu, C. K., Hwang, G. J., Chang, Y. T., & Chang, C. K. (2013). Effects of video caption modes on English listening comprehension and vocabulary acquisition using handheld devices. Educational Technology & Society, 16(1), 403-414. Retrieved from http: // www. ifets. info/j ournals/ 16_1/35.pdf Hsu, L., & Lee, S. N. (2011). Learning tourism English on mobile phones: How does it work? Journal of Hospitality, Leisure, Sport and Tourism Education, 10, 85-94. doi: 10.3794/j ohlste. 102.348 Huizenga, J., Admiraal, W., Akkerman, S., & ten Dam, G (2009). Mobile game-based learning in secondary education: engagement, motivation and learning in a mobile city game. Journal of Computer Assisted Learning, 25, 332-344. doi: 10.1111/j.1365-2729.2009.00316.x

Hwang, G J., Chu, H. C., Lin, Y. S., & Tsai, C. C. (2011). A knowledge acquisition approach to developing Mindtools for organizing and sharing differentiating knowledge in a ubiquitous learning environment. Computers & Education, 57, 1368-1377. doi:10.1016/j.compedu.2010.12.013

Hwang, G J. & Tsai, C. C. (2011). Research trends in mobile and ubiquitous learning: A review of publications in selected journals from 2001 to 2010. British Journal of Educational Technology, 42, 65-70. doi:10.1111/j.1467-8535.2011.01183.x Hwang, W. Y., Shadiev, R., & Huang, S. M. (2011). A study of a multimedia web annotation

system and its effect on the EFL writing and speaking performance of junior high school students. ReCALL, 23(2), 160-180. D0I:10.1017/S0958344011000061 Hwang, G J., Tsai, C. C., & Yang, J. H. (2008). Criteria, strategies and research issues of context-aware ubiquitous learning. Educational Technology & Society, 11, 81-91. Retrieved from http://www.ifets.info/journals/11_2/8.pdf Jabbar, A. I. A., & Felicia, P. (2015). Gameplay engagement and learning in game-based learning: A systematic review. Review of Educational Research, 85(4), 740-779. doi: 10.3102/0034654315577210 Jeong, H. Y., & Hong, B. H. (2013). A practical use of learning system using user preference in ubiquitous computing environment. Multimedia Tools and Applications, 64, 491-504. doi:10.1007/s 11042-012-1026-z Inan, F. A., & Lowther, D. L. (2010). Laptops in the K-12 classrooms: Exploring factors impacting instructional use. Computers & Education, 55, 937-944. doi:10.1016/j. compedu. 2010.04.004 Jian, H. L., Sandnes, F. E., Law, K. M. Y., Huang, Y. P., & Huang, Y. M. (2009). The role of electronic pocket dictionaries as an English learning tool among Chinese students. Journal of Computer Assisted Learning, 25, 503-514. doi: 10.1111/j.1365-2729.2009.00325.x

Ketamo, H. (2003). An adaptive geometry game for handheld devices. Educational Technology

& Society, 6(1), 83-95. Retrieved from http://www.ifets.info/journals/6_1/ketamo.html Kim, P., Buckner, E., Kim, H., Makanya, T., Taleja, N., & Parikh, V. (2011). A comparative

analysis of a game-based mobile learning model in low-socioeconomic communities of India. International Journal of Educational Development, 32, 329-340. doi:10.1016/j.ijedudev.2011.05.008 Klopfer, E., Sheldon, J., Perry, J., & Chen, V. H.-H. (2012). Ubiquitous games for learning

(UbiqGames): Weatherlings, a worked example. Journal of Computer Assisted Learning, 28, 465-476. doi: 10.1111/j. 1365-2729.2011.00456.x Kondo, M., Ishikawa, Y., Smith, C., Sakamoto, K., Shimomura, H., & Wada, N. (2012). Mobile assisted language learning in university EFL courses in Japan: Developing attitudes and skills for self-regulated learning. ReCALL, 24, 169-187. doi:10.1017/S095834401200005 5 Kulik, C. L. C., & Kulik, J. A. (1991). Effectiveness of computer-based instruction: An updated analysis. Computers in Human Behavior, 7, 75-94. doi:10.1016/0747-5632(91)90030-5 Kukulska-Hulme, Agnes and Shield, Lesley (2008). An overview of mobile assisted language learning: From content delivery to supported collaboration and interaction. ReCALL, 20, 271-289. doi:10.1017/S0958344008000335 Lai, C. Y., & Wu, C. C. (2006). Using handhelds in a jigsaw cooperative learning environment. Journal of Computer Assisted Learning, 22, 284-297. doi: 10.1111/j.1365-2729.2006.00176.x

Lan, Y. J., Sung, Y. T., & Chang, K. E. (2007). A mobile-device-supported peer-assisted learning system for collaborative early EFL reading. Language Learning & Technology, 11, 130151. Retrieved from http://llt.msu.edu/vol11num3/pdf/lansungchang.pdf

Lan, Y J., Sung, Y T., & Chang, K. E. (2009). Let us read together: Development and evaluation of a computer-assisted reciprocal early English reading system. Computers & Education, 53, 1188-1198. doi: 10.1016/j. compedu.2009.06.002 Langman, J. & Fies, C. (2010). Classroom response system-mediated science learning with English language learners. Language and Education, 24, 81-99. doi: 10.1080/09500780903096553 Li, L. Y., Chen, G D., & Yang, S. J. (2013). Construction of cognitive maps to improve e-book reading and navigation. Computers & Education, 60, 32-39. doi:10.1016/j. compedu. 2012.07.010 Lipsey, M. W., & Wilson, D. B. (2000). Practical meta-analysis. Thousand Oaks, CA: Sage. Liu, C. C., Tao, S. Y., & Nee, J. N. (2008). Bridging the gap between students and computers: Supporting activity awareness for network collaborative learning with GSM network. Behaviour & Information Technology, 27, 127-137. doi:10.1080/01449290601054772 Liu, T. C., Lin, Y C., Tsai, M. J., & Paas, F. (2012). Split-attention and redundancy effects in mobile learning in physical environments. Computers & Education, 58, 172-180. doi: 10.1016/j. compedu.2011.08.007 Liu, T. Y. (2009). A context-aware ubiquitous learning environment for language listening and speaking. Journal of Computer Assisted Learning, 25(6), 515-527. doi: 10.1111/j.1365-2729.2009.00329.x

Liu, T. Y., Tan, T. H., & Chu, Y. L. (2009). Outdoor natural science learning with an RFID-supported immersive ubiquitous learning environment. Educational Technology & Society, 12, 161-175. Retrieved from http://www.ifets.info/journals/12_4/15.pdf Lu, M. (2008). Effectiveness of vocabulary learning via mobile phone. Journal of Computer

Assisted Learning, 24, 515-525. doi:10.1111/j.1365-2729.2008.00289.x Looi, C. K., Zhang, B., Chen, W., Seow, P., Chia, G, Norris, C., Soloway, E. (2011). 1:1 mobile inquiry learning experience for primary science students: A study of learning effectiveness. Journal of Computer Assisted Learning, 27, 269-287. doi: 10.1111/j. 1365-2729.2010.00390.x

Lu, Z. J. (2012). Learning with Mobile Technologies, Handheld Devices, and Smart Phones:

Innovative Methods. IGI-Global. 1-272. McMillan, J. H., Venable, J. J., & Varier, D. (2013). Studies of the effect of formative assessment on student achievement: So much more is needed. Practical Assessment, Research & Evaluation, 18(2). Retrieved from http://pareonline.net/pdf/v18n2.pdf Morris, S. B. (2008). Estimating effect sizes from pretest-posttest-control group designs.

Organizational Research Methods, 11, 364-386. doi:10.1177/1094428106291059 Mouza, C. (2008). Learning with laptops: Implementation and outcomes in an urban, underprivileged school. Journal of Research on Technology in Education, 40, 447-472. doi: 10.1080/15391523.2008.10782516 Naismith, L., Lonsdale, P., Vavoula, G., & Sharples, M. (2004). Literature review in mobile technologies and learning. (Report No. 11). Retrieved from FutureLab website: http://archive.futurelab.org.uk/resources/documents/lit_reviews/Mobile_Review.pdf Newhouse, C. P., Williams, P. J., & Pearson, J. (2006). Supporting mobile education for pre-service teachers. Australasian Journal of Educational Technology, 22, 289-311. Retrieved from http://www.ascilite.org.au/ajet/ajet22/newhouse.html

Oberg, A., & Daniels, P. (2013). Analysis of the effect a student-centred mobile learning

instructional method has on language acquisition. Computer Assisted Language Learning, 26(2), 177-196. doi: 10.1080/09588221.2011.649484 Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational and

Behavioral Statistics, 8, 157-159. doi:10.3102/10769986008002157 Ozcelik, E., & Acarturk, C. (2011). Reducing the spatial distance between printed and online information sources by means of mobile technology enhances learning: Using 2D barcodes. Computers & Education, 57(3), 2077-2085. doi: 10.1016/j.compedu.2011.05.019 Penuel, W. R. (2006). Implementation and effects of 1:1 computing initiatives: A research synthesis. Journal of Research on Technology in Education, 38, 329-348. doi: 10.1080/15391523.2006.10782463 Penuel, W. R., Roschelle, J., & Shechtman, N. (2007). Designing formative assessment software with teachers: An analysis of the co-design process. Research and Practice in Technology Enhanced Learning, 2, 51-74. doi:10.1142/S1793206807000300 Penuel, W. R., & Yarnall, L. (2005). Designing handheld software to support classroom

assessment: Analysis of conditions for teacher adoption. Journal of Technology, Learning, and Assessment, 3, 1-46. Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1658 Pressley, M., & Harris, K. R. (1994). Increasing the quality of educational intervention research.

Educational Psychology Review, 6, 191-208. doi:10.1007/BF02213181 Purrazzella, K., & Mechling, L. C. (2013). Evaluation of manual spelling, observational and incidental learning using computer-based instruction with a tablet PC, large screen

projection, and a forward chaining procedure. Education and Training in Autism and Developmental Disabilities, 48, 218-235. Retrieved from

http://daddcec.org/Portals/0/CEC/Autism_Disabilities/Research/Publications/Education_ Training_Development_Disabilities/Full_Journals/ETADD_48_2_218-235.pdf Riconscente, M. M. (2013). Results from a controlled study of the iPad fractions game motion

math. Games and Culture, 8, 186-214. doi:10.1177/1555412013496894 Roschelle, J., & Pea, R. D. (2002). A walk on the WILD side: How wireless handhelds may change computer-supported collaborative learning. International Journal of Cognition and Technology, 1, 145-168. Retrieved from https://hal.archives-ouvertes.fr/file/index/docid/190615/filename/A110_Roschelle_Pea_02_WILD.pdf Roschelle, J., Rafanan, K., Bhanot, R., Estrella, G, Penuel, B., Nussbaum, M., & Claro, S.

(2010). Scaffolding group explanation and feedback with handheld technology: Impact on students' mathematics learning. Educational Technology Research and Development, 58, 399-419. doi:10.1007/s11423-009-9142-9 Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological

Bulletin, 86, 638-641. doi:10.1037/0033-2909.86.3.638 Runyan, J. D., Steenbergh, T. A., Bainbridge, C., Daugherty, D. A., Oke, L., Fry, B. N. (2013). A smartphone ecological momentary assessment/intervention "App" for collecting real-time data and promoting self-awareness. PLoS ONE, 8, e71325. doi: 10.1371/j ournal.pone. 0071325 Ryu, H. & Parsons, D. (2012). Risky business or sharing the load? Social flow in collaborative mobile learning. Computers & Education, 58, 707-720. doi:10.1016/j. compedu. 2011.09.019

Saran, M., Seferoglu, G, & Cagiltay, K. (2012). Mobile language learning: Contribution of

multimedia messages via mobile phones in consolidating vocabulary. The Asia-Pacific Education Researcher, 21, 181-190. Retrieved from

http://www.ejournals.ph/index.php?journal=TAPER&page=article&op=viewArticle&pat h%5B%5D=4826

Shapley, K.S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2010). Evaluating the implementation fidelity of technology immersion and its relationship with student achievement. Journal of Technology, Learning, and Assessment, 9, 5-68. Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1609 Shih, S. C., Kuo, B. C., & Liu, Y L. (2012). Adaptively ubiquitous learning in campus Math path. Educational Technology & Society, 15, 298-308. Retrieved from http: // www. ifets. info/j ournals/15_2/25.pdf Siozos, P., Palaigeorgiou, G, Triantafyllakos, G, & Despotakis, T. (2009). Computer based testing using "Digital Ink": Participatory design of a tablet PC based assessment application for secondary education. Computers & Education, 52, 811-819. doi:10.1016/j. compedu. 2008.12.006 Slavin, R. E. (1977). Classroom reward structure: An analytical and practical review. Review of Educational Research, 47, 633-650. Retrieved from

http://www.jstor.org/stable/pdfplus/1170003.pdf?acceptTC=true&jpdConfirm=true Slavin, R. E. (1993). When and why does cooperative learning increase achievement? Theoretical and empirical perspectives. In H. Daniels & A. Edwards, (Eds), The RoutledgeFalmer reader in psychology of education (pp. 271-293). New York, NY: RoutledgeFalmer.

Slavin, R. E. (2012). Instruction based on cooperative learning. In R. E. Mayer & P. A.

Alexander (Eds.), Handbook of research on learning and instruction (pp. 344-360). New York, NY: Routledge.

Sung, Y. T., Chang, K. E., Hou, H. T., & Chen, P. F. (2010). Designing an electronic guidebook for learning engagement in a museum of history. Computers in Human Behavior, 26, 7483. doi :10.1016/j. chb. 2009.08.004 Sung, Y. T., Hou, H. T., Liu, C. K., & Chang, K. E. (2010). Mobile guide system using problemsolving strategy for museum learning: a sequential learning behavioural pattern analysis. Journal of Computer Assisted Learning, 26, 106-115. doi: 10.1111/j. 1365-2729.2010.00345.x

Sung, Y. T. & Lesgold, A. (2007). Software infrastructure for teachers: A missing link in

integrating technology with instruction. Teachers College Record, 109, 2541-2575. Retrieved from http://www.tcrecord.org/Content.asp?contentid=14536 Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research, 81, 4-28. doi:10.3102/0034654310393361 Tan, T. H., Liu, T. Y., & Chang, C. C. (2007). Development and evaluation of an RFID-based

ubiquitous learning environment for outdoor learning. Interactive Learning Environments, 15, 253-269. doi:10.1080/10494820701281431 Tanner, M. W., & Landon, M. M. (2009). The effects of computer-assisted pronunciation readings on ESL learners' use of pausing, stress, intonation, and overall comprehensibility. Language Learning and Technology, 13, 51-65. Retrieved from:

http://llt.msu.edu/vol13num3/tannerlandon.pdf Van der Kleij, F. M., Feskens, R. C., & Eggen, T. J. (2015). Effects of feedback in a computer-based learning environment on students' learning outcomes: A meta-analysis. Review of Educational Research, 85(4), 1-37. doi: 10.3102/0034654314564881 Wang, S. L., & Wu, C. Y. (2011). Application of context-aware and personalized

recommendation to implement an adaptive ubiquitous learning system. Expert Systems with Applications, 38(9), 10831-10838. doi: 10.1016/j.eswa.2011.02.083 Warschauer, M. (2007). A teacher's place in the digital divide. Yearbook of the National Society

for the Study of Education, 106, 147-166. doi:10.1111/j.1744-7984.2007.00118.x Warschauer, M., Zheng, B., Niiya, M., Cotten, S., & Farkas, G (2014). Balancing the one-to-one equation: Equity and access in three laptop programs. Equity & Excellence in Education, 47, 46-62. DOI: 10.1080/10665684.2014.866871 Weston, M., & Bain, A. (2010). The end of techno-critique: The naked truth about 1:1 laptop

initiatives and educational change. Journal of Technology, Learning, and Assessment. 9, 126. Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1611 Wong, L. H., & Looi, C. K. (2011). What seams do we remove in mobile assisted seamless learning? A critical review of the literature. Computers & Education, 57, 2364-2381. doi:10.1016/j. compedu. 2011.06.007 Wouters, P., Van Nimwegen, C., Van Oostendorp, H., & Van Der Spek, E. D. (2013). A metaanalysis of the cognitive and motivational effects of serious games. Journal of Educational Psychology, 105(2), 249-265. doi: 10.1037/a0031311 Wu, T. T., Sung, T. W., Huang, Y M., Yang, C. S., & Yang, J. T. (2011). Ubiquitous English

learning system with dynamic personalized guidance of learning portfolio. Educational

Technology & Society, 14, 164-180. Retrieved from http: //www. ifets. info/j ournals/14_4/ 15.pdf Yang, Y., Zhang, L., Zeng, J., Pang, X., Lai, F., & Rozelle, S. (2013). Computers and the

academic performance of elementary school-aged girls in China's poor communities. Computers & Education, 60, 335-346. doi:10.1016/j.compedu.2012.08.011 Zhang, B. H., Looi, C. K., Seow, P., Chia, G, Wong, L. H., Chen, W., Norris, C. (2010). Deconstructing and reconstructing: Transforming primary science learning via a mobilized curriculum. Computers & Education, 55, 1504-1523. doi: 10.1016/j. compedu.2010.06.016 Zhang, J., Sung, Y. T., Hou, H. T., & Chang, K. E. (2014). The development and evaluation of an augmented reality-based armillary sphere for astronomical observation instruction. Computers & Education, 73, 178-188. doi:10.1016/j.compedu.2014.01.003 Zucker, A. A., & Light, D. (2009). Laptop programs for students. Science, 323, 82-85.

doi: 10.1126/science. 1167705 Zurita, G, & Nussbaum, M. (2004). Computer supported collaborative learning using wirelessly interconnected handheld computers. Computers & Education, 42, 289-314. doi:10.1016/j. compedu.2003.08.005

Further readings (The studies included for meta-analysis)

Ahmed, S., & Parsons, D. (2013). Abductive science inquiry using mobile devices in the

classroom. Computers & Education, 63, 62-72. doi:10.1016/j.compedu.2012.11.017 Agbatogun, A. O. (2012). Exploring the efficacy of student response system in a sub-Saharan African country: A sociocultural perspective. Journal of Information Technology Education, 11, 249-267. Retrieved from http://eric.ed.gov/?id=EJ990469 Ba§oglu, E. B., & Akdemir, O. (2010). A comparison of undergraduate students' English

vocabulary learning: Using mobile phones and flash cards. The Turkish Online Journal of Educational Technology, 9, 1-7. Retrieved from http://files.eric.ed.gov/fulltext/EJ898010.pdf Bebell, D., & Kay, R. (2010). One to one computing: A summary of the quantitative results from the Berkshire wireless learning initiative. The Journal of Technology, Learning, and Assessment, 9, 1-60. Retrieved from http://napoleon.bc.edu/ojs/index.php/jtla/article/viewFile/1607/1462 Billings, E. S., & Mathison, C. (2012). I get to use an iPod in school? Using technology-based advance organizers to support the academic success of English learners. Journal of Science Education and Technology, 21, 494-503. doi:10.1007/s10956-011-9341-0 Brooks, G, Miles, J. N. V., Torgerson, C. J., & Torgerson, D. J. (2006). Is an intervention using computer software effective in literacy learning? A randomised controlled trial. Educational Studies, 32, 133-143. Retrieved from http://www. j eremymiles. co. uk/mestuff/publications/p41. pdf Bruce-Low, S. S., Burnet, S., Arber, K., Price, D., Webster, L., Stopforth, M. (2013). Interactive mobile learning: A pilot study of a new approach for sport science and medical

undergraduate students. Advances in Physiology Education, 37, 292-297. doi: 10.1152/advan. 00004.2013 Chaiprasurt, C., & Esichaikul, V. (2013). Enhancing motivation in online courses with mobile

communication tool support: A comparative study. The International Review of Research in open and distance learning, 14, 377-401. Retrieved from http: //www. irrodl. org/ index.php/irrodl/article/view/1416/2610 Chang, K. E., Lan, Y. J., Chang, C. M., & Sung, Y. T. (2010). Mobile-device-supported strategy

for Chinese reading comprehension. Innovations in Education and Teaching International, 47, 69-84. doi: 10.1080/14703290903525853 Chao, P. Y, & Chen, G D. (2009). Augmenting paper-based learning with mobile phones.

Interacting With Computers, 21, 173-185. doi:10.1016/j.intcom.2009.01.001 Chen, C. M. & Chen, M. C. (2009). Mobile formative assessment tool based on data mining techniques for supporting web-based learning. Computers & Education, 52, 256-273. doi:10.1016/j.compedu.2008.08.005 Chen, C. M., & Li, Y. L. (2010). Personalised context-aware ubiquitous learning system for

supporting effective English vocabulary learning. Interactive Learning Environments, 18, 341-364. doi: 10.1080/10494820802602329 Chen, C. M., Tan, C. C., & Lo, B. J. (2013). Facilitating English-language learners' oral reading fluency with digital pen technology. Interactive Learning Environments, 1-23. doi: 10.1080/10494820.2013.817442 Chen, N. S., Teng, D. C. E., Lee, C. H., & Kinshuk. (2011). Augmenting paper-based reading activity with direct access to digital materials and scaffolded questioning. Computers & Education, 57, 1705-1715. doi:10.1016/j.compedu.2011.03.013

Chen, Y. (2010). Dictionary use and EFL learning. A contrastive study of pocket electronic

dictionaries and paper dictionaries. International Journal of Lexicography, 23, 275-306. doi: 10.1093/ijl/ecq013

Chen, Y. S., Kao, T. C., & Sheu, J. P. (2005). Realizing outdoor independent learning with a

butterfly-watching mobile learning system. Journal of Educational Computing Research, 33, 395-417. Retrieved from http: //www. csie. ntpu. edu. tw/~yschen/mypapers/JECR-2005.pdf

Chiu, L. L., & Liu, G Z. (2013). Effects of printed, pocket electronic, and online dictionaries on high school students English vocabulary retention. The Asia-Pacific Education Researcher, 22, 619-634. doi:10.1007/s40299-013-0065-1 Chu, H. C., Hwang, G J., & Tsai, C. C. (2010). A knowledge engineering approach to developing mindtools for context-aware ubiquitous learning. Computers & Education, 54, 289-297. doi:10.1016/j.compedu.2009.08.023 De Joodea, E. A., Van Heugtena, C. M., Verheya, F. R. J., & Van Boxtela, M. P. J. (2013). Effectiveness of an electronic cognitive aid in patients with acquired brain injury: A multicentre randomised parallel-group study. Neuropsychological Rehabilitation: An International Journal, 23, 133-156. doi:10.1080/09602011.2012.726632 de La Fuente, M. J. (2012). Learners' attention to input during focus on form listening tasks: the role of mobile technology in the second language classroom. Computer Assisted Language Learning, 27, 261-276. doi:10.1080/09588221.2012.733710 de-Marcos, L., Hilera, J. R., Barchino, R., Jiménez, L., Martínez, J. J., Gutiérrez, J. A., ..., Otón, S. (2010). An experiment for improving students performance in secondary and tertiary

education by means of m-learning auto-assessment. Computers & Education, 55, 10691079. doi: 10.1016/j. compedu.2010.05.003 Du, H., Hao, J. X., Kwok, R., & Wagner, C. (2010). Can a lean medium enhance large-group communication? Examining the impact of interactive mobile learning. Journal of the American Society for Information Science and Technology, 61, 2122-2137. doi:10.1002/asi.21376

Dunleavy, M. & Heinecke, W. F. (2007).The impact of 1:1 laptop use on middle school math and science standardized test scores. Computers in the Schools: Interdisciplinary Journal of Practice, Theory, and Applied Research, 24, 7-22. doi:10.1300/J025v24n03_02 Eden, S., Shamir, A., & Fershtman, M. (2011). The effect of using laptops on the spelling skills of students with learning disabilities. Educational Media International, 48, 249-259. doi: 10.1080/09523987.2011.632274 Edwards, C. M., Rule, A. C., & Boody, R. M. (2013). Comparison of face-to-face and online

mathematics learning of sixth graders. Computers in Mathematics and Science Teaching, 32, 25-47. Retrieved from http://www.editlib.org/noaccess/39231/ Gardner, J., Morrison, H., & Jarman, R. (1993). The impact of high access to computers on learning. Journal of Computer Assisted Learning, 9, 2-16. doi: 10.1111/j. 1365-2729.1993.tb00259.x

Gebru, M. T., Phelps, A. J., & Wulfsberg, G (2012). Effect of clickers versus online homework on students' long-term retention of general chemistry course material. Chemistry Education Research and Practice, 13, 325-329. doi:10.1039/C2RP20033C

Gulek, J. C., & Demirtas, H. (2005). Learning with technology: The impact of laptop use on student achievement. The Journal of Technology, Learning, and Assessment, 3, 1-38. Retrieved from http://napoleon.bc.edu/ojs/index.php/jtla/article/view/1655/1501 Hayati, A., Jalilifar, A., & Mashhadi, A. (2013). Using short message service (SMS) to teach English idioms to EFL students. British Journal of Educational Technology, 44, 66-81. doi:10.1111/j.1467-8535.2011.01260.x Ho, K., Lauscher, H. N., Broudo, M., Jarvis-Selinger, S., Fraser, J., Hewes, D., & Scott, I. (2009). The impact of a personal digital assistant (PDA) case log in a medical student clerkship. Teaching and Learning in Medicine: An International Journal, 21, 318-326. doi: 10.1080/10401330903228554 Hsu, L., & Lee, S. N. (2011). Learning tourism English on mobile phones: How does it work? Journal of Hospitality, Leisure, Sport and Tourism Education, 10, 85-94. doi: 10.3794/johlste.102.348 Huang, Y M., Liang, T. H., Su, Y. N., & Chen, N. S. (2012). Empowering personalized learning with an interactive e-book learning system for elementary school students. Educational Technology Research and Development, 60, 703-722. doi:10.1007/s11423-012-9237-6 Hung, P. H., Hwang, G J., Lin,Y. F., Wu, T. H., & Su, I. H. (2013). Seamless connection between learning and assessment applying progressive learning tasks in mobile ecology inquiry. Educational Technology & Society, 16, 194-205. Retrieved from http: // www. ifets. info/ j ournals/16 1/17.pdf Hwang, G J., Chang, H. F. (2011). A formative assessment-based mobile learning approach to improving the learning attitudes and achievements of students. Computers & Education, 56, 1023-1031. doi:10.1016/j.compedu.2010.12.002

Hwang, G J., Chu, H. C., Lin, Y. S., & Tsai, C. C. (2011). A knowledge acquisition approach to developing Mindtools for organizing and sharing differentiating knowledge in a ubiquitous learning environment. Computers & Education, 57, 1368-1377. doi: 10.1016/j. compedu.2010.12.013 Hwang, G J., Kuo, F. R., Yin, P. Y., & Chuang, K. H. (2010). A Heuristic Algorithm for planning personalized learning paths for context-aware ubiquitous learning. Computers & Education, 54, 404-415. doi:10.1016/j.compedu.2009.08.024 Hwang, G J., Shi, Y. R., & Chu, H. C. (2011). A concept map approach to developing

collaborative Mindtools for context-aware ubiquitous learning. British Journal of Educational Technology, 42, 778-789. doi:10.1111/j.1467-8535.2010.01102.x Hwang, G J., Wu, C. H., & Tseng, J. C. R., Huang, I. (2011). Development of a ubiquitous learning platform based on a real-time help-seeking mechanism. British Journal of Educational Technology, 42, 992-1002. doi:10.1111/j.1467-8535.2010.01123.x Hwang, G J., Wu, P. H., Zhuang, Y. Y., & Huang, Y M. (2013). Effects of the inquiry-based mobile learning model on the cognitive load and learning achievement of students. Interactive Learning Environments, 21, 338-354. doi:10.1080/10494820.2011.575789 Hwang, W. Y., & Chen, H. S. L. (2013). Users' familiar situational contexts facilitate the practice of EFL in elementary schools with mobile devices. Computer Assisted Language Learning, 26, 101-125. doi:10.1080/09588221.2011.639783 Hwang, W. Y., Chen, H. S. L., Shadiev, R., Huang, R. Y. M., & Chen, C. Y. (2012). Improving English as a foreign language writing in elementary schools using mobile devices in familiar situational contexts. Computer Assisted Language Learning, 27, 1-20. doi: 10.1080/09588221.2012.733711

Huang, Y M., Lin, Y. T., & Cheng, S. C. (2010). Effectiveness of a mobile plant learning system in a science curriculum in Taiwanese elementary education. Computers & Education, 54, 47-58. doi :10.1016/j. compedu.2009.07.006 Huizenga, J., Admiraal, W., Akkerman, S., & ten Dam, G (2009). Mobile game-based learning in secondary education: Engagement, motivation and learning in a mobile city game. Journal of Computer Assisted Learning, 25, 332-344. doi: 10.1111/j. 1365-2729.2009.00316.x

Jones, S. J., Crandall, J., Vogler, J. S., & Robinson, D. H. (2013). Classroom response systems facilitate student accountability, readiness, and learning. Journal of Educational Computing Research, 49, 155-171. doi:10.2190/EC.49.2.b Kang, H., Lundeberga, M., Wolter, B., delMas, R., & Herreid, C. F. (2012). Gender differences in student performance in large lecture classrooms using personal response systems ('clickers') with narrative case studies. Learning, Media and Technology, 37, 53-76. doi: 10.1080/17439884.2011.556123 Karaman, S. (2011). Effects of audience response systems on student achievement and long-term retention. Social Behavior and Personality: an International Journal, 39, 1431-1439. doi: 10.2224/sbp.2011.39.10.1431 Kert, S. B. (2011). The use of SMS support in programming education. Turkish Online Journal of Educational Technology, 10, 268-273. Retrieved from http://files.eric. ed. gov/fulltext/EJ932245.pdf Kert, S. B. (2013). Using J-Query mobile technology to support a pedagogical proficiency course.

Journal of Educational Computing Research, 48, 431-445. doi:10.2190/EC.48.4.b Ketamo, H. (2003). An adaptive geometry game for handheld devices. Educational Technology

& Society, 6, 83-95. Retrieved from http://www.ifets.info/journals/6_1/ketamo.html Kim, P., Buckner, E., Kim, H., Makany, T., Taleja, N., & Parikh, V. (2012). A comparative

analysis of a game-based mobile learning model in low-socioeconomic communities of India. International Journal of Educational Development, 32, 329-340. doi:10.1016/j.ijedudev.2011.05.008 Kim, P., Hagashi, T., Carillo, L., Gonzales, I., Makany, T., Lee, B., & Garate, A. (2011). Socioeconomic strata, mobile technology, and education: A comparative analysis. Educational Technology Research and Development, 59, 465-486. doi:10.1007/s11423-010-9172-3

Kondo, M., Ishikawa, Y., Smith, C., Sakamoto, K., Shimomura, H., & Wada, N. (2012). Mobile assisted language learning in university EFL courses in Japan: Developing attitudes and skills for self-regulated learning. ReCALL, 24, 169-187. doi:10.1017/S0958344012000055 Lai, C. H., Yang, J. C., Chen, F. C., Ho, C. W., & Chan, T. W. (2007). Affordances of mobile technologies for experiential learning: The interplay of technology and pedagogical practices. Journal of Computer Assisted Learning, 23, 326-337. doi: 10.1111/j. 1365-2729.2007.00237.x

Lai, C. Y., & Wu, C. C. (2006). Using handhelds in a Jigsaw cooperative learning environment. Journal of Computer Assisted Learning, 22, 284-297. Retrieved from http://140.115. 126.240/mediawiki/images/6/6d/Jigsaw. pdf Lan, Y F., Tsai, P. W., Yang, S. H., & Hung, C. L. (2012). Comparing the social knowledge construction behavioral patterns of problem-based online asynchronous discussion in e/m-learning environments. Computers & Education, 59, 1122-1135.

doi: 10.1016/j. compedu.2012.05.004 Lan, Y J., Sung, Y T., Tan, N. C., Lin, C. P., & Chang, K. E. (2010). Mobile-device-supported problem-based computational estimation instruction for elementary school students. Educational Technology & Society, 13, 55-69. Retrieved from http://www.thai-library.org/Resource/0000005765.pdf#page=60 Lan, Y J., Sung, Y T., & Chang, K. E. (2009). Let us read together: Development and evaluation of a computer-assisted reciprocal early English reading system. Computers & Education, 53, 1188-1198. doi: 10.1016/j. compedu. 2009.06.002 Lawrence, L. E., & Victorina, W. (2004). Laptops, technology, and algebra 1: A case study of an experiment. Mathematics Teacher, 97, 136-142. Retrieved from http://personal.stevens.edu/~llevine/mt_laptop_study.pdf Liu, C. C., Tao, S. Y., & Nee, J. N. (2008). Bridging the gap between students and computers: Supporting activity awareness for network collaborative learning with GSM network. Behaviour & Information Technology, 27, 127-137. doi:10.1080/01449290601054772 Lin, Y. C., Liu, T. C., & Chu, C. C. (2011). Implementing clickers to assist learning in science lectures: The clicker-assisted conceptual change model. Australasian Journal of Educational Technology, 27, 979-996. Retrieved from http://ascilite.org.au/ajet/ajet27/lin.html Liu, T. Y. (2009). A context-aware ubiquitous learning environment for language listening and speaking. Journal of Computer Assisted Learning, 25, 515-527. doi: 10.1111/j. 1365-2729.2009.00329.x

Liu, T. Y., & Chu, Y. L. (2010). Using ubiquitous games in an English listening and speaking

course: Impact on learning outcomes and motivation. Computers & Education, 55, 630643. doi:10.1016/j.compedu.2010.02.023 Liu, T. Y., Tan, T. H., & Chu, Y. L. (2009). Outdoor natural science learning with an RFID-

supported immersive ubiquitous learning environment. Educational Technology & Society, 12, 161-175. Retrieved from http://www.ifets.info/journals/12_4/15.pdf Looi, C. K., Zhang, B., Chen, W., Seow, P., Chia, G, Noms, C., & Soloway, E. (2011). 1:1 mobile inquiry learning experience for primary science students: A study of learning effectiveness. Journal of Computer Assisted Learning, 27, 269-287. doi:10.1111/j.1365-2729.2010.00390.x

Lowther, D. L., Ross, S. M., & Marrison, G. M. (2003). When each one has one: The influences on teaching strategies and student achievement of using laptops in the classroom. Educational Technology Research and Development, 51, 23-44. doi:10.1007/BF02504551 Lu, M. (2008). Effectiveness of vocabulary learning via mobile phone. Journal of Computer

Assisted Learning, 24, 515-525. doi:10.1111/j.1365-2729.2008.00289.x Martin-Dorta, N., Saorin, J. L., & Contero, M. (2011). Web-based spatial training using handheld touch screen devices. Educational Technology & Society, 14, 163-177. Retrieved from http: //www. ifets. info/j ournals/14_3 /14.pdf Martin, F., & Ertzberger, J. (2013). Here and now mobile learning: An experimental study on the use of mobile technology. Computers & Education, 68, 76-85. doi:10.1016/j. compedu. 2013.04.021

Mayer, R. E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., ..., Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 34, 51-57. doi: 10.1016/j. cedpsych. 2008.04.002 McNaughton, D., Hughes, C., & Clark, K. (1997). The effect of five proofreading conditions on the spelling performance of college students with learning disabilities. Journal of Learning Disabilities, 30, 643-651. doi:10.1177/002221949703000608 Morris, J. P., Mahajan, N., Pasek, K. H., Golinkoff, R. M., & Collins, M. F. (2013). Once upon a time: Parent-child dialogue and storybook reading in the electronic era. Mind, Brain, and Education, 7, 200-211. doi:10.1111/mbe.12028 Noguera, J. M., Jimeneza, J. J., & Osuna-Perezb, M. C. (2013). Development and evaluation of a 3D mobile application for learning manual therapy in the physiotherapy laboratory. Computers & Education, 69, 96-108. doi:10.1016/j.compedu.2013.07.007 Oberg, A., & Daniels, P. (2013). Analysis of the effect a student-centred mobile learning

instructional method has on language acquisition. Computer Assisted Language Learning, 26, 177-196. doi:10.1080/09588221.2011.649484 Ortega, L. D. M., Plata, R. B., Rodriguez, M. L. J., Gonzalez, J. R. H., Herraiz, J. J. M., Mesa, J. A. G D., Tortosa, S. O. (2011). Using M-learning on nursing courses to improve learning. Computers, Informatics, Nursing, 29, TC98-TC104. doi: 10.1097/NCN. 0b013e3182285d2c Ozcelik, E., & Acarturk, C. (2011). Reducing the spatial distance between printed and online information sources by means of mobile technology enhances learning: Using 2D barcodes. Computers & Education, 57, 2077-2085. doi:10.1016/j.compedu.2011.05.019

Powell, C. B., & Mason, D. S. (2013). Effectiveness of podcasts delivered on mobile devices as a support for student learning during general chemistry laboratories. Journal of Science Education and Technology, 22, 148-170. doi:10.1007/s10956-012-9383-y Radosevich, D. J., Salomon, R. Radosevich, D. M., & Kahn, P. (2008). Using student response systems to increase motivation, learning, and knowledge retention. Innovate, 5, 7. Retrieved from http://eric.ed.gov/?id=EJ840506 Rau, P. L. P., Gao, Q., & Wu, L. M. (2008). Using mobile communication technology in high school education: Motivation, pressure, and learning performance. Computers & Education, 50, 1-22. doi:10.1016/j.compedu.2006.03.008 Riconscente, M. M. (2013). Results from a controlled study of the iPad fractions game motion

math. Games and Culture July, 8, 186-214. doi: 10.1177/1555412013496894 Rockinson- Szapkiw, A. J., Courduff, J., Carter, K., & Bennett, D. (2013). Electronic versus traditional print textbooks: A comparison study on the influence of university students' learning. Computers & Education, 63, 259-266. doi:10.1016/j.compedu.2012.11.022 Rodriguez, P., Nussbaum, M., Lopez, X., & Sepulveda, M. (2010). A monitoring and evaluation scheme for an ICT-supported education program in schools. Educational Technology & Society, 13, 166-179. Retrieved from

http://www.ceppe.cl/images/stories/articulos/tic/2.2-a-monitoring.-rodriguez1.pdf Roschelle, J., Rafanan, K., Bhanot, R., Estrella, G, Penuel, B., Nussbaum, M., & Claro, S.

(2010). Scaffolding group explanation and feedback with handheld technology: Impact on students' mathematics learning. Educational Technology Research and Development, 58(4), 399-419. doi:10.1007/s11423-009-9142-9

Sandberg, J., Maris, M., & De Geus, K. (2011). Mobile English learning: An evidence-based study with fifth graders. Computers & Education, 57, 1334-1347. doi:10.1016/j.compedu.2011.01.015 Saran, M., Seferoglu, G, & Cagiltay, K. (2009). Mobile assisted language learning: English

pronunciation at learners' fingertips. Eurasian Journal of Educational Research, 34, 97114. Retrieved from http://94.73.145.182/0DOWNLOAD/pdfler/tr/902903948.pdf Saran, M., Seferoglu, G, & Cagiltay, K. (2012). Mobile language learning: Contribution of

multimedia messages via mobile phones in consolidating vocabulary. The Asia-Pacific Education Researcher, 21, 181-190. Retrieved from

http://www.ejournals.ph/index.php?journal=TAPER&page=article&op=view&path%5B %5D=4826&path%5B%5D=5016 Hassan, I. S., Ismail, M. A., & Mustapha, R. (2010). The effects of integrating mobile and CAD technology in teaching design process for Malaysian polytechnic architecture student in producing creative product. Turkish Online Journal of Educational Technology, 9, 162172. Retrieved from http://files.eric.ed.gov/fulltext/EJ908082.pdf Shih, W. C., & Tseng, S. S. (2009). A knowledge-based approach to retrieving teaching materials for context-aware learning. Educational Technology & Society, 12, 82-106. Retrieved from http://www. ifets.info/j ournals/12_1 /8. pdf Siegle, D., & Foster, T. (2001). Laptop computers and multimedia and presentation software:

Their effects on student achievement in Anatomy and Physiology. Journal of Research on Technology in Education, 34, 29-37. doi:10.1080/15391523.2001.10782331 Skeate, R. C., Wahi, M. M., Jessurun, J., & Connelly, D. P. (2007). Personal digital assistant-enabled report content knowledgebase results in more complete pathology reports and

enhances resident learning. Human Pathology, 38, 1727-1735. doi:10.1016/j.humpath.2007.05.019 Solhaug, T. (2009). Two configurations for accessing classroom computers: Differential impact on students' critical reflections and their empowerment. Journal of Computer Assisted Learning, 25, 411-422. doi:10.1111/j.1365-2729.2009.00318.x Suhr, K. A., Hernandez, D. A., Grimes, D., & Warschauer, M. (2010). Laptops and fourth grade literacy: Assisting the jump over the fourth-grade slump. The Journal of Technology, Learning, and Assessment, 9, 1-45. Retrieved from http://136.167.2.46/ojs/index.php/jtla/article/view/1610/1459 Sung, E., & Mayer, R. E. (2013).0nline multimedia learning with mobile devices and desktop computers: An experimental test of Clark's methods-not-media hypothesis. Computers in Human Behavior, 29, 639-647. doi:10.1016/j.chb.2012.10.022 Sung, Y. T., Chang, K. E., Lee, Y H., & Yu, W. C. (2008). Effects of a mobile electronic guidebook on visitors' attention and visiting behaviors. Educational Technology & Society, 11, 67-80. Retrieved from http://ifets.info/journals/11_2/7.pdf Sung, Y. T., Chang, K. E., Hou, H. T., & Chen, P. F. (2010). Designing an electronic guidebook for learning engagement in a museum of history. Computers in Human Behavior, 26, 7483. doi:10.1016/j.chb.2009.08.004 Tan, T. H., Liu, T. Y., & Chang, C. C. (2007). Development and evaluation of an RFID-based

ubiquitous learning environment for outdoor learning. Interactive Learning Environments, 15, 253-269. doi:10.1080/10494820701281431 Thornton, P., & Houser, C. (2005). Using mobile phones in English education in Japan. Journal of Computer Assisted Learning, 21, 217-228. doi:10.1111/j.1365-2729.2005.00129.x

Uluyol, C., & Agca, R. K. (2012). Integrating mobile multimedia into textbooks: 2D barcodes.

Computers & Education, 59, 1192-1198. doi:10.1016/j.compedu.2012.05.018 Wang, S. L., & Wu, C. Y. (2011). Application of context-aware and personalized

recommendation to implement an adaptive ubiquitous learning system. Expert Systems with Applications, 38, 10831-10838. doi:10.1016/j.eswa.2011.02.083 Wei, C. W., Hung, I. C., Lee, L., & Chen, N. S. (2011). A joyful classroom learning system with robot learning companion for children to learn mathematics multiplication. Turkish Online Journal of Educational Technology, 10, 11-23. Retrieved from http://files.eric.ed.gov/fulltext/EJ932221.pdf Wessels, A., Fries, S., Horz, H., Scheele, N., & Effelsberg, W. (2007). Interactive lectures:

Effective teaching and learning in lectures using wireless networks. Computers in Human Behavior, 23, 2524-2537. doi:10.1016/j.chb.2006.05.001 Wood, C., Jackson, E., Hart, L., Plester, B., & Wilde, L. (2011). The effect of text messaging on 9- and 10-year-old children's reading, spelling and phonological processing skills. Journal of Computer Assisted Learning, 27, 28-36. doi:10.1111/j.1365-2729.2010.00398.x

Wu, P. H., Hwang, G J., Su, L. H., & Huang, Y. M. (2012). A context-aware mobile learning system for supporting cognitive apprenticeships in nursing skills training. Educational Technology & Society, 15, 223-236. Retrieved from http://www. ifets.info/j ournals/15_1 /20.pdf Wu, T. T., Sung, T. W., Huang, Y M., Yang, C. S., & Yang, J. T. (2011). Ubiquitous English

learning system with dynamic personalized guidance of learning portfolio. Educational Technology & Society, 14, 164-180. Retrieved from

http://www.ifets.info/journals/14 4/ets 14 4.pdf#page=169 Yang, C. C., Hwang, G J., Hung, C. M., & Tseng, S. S. (2013). An evaluation of the learning effectiveness of concept map-based science book reading via mobile devices. Educational Technology & Society, 16, 167-178. Retrieved from http: //www. ifets. info/j ournals/16_3 /13.pdf Yang, C. C., Tseng, S. S., Liao, A. Y. H., & Liang, T. (2013). Situated poetry learning using

multimedia resource sharing approach. Educational Technology & Society, 16, 282-295. Retrieved from http://www.ifets.info/journals/16_2/23.pdf Zhang, B., Looi, C. K., Seow, P., Chia, G, Wong, L. H., Chen, W., ..., Norris, C. (2010). Deconstructing and reconstructing: Transforming primary science learning via a mobilized curriculum. Computers & Education, 55, 1504-1523. doi: 10.1016/j. compedu.2010.06.016 Zhang, H., Song, W., & Burston, J. (2011). Reexamining the effectiveness of vocabulary learning via mobile phones. The Turkish Online Journal of Educational Technology, 10, 203-214. Retrieved from http://files.eric.ed.gov/fulltext/EJ944968.pdf Zheng, B., Warschauer, M., & Farkas, G (2013). Digital writing and diversity: The effects of school laptop programs on literacy processes and outcomes. Journal of Educational Computing Research, 48, 267-299. doi:10.2190/EC.48.3.a Zurita, G, & Nussbaum, M. (2004a). Computer supported collaborative learning using wirelessly interconnected handheld computers. Computers & Education, 42, 289-314. doi: 10.1016/j. compedu.2003.08.005 Zurita, G, & Nussbaum, M. (2004b). A constructivist mobile learning environment supported by a wireless handheld network. Journal of Computer Assisted Learning, 20, 235-243.

doi: 10.1111/j. 1365-2729.2004.00089.x

MOBILE DEVICES WITH TEACHING AND LEARNING 68

Appendix A

Related review of the research into Integrating Mobile Devices with Teaching and Learning

Study Devices are focused on Method Number of studies Result

Penuel (2006) Laptops Narrative review 30(not provided publication list) Penuel (2006) synthesized findings from research and evaluation studies that analyzed implementation and effects of one-to-one initiatives from a range of countries. Factors related to successful implementation reported in the research include extensive teacher professional development, access to technical support, and positive teacher attitudes toward student technology use. Penuel (2006) found that outcome studies with rigorous designs are few, but those studies that did measure outcomes consistently reported positive effects on technology use, technology literacy, and writing skills.

Frohberg, Goth, & Schwabe (2009) Laptops Narrative review 102 (mobile learning projects) Frohberg, Goth, & Schwabe (2009) used a mobile learning framework to evaluate and categorize 102 mobile learning projects, and to briefly introduce exemplary projects for each category. Despite the fact that mobile phones initially started as a communication device, communication and collaboration play a surprisingly small role in Mobile Learning projects.

Zucker & Light (2009) Laptops Narrative review 31 (not provided publication list) Zucker & Light (2009) found research in many nations suggests that laptop programs will be most successful as part of balanced, comprehensive initiatives that address changes in education goals, curricula, teacher training, and assessment.

Bebell & O'Dwyer (2010) Laptops Narrative review 5 Bebell & O'Dwyer (2010) summarized evidence that participation in the 1: 1 computer programs was associated with increased student and teacher technology use, increased student engagement and interest level, and modest increases in student achievement.

Hwang & Tsai (2011) Various types of mobile device Content analysis 154(not provided publication list) Hwang & Tsai (2011) examined the mobile or ubiquitous learning papers published in the Social Science Citation Index (SSCI) database from 2001 to 2010. Hwang & Tsai (2011) found that the number of articles has significantly increased during the past 10 years; moreover, researchers from the different countries have contributed to the related field in recent years.

Wong & Looi (2011) Laptops Narrative review 54 Wong & Looi (2011) aimed to further investigate the meaning of seamless learning and the potential ways to put it in practice. Through a thorough review of recent academic papers on mobile-assisted seamless learning (MSL), Wong & Looi (2011) identify ten dimensions that characterize MSL.

Fleischer (2012) Narrative Narrative 18 Fleischer (2012) reviewed cross-disciplinary accumulated empirical research on one-to-one

review review computer projects in school settings as published in peer-reviewed journals between 2005 and 2010, particularly the results of teacher- and pupil-oriented studies. The results of Fleischer (2012) show that the research field has not developed substantially since the previously published reviews. One the other hand, Fleischer (2012) discussed the reasons for this lack of development, as well as the need for political, scholarly and epistemological awareness when researching questions of one-to-one computer projects.

Appendix B

Forest plot of the effect sizes and 95% CI of the 110 articles

Note. The research papers were numbered from 1 to 110. Please see the further reading section.

Appendix C Cross analyses of moderator variables

Table C1

Cross-analysis of teaching methods, domain subjects, and hardware used_

Teaching method

Not Lectures Inquiry- Cooperative Game-based Self-directed Mixed Computer- Total

mentioned oriented learning learning study assisted

learning testing

Domain Language arts 6 1 2 5 0 20 4 1 39

Subjects (15.4%) (2.6%) (5.1%) (12.8%) (0.0%) (51.2%) (10.3%) (2.6%) (100%)

Social studies 0 0 3 0 1 0 0 1 5

(0.0%) (0.0%) (60.0%) (0.0%) (20%) (0.0%) (0.0%) (20%) (100%)

Science 2 5 11 1 0 2 5 1 27

(7.4%) (18.5%) (40.7%) (3.7%) (0.0%) (7.4%) (18.5%) (3.7%) (100%)

Mathematics 2 0 2 2 3 1 1 1 12

(16.7%) (0.0%) (16.7%) (16.7%) (25.0%) (8.3%) (8.3%) (8.3%) (100%)

General 3 0 0 0 0 3 0 0 6

(50.0%) (0.0%) (0.0%) (0.0%) (0.0%) (50.0%) (0.0%) (0.0%) (100%)

Professional subjects 1 6 6 2 0 8 0 4 27

(3.7%) (22.2%) (22.2%) (7.4%) (0.0%) (29.6%) (0.0%) (14.8%) (100%)

Total 14 12 24 10 4 34 10 8 116

(12.1%) (10.3%) (20.7%) (8.6%) (3.4%) (29.3%) (8.6%) (6.9%) (100%)

Hardware Not mentioned 0 0 1 0 0 1 0 0 2

used (0.0%) (0.0%) (50.0%) (0.0%) (0.0%) (50.0%) (0.0%) (0.0%) (100%)

Handhelds 2 10 18 7 3 27 6 6 79

(2.5%) (12.7%) (22.8%) (8.9%) (3.8%) (34.2%) (7.6%) (7.6%) (100%)

Laptops 6 2 5 2 1 4 4 0 24

(25.0%) (8.3% ) (20.8%) (8.3%) (4.2%) (16.7%) (16.7%) (0.0%) (100%)

Mixed 1 0 0 0 0 2 0 0 3

(33.3%) (0.0%) (0.0%) (0.0%) (0.0%) (66.7%) (0.0%) (0.0%) (100%)

Total 9 12 24 9 4 34 10 6 108

(8.3%) (11.1%) (22.2%) (8.3%) (3.7%) (31.5%) (9.3%) (5.5%) (100%)

Software Not mentioned 1 0 0 1 0 1 0 0 3

used (33.3%) (0.0%) (0.0%) (33.3%) (0.0%) (33.3%) (0.0%) (0.0%) (100%)

General purpose 6 9 5 1 0 13 2 1 37

(16.2%) (24.3%) (13.5%) (2.7%) (0.0%) (35.1%) (5.4%) (2.7%) (100%)

Learning-oriented 2 3 19 7 4 20 8 5 68

(2.9%) (4.4% ) (27.9%) (10.2%) (5.9%) (29.4%) (11.8%) (7.4%) (100%)

Total 9 12 24 9 4 34 10 6 108

(8.3%) (11.1%) (22.2%) (8.3%) (3.7%) (31.5%) (9.3%) (5.6%) (100%)

Cross-analysis for intervention durations, hardware used, software used, and teaching methods_

Intervention duration

Not < 1 week > 1, < 4 weeks > 1 month, < >6 months Total

mentioned 6 months

Hardware used Not mentioned 0 0 0 7 2 0 2

(0.0%) (0.0%) (0.0%) (100.0%) (0.0%) (100.0%)

Handhelds 4 23 20 29 3 79

(5.1%) (29.1%) (25.3%) (36.7%) (3.8%) (100.0%)

Laptops 3 5 6 4 6 24

(12.5%) (20.8%) (25.0%) (16.7%) (25.0%) (100.0%)

Mixed 0 2 1 0 0 3

(0.0%) (66.7%) (33.3%) (0.0%) (0.0%) (100.0%)

Total 7 30 27 35 9 108

(6.5%) (27.8%) (25.0%) (32.4%) (8.3%) (100.0%)

Software used Not mentioned 1 0 2 0 0 3

(33.3%) (0.0%) (66.7%) (0.0%) (0.0%) (100.0%)

General purpose 2 7 7 15 6 37

(5.4%) (18.9%) (18.9%) (40.5%) (16.2%) (100.0%)

Learning-oriented 4 23 18 20 3 68

(5.9%) (33.8%) (26.5%) (29.4%) (4.4%) (100.0%)

Total 7 30 27 35 9 108

(6.5%) (27.8%) (25.0%) (32.4%) (8.3%) (100.0%)

Teaching Not mentioned 1 1 1 2 4 9

method (11.1%) (11.1%) (11.1%) (22.2%) (44.4%) (100.0%)

Lecture 1 5 0 5 1 12

(8.3%) (41.7%) (0.0%) (41.7%) (8.3%) (100.0%)

Inquiry-oriented 3 9 5 7 0 24

(12.5%) (37.5%) (20.8%) (29.2%) (0.0%) (100.0%)

Cooperative learning 0 2 4 2 1 9

(0.0%) (22.2%) (44.4%) (22.2%) (11.1%) (100.0%)

Game-based learning Self-directed study Mixed

Computer-assisted testing Total

(0.0%) (50.0%) (5o.o%;

1 9 13

(2.9%) (26.5%) (38.2%;

(0.0%) (0.0%) (io.o%;

(16.7%) (33.3%) (16.7%;

7 30 27

(6.5%) (27.8%) (25.0%;

(0.0%) 11

(32.4%) 6

(60.0%) 2

(33.3%) 35

_(32.4%)

(0.0%) (100.0%)

(0.0%) (100.0%)

(30.0%) (100.0%)

(0.0%) (100.0%)

(8.3%) (100.0%)

Cross-analysis for implementation settings and software used_

Software used

Not mentioned General purpose Learning-oriented Total

Implementation setting Not mentioned 0 1 1 2

(0.0%) (50.0%) (50.0%) (100.0%)

Formal settings 2 22 36 60

(3.3%) (36.7%) (60.0%) (100.0%)

Informal settings 0 4 17 21

(0.0%) (19.0%) (81.0%) (100.0%)

Unrestricted 1 10 14 25

(4.0%) (40.0%) (56.0%) (100.0%)

Total 3 37 68 108

(2.8%) (34.3%) (63.0%) (100.0%)

Categories and learning achievement effect sizes for 110 articles

Number Number of effect sizes Proportion Proportion Effect

Variable Category of studies (k) of studies of effect size size (g)

Learning stage 1. Kindergarten 1 2 0.009 0.005 0.103

2. Elementary school 38 97 0.339 0.232 0.654

3. Middle school 10 47 0.089 0.112 0.512

4. High school 10 47 0.089 0.112 0.390

5. College 43 128 0.384 0.305 0.599

6. Adults 2 4 0.018 0.010 2.474

7. Mixed 8 94 0.071 0.224 0.084

Intervention 1. Not mentioned 7 23 0.064 0.055 0.782

duration 2. <4 hours 23 86 0.209 0.205 0.521

3. > 4, < 24 hours 2 18 0.018 0.043 0.385

4. >1, < 7 days 5 9 0.045 0.021 0.369

5. > 1 week, < 4 weeks 28 95 0.255 0.227 0.643

6. > 1 month, < 6 months 36 100 0.327 0.239 0.630

7. >6 months 9 88 0.082 0.210 0.290

Hardware used 1. Not mentioned 2 8 0.018 0.019 1.421

2. Handhelds 40 87 0.364 0.208 0.743

3. Laptop 14 109 0.127 0.260 0.276

4. Tablet PC 8 19 0.073 0.045 0.615

5. Cell phone 24 84 0.218 0.200 0.676

6. iPod or MP3 player 5 16 0.045 0.038 0.524

7. E-book reader 2 41 0.018 0.098 -0.693

8. Digital pen 1 1 0.009 0.002 0.217

9. Pocket dictionary 2 11 0.018 0.026 -0.160

10. Classroom response systems 8 31 0.073 0.074 0.369

11. Mixed 4 12 0.036 0.029 0.273

Software used 1. Not mentioned 3 29 0.027 0.069 0.355

2. General purpose 38 223 0.345 0.532 0.494

3. Learning-oriented 69 167 0.627 0.399 0.626

Implementation 0. Not mentioned 2 3 0.018 0.007 0.700

setting 1. Classroom 55 242 0.500 0.578 0.487

2. Museum 4 13 0.036 0.031 0.833

3. Laboratory 3 12 0.027 0.029 0.329

4. Outdoors 17 27 0.155 0.064 0.760

5. Unrestricted 18 94 0.164 0.224 0.480

Table 1

Categories and learning achievement effect sizes for 110 articles (continued)

Number Number Proportion Proportion Effect

Variable Category of studies (k) of effect sizes of studies of effect size size (g)

6. Workplaces 3 14 0.027 0.033 0.247

7. Mixed 8 14 0.073 0.033 1.032

Teaching 1. Not mentioned 9 84 0.082 0.200 0.186

method 2. Lectures 13 45 0.118 0.107 0.556

3. Discovery and 13 25 0.118 0.060 0.920

exploration

4. Cooperative 9 60 0.082 0.143 0.261

learning

5. Problem-solving 10 32 0.091 0.076 0.572

6. Game-based 4 0.036 0.017 0.404

learning

7. Self-directed study 34 122 0.309 0.291 0.521

8. Podcasting 1 6 0.009 0.014 0.153

9.Computer-assisted 6 8 0.055 0.019 0.660

testing

10. Project-based 1 7 0.009 0.017 2.551

learning

11. Mixed 10 23 0.091 0.055 0.847

Domain 1. Language arts 41 169 0.347 0.403 0.593

subject 2. Social studies 5 10 0.042 0.024 0.776

3. Science 27 78 0.229 0.186 0.578

4. Mathematics 12 41 0.102 0.098 0.338

5. Multidisciplinary 1 6 0.008 0.014 0.333

6. Specific abilities 5 24 0.042 0.057 0.103

7. Health-care 7 18 0.059 0.043 0.535

programs

8. Education 3 6 0.025 0.014 0.381

9. Psychology 3 7 0.025 0.017 0.467

10. Computer and

information 14 60 0.119 0.143 0.716

technology

The learning-achievement effect sizes of categories and their related moderator variables

Category k g z 95% CI to

Learning stage 9.226* 0%

l.Young children 39 0.636 8.000*** [0.480-0.791]

2. Secondary-schoolers 20 0.451 4 274*** [0.244-0.658]

3. Adults 43 0.552 7.360*** [0.405-0.700]

4. Mixed 8 0.086 0.503 [-0.248-0.419]

Hardware used 18.426*** 7%

1. Not mentioned 2 1.416 4 491*** [0.798-2.033]

2. Handhelds 78 0.591 10.992*** [0.485-0.696]

3. Laptops 24 0.309 3.350** [0.128-0.490]

4. Mixed 3 0.044 0.173 [-0.460-0.548]

Software Used 3.025 0%

1. Not mentioned 3 0.347 1.262 [-0.192-0.886]

2. General purpose 37 0.429 5.407*** [0.273-0.584]

3. Learning-oriented 68 0.590 9 699*** [0.471-0.709]

Implementation setting 7.993* 8%

1. Not mentioned 2 0.701 2.069* [0.037-1.365]

2. Formal settings

(classroom, laboratory, 60 0.430 7.328*** [0.315-0.545]

hospital)

3. Informal settings 21 0.768 7.096*** [0.556-0.980]

(museum, outside)

4. Unrestricted 25 0.550 5.887*** [0.367-0.734]

Teaching method 26.744*** 12%

1. Not mentioned 9 0.186 1.369 [-0.080-0.452]

2. Lectures 12 0.394 3.120** [0.146-0.641]

3. Inquiry-oriented learning 24 0.844 8.400*** [0.647-1.041]

4. Cooperative learning 9 0.261 1.673 [-0.045-0.566]

5. Game-based learning 4 0.407 1.922 [-0.008-0.822]

6. Self-directed learning 34 0.440 5 492*** [0.283-0.597]

7. Computer-assisted testing 6 0.656 3.661*** [0.305-1.006]

8. Mixed 10 0.839 5.702*** [0.550-1.127]

Table 2

The learning-achievement effect sizes of categories and their related moderator variables (continued)_

Category k g z 95% CI Qb R2

Intervention duration 4.924 0%

1. Not mentioned 7 0.770 4. 181*** [0.409-1.130]

2. < 1 week 30 0.479 5 175*** [0.298-0.661]

3. > 1, < 4 weeks 27 0.552 5 644*** [0.360-0.743]

4. > 1 month, < 6 months 35 0.566 6 870*** [0.405-0.728]

5. >6 months 9 0.287 1. 942 [-0.003-0.577]

Domain subjects 9.108 0%

1. Language arts 39 0.473 6. 352*** [0.327-0.619]

2. Social studies 5 0.768 3 682*** [0.359-1.177]

3. Science 27 0.565 6 397*** [0.392-0.738]

4. Mathematics 12 0.337 2 628** [0.086-0.588]

5. General 6 0.151 0 868 [-0.190-0.491]

6. Professional subjects 27 0.592 6. 808*** [0.422-0.763]

Note. CI = confidence interval *p<0.05; **p<0.01; ***^<0.001

Table 3

Results of the classic fail-safe N_

Z value for observed studies p value for observed studies Alpha Tail

Z for alpha

Number of observed studies

Number of missing studies that would bring the p value to >alpha

22.51 0.00 0.05 2.00 1.96 108.00 4144.00

Table 4

Results of Orwin's fail-safe N

Hedges' g in observed studies (fixed effect) 0.33

Criterion for a 'trivial' Hedges' g 0.01

Mean Hedges' g in missing studies 0.00

Number of missing studies needed to bring Hedge's g to under 0.01 3423.00

1993-1997 1998-2003 2004-2003

Hardware used

□ Not mentioned ¡5 hand he Id s D Laptops H Mixed

2009-2013

Figure1. Histogram of the hardware used in mobile devices assisted learning across time.

1993-1997 1996-2003 2004-2008

Implementation setting

□Not mentioned H Formal settings QInformal settings 0 Unrestricted

2009-2013

Figure2. Histogram of the Implementation setting in mobile devices assisted learning across time.

CT 20-

KJaaaiSA.

Domain subjects

□ Language aits Z Social studies

□ Science

E3 Mathematics DI General

Professional subjects

1.00 2.00 3.00 4.00

Figure3. Histogram of the Domain subjects in mobile devices assisted learning across time.

Figure 4. Histogram of the effect sizes of the 110 articles.

This is a meta-analysis and research synthesis study for mobile-integrated education.

110 published journal articles that were written over a 20-year period were coded and analyzed.

The application of mobile devices to education has a moderate mean effect size.

The effect sizes of moderator variables were analyzed.

The benefits and drawbacks of mobile learning were synthesized.

Acknowledgement

This research was supported by grants from the Ministry of Science and Technology, Taiwan (102-2911-I-003 -301; 102-2511-S-003 -001 -MY3; 101-2511-S-003 -047 -MY3; 103-2911-I-003-301), and the Aim for the Top University Project of National Taiwan Normal University.