See how this article has been cited at scite.ai
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.
A. Bojko (2009). Informative or Misleading? Heatmaps Deconstructed. Human-Computer Interaction New Trends. Springer Berlin Heidelberg.
P. Chandler, & J. Sweller (1991). Cognitive load theory and the format of instruction Cognition & Instruction 8(4), 293–332.
C. M. Chen, & C. H. Wu (2015). Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance Computers & Education 80(5), 108–121.
R. C. Clark, & R. E. Mayer (2011). E-learning and the science of instruction: proven guidelines for consumers and designers of multimedia learning. Pfeiffer.
N. Garrett (2015). Eye-Tracking Analytics in Instructional Videos ISECON.
P. J. Guo J. Kim, & R. Rubin (2014). How video production affects student engagement: an empirical study of MOOC videos ACM Conference on Learning @ Scale Conference (Vol.43, pp. 41–50). ACM.
F.M. Hollands, & D. Tirthali (2014). MOOCs: Expectation sand reality. Full Report Center for Benefit Cost Studies of Education, Teachers College Columbia University, NY.
M. A. Just, & P. A. Carpenter (1980). A theory of reading: from eye fixations to comprehension Psychological Review 87(4), 329.
M. Kipp (2014). ANVIL: A Universal Video Research Tool. In J. Durand U. Gut G. Kristofferson (Eds.) Handbook of Corpus Phonology Oxford University Press 420–436.
R. F. Kizilcec K. Papadopoulos, & L. Sritanyaratana (2014). Showing face in video instruction: effects on information retention, visual attention, and affect 2095–2102.
M. L. Lai M. J. Tsai F. Y. Yang C. Y. Hsu T. C. Liu, & W. Y. Lee et al. (2013). A review of using eye-tracking technology in exploring learning from 2000 to 2012 Educational Research Review 10(4), 90–115.
J. Li R. Kizilcec J. Bailenson, & W. Ju (2015). Social robots and virtual agents as lecturers for virdo instruction Computers in Human Behavior 55 1222–1230.
R. E. Mayer (2001). Multimedia Learning Cambridge University Press.
K. Ouwehand T. Van Gog, & F. Paas (2015). Designing effective video-based modeling examples using gaze and gesture cues Educational Technology & Society 18.
Z. Pi J. Hong, & J. Yang (2017). Effects of the instructor's pointing gestures on learning performance in video lectures British Journal of Educational Technology 48(4), 1020–1029.
K. Sharma P. Jermann, & P. Dillenbourg (2014). How Students Learn using MOOCs: An Eye-tracking Insight EMOOCs 2014, the Second MOOC European Stakeholders Summit.
K. Sharma H. S. Alavi P. Jermann, & P. Dillenbourg (2016). A gaze-based learning analytics model:in-video visual feedback to improve learner's attention in moocs 417–421.
SoftBank Robotics (2017) “Find out more about Pepper”. [Online] Available from: http://www.ald.softbankrobotics.com/en/robots/pepper [accessed 28 March 2018].
Y. Tian, & M. L. Bourguet (2016). Lecturers' Hand Gestures as Clues to Detect Pedagogical Significance in Video Lectures European Conference on Cognitive Ergonomics (pp.2). ACM.
J. Wang, & P. D. Antonenko (2017). Instructor presence in instructional video: effects on visual attention, recall, and perceived learning Computers in Human Behavior 71 79–89.
A. M. F. Yousef M. A. Chatti, & U. Schroeder (2014). Video-Based Learning: A Critical Analysis of The Research Published in 2003-2013 and Future Visions eLmL 2014 : The Sixth International Conference on Mobile, Hybrid, and On-line Learning (pp. 112–119).
J. R. Zhang (2012). Upper body gestures in lecture videos:indexing and correlating to pedagogical significance ACM International Conference on Multimedia (pp. 1389–1392). ACM.