EdTech InsightsArticles

Touchscreens and Handwriting: A Conversation

March 25, 2024

John: You have done extensive research into how children are using touch devices, and observed them using them in unique ways from adults. What do you think accounts for this behavior? I’m thinking many adults only started using touchscreens as adults, whereas children using touchscreens are “digital natives”. In your expert opinion, is the way children use touchscreens something they grow out of, or will that become the norm moving into the future? ( I am aware that children do “miss” the piece they should be touching, which is likely something they grow out of, but wondering if you are seeing other distinguishing features).

Dr. Anthony: Honestly what we see is that the term “digital native” is a bit of a myth. Yes it’s true that children grow up from a much earlier age exposed to technology, including touchscreens, but the concept of the digital “native” implies that there is somehow a natural ability or affinity for technology that these kids have. A more accurate picture includes the fact that even children have to learn the way to interact with the device, the devices are not yet intelligently adapting TO the way people or even children naturally try to interact. I think most of our results that show children’s and adults’ interactions with touchscreen technology differ are stemming from the ongoing development of cognitive and motor skills in children--children’s gestures don’t look like adults until they have acquired the cognitive skills to understand and recognize what gesture they need to make, and the motor skills to be able to execute the gesture with precision and accuracy. As children grow up, these differences disappear (our early work showed that input by children ages 10 to 13 had fewer differences from adults’ input, and almost no differences were evident for teens).


My biggest takeaway here is that children must also learn how to use a device. My incorrect assumption was that earlier exposure to technology means that someone would  take to it more naturally. As a former educator, I should have known better; I taught English grammar to 4th graders for years, and know that verb tenses and plurals still trip up 10 year olds. In fact, Dr. Anthony’s statement that “the digital native”, a phrase coined by Mark Prensky, is in reality, a myth, and one no longer held by most people in the field (source).

John: Educators are heavily focused on evidence of comprehension and retention. Studies suggest that handwriting is superior to typing for both. Here at Magma, we are heavily focused on these same goals and hope our touchscreen handwriting to digital text feature has clear benefits for math retention and comprehension. Have you researched specifically touchscreen handwriting and its cognitive benefits? What are the biggest takeaways in your opinion?


Dr. Anthony: My work has primarily proceeded from a foundational assumption that direct handwriting will be more beneficial to learning and development than interacting with buttons and widgets. This assumption is not something my lab has tested empirically, but it is based on two pieces of prior work:

1. That direct manipulation in general has been found to be beneficial from a human-computer interaction perspective.

2. That handwritten note-taking has been found to be beneficial for learning.

This is an assumption that we also hold at Magma. Handwriting helps with comprehension and retention. I know from personal experience that if I take notes, or even doodle while someone is lecturing, I remember more. My favorite quote/article title on this subject was attached to the second article provided by Dr. Anthony, “The pen is mightier than the keyboard”.

John: As we move towards an increasingly digital and remote world, what does your research suggest as key components to keep? Your contribution to “Designing Touchscreen Interfaces that Don’t Interfere with Learning” is the basis for this question. In other words, if you had an unlimited budget, the best R&D team in the world, and complete control, what would the perfect student hardware look like/include?


Dr. Anthony: I have been inspired throughout my career by the vision of an adaptive, personalized learning assistant as envisioned by Neal Stephenson in “The Diamond Age: Or, A Young Lady's Illustrated Primer”. In that book, an impoverished young girl finds a digital tablet-like device that in her world, the privileged elite can afford to give to their children. The device imprints to her and presents learning content she can interact with directly at her own pace. The device is able to present anything of interest to her and grows with her as she grows. This book is still science fiction because a truly adaptive, personalized assistant requires a level of AI (artificial intelligence) that we do not yet know how to build. My lab is working on the “natural” interaction piece, at least. We are trying to build hardware and software that can accept any type of input (voice, touch, gesture) from the child at any point, detect the child’s intention, and respond intelligently. We are still far away from realizing even this level of adaptation!

Educational Technology is an exciting and ever changing landscape. The use of AI in educational technology has long been supported by some of the largest educational technology companies including Pearson. As commercially available educational technology merges with AI, we can start to dream for a device like the one Dr. Anthony posits.

John: I read the abstract to your most recent paper, ”Affording Embodied Cognition through Touchscreen and Above-the-Surface Gestures Guring Collaborative Tabletop Science Learning” from your website. I did my best to understand, but was hoping you might be able to explain it to me like I’m five, haha. I’d love to learn more specifically about “above-the-surface” gestures; what that means, and how that impacts learning.


Dr. Anthony: In this study, we examined how families interacted with each other and with the touchscreen tabletop software in order to learn something about Earth’s ocean temperature patterns over time. The idea of ‘embodied cognition’ says that we don’t use only our brains to think, we also use our bodies. Think of a softball player who is running to catch a fly ball: her head is not filled with equations and math to calculate the trajectory of the ball and where she should run to, but she is using her experience of past catches and ‘feeling’ the ball and where it’s going to go in order to be there at the right place and time to catch it. This kind of embodied cognition is going on all around us while we interact with the world, and also with technology, but we think so especially in touchscreen interactions that enable direct manipulation. In our study, we saw families using gestures above-the-surface to communicate together--that is, they would point to the screen to draw their companions’ attention or make a swirlie gesture without actually touching the tabletop to illustrate their conception of the rotating currents in the ocean. These gestures that don’t make contact with the screen are not usually able to be detected by the touchscreen, but we argued in our paper that if we COULD, we could better support learning through the natural ways people communicate and collaborate with each other.

Body language communicates at large part of what is being communicated, maybe even MOST of what is being communicated. The idea that nonverbal communication could be captured and understood by technology would have profound impacts upon the classroom, especially in the Learning Disabilities and Challenges sphere.

John: If you could change or champion one educational initiative, what would that be? For example, “I would make tablets available to every child under the age of 10” -or- “I would make Art a mandatory bi-weekly course”.

Dr. Anthony: I would include web and information literacy courses at every grade level for all children--helping them to understand the difference between advertising content in their games, for example, and game-related content. Helping them to be able to weigh the trustworthiness of information they are bombarded with from all angles online and in social media. The availability of touchscreen technology has transformed our society but we have not necessarily kept up with training the next generation on how to think critically about what they will encounter while using that technology.

Dr. Anthony is a true educator; she prizes critical thinking overall. Curiosity and the ability to explain one’s thinking is the goal of most dedicated educators. We thank Dr. Anthony for her time and thoughtful responses.

EdTech InsightsArticles

Touchscreens and Handwriting: A Conversation

Mar 25

John: You have done extensive research into how children are using touch devices, and observed them using them in unique ways from adults. What do you think accounts for this behavior? I’m thinking many adults only started using touchscreens as adults, whereas children using touchscreens are “digital natives”. In your expert opinion, is the way children use touchscreens something they grow out of, or will that become the norm moving into the future? ( I am aware that children do “miss” the piece they should be touching, which is likely something they grow out of, but wondering if you are seeing other distinguishing features).

Dr. Anthony: Honestly what we see is that the term “digital native” is a bit of a myth. Yes it’s true that children grow up from a much earlier age exposed to technology, including touchscreens, but the concept of the digital “native” implies that there is somehow a natural ability or affinity for technology that these kids have. A more accurate picture includes the fact that even children have to learn the way to interact with the device, the devices are not yet intelligently adapting TO the way people or even children naturally try to interact. I think most of our results that show children’s and adults’ interactions with touchscreen technology differ are stemming from the ongoing development of cognitive and motor skills in children--children’s gestures don’t look like adults until they have acquired the cognitive skills to understand and recognize what gesture they need to make, and the motor skills to be able to execute the gesture with precision and accuracy. As children grow up, these differences disappear (our early work showed that input by children ages 10 to 13 had fewer differences from adults’ input, and almost no differences were evident for teens).


My biggest takeaway here is that children must also learn how to use a device. My incorrect assumption was that earlier exposure to technology means that someone would  take to it more naturally. As a former educator, I should have known better; I taught English grammar to 4th graders for years, and know that verb tenses and plurals still trip up 10 year olds. In fact, Dr. Anthony’s statement that “the digital native”, a phrase coined by Mark Prensky, is in reality, a myth, and one no longer held by most people in the field (source).

John: Educators are heavily focused on evidence of comprehension and retention. Studies suggest that handwriting is superior to typing for both. Here at Magma, we are heavily focused on these same goals and hope our touchscreen handwriting to digital text feature has clear benefits for math retention and comprehension. Have you researched specifically touchscreen handwriting and its cognitive benefits? What are the biggest takeaways in your opinion?


Dr. Anthony: My work has primarily proceeded from a foundational assumption that direct handwriting will be more beneficial to learning and development than interacting with buttons and widgets. This assumption is not something my lab has tested empirically, but it is based on two pieces of prior work:

1. That direct manipulation in general has been found to be beneficial from a human-computer interaction perspective.

2. That handwritten note-taking has been found to be beneficial for learning.

This is an assumption that we also hold at Magma. Handwriting helps with comprehension and retention. I know from personal experience that if I take notes, or even doodle while someone is lecturing, I remember more. My favorite quote/article title on this subject was attached to the second article provided by Dr. Anthony, “The pen is mightier than the keyboard”.

John: As we move towards an increasingly digital and remote world, what does your research suggest as key components to keep? Your contribution to “Designing Touchscreen Interfaces that Don’t Interfere with Learning” is the basis for this question. In other words, if you had an unlimited budget, the best R&D team in the world, and complete control, what would the perfect student hardware look like/include?


Dr. Anthony: I have been inspired throughout my career by the vision of an adaptive, personalized learning assistant as envisioned by Neal Stephenson in “The Diamond Age: Or, A Young Lady's Illustrated Primer”. In that book, an impoverished young girl finds a digital tablet-like device that in her world, the privileged elite can afford to give to their children. The device imprints to her and presents learning content she can interact with directly at her own pace. The device is able to present anything of interest to her and grows with her as she grows. This book is still science fiction because a truly adaptive, personalized assistant requires a level of AI (artificial intelligence) that we do not yet know how to build. My lab is working on the “natural” interaction piece, at least. We are trying to build hardware and software that can accept any type of input (voice, touch, gesture) from the child at any point, detect the child’s intention, and respond intelligently. We are still far away from realizing even this level of adaptation!

Educational Technology is an exciting and ever changing landscape. The use of AI in educational technology has long been supported by some of the largest educational technology companies including Pearson. As commercially available educational technology merges with AI, we can start to dream for a device like the one Dr. Anthony posits.

John: I read the abstract to your most recent paper, ”Affording Embodied Cognition through Touchscreen and Above-the-Surface Gestures Guring Collaborative Tabletop Science Learning” from your website. I did my best to understand, but was hoping you might be able to explain it to me like I’m five, haha. I’d love to learn more specifically about “above-the-surface” gestures; what that means, and how that impacts learning.


Dr. Anthony: In this study, we examined how families interacted with each other and with the touchscreen tabletop software in order to learn something about Earth’s ocean temperature patterns over time. The idea of ‘embodied cognition’ says that we don’t use only our brains to think, we also use our bodies. Think of a softball player who is running to catch a fly ball: her head is not filled with equations and math to calculate the trajectory of the ball and where she should run to, but she is using her experience of past catches and ‘feeling’ the ball and where it’s going to go in order to be there at the right place and time to catch it. This kind of embodied cognition is going on all around us while we interact with the world, and also with technology, but we think so especially in touchscreen interactions that enable direct manipulation. In our study, we saw families using gestures above-the-surface to communicate together--that is, they would point to the screen to draw their companions’ attention or make a swirlie gesture without actually touching the tabletop to illustrate their conception of the rotating currents in the ocean. These gestures that don’t make contact with the screen are not usually able to be detected by the touchscreen, but we argued in our paper that if we COULD, we could better support learning through the natural ways people communicate and collaborate with each other.

Body language communicates at large part of what is being communicated, maybe even MOST of what is being communicated. The idea that nonverbal communication could be captured and understood by technology would have profound impacts upon the classroom, especially in the Learning Disabilities and Challenges sphere.

John: If you could change or champion one educational initiative, what would that be? For example, “I would make tablets available to every child under the age of 10” -or- “I would make Art a mandatory bi-weekly course”.

Dr. Anthony: I would include web and information literacy courses at every grade level for all children--helping them to understand the difference between advertising content in their games, for example, and game-related content. Helping them to be able to weigh the trustworthiness of information they are bombarded with from all angles online and in social media. The availability of touchscreen technology has transformed our society but we have not necessarily kept up with training the next generation on how to think critically about what they will encounter while using that technology.

Dr. Anthony is a true educator; she prizes critical thinking overall. Curiosity and the ability to explain one’s thinking is the goal of most dedicated educators. We thank Dr. Anthony for her time and thoughtful responses.

More episodes

No items found.