The digital native is a myth
Some people put the cut-off at 1984, but for most it is 1980. People born after that date are the digital natives; those born before are digital immigrants, doomed to be forever strangers in a computer-based strange land.
The generational difference between the groups goes beyond their numbers of Facebook friends and Twitter followers: it can also help to explain differences in how they buy insurance. At least, that’s according to a report released this week for the insurance industry. Targeting Millennials with Insurance explains that young people aren’t like those who came before and queued passively for cover. They “prioritize holidays”, for one, which might surprise some of them. Because they are digital natives, they “will favor technologically innovative insurance policies”.
But a paper published last month in Teaching and Teacher Education reaches the opposite conclusion. The digital native is a myth, it claims: a yeti with a smartphone (P. A. Kirschner and P. D. Bruyckere Teach. Teach. Educ. 67, 135–142; 2017). The implications go beyond insurance. Many schools and universities are retooling to cope with kids and young adults who are supposedly different. From collaborative learning in the classroom to the provision of e-learning modules in undergraduate courses, the rise of the digital native is being used as a reason — some say a justification — for significant policy changes.
Education policy is particularly vulnerable to political whims, fads and untested assumptions. From swapping evolution for creationism to the idea that multiple types of intelligence demand multiple approaches, generations of children are schooled according to dogma, not evidence. Surveys show, for example, that teachers and education experts subscribe to dozens of different and opposing ‘learning styles’. Under these, children can be categorized as activists or theorists, organizers or innovators, non-committers or plungers, globalists or analysts, deep or surface learners, and so on. Could the latest example be altering access to, and the provision of, technology in the classroom, simply because a new cohort is believed to be more familiar with it?
It is beyond dispute that people brought up in the most recent decades have been exposed to a lot of digital technology — at least in developed countries. And paper co-author Paul Kirschner, an education researcher at the Open University of the Netherlands in Heerlen, happily describes himself in his academic work as a “windmill-fighter”. But whereas Don Quixote aimed against solid walls, the digital-native assumption, on closer inspection, does seem illusory. It is certainly no giant.
A 2011 review for the Higher Education Academy in York, UK, put it bluntly, as the first of its executive-summary conclusions: “There is no evidence that there is a single new generation of young students entering Higher Education and the terms Net Generation and Digital Native do not capture the processes of change that are taking place”. Many members of the digital-savvy generation use technology in the same way as many of their elders: to passively soak up information. Children say they prefer IT in their lessons and courses? Do schools listen when kids say they prefer chips for lunch every day?
The Teaching and Teacher Education paper raises another concern. Digital natives are assumed to be able to multitask, it warns. But the evidence for this is also scant. Reading text messages during university lectures almost certainly comes at a cognitive cost. So too, employers might assume, does fiddling with smartphones and laptops in meetings. Buy that technologically innovative insurance policy another time.