In the UK coding for children is touted as “the new Latin”, a phrase coined in the Livingstone-Hope review on skills for the video games and visual effects industry. In the US plenty of people are jumping on the coding bandwagon, no doubt excited by what they see in the tech startup boom.
Yet one naysayer is Jeff Atwood who says “Please don’t learn to code”. Jeff’s central arguments are that coding is just a means to an end (i.e. solving problems), that it’s much better to focus people on communication and problem-solving, and most people aren’t suited to be software developers anyway.
My friend and former colleague Martin Belam disagrees, saying learning to code has helped him significantly to do his (non-coding) job effectively. Zed A. Shaw reckons Jeff is being elitist. I don’t see that in Jeff’s article; I see someone who is trying to warn about the damage caused by too many people with inflated hopes jumping on a bandwagon.
Undoubtedly our overall level of technological literacy really does need to be improved.
Last week I heard Victoria Davison, COO of Marsh, talk about the kind of decisions her board has to make. They are regularly business decisions with a large technology component, and she commented that her colleagues don’t have the tools or understand the trade-offs to make those decisions easily. Then there’s the CEO who acquired a technology company to absorb, even though its technology was entirely different to his own — apparently no-one thought through that; the CFO who thought moving his systems into the cloud would mean he no longer needed an operations team; and the headmaster who thought he was providing his school’s events calendar in machine-readable format because he scanned the flyer and published it as a PDF.
Meanwhile the UK ICT curriculum teaches children clerical skills and how to choose between a CRT monitor and a TFT display [pdf].
Clearly by the time we get to adulthood we need, overall, a much better understanding of technology-related issues than we do today. Will that be learned by a generation of children if they learn to code?
I think there is a parallel drawn easily with other subjects. We all study mathematics for years and we end up being broadly numerate — some more than others, of course. We all study a foreign language for a while and that helps us appreciate other cultures and maybe get by when we go abroad. We study history and we end up having a pretty good appreciation of how the world around us has come to be and how our nations might sensibly make decisions today. And so on.
Similarly I think learning to code will probably help our children become technologically literate, although coding is pretty narrow, and the case becomes much stronger when you augment it with other things that go around it: software and systems architecture, user experience design, data management, and much, much else besides. Thus learning to code — and more besides — is probably sufficient to address our problems.
But then there’s the waste…
Coding is probably not necessary
Unfortunately “the new Latin” is an effective phrase for its negative connotations, too. Learning Latin is wonderful for enabling you to learn many (Western) languages, but pretty useless in itself. If you just want to learn German, say, then it’s better to go directly for that rather than learning Latin first.
There is plenty of waste in other subjects, too, on much more specific issues. Most people who study simultaneous equations in maths never use them after the exams; most people who study a foreign language forget it again quickly after; and most of us would be hard-pressed to explain the precise usefulness of knowing the events at the Battle of Culloden, if that was on our syllabus.
Similarly most of the specific struggles children might have with getting something to compile or resolving a null pointer exception will be forgotten by the time they leave education.
It seems like there should be a more direct route to technological literacy, without all that waste. That’s why I think coding is probably not necessary to achieve that end. There’s probably a more efficient route.
And at the same time we have to ask what this is all for.
Even though we studied those subjects hardly any of us have become professional mathematicians, linguists or historians. Similarly — Jeff Atwood will be relieved to know — the vast majority of those young coders will not become professional software developers.
All that education, though, will better place children to take up those professions. The Livingstone-Hope review, and the follow-up Next Gen. report, is about “how the UK can be transformed into the world’s leading talent hub for video games and visual effects”. It’s about putting people in jobs.
If that really is the goal then there’s no radical transformation of the curriculum. We were training children for clerical jobs; now we’re training them for programming jobs. We’ve just shifted the syllabus to follow a shift in the job market. Education success is still measured by your job prospects, rather than a broader goal of understanding and appreciating the world, and contributing to it in ways that are not bound by an industry’s current hiring requirements.
If we did measure the success of a syllabus by children’s readiness for the job market then we would abolish art and music lessons immediately; the job prospects there are simply too minimal. And yet there is a consensus that there is value in those subjects. So what do we want to get out of technology education? There is creativity in coding, certainly. There’s also creativity in user interface design, hardware building, mathematical visualisation, and more.
So what is our children’s education for? And what skills should adults learn to help them become more technologically literate? Teaching people to code is, I’m sure, a valuable thing. Augmenting that with some related topics will help even more. But we shouldn’t be thinking just about putting people in jobs. We would be better off thinking much more broadly: beyond just coding and beyond just jobs.