Should everyone learn to code?

The media seems to think that everyone should learn to code, but I strongly disagree and believe we should look at the bigger picture.

tl;dr: There’s much more to IT than programming and we should be educating ourselves and our children in a variety of fields, not just the one that’s currently popular

Firstly, I think it’s important to not be elitist and say “It’s not for you, Jen” or make fun of some peoples’ impression that programming knowledge is a binary thing that you either have or don’t have, and you either don’t know it or you know every language, framework and problem-solution (or, as is more common, fail to mention this and assume that JavaScript is the language computers “think in”). No, you won’t find any mocking or berating in this post, just a benevolent attempt at questioning a seemingly-common misconception.

What is code? Baby don’t hurt me

Government, schools and various institutions (even Zuckerberg, apparently) seem to think that everyone should learn to code. Children are being given access to applications that simulate programming in a sort of logic-gate-esque GUI and even bestowed cheap Raspberry Pis so they can write scripts to make LEDs twinkle as if Captain Pike were trying to tell us something (the real Pike, not the new father-figure Dave “Pikey” Pikeman that dies before ever reaching Talos IV).

While I think it’s amazing that people are recognising the importance of technology and making an effort to educate young people to grant a head-start, I wonder whether it’s misguided. It’s as if programming is the height of Information Technology and the sum of all our technological progression — all past efforts (cracking the Enigma machine, internet protocol and the space shuttle) are alright, but programming is what the rockstars of IT do. The nerdy kids do that hard stuff like astrophysics, but it’s the programmers who fly the ship, right?

Sure, sometimes I feel a bit like a rockstar when that elegantly-written, perfectly-indented, shouldn’t-work-but-it-does recursion does into production, but there’s so much more to explore. There’s nothing wrong with learning to write complex logic by breaking it down into smaller problems — it’s hard to argue that it’s a bad thing — but what’s the desired outcome? Millions of people can enhance your home security system but no-one knows how to maintain the cameras?

Think of it like this: being able to program a Pi is certainly cool, but what about being able to construct one? Surely we should be encouraging, nay, rewarding those skills too?

Brave New World

IT is a huge subject. Like, massive. It’s ridiculously humongous. It’s so gargantuan that there isn’t enough legible slanting I can apply to do it justice.

It includes networking, graphic design, UX/UI, usability, scalability, hosting, hardware, wearables … to say that everyone should learn to code is to ignore the many fields in IT that aren’t programming. Even now there is a very real need in industry for a wide range of skills, but programmers are ten-a-penny these days — just look at the race-to-the-bottom created by Indian dev-shops on sites like Elance.

I think it’s a fallback perspective on an industry that many don’t understand. Not in a “they’re not in the club” sort of way, just that most people don’t use or need much technical knowledge to get on perfectly well in their fields with tools that come with usable GUIs. The logic is something like “Mobile apps are integral to modern life. How do you make apps? Programming!” and it totally makes sense unless you’ve worked in/around the industry and have seen all the other stuff going on in the background.

“Like what?” you say? Well, someone has to work out what the app should do (Product Management) and how it should behave (Interaction Design) and what it should look like (Visual Design) and, if it stores data centrally, someone needs to make the API (Back-end Developer) and ensure it can handle the expected capacity and scale as it grows (Infrastructure) and make sure people know it exists (Digital Marketing), and many more unseen roles.

Suddenly your app requires many more skillsets than programming — and that’s just for Candy Crush!

So what is “programming”?

Stepping back a bit, let’s look at what programming is in itself, in a Marcus Aurelius “What is its nature?” sort of way.

Individual acts of programming are 99% logic and maths. By that, I mean that writing individual lines of code and simple functions require thinking a problem through step-by-step and probably an element of mathematical theory too. I’m not saying you need to be good at arithmetic (I’m certainly not!) but knowing that to calculate a mean you divide the sum by the quantity is an example. Being able to calculate it in your head is a bonus, but being able to program a computer to follow the logic of the maths is more important.

However that description gets a little hazier when you look at frameworks, libraries and application architecture (more high-level stuff) but before you can do any of that you have to learn to apply logic to your chosen language first anyway. Even then, you’ll need to direct some brain-power to getting other peoples’ code to work how you want.

Programming is hard. Not hard like a tricky crossword puzzle, but hard like building a house one brick, cable and water pipe at a time. Even a seemingly-innocuous requirement like “allow photo uploads” is riddled with questions, like what filetypes to accept, where the files will be stored, what’s the min/max resolution/file size, whether infrastructure changes are required to allow such uploads (such as server limitations), how regularly will they be accessed (e.g. CDN vs. cheap cold storage). I’m not saying you have to have a freakishly-high IQ for it, just that very few things in programming are simple, and often it’s the seemingly-simple things that turn out to be the hardest (such as cache invalidation or naming things!)

For a more real-world example, ever noticed that Twitter doesn’t let you edit your tweets? Imagine the requirement “allow users to edit their tweets” and think about how those changes propagate (e.g. to mobile users who may even be offline) and how a thread would work if someone completely changed the meaning of a tweet that had already been replied to. Also, how far back in time do you allow such edits? Should users really be given the power to rewrite history? I’m not being facetious, Twitter seriously considered these questions and decided it was better to not have that feature at all.

“We’re supposed to start with these operation programs first, but that’s major boring sh!t”

Like I said earlier, knowledge of programming isn’t a binary do-or-don’t thing and you don’t throw a day/week/month/year/decade at learning programming and suddenly “I know Kung Fu”. Sites like CodeAcademy are great for an introduction but, by their nature, it’s difficult to teach analytical thinking after you’ve done the necessary hand-holding to keep someone’s attention without leaving them feeling thrown into the deep end when, god forbid, they have to work out why variable foo is undefined for themselves. Again, I’m certainly not criticising sites or code camps offering these kind of services (I actually think they’re great resources and am going through a course on the R language myself right now) I’m just trying to put them into perspective.

You don’t “learn programming” as a thing in itself, rather you learn a variety of concepts and operations to make things happen. Things like if / while / for / foreach / do / try and such have a place in almost every programming language and, like real-world languages, once you know one it’s easier to pick up another because you’re no longer learning concepts, just things like syntactical differences, scoping quirks and the age-old question of whether expressions need terminating with a semicolon.

What will surprise you is just how low-level these operations are — you may think “add tweet” but you actually have to “get text in editor, check content is valid, escape any nastiness, check user is authorised to add a tweet, store in database, update list of tweets in current user’s view, ensure process doesn’t in some way break database replication” and probably a lot more that I can’t name off the top of my head.

Another surprise will come when you realise, once reaching a certain level of competency, that a big proportion of your time will be spent making other peoples’ code play nicely rather than writing code yourself!

What’s my point then?!

Am I trying to put you off programming, perhaps an attempt to make my own skills less available and more sought-after? Absolutely not!

Some of the most fulfilling in-the-flow moments I’ve ever experienced have been whilst writing code and if it can bring you the same joy then by all means get typing! What I’m really trying to do is encourage people who have the inclination and motivation while telling everyone else not to worry so much. You probably don’t need to learn programming anyway!

Want to get into medicine, manufacture, architecture, fashion, import/export or basically 99% of all career paths? You almost certainly don’t need to know how to code. At all. General computer literacy will be valuable, but accountants don’t program their own accounting software when they can buy a QuickBooks license. Seriously — don’t worry if people keep telling you that you should learn to code. Instead, take a deep breath and focus on…

What we should be teaching instead

Teaching everyone to code is an admirable goal, albeit a misguided one. One valuable skill it may sharpen is the ability to approach large problems in manageable chunks and approach each smaller problem logically. In short, it will make you smarter (it’s certainly made my mind feel sharper) but so will many other things!

General computing is an absolute minimum: things like typing, basic keyboard shortcuts or knowing how to accomplish common tasks like moving files around and keeping your anti-virus updated. And, of course, lots of patience and strong Google-fu! After that, there are countless skills to learn from network architecture to application usability.

One last example: no single person can make a computer mouse. Someone has to design the tools to extract crude oil, someone has to manufacture those tools, someone has to operate them, someone has to have a logistics network to transport the oil, someone has to turn it into a usable material, someone has to mold it into the required shape, someone has to create the electronic circuitry and cables, someone has to make it work with the USB standard, someone has to market the mouse, someone has to package it, someone has to advise customers which to buy, etc. There must be a dozen other things involved, but there is no single person who knows how to do it all.

Do some research, find out what’s out there, experiment with programs and processes in different fields and see what you enjoy and are good at. Should we be giving soldering irons to toddlers? Probably not, but the field of electronics could start with motorised Lego components and move on to modelling circuitry!

Embiggen your mind

To say that everyone should learn programming because technology is so important is like saying everyone should learn astrophysics because science is important. What about chemistry, biomechanics, quantum theory, anthropology…science is a huge field and so is technology.

We seem to have a laser-sharp focus and I’m just saying we need a larger field of vision if we’re going to teach people the skills they really need for the future.

Leave a Reply

Your email address will not be published.