Discover more from Futurestate Insights
Is it do-or-die for non-programmers?
In an ever-changing digital world, coding is a prevalent skill that provides us with our favourite and most useful digital programs. But…
In an ever-changing digital world, coding is a prevalent skill that provides us with our favourite and most useful digital programs. But should coding be necessary for everyone to know in order to ensure survival in a modern technology-based society?
By Emily Wilson
When it comes to coding, I’m no genius. In fact, if you asked me what an algorithm is, I’d probably tell you that it derives from a Harry Potter spell — I’m that clueless. But in this digital age, programming may well be the next step forward; code could become the new alphabet. If that’s the case, what happens to the older generations who’ve never really grasped the concept of typing symbols into computers? What happens to those who didn’t get the opportunity to learn the knack of coding? Do they get left behind? Paul Ford, a writer and programmer living in Brooklyn, seems to think so. As he recently wrote in Businessweek: “the world belongs to people who code. Those who don’t understand will be left behind.”
To a child, committing their ABCs to memory is a difficult feat, let alone using these letters to form words and sounds. When numbers come along, they seem impossible to understand until the child notices the pattern — then comes the realisation, the plant that blossoms in young people’s minds: almost everything comes down to order. After that, the world becomes simpler. Especially with computers, tablets and smartphones to assist even the youngest of consumers.
Now punctuation is introduced. Punctuation is different. It doesn’t have a specific order, nor is there a pattern involved in the strategy used to teach it, but there are rules. Commas can’t be used at random; exclamation points shouldn’t be employed without purpose, especially in a sentence that has no place making a significant impact. Punctuation is taught, retaught and constantly reintroduced throughout academic careers, pushing its way into sectors of education reserved for the maturest of material. Eventually, you get the hang of it.
This is what it all comes down to, the possibility of having to learn more in order to fit into the digital age.
But the moment a young person like me is introduced to code, everything comes crashing down. Where is the pattern? Where are the complete words, the sentences that make sense? Why are the rules of punctuation discarded? What happened to the separation of letters and numbers?
And the biggest question of all: are we expected to learn this? This is what it all comes down to, the possibility of having the learn more in order to fit into the digital age. Otherwise, will the train carrying the progression of the world forget about us, leaving us gazing cluelessly at symbols that make no sense?
It seems as though learning how to code should be a ‘must’ for those uneducated in this area, which is a concept that a wide range of people agree on — and there are aspects of this that I definitely understand. With the growing reliance on technology comes a growing reliance on programming. Put simply, more programmers mean more computers, resulting in more happy people.
Then again, is coding really an essential skill that everyone needs to harness? Sure, an increase in the number of programmers would be a definite advantage to the digital world and its evolution, but is it truly necessary for everyone to know how to code?
As a teenager, the thought of committing myself to learning the ways of the coding culture strikes fear into my very soul. Don’t get me wrong, I think that the prospect of being able to boast a new skill is pretty great, but the time it would take for me to wrap my head around it and the effort I’d have to put into practicing it doesn’t fit into my attention span. After years of staring desperately at teacher after teacher, scrabbling to process the information they throw at me, I’m not about to subject myself to more scrambled thought processes.
A ‘computing revolution’ took place in some schools, the curriculum becoming more computer-friendly.
On the other hand, that’s just my opinion. I know plenty of people my age who are searching for a way to open the door to opportunity to program, unhappy with the fact that most schools aren’t fans of teaching IT. This is mostly because the majority of schools are controlled by older generations, by people who haven’t fully come to terms with the fast-paced digital world we have come to live in. Despite the evolution of technology, these older generations refuse to accept the need for IT skills, preferring to stick to the traditional Maths, English and Science. Really, it should be Maths, English, Science and ICT; it should be required for schools and colleges to provide courses on the main sections of computing. The importance of this has been recognised by many, and last year, a ‘computing revolution’ took place in some schools, the curriculum becoming more computer-friendly. But this is not the case for all schools; the selection of subjects taught, in the end, comes down to the preferences of the school staff.
So no, I don’t think that coding will become so large that everyone will need to know how to do it, but I do think it should be an integral part of mainstream education. No, I don’t think that coding will become world-dominating and that those who can’t program will be left behind, because I think that there will always be a need for other aspects of the digital world a need for the people with the abilities to fulfil these other roles.
Emily is a student and budding writer on a work experience placement at Wilson Fletcher.