Teach Children to Code

I read a fair few tweets last night on subject of teaching children to program in school. A lot of the discussion appears to have been prompted by Ben Goldacre’s link to a post by programmer/author John Graham-Cumming supporting a petition entitled “Teach Our Kids to Code”. The petition argues that we should teach kids to program from Year 5 (9-10 years old). Definitely! Just as I was about to sign the petition this morning I saw a tweet by Mark Henderson, The Times Science Editor, saying that David Willets MP had just announced a pilot programme to teach programming in schools! Great stuff.

David Willetts has just announced pilot programme to teach schoolchildren coding & to develop a programming GCSE.

11:23 AM Thu Sep 15, 2011

I was about 10 years old when my parents bought our first computer. They had saved up for a long time and I was so excited about it. I remember the day we got it very clearly. It was a Compaq Presario with 2.2 GB hard drive, 64 MB RAM and 600 MHz Celeron CPU. The power! If I wasn’t out on my bike with friends you could find me endlessly fiddling, breaking and then fixing the computer (all whilst trying to hide the fact that I had broken it from Dad—I had to fix it otherwise they were going to be pretty angry/worried that I had broken their expensive new PC).

At a young age I was a logical thinker and quickly became computer literate, teaching myself HTML and then Javascript and PHP. If I, by no means a “child genius”, can work it out on my own from Internet resources then there is no reason why other children couldn’t learn to program in school with some good teaching.[^1] I think a 10 year old could easily cope with logical statements such as “if this then that” or “while this do that”—they naturally think like it in everyday life, they just need to help to translate it into the formal instructions a computer understands. Programming is fun and intellectually satisfying, much more so than the ridiculous ICT “lessons” that I used to have: open a Word document, copy some text, print, make a change, print again. Completely pointless.

It’s a useful skill too. Being able to program has been really handy for me at university. Last year I recorded probably around a hundred absorption and emission spectra (and maybe thousands using an automated system) which would have been impossible to analyse using Excel, the standard tool of choice amongst undergrads in my department. A bit of code in MATLAB and you can analyse as much data as your computer can cope with. For some reason my department didn’t teach a programming language like other departments such as Physics who taught C++. Instead we had “maths lab” where we used Excel for numerical methods. Not very useful (and very dull). Solving Project Euler-style problems with something like MATLAB or even better a proper, open source, high level language like Python (with SciPy and matplotlib) would be much more useful. I’ve been learning it myself over the summer. It’s a fun language that I’d love to teach to a class of undergraduate chemists.

I think it’s clear that school children would benefit from being able to program. Even if they never use code again, they would gain an understanding of how a computer functions and can then use this knowledge to work out how new software works. Rather than teach specific software applications, teach computing. It’d benefit industry too. Fingers crossed that the government doesn’t force schools to teach a horrible proprietary language outsourced as a “solution” on a ludicrously expensive contract and instead choose something open source and useful. A recent report about open source in Whitehall doesn’t bode well…

[^1]: Interestingly, my teacher told my parents in year 4 that I “had reached my plateau”. I’d quite like to let him know where I am now!

“I’ll just memorise it for the exam”

Back in May, I read a blog post by Nick Morris titled Do students need to know facts or do they just need to know how to interpret them? in which he wrote that if students don’t need to know the facts, but instead only understand them, then there needs to be a major change in teaching and subsequent assessment at university. I intended to write a response but never really got round to it. A recent post by The Curious Wavefunction, On Chemistry’s Multiple Cultures, got me thinking about it all again.

At the time of Nick’s post I had just finished my last ever set of written exams, but morale was not exactly high as our research project reports were due in couple of weeks and then we had viva voce exams (covering years 1-3 of our degree…). It occurred to me that throughout the whole of my degree I have had to endlessly memorise facts in order to be successful in exams. Why?

Some memorisation is necessary because every scientist needs to know the foundations of their discipline. Memorisation should be restricted to the foundations. All chemists, for example, need to know all of the functional groups. However memorising facts, in my experience, isn’t confined to the foundations. I’ve been expected to memorise trivial details of advanced courses that are forgotten as soon as the exam is over. For example, in one course I was required to memorise the half-lives, precursors and corresponding nuclear reactions of radioactive isotopes used in positron emission topography and write them down when proposing synthetic routes to molecules. Yes, the half-life is important, but why are marks wasted on these details when they could be provided in the exam. The marks should be used to assess understanding, not the ability to memorise numbers.

Another course wholly consisted of memorising reactions in the presence and absence of ultrasound or microwaves, and then writing them down in the exam. Third and fourth years, who have mastered the foundations, should study advanced material by looking at the patterns and trends. They should be thinking, understanding and reasoning, not memorising.

In my final year I spent the majority of my time working on my research project. Memorisation was of no use to me then—what good is the ability to memorise when “the facts” are not yet known? I’ve seen friends who ace exams because they can memorise derivations and reaction conditions fail miserably in the lab because they can’t work out what to do when things don’t go quite as the textbooks would suggest. One friend used to memorise whole derivations for exams, even though he didn’t understand them. Far too much emphasis is placed on an undergraduate’s ability to memorise rather than think, considering the former is, in my limited experience, of little use in research and to employers.

I think there are three reasons why some courses require the memorisation of an extraordinary amount of information.

Firstly, I think a minority of lecturers simply don’t realise how much they are asking their students to learn. They are the experts in their field who have spent years working on a specific area and know it inside out. Without realising, they expect their students to know the same facts they do. I think the majority of “bad” lecturers fall in to this category.

Secondly, it makes assessment straightforward. It’s easy to assess a student if you ask them to write down a reaction or fill in the product of a given set of reagents and conditions. When students complain about or do badly on an exam question, examiners can say, and have said, in exam feedback that “it was in the notes”.

Thirdly, I think an even smaller minority of lecturers are bitter and don’t want to change things because they had to memorise everything and so should we. Once a lecturer asked us how we would improve our course and commented that there are some members of staff who, if they had their own way, would have the course exactly the same as it was 30 years ago. This is not good; we do not live in the 1980s.

I want to emphasise that the vast majority of my lectures have been good, and a few have been truly awesome, but I think assessment methods need to change. Students need to be assessed on what they understand and how they think rather than what they have memorised for the exam. “I’ll just memorise it for the exam” was a phrase heard far too often in my department.

One final year lecture course, “Green Solvents”, was outstanding and is definitely up there in my top five courses of all time. I feel like I learnt more in those eight lectures than any other course throughout my whole degree. Before each lecture we were given one or two fairly lengthy reviews to read, which we then discussed in the following lecture. Not only did we learn about green solvents, we learnt about science as a process and how to read papers more critically. Even if I were to never look at green solvents ever again, the course was still worthwhile. For assessment, rather than a traditional written exam, we had to write an essay assessing a paper of our choice that claimed to have “greened” an industrial process. It was much more enjoyable and stimulating than the brain-numbing and soul-destroying revision for all the other exams I took. We need more courses like this that require thinking rather than regurgitation.

The Curious Wavefunction’s post On Chemistry’s Multiple Cultures made me think that the segregation between chemists starts as undergraduate. I didn’t take many organic courses because I’m rubbish at learning all the reaction conditions, reagents and solvents which score you a lot of marks in the exam. Friends who struggle with equations and maths hate physical chemistry. They’ll then go on to be an “x” chemist who hates “y” and won’t have anything to do with it. Surely this is bad for chemistry as a whole? Perhaps if assessment methods changed so that they tested understanding rather than trivial details, students wouldn’t specialise so early and neglect whole swathes of their discipline.