I’m available for freelance work. Let’s talk »

You (probably) don’t need to learn C

Wednesday 24 January 2024

On Mastodon I wrote that I was tired of people saying, “you should learn C so you can understand how a computer really works.” I got a lot of replies which did not change my mind, but helped me understand more how abstractions are inescapable in computers.

People made a number of claims. C was important because syscalls are defined in terms of C semantics (they are not). They said it was good for exploring limited-resource computers like Arduinos, but most people don’t program for those. They said it was important because C is more performant, but Python programs often offload the compute-intensive work to libraries other people have written, and these days that work is often on a GPU. Someone said you need it to debug with strace, then someone said they use strace all the time and don’t know C. Someone even said C was good because it explains why NUL isn’t allowed in filenames, but who tries to do that, and why learn a language just for that trivia?

I’m all for learning C if it will be useful for the job at hand, but you can write lots of great software without knowing C.

A few people repeated the idea that C teaches you how code “really” executes. But C is an abstract model of a computer, and modern CPUs do all kinds of things that C doesn’t show you or explain. Pipelining, cache misses, branch prediction, speculative execution, multiple cores, even virtual memory are all completely invisible to C programs.

C is an abstraction of how a computer works, and chip makers work hard to implement that abstraction, but they do it on top of much more complicated machinery.

C is far removed from modern computer architectures: there have been 50 years of innovation since it was created in the 1970’s. The gap between C’s model and modern hardware is the root cause of famous vulnerabilities like Meltdown and Spectre, as explained in C is Not a Low-level Language.

C can teach you useful things, like how memory is a huge array of bytes, but you can also learn that without writing C programs. People say, C teaches you about memory allocation. Yes it does, but you can learn what that means as a concept without learning a programming language. And besides, what will Python or Ruby developers do with that knowledge other than appreciate that their languages do that work for them and they no longer have to think about it?

Pointers came up a lot in the Mastodon replies. Pointers underpin concepts in higher-level languages, but you can explain those concepts as references instead, and skip pointer arithmetic, aliasing, and null pointers completely.

A question I asked a number of people: what mistakes are JavaScript/Ruby/Python developers making if they don’t know these things (C, syscalls, pointers)?”. I didn’t get strong answers.

We work in an enormous tower of abstractions. I write programs in Python, which provides me abstractions that C (its underlying implementation language) does not. C provides an abstract model of memory and CPU execution which the computer implements on top of other mechanisms (microcode and virtual memory). When I made a wire-wrapped computer, I could pretend the signal travelled through wires instantaneously. For other hardware designers, that abstraction breaks down and they need to consider the speed electricity travels. Sometimes you need to go one level deeper in the abstraction stack to understand what’s going on. Everyone has to find the right layer to work at.

Andy Gocke said it well:

When you no longer have problems at that layer, that’s when you can stop caring about that layer. I don’t think there’s a universal level of knowledge that people need or is sufficient.

“like jam or bootlaces” made another excellent point:

There’s a big difference between “everyone should know this” and “someone should know this” that seems to get glossed over in these kinds of discussions.

C can teach you many useful and interesting things. It will make you a better programmer, just as learning any new-to-you language will because it broadens your perspective. Some kinds of programming need C, though other languages like Rust are ably filling that role now too. C doesn’t teach you how a computer really works. It teaches you a common abstraction of how computers work.

Find a level of abstraction that works for what you need to do. When you have trouble there, look beneath that abstraction. You won’t be seeing how things really work, you’ll be seeing a lower-level abstraction that could be helpful. Sometimes what you need will be an abstraction one level up. Is your Python loop too slow? Perhaps you need a C loop. Or perhaps you need numpy array operations.

You (probably) don’t need to learn C.

Comments

[gravatar]

I’ll start saying I never learned C or any of their dialects. After almost 30 years in IT industry ranks, never ever needed it. I began with IBM’s RPG, from 2 up to 5, some Cobol, some Fortran, even some Assembler. Around ’91 bought my first Visual Basic 3 implementation … and fell in love with it. Even today, now just as a hobby, I’m still using Visual Basic 2022. Oh yes, also learned some Python but never got very fluent. I’ve writen millions (possibly billions) of lines of code, i around 9 different countries … and never ever needed to learn C. May be if in the future I begin with Arduinos …

[gravatar]

Hey Ned, great article. I found it via the CodeProject newsletter daily link. I love this line: “We work in an enormous tower of abstractions.” This is so true!

I’ve been thinking lately about the nature of ego in our line of work – engineering, tech, dev. There’s a disproportionate amount of male presence in our world and, good or ill, makes for ego challenges. So many of these conversations about what language you should learn or know seem, to me, to come back to personal ego and assumed authority.

I’m also noticing parallels to current conversation about the ethics of chatGPT usage in code authoring. And yet, as you point out, the prolific use of libraries is such that very few of us can fully define how all of our code works now. Perhaps AI is just another form of abstraction.

aside: I loved your article before I saw your links in the upper right hand corner. I love it even more now.

[gravatar]

Agreed that almost no one needs to learn C, as it is a really crappy language. They should learn C++ instead.

[gravatar]

Frankly, imho learning C won’t provide as much insight as learning Assembler. Admittedly, there isn’t much going on for Assembler today however I do believe it would be a better choice than C.

[gravatar]

I’m a mathematician and came to work in industry 27 years ago. I learned ForTran in college, at first with punch cards. Upon my second job I had to learn C since that was the predominant language for our mathematical knowledge transfer tool at the time. It really helped me understand much about programming, and the hardware underlying it. Maybe hardware is not the right term. In any case, at the time of my career and my limited knowledge of programming, it was extremely useful. I’m not suggesting that’s the situation now. As an aside, I could never get my arms around C++ and get a headache whenever I hear or read that term. I like Java and Python very much.

[gravatar]

I am a programming teacher, and I instruct students in the C and C++ languages, primarily those studying in scientific fields such as physics and mathematics. While I don’t entirely agree with you, I believe that not all academic profiles necessarily need to learn the C language. For example, electronic engineers benefit from learning it, but not all other physicists do. For mathematicians, learning Python might be more beneficial. However, for computer scientists, I think it is advantageous to start with the C language. I agree with your point that ‘C is an abstraction of how a computer works,’ but don’t forget that many programming languages are inspired by C. Learning C can facilitate an easier understanding of other languages. On the other hand, implementing certain cases such as data structures (linked lists, graphs, trees, forests, …) is often better done with C, in my opinion. Ultimately, everyone has their preferences in programming languages. For me, I prefer JavaScript and Java :)

[gravatar]
Leonardo Herrera 9:07 AM on 30 Jan 2024

Pointers came up a lot in the Mastodon replies. Pointers underpin concepts in higher-level languages, but you can explain those concepts as references instead, and skip pointer arithmetic, aliasing, and null pointers completely.

One observation I made during trade school (I learnt programming instead of going to “normal” high school) and later at university and even work, is that there are two kind of people: those who understand pointers, and those who don’t, and it is a really good predictor of programming skill.

Of course, I learn later at life that to be a good successful software developer, you require other skills that are rarely present in those who naturally understand pointers :D

[gravatar]

Over the years i have been faced with these issues over and over again. If anything i have learned that when it presented this way, it is more of a religious belief amongst script kiddies. I have used and tried many of these tools over the years and i see them today as such - that is.. tools. And tools are something you select to use because it purely fits a certain requirement for your task. So do you need to learn C? It certainly depends on what you need to do. Do you need to learn python? Rust? Java? Go? SQL? Or some other isoteric language? Absolutely the same answer. Every language is made to cover a specific domain which it does very well.. but if you try to force it to solve a problem outside its domain.. well that often is bloody hard and some times downright impossible. So know your tools, know what they are good for and apply them accordingly.

[gravatar]

Well I program in C on custom embedded systems. Could I use something like Python or Rust? Yes as a language, but the C infrastructure is there (a very good GCC compiler and vendor supplied low level C libraries) where it isn’t there for Python or Rust (yet). I like C, it has never let me down in 40 years. Do I use C on the PC where the needs/constraints are very, very different? No, that is left to C# or Python. I haven’t written a C program for a PC in 30 years, although I do occasionally dust off some old C open source utility and compile it for Win32 so I can use it. For me it is “Use the right tool for the job at hand”. I think it is better to learn what interests you: learn some procedural language and some functional language and that will set you up for useful and hopefully enjoyable employment for a long time.

Add a comment:

Ignore this:
Leave this empty:
Name is required. Either email or web are required. Email won't be displayed and I won't spam you. Your web site won't be indexed by search engines.
Don't put anything here:
Leave this empty:
Comment text is Markdown.