DEV Community

Cover image for Why You Shouldn't Learn C
Erik
Erik

Posted on • Originally published at erikscode.space

Why You Shouldn't Learn C

#c

Originally posted on eriksCode.space

Knowledge of the C programming language is often touted as the mark of a “true” programmer. You don’t really know programming unless you know this language, or so the wisdom goes. Many aspiring programmers have been advised by senior developers (or gatekeepers) to learn C to up their skills and bring them to the next level. That’s what this blog is all about, leveling up, so let’s discuss why learning C might be a waste of your time.

What’s The Point of This Article?

This blog is dedicated to helping junior programmers level themselves up, bridging the gap between “Programming 101” and advanced levels of software engineering. There’s a ton of steps in between, is learning this esoteric language one of them?

Conventional wisdom says yes, and if you want to convince yourself to learn the C programming language, you don’t need to look very far. Lots of blog posts are dedicated to telling you to learn C, many questions are asked in online forums “should I learn C” with most answers being a resounding yes.

To me, this is information that has been parroted so many times that no one really thinks about it anymore. The typical boilerplate reasons for learning C are given in every one of these posts or Q&As. At this point, it’s one of those “well, I learned it, so everyone else should too” things.

Personally, I think the advice out there regarding this is wrong and I’ll tell you why. I don’t want you to needlessly waste your time on something that will not pay off in the long run. So here are the myths of learning C, why it might be waste of your time, and what you can do instead.

Myth #1 – It’s the lingua franca of programming languages

This is the idea that C is the inspiration for every (or most) programming language that came after it. This is actually true for the most part, the myth is that this fact is at all meaningful.

This is the same reasoning behind learning Latin to become a better English speaker. Sure, in the process of learning Latin, you will gain a perspective on your native language (if it happens to be English) that you wouldn’t otherwise get. However, this is true with learning any new language.

Likewise, learning C isn’t going to magically make you a better JavaScript programmer any more than learning Perl will make you a better PHP programmer. There is value in learning the ins and outs of a new language and seeing how that language community addresses common problems. The value isn’t in the language itself though, it’s in the problem solving skills you exercise.

Will learning C make you a better programmer in your day-to-day language? If you’re an absolute beginner, yes but so will learning any language. If you’ve been writing code for about 2 or so years? Maybe marginally, but again, so will learning any other language.

The better way to level up

Like I’ve said repeatedly now, learning a new language will make you a better programmer and is a fine approach to leveling up your skill set. But let’s put some thought into which language to learn to optimize our time instead of reaching for that old copy of K&R C like everyone on the internet told you to.

If you’re a beginner and you’ve learned one programming language, your second one should be either:

  1. A language of a different paradigm
  2. A language relevant to your preferred domain

A language of a different paradigm means that if you’e learned an object oriented programming language, you should learn a functional programming language now. Or better yet, learn functional programming concepts in the context of your original language.

Many programming languages have both OOP and FP features and you can learn the basics of both in one language. Both Python and JavaScript are great examples of this. Even modern C# and Java support many FP features. If you can grasp both paradigms in your main language, you can really get some cool stuff done.

Learning a language relevant to your preferred domain means thinking about what kind of programming you want to do, and learning a prevalent language in that domain. If you’re into data science and you already know Python, learn R. If you like web development and you learned JavaScript, give a back-end language a go (PHP, Ruby, Python). If you want to get into embedded programming, learn C.

With that last sentence, please take note that I am not saying there is no reason to learn C. There are times when it’s relevant and knowing it is a prerequisite for some forms of programming.

Myth #2 – You’ll learn how computers work

This is perhaps the most cited reason the gatekeepers give for learning the C programming language. “C get’s you as close to the metal as you can get without learning assembler” I read one comment say, or “you really learn about computer architecture when you learn C.”

These things just aren’t true. First of all, if getting “as close as you can get to the metal” is important, then why aren’t there as many people advocating for learning assembler as there are for learning C? Wouldn’t that be better by definition? In fact, you will certainly learn a lot about how computers work if you learn assembler, so why is C the magic language for learning about a “computer’s architecture?”

To put it bluntly, it’s not. And even if C did bestow some kind of arcane computer architecture knowledge, what good does that do you? If knowing how computers work is that important to programming, why do we stop at C? Why not the basics of electrical engineering? Theory of computation? Physics?

When people say “it gets you close to the metal” they actually mean “you learn about pointers,” which aren’t that complicated (though certainly not simple). Sure, you need to know C if you want to write device drivers, embedded software, or operating systems, and these are “close to the metal” types of things. But unless those are the kinds of things you want to work on, working on them won’t make you a better programmer in general.

If you want to be a web developer, writing a printer driver isn’t going to help you as much as building websites will. If you want to be a data engineer, writing embedded software isn’t going to help you as much as learning about building data pipelines will. If you want to make mobile apps, writing an operating system isn’t going to help you as much as writing more mobile apps will.

For the vast majority of gatekeepers who advocate C as a must-know language, they’ll advise you to learn enough of the language to work with pointers. I think the inherent value in learning about pointers is actually in learning about memory management. For most of us, you can learn all you’ll ever really need to know about memory management in an afternoon or two of wikipedia articles, YouTube videos, and blog posts on the topic. The rest you can learn as problems arise.

The better way to level up

If you want to know how computers work, writing code is really just a start, if relevant at all. You’ll also have to learn about circuits, electronics, physics, and so on. And of course, there is nothing wrong with this if that’s what makes you tick, but these things are not paramount to being a good programmer or software engineer.

Memory management, and to some extent, garbage collection, actually is an important thing to know, especially with tricky performance bugs. Do a google search for how memory works in computers, then read up on how your target language handles memory management and garbage collection. This is really the extent of what you need to know as a beginner.

Instead of learning “how computers work,” I think it’s more beneficial to learn different kinds of systems administration. For example, an aspiring web developer should have at least basic skills in administering a Linux server, a data engineer will find value in learning how to work with databases, and just about everyone can benefit from some basic cloud administration skills.

Unless your programming goals are to work in environments that require knowing C, learning the aforementioned skills will pay off dividends more.

Myth #3 – You’ll be better at debugging

This is another reason you’ll see given by the gurus of why you should learn C. The reasoning goes that, because many compilers and interpreters are written in C, the trickier bugs require knowledge of the language to troubleshoot.

For example, the Python interpreter is written in C. Say you’re writing your company’s CRM in Django, and as more and more users are using it, the more complicated it gets. Eventually, the complexity and usage will cause some weird behavior that will have you tearing your hair out trying to understand.

The logic here is that knowing C will somehow magically make you able to interpret this behavior better, but this just isn’t true. Performance bugs are inherently tricky, and knowing C will only make you marginally more capable of figuring out the cause.

The better way to level up

Learning C to get better at debugging is like taking up jogging to get better at swimming. The best way to get better at debugging is by practicing debugging.

Instead of learning C, learn more about your language’s error messages, or how certain exceptions are triggered. Practice reading logs and error messages. Learn the ins and outs of your language’s profiling capabilities. Better yet, learn how to use a debugging tool. If you’re into web development, getting intimately familiar with Chrome tools will be 100x more valuable than learning C.

I try not to use anecdotal evidence for anything but in this case, I can’t resist. Recently, I wrote some Ruby code that was SO BAD, I was triggering C compiler errors. I was able to figure out the problem though, not because I knew C, but because I am not an idiot. It’s not like you become illiterate to error messages if you don’t know C, you just have to read them a little closer.

Myth #4 – It just makes you better

The worst gatekeepers love giving this advice, “learning C just makes you a better programmer.” As if learning C was the puberty of programming and the pathway to true 10x adulthood.

Again, this is patently absurd. If you’re a PHP developer and your web apps are insecure, knowing C isn’t going to change that. If your favorite code to write is data crunching in Jupyter notebooks, learning C isn’t going to make you any better at it.

Every now and then, I’ll see someone mention that the C language is the best way to learn data structures and algorithms. I think this belief stems from the fact that most of these people probably learned DS&A in C because that’s how their university taught it.

If knowing these things are so important and language agnostic, you can learn them in any language.

The better way to level up

The best way to get better at programming is by programming. In fact, knowing just the syntax of a language doesn’t actually mean you know that language and certainly doesn’t mean it’s time to learn a new one.

Instead, learn design patterns. Learn basic software architecture. Learn how to do TDD in your target language. Learn a popular library or framework in your language. Work on a personal project in your language. All these things will have exponentially more benefits than learning C.

When you should learn C

This article borders on sacrilege and as such, people are likely reading things I am not saying. Let me be clear, I am not trashing the C language. C is useful and powerful; it has an important role in the history of computing.

If this article is trashing anything, it’s the gatekeepers that tell impressionable programmers to waste their time learning something just because that’s what the old gatekeepers told them when they were impressionable programmers.

There are times you need to know C. There are times when it’s imperative in fact, and let there be no record showing I ever said anything to the contrary. You should learn C if:

  • You’re writing embedded software and the best way to do that is in C
  • You’re writing a device driver and the best way to do that is in C
  • You’re taking a class in school that requires you to learn C
  • You’re contributing to a project that is written in C
  • You just want to learn C

If you want to learn C for no other reason than you just want to, then do it. Keeping yourself engaged and interested is what will keep you coming back and trying over and over. This is what leads to skill.

Top comments (2)

Collapse
 
mousticke profile image
Akim (mousticke.eth) @Colossos • Edited

I am one of those how think that you should learn C. Why?
First, algo for searching, sorting and basic data structures are more affordable in Pseudo Code and C.
C is a static typed language and it helps to get at the beginning a good practice. Moreover, you have a system-approach of the algorithm and are more aware of memory concerns.
Sure you can learn Data Structure in Java, PHP, Python or Javascript. But they are not as strict as C (Well Java is Strong typed). Java or C# requires for me some basics of OOP and the knowledge of a “List” “Map”, “Stack” …
Indeed if you already have some knowledge in programming, then you can say you don’t need C anymore because C++ can be the next choice maybe. Or maybe you are now in BackEnd or Front-End development. C has its own purpose but it is a good start for own culture. 90% of Engineers/Developers have C as the foundation.

Collapse
 
erikwhiting88 profile image
Erik • Edited

knowing how algorithms are implemented doesn't make you a better developer, and can be studied in other languages easier (i can write a bubble sort in JS, Python, Java, Ruby, why would adding C to that list make things any better?). and by the time you need to troubleshoot a performance issue, that's when you can learn about aalgos, studying them beforehand in C won't make a difference.

also, why does static typing make for good practice? what if you just want to build something cool? it's more important to learn interesting things and make coding fun at the beginning than it is to learn why 'my name' can't be an int.

why does being aware of memory concerns matter? my first language was C and I still overflow buffers.

finally, 90% of engineers and developers have a C foundation? i would love to see your source on that, because it sounds like a statistic you made up to support a point (like the gatekeepers in writing about).