I have a confession to make. At one point in my career, I was a mediocre programmer. I might say that I still am, only in the context of being a harsh grader. I developed a scale for software engineering for which I can only, in intellectual honesty, assign myself 1.8 points out of a possible 3.0. One of the signs of my mediocrity is that I haven’t a clue about many low-level programming details that, thirty years ago, people dealt with on a regular basis. I know what L1 and L2 cache are, but I haven’t built the skill set yet to make use of this knowledge.
I love high-level languages like Scala, Clojure, and Haskell. The abstractions they provide make programming more productive and fun than it is in a language like Java and C++, and the languages have a beauty that I appreciate as a designer and mathematician. Yet, there is still quite a place for C in this world. Last July, I wrote an essay, “Six Languages to Master“, in which I advised young programmers to learn the following languages:
- Python, because one can get started quickly and Python is a good all-purpose language.
- C, because there are large sections of computer science that are inaccessible if you don’t understand low-level details like memory management.
- ML, to learn taste in a simple language often described as a “functional C” that also teaches how to use type systems to make powerful guarantees about programs.
- Clojure, because learning about language (which is important if one wants to design good interfaces) is best done with a Lisp and because, for better for worse, the Java libraries are a part of our world.
- Scala, because it’s badass if used by people with a deep understanding of type systems, functional programming, and the few (very rare) occasions where object-oriented programming is appropriate. (It can be, however, horrid if wielded by “Java-in-Scala” programmers.)
- English (or the natural language of one’s environment) because if you can’t teach other people how to use the assets you create, you’re not doing a very good job.
Of these, C was my weakest at the time. It still is. Now, I’m taking some time to learn it. Why? There are two reasons for this.
- Transferability. Scala’s great, but I have no idea if it will be around in 10 years. If the Java-in-Scala crowd adopts the language without upgrading its skills and the language becomes associated with Maven, XMHell, IDE culture, and commodity programmers, in the way that Java has, the result will be piles of terrible Scala code that will brand the language as “write-only” and damage its reputation for reasons that are not Scala’s fault. These sociological variables I cannot predict. I do, however, know that C will be in use in 10 years. I don’t mind learning new languages– it’s fun and I can do it quickly– but the upshot of C is that, if I know it, I will be able to make immediate technical contributions in almost any programming environment. I’m already fluent in about ten languages; might as well add C.
- Confidence. High-level languages are great, but if you develop the attitude that low-level languages are “unsafe”, ugly, and generally terrifying, then you’re hobbling yourself for no reason. C has its warts, and there are many applications where it’s not appropriate. It requires attention to details (array bounds, memory management) that high-level languages handle automatically. The issue is that, in engineering, anything can break down, and you may be required to solve problems in the depths of detail. Your beautiful Clojure program might have a performance problem in production because of an issue with the JVM. You might need to dig deep and figure it out. That doesn’t mean you shouldn’t use Clojure. However, if you’re scared of C, you can’t study the JVM internals or performance considerations, because a lot of the core concepts (e.g. memory allocation) become a “black box”. Nor will you be able to understand your operating system.
For me, personally, the confidence issue is the important one. In the functional programming community, we often develop an attitude that the imperative way of doing things is ugly, unsafe, wrong, and best left to “experts only” (which is ironic, because most of us are well into the top 5% of programmers, and more equipped to handle complexity than most; it’s this adeptness that makes us aware of our own limitations and prefer functional safeguards when possible). Or, I should not say that this is a prevailing attitude, so much as an artifact of communication. Fifty-year-old, brilliant functional programmers talk about how great it is to be liberated from evils like malloc and free. They’re right, for applications where high-level programming is appropriate. The context being missed is that they have already learned about memory management quite thoroughly, and now it’s an annoyance to them to keep having to do it. That’s why they love languages like Ocaml and Python. It’s not that low-level languages are dirty or unsafe or even “un-fun”, but that high-level languages are just much better suited to certain classes of problems.
Becoming the mentor
I’m going to make an aside that has nothing to do with C. What is the best predictor of whether someone will remain at a company for more than 3 years? Mentorship. Everyone wants “a Mentor” who will take care of his career by providing interesting work, freedom from politics, necessary introductions, and well-designed learning exercises instead of just-get-it-done grunt work. That’s what we see in the movies: the plucky 25-year-old is picked up by the “star” trader, journalist, or executive and, over 97 long minutes, his or her career is made. Often this relationship goes horribly wrong in film, as in Wall Street, wherein the mentor and protege end up in a nasty conflict. I won’t go so far as to call this entirely fictional, but it’s very rare. You can find mentors (plural) who will help you along as much as they can, and should always be looking for people interested in sharing knowledge and help, but you shouldn’t look for “The Mentor”. He doesn’t exist. People want to help those who are already self-mentoring. This is even more true in a world where few people stay at a job for more than 4 years.
I’ll turn 30 this year, and in Silicon Valley that would entitle me to a lawn and the right to tell people to get off of it, but I live in Manhattan so I’ll have to keep using the Internet as my virtual lawn. (Well, people just keep fucking being wrong. There are too many for one man to handle!) One of the most important lessons to learn is the importance of self-mentoring. Once you get out of school where people are paid to teach you stuff, people won’t help people who aren’t helping themselves. To a large degree, this means becoming the “Mentor” figure that one seeks. I think that’s what adulthood is. It’s when you realize that the age in which there were superior people at your beck and call to sort out your messes and tell you what to do is over. Children can be nasty to each other but there are always adults to make things right– to discipline those who break others’ toys, and replace what is broken. The terrifying thing about adulthood is the realization that there are no adults. This is a deep-seated need that the physical world won’t fill. There’s at least 10,000 recorded years of history that shows people gaining immense power by making “adults-over-adults” up, and using the purported existence of such creatures to arrogate political power, because most people are frankly terrified of the fact that, at least in the observable physical world and in this life, there is no such creature.
What could this have to do with C? Well, now I dive back into confessional mode. My longest job tenure (30 months!) was at a startup that seems to have disappeared after I left. I was working in Clojure, doing some beautiful technical work. This was in Clojure’s infancy, but the great thing about Lisps is that it’s easy to adapt the language to your needs. I wrote a multi-threaded debugger using dynamic binding (dangerous in production, but fine for debugging) that involved getting into the guts of Clojure, a test harness, an RPC client-server infrastructure, and a custom NoSQL graph-based database. The startup itself wasn’t well-managed, but the technical work itself was a lot of fun. Still, I remember a lot of conversations to the effect of, “When we get a real <X>”, where X might be “database guy” or “security expert” or “support team”. The attitude I allowed myself to fall into, when we were four people strong, was that a lot of the hard work would have to be done by someone more senior, someone better. We inaccurately believed that the scaling challenges would mandate this, when in fact, we didn’t scale at all because the startup didn’t launch.
Business idiots love real X’s. This is why startups frequently develop the social-climbing mentality (in the name of “scaling”) that makes internal promotion rare. The problem is that this “realness” is total fiction. People don’t graduate from Expert School and become experts. They build a knowledge base over time, often by going far outside of their comfort zones and trying things at which they might fail, and the only things that change are that the challenges get harder, or the failure rate goes down. As with the Mentor that many people wait for in vain, one doesn’t wait to “find a Real X” but becomes one. That’s the difference between a corporate developer and a real hacker. The former plays Minesweeper (or whatever Windows users do these days) and waits for an Expert to come from on high to fix his IDE when it breaks. The latter shows an actual interest in how computers really work, which requires diving into the netherworld of the command line interface.
That’s why I’m learning C. I’d prefer to spend much of my programming existence in high-level languages and not micromanaging details– although, this far, C has proven surprisingly fun– but I realize that these low-level concerns are extremely important and that if I want to understand things truly, I need a basic fluency in them. If you fear details, you don’t understand “the big picture”. The big picture is made up of details, after all. This is a way to keep the senescence of business FUD at bay– to not become That Executive who mandates hideous “best practices” Java, because Python and Scala are “too risky”.
Fear of databases? Of operating systems? Of “weird” languages like C and Assembly? Y’all fears get Zero Fucks from me.