A lot of the software that you use on your Mac, PC or Linux computer, not to mention your cell phone, is coded in some variant of the C language he invented. Every OS X Mac, and almost every major website, runs on a descendant of the Unix operating system, which he co-developed at Bell Labs in the early 70’s. C was key to the success of Unix, and its modern variant, Linux, both of which are written in C and can be easily ported, or converted, to new processors as they are invented. Ritchie was 71.
Reader Interactions
63Comments
Comments are closed.
Trackbacks
-
[…] with UNIX (which they also co-created) and its successors (e.g., Linux) for almost that long. As mistermix puts […]
Winston Smith
Dennis Ritchie invented C (what, is Brian Kernighan now the Steve Wozniak of C?).
Bjarne Stroustrup fucked it up with C++.
Steve Jobs fixed it with Objective-C.
See, everything comes back to Steve Jobs. :-P
Keith
I haven’t used straight C in about 12 years, but I have been using one of the descendents (C++ to Java to C#) for almost every day of my life since then, weekends included. He is, was, and always will be a legend in the business.
cleek
“C is not a big language, and it is not well served by a big book.”
from the preface to the very skinny: “The C Programming Language”
henqiguai
Remember reading the Kernighan and Richie book back in my early hardware days, cover to cover, then re-reading it. Pre-PC era and I had no idea what I was trying to learn, just knew it was interesting stuff.
Why would anyone want to use C# ?
PIGL
I still use straight C.
My proudest moment as a hacker was fixing a portability bug in eqn or tbl….I forget which, but Mr. Ritchie had written it.
I feel as people of the previous generation must have felt at the death of Turing or von Neumann.
amk
Obama raises more than $70 million during summer
PIGL
@cleek: I’ve modelled my scientific writing after these guys. Did you ever read the Plan 9 docs?
gene108
Something to do with companies that choose Microsoft technologies to develop their applications.
Bill E Pilgrim
It was great when he invented C because before that you had to play everything in Db or B.
RIP, I don’t do coding at all at least beyond XML but always appreciated the genius behind this stuff, they really are new languages, and those used to have to develop by themselves and took thousands of years.
Ivan Ivanovich Renko
My K&R is in a box someplace in my basement… I could never get myself to let it go even long after I became obsolete and went over to the dark side.
RIP.
cleek
@henqiguai:
the .Net framework is pretty awesome.
coming from C++, C# is like walking into a fully-stocked wood-shop, when all you had previously was a sharp knife and a hammer.
arguingwithsignposts
@Bill E Pilgrim:
And where would the 12-bar blues be without C? Or folk music?
Bill E Pilgrim
@arguingwithsignposts: Capos would be built right into guitars.
arguingwithsignposts
@Bill E Pilgrim: speaking of guitars, have you seen Rocksmith?
Looks pretty cool.
Bill E Pilgrim
@arguingwithsignposts: Wow. You know, I’ve never had the slightest idea how “guitar video games” work, and studiously avoided ever finding out. The whole idea of using an instrument like that seems so bizarre to me, like turning a grand piano or an opera singer into a game controller.
Otherwise I’m really not very ludditish, though my gaming addiction period came to an end after Total Annihilation and Pharaoh, which in modern time scales is about the equivalent of 4,000 years ago.
One of my favorite games ever BTW since we’re on the early computer years here was “Sentry”, on my Commodore 64. That and Up Periscope, I think it was called, that I’d load on big floppy disks– never had so much fun on a computer.
RobertB
Still have my copy of K&R at work.
Maxwel
@Winton Smith
Wiki: “Objective-C was created primarily by Brad Cox and Tom Love in the early 1980s at their company Stepstone. ”
C++ is great and you are a moron.
Grumpy Code Monkey
@Bill E Pilgrim:
I’m ashamed to say it took me a while to get that. I read Db as DBase, and there was a B language (IIRC, it goes CPL -> BCPL -> B -> C). When Stroustrup came up with C++, there was some question as to whether it would be named D (since that was the next letter of the alphabet) or P (since that was the next in the sequence BCPL).
C is a surprisingly powerful language, but it’s a product of its time – it just doesn’t have built-in support for a lot of technology that we’ve come to rely on, which is why it’s been eclipsed somewhat by newer tools like C# and Java.
C’s also a remarkably efficient language for shooting yourself in the foot.
Davis X. Machina
@Ivan Ivanovich Renko: Kernighan and Ritchie are the Rogers and Hammerstein of CS. Yeah, they didn’t write Rent and probably couldn’t. But without Oklahoma….
Mike
main(argc, argv)
int argc;
char **argv;
{
printf(“Goodbye World”);
}
C is the best programming language ever. UNIX is the best operating system ever. Great stuff… very sad to see this pioneer go…
Keith
@cleek: Right you are. It’s mainly about the framework, which is – compared to other frameworks – extremely consistent and therefore easy to use. Also, there are various decisions made with C# that make it perform better than Java (defaulting to non-virtual methods, for instance). Additionally, it evolves considerably faster than Java (especially since Oracle got its hands on the thing).
And FWIW, I find Objective-C to be nearly unreadable.
cleek
#include “stdio.h”
main()
{
int b=0,c=0,q=60,_=q;for(float i=-20,o,O=0,l=0,j,p;j=O*O,p=l*l,
(!_–|(j+p>4)?fputc(b?q+(_/3):10,(i+=!b,p=j=O=l=0,c++,stdout)),
_=q:l=2*O*l+i/20,O=j-p+o),b=c%q,c<2400;o=-2+b*.05);
}
yay, C!
Bill E Pilgrim
@Grumpy Code Monkey: Makes sense. Music profession all my life so I’m the other way around. I was just reading about C and saw that they named it that way following from “B”. I think they sent B off an Ark with a bunch of people, right?
Maxwel
@Grumpy Code Monkey
C++ (and C) are languages for writing support libraries.
catclub
@Mike: I know that technically it is unnecessary in your program, but is it
exit(0) or exit(1)?
ed_finnerty
Thank God for C. The day we got our PDP 11 was a big day.
Before that I was writing in FORTRAN using 360 JCL.
Winston Smith
@Maxwel:
No, Steve Jobs invented Objective-C when he created the cell phone. Don’t you know ANYTHING?
…and you don’t have a tiny weiner. Sure. Whatever you say.
Mike
@ed_finnerty: JCL? Oh man, I feel sorry for you on that one… yuck! I always hated JCL… worst language ever!
@catclub: exit(0) would be the most appropriate. The number is used to return an error code that can be picked up by the shell process. exit(1) would be typically used when the program needs to abort due to some sort of error.
Walker
@Maxwel:
Even Stroustrup admits he got it wrong.
Lots of bad design decisions, from multiple inheritance to the way templates are compiled.
LarryB
@cleek: Not to speak ill of the dead, but it’s appropriate to remind everyone that Unix/C is a hoax! Also, too:
Belafon (formerly anonevent)
@Winston Smith: I will not learn objective C unless you paid me a butload of money. As for Kernigan, he didn’t create C, he just helped write the book on it.
Belafon (formerly anonevent)
Duplicate
Belafon (formerly anonevent)
@Maxwel: Which is why I like C++. See the Boost Proto library for the ultimate metalibrary.
My goal today is to go around the programmers at my job and find that dividing line between those who know who Ritchie is and those who don’t. I’ll also have to tell a few stories I suspect.
MikeJ
@Belafon (formerly anonevent): What’s wrong with objective c? I always thought it was pretty nice.
Ever do any smalltalk? Everybody should, just to get your brain working a new way. Same reason you should know Haskell too.
bago
@cleek: And a roomba.
Winston Smith
@Belafon (formerly anonevent):
Me neither. I haven’t programmed in anything by Java since 1996. Objective-C programmers tend to like it way better than C++. YMMV.
Oh. I always assumed otherwise. Thanks for setting me straight.
Dougerhead
Luckily, the free market destroyed the inefficiency of Bell Labs a few years ago.
prufrock
@Mike: Ew, non ANSI standard C. I’ve lost count of the amount of hours I’ve spent changing our code to ANSI standard, just so the compiler would catch our mistakes.
Now I’m having flashbacks. Thanks, man!
Grumpy Code Monkey
@cleek:
Heh. How many other programming languages have a contest to see just how unintelligible you can make your code?
eldorado
objective-c i interesting to me for the smalltalk features that it is based around, but i found it very wordy using it to program mac os stuff. also, it’s nearly impossible to use a standard unix programming environment (vim etc), apple really loves the xcode ide.
i mostly work in ruby now, but all the interesting work seems to be going on in javascript these days.
Belafon (formerly anonevent)
@Grumpy Code Monkey:
According to wikipedia, Perl and Ruby.
thrashbluegrass
Point of order, Linux is a direct descendent of Minix, which is a cleanroom reimplementation of a POSIXish OS. Only the BSDs and their descendants (Solaris, MacOS X, and IIRC, cisco’s IOS) are actually descendents of the original Bell UNIX.
ETA: OS X is derived from NextSTEP, which does include some BSD odds and ends
And for the record, C is friggin’ awesome.
PopeRatzy
In the early days of USENET you could ask questions about C & UNIX in the tech newsgroups. Quite often the respondent was one of those gawds from Bell Labs, Ritchie, Aho, Kernighan or Thompson. The answer was definitive, the snarky follow-ups from the other gawds made asking the questions worth the time to come up with something that would interest them.
I still remember Ritchie et al on the Plan 9 bulletin board the day that Bell Labs was sold to Lucent (?memory fades, it was Lucent?). The descriptions of the mass panic as the *junta* took over was hysterically funny.
I truly miss the days when snark & UPA were an art form.
Amanda in the South Bay
@cleek:
Isn’t there a second volume covering data structures? I’ve always wanted to get it, but have never found it in a local used bookstore.
Winston Smith
@prufrock:
They don’t call it “Berkeley Assembler” for nothing.
@Grumpy Code Monkey:
There was never any point to doing so in C++ because ordinary code obfuscates itself. Thanks to operator overloading, the following line in C++ could literally do anything.
a = b + c;
Roger Moore
@Grumpy Code Monkey:
To steal Neal Stephenson’s analogy, C continues the Hole Hawg family of programming languages. It assumes you know what you’re doing and will blithely go ahead and do it without question. That makes it very powerful- you can do what you want- but also very dangerous because you need to think through all the consequences of what you’re doing. I don’t think that’s appropriate for every task- there are lots of cases where you don’t need that power and a language that protects you from your mistakes is helpful- but when you need it you really need it.
MTiffany
Now there’s your job creator. And that man’s labor directly led to the creation of more wealth for the whole goddamned planet than anything those motherfucking Koch brothers or Wall Street econoterrorists ever did or will. If capitalism here in the US really did reward hard work and innovation, Ritchie would have died the world’s only trillionaire. Herman Cain can kiss my ass.
RSA
@MikeJ:
Common Lisp and Scheme are also good, for the same reason. (My students, who mostly work with C++ and Java, seem to understand recursion a bit better after writing programs in CL, and they get something out of seeing a different model of object-oriented programming.)
MTiffany
That’s the best argument I’ve ever heard for requiring people to learn C before they’re allowed to vote.
Roger Moore
@thrashbluegrass:
No, it isn’t. Linux was re-coded from scratch. It originally used the Minix file system because Linus used Minix as a programming vehicle while writing it, but it didn’t re-use any Minix code.
kMc
Archlinux is amazing.
That’s all.
Belafon (formerly anonevent)
@Winston Smith: Yes, and because of that, I can write things like:
find( v.begin(), v.end(), _ 1 * _ 1 <= 50 )
(The 1s are supposed to have an underscore in front of them. Is there a way to temporarily turn the formatting off?)
thrashbluegrass
@Roger Moore:
Meh; sue me :p
Cain
@Roger Moore: @Roger Moore:
which is why it makes a great learning language. You’ll understand memory management really really well unlike java or .net.
sri
henqiguai
@cleek (#11): And gene108 (#8); yeah, that I understand since I work in an all Microsoft .NET and Office shop (enterprise productivity tools). As a QAer it’s always fun to start proselytizing Java and/or C++ just to aggravate the developers (yeah, evil, but I’m the biggest and craziest one in the company, with the exception of the boss).
RSA
@Cain:
I think it might be more accurate to say that if you learn to program in C well, then you’ll understand memory management. But that doesn’t necessarily make C a good learning language. To abuse a popular metaphor, it would be like saying that Formula One race cars are great for learning how to drive, because you’ll really gain an understanding of high-speed driving maneuvers.
C was one of my first languages (K&R is a great book), and I still like it, but it turns out that C isn’t for everyone, at least at the start.
Djur
@Maxwel: No, C++ is terrible. It is an awful language. I’d rather code in RPM on punchcards than touch another line of C++ as long as I live.
Djur
Ugh, I mean RPG of course. My mind’s already on work.
Rheinhard
C++ has a lot of weird, non-common-sensical compromises to both make it object-oriented yet also keep it backward compatible with non-object-oriented, few-steps-above-assembler C.
@MTiffany: I’m pretty sure somebody as key as Ritchie was pretty well compensated in his lifetime (as computer scientists go), but I’d also be surprised if he died much more than a single-m millionaire. (Or, to put it another way, that his whole net worth at the end was probably less than a single annual bonus for a VP of a major Wall Street trading firm.)
patrick II
How many people get assignments like that today? And look at the payoff.
greylocks
C is a great language for what it was intended for – to replace assembly language in systems programming. Back then (and yes, I’m that old), most computers didn’t even have true operating systems (PC_DOS, for example, was not an OS) and even if they did, device drivers were typically primitive to non-existent, and the laundry list of services now routinely provided by modern OS’s didn’t exist. For filling those gaps, C was an excellent choice, because it was much easier to code and read than AL but was still low-level enough that you could directly access the hardware with it. If well-written, it generated pretty fast code even before there were optimizers.
It was never really intended as a replacement for FORTRAN or any other application language. That it was adopted widely as an application programming language was mainly because…wait for it, libertarians!…the GOVERNMENT FORCED BELL LABS TO GIVE AWAY UNIX AND C FOR FREE! Yes folks, more proof that government intervention in the markets is evil. Given that a decent FORTRAN or COBOL compiler could cost several thousands of dollars – in 1970 dollars – C was an instant hit if for no other reason than the cost. Unix and C were the greatest bargains ever in the history of computing.
henqiguai
@Grumpy Code Monkey (#39):
How about APL ? Thought I saw, somewhere and somewhen, something about such a contest. Too old ???