flewellyn: (Default)
flewellyn ([personal profile] flewellyn) wrote2005-01-31 05:58 pm
Entry tags:

ASCII silly question, getty silly ANSI.

A couple of geek friends of mine have asked me why it is, exactly, that I hate the C++ language so much, as well as its descendants. Since I would end up repeating myself a lot, and since this is a favorite rant of mine, I thought I would answer the question once and for all here. For the non-geeks in the audience, I shall provide a cut-tag, so that you don't need to scroll past a lengthy rant that you may not be interested in.



C++ was the first programming language I learned in earnest. I had tried various flavors of Basic (including Visual, ick) before, but C++ was the language I learned while I was learning what programming is, in general. So understand, C++ shaped a lot of my earlier understanding of how computers and programs work.

I've spent years undoing the damage.

You see, C++ is based on C. Now, I like C. C is a good language for doing low-level work, such as...writing an operating system. Surprise! That was its purpose! C was invented to write Unix, after all. It's very useful for writing low-level code. And, for the most part, it's stood the test of time well; aside from anachronisms like the preprocessor (the job it does would just be done by the compiler's parser, these days), and some issues I have with the irregularity of the syntax, I think it's a reasonably good language.

Like any programming language, C has a certain model of how computing works: statically-typed variables whose types correspond closely to machine registers and portions thereof, functions which are closely modeled on the runtime stack of the CPU, structures which are basically just collections of variables, arrays that are just sequences of values defined according to the machine words (or portions thereof, again), and so on. It's very low-level, though not as low-level as assembly, but it works for that kind of job: OS-kernel work, system libraries, primitives for graphics operations, and so on.

This model implies a certain understanding of how computation works. It's very much a "state-machine" based idea: you have your data in specific slots, which is manipulated with specific statements. A particular datum is not an object in its own right, it has no real identity beyond its type (and that can be overridden easily enough), and it can "mean" different things in different places. C types are basically defined by their size: the only REAL difference between a char, an int, and a long is how many bits their representations use (double and float break this paradigm a little, but floating point has always been problematic). Memory, of course, has to be managed by hand in this model. This is very close to how the machine thinks, conceptually. C gives you the power and flexibility of assembly language, combined with the clarity and maintainability of assembly language.

Now, take C++. C++ is an attempt (a poor one, in my estimation) to add "object oriented" extensions to C. Object Oriented programming, though, comes from a completely different mental universe than the C model: to OO languages like Smalltalk or Lisp, data has its own essential identity. There is a real difference between a character of text and a number; whereas in C you could take a char value and easily interpret it as an int value, or vice versa (if you don't mind some issues with sign extension), in these languages "character" and "integer" are not the same at all. It's not just a difference of size, it's a difference of kind. Data is not merely clumps of bits anymore, it is objects that have identities, and purposes, and behaviors.

Extend this mode of thinking throughout the language, and you end up with a mental world in which, among other things, it no longer makes sense to have types adhere to variables: they adhere to values. Moreover, other things which in C might not make sense to have types, acquire them: "function" is a data type in Lisp, for instance. Indeed, variables themselves are pieces of data ("symbols"), and can be manipulated. The need to have direct pointer addressing disappears, since A) objects have identities which can be named, and B) the names themselves, as symbols, can be changed at will. This is what REAL "object oriented programming" is about. The stuff that it's come to represent (classes, inheritance, polymorphism, etc.) are just emergent properties, epiphenomena, if you will, of such a system. (It should come as no surprise that Lisp was one of the first languages to become OO that was not originally; Smalltalk was designed as OO from the get-go.)

So, what does C++ do wrong? Why, tries to mash these two completely inconsistent models together! It takes the epiphenomena of a truly OO language and grafts them clumsily onto the low-level, machine oriented world of C. So, static typing, functions as procedures only, types determined at the primitive level by storage size. Not to mention the godawful syntax! What was merely irritating in places in C becomes intolerable in C++: the mess of periods, double-colons, semicolons, braces, brackets, wackets (my name for these things: <>), not to mention excessive parentheses due to "operator precedence" (a bad idea in math, and a bad idea in programming) causing all kinds of lossage in logical and arithmetic expressions...oi.

The particularly troublesome parts here are the type system and the memory model. Static typing makes sense at the low level C operates on, but when you try to mesh it with object oriented programming...ick. It does not work well. The only benefit that the C++ model of OO provides to C programmers is the ability to define functions which can work on data of different types: not such a bad deal, but you can do this in plain C using void pointers, anyway. Indeed, if you look at how OO is used in most C++ programs, it's not in the traditional "modelling object relationships" way, but as an end-run around the type system. Other than that, C++ classes are basically structs with functions inside them. You can do that in C. (Oh, okay, and they have "public" and "private" members, to enforce "information hiding"; being able to hide the implementation details of a library is not a bad thing, but the answer is not OO, the answer is a good packaging system.)

And don't even get me STARTED on the horror that is templates...a syntax that could only be spawned by Great Cthulhu, and semantics which could only be understood by one driven mad by the Great Old One.

Now, fine, you might say, that takes care of C++, but what about Java? Or C#?

Java, I consider to be slightly better than C++. That is, however, damning with faint praise. Java did one thing right, that C++ did not: garbage collection. (It is possible to use GC with C++, but it doesn't come standard.) And even then, they screwed up by putting the "new" operator in the language. Hello? What is the point? In C++, "new" made some sense, as you were doing manual memory allocation, and needed such an operator; in Java, this is just silly. (Not to mention the fact that Java manged to make I/O even more of a circus than C++, or the fact that you don't even have multiple inheritance, something that C++ implemented incompetently, but at least tried to do.) I feel similarly about C#, though I add that C# is even more like rounded-edged scissors than Java, in its attempts to prevent the programmers from shooting themselves in their feet. A programming language should not get in your way!

In summary, I think that problem with C++, Java, C#, and other languages based on them is twofold: 1) they tried to reinvent the wheel, and 2) they insisted on it being square.

[identity profile] limpingpigeon.livejournal.com 2005-01-31 04:31 pm (UTC)(link)
C++ was the first and only programming language I attempted to learn via a college course.

I gave up on my short-lived idea to pursue a Computer Science minor in college because of C++. I thought I just hated and sucked at programming. But every programmer I've talked to basically said that C++ is the worst possible programming language to lean first, and especially to use to learn how programming works.

And that's why I am not more of a computer geek than I am :)

[identity profile] flewellyn.livejournal.com 2005-01-31 04:36 pm (UTC)(link)
See, this is why I want to try teaching you something else sometime. Maybe Lisp, it's nice and friendly. Yet weird. Like me!

And you admitted to being a geek! AHAHAHA!

[identity profile] xuincherguixe.livejournal.com 2005-01-31 04:53 pm (UTC)(link)
*chuckles*

Of course you knew I was going to comment on the Java stuff :P


As far as the whole deal with the new operator, I suspect there is some reason they decided that whenever you create an object that you type out new. It might not have been a good reason though. Maybe they wanted to make it really damn obvious when you're creating an object? If you just typed something like Whatever what = Whatever(stuff), it might look like it was just an ordinary method, rather then a Constructor.

Or they could have been on crack.


I've heard arguments for why Multiple Inheritance isn't a good thing, but I've forgotten what they where.


Java is good at eliminating a lot of bugs. Mostly by elimiating your ability to do stuff. It's strongly typed, and arrays are tightly bound. The thing about these limitations though is that it's mostly just stopping you from doing stuff you probably wouldn't want to do anyways. If you did want to do something like access the 30th element in an array with a size of 2, Java obviously is not the choice of languages to use.

It's a good language to use for developing software. Especially complicated software. Which is why a lot of stuff is done like it is.

Which is not to say that I don't think that they should seriously overhaul a couple things. Quite frankly I'd like to be able to tell how much memory an object is using.

[identity profile] flewellyn.livejournal.com 2005-01-31 05:00 pm (UTC)(link)
Yes, but Lisp is also strongly typed. Just not statically. The type checking is done at runtime, unless you declare otherwise. And bounds-checking is a good thing that many languages have done.

My issues with Java have to do with the static typing, the exception handling (unwinding before passing control to a handler? GEEZ!), the absolute convoluted nonsense that is standard I/O, the lack of multiple inheritance (yes, it's VERY useful, and the dangers only arise from the static typing model), the use of the older "message passing" OO model instead of the "generic function" model, and various other things.

That said, it is better than C++. But not much.

[identity profile] dcayn.livejournal.com 2005-01-31 05:25 pm (UTC)(link)
I originally declared computer programming as my major when I started college. After my first programming class, I decided I was crazy. I didn't even make it through half the class b4 giving up. I'm a visual learner. Put me in front of a box, show me what to do, and I'll get it (eventually). Give me a book and tell me to memorize everything and I crack. Ever since that class I've been anti-programming. I'd like to learn java one day, or something else useful and easily processed, but c++ left a horrible taste in my mouth. Don't even get me started on flow charting. What the hell were they smoking?? It makes my brain hurt just thinking about it.

[identity profile] flewellyn.livejournal.com 2005-01-31 05:29 pm (UTC)(link)
Being who I am, I suggest Lisp. Its syntax is simple enough to learn in five minutes, it's very regular, easy to work with, and so powerful that, even today, nearly 50 years after its invention, other languages have yet to catch up.

[identity profile] xuincherguixe.livejournal.com 2005-01-31 09:30 pm (UTC)(link)
Java would likely trigger some repressed memories then.

I also recomend Lisp, and I haven't even programmed much in it (I was looking at the tutorial, it's super easy to use)

[identity profile] flewellyn.livejournal.com 2005-02-02 05:58 pm (UTC)(link)
I'm waiting until you discover lexical closures, macros, read macros, and CLOS. :-)

[identity profile] buzz.livejournal.com 2005-01-31 05:58 pm (UTC)(link)
Hence why I'm happy as a clam dealing with C/C++ in my happy little embedded real time world. ;)

[identity profile] flewellyn.livejournal.com 2005-02-01 02:22 am (UTC)(link)
Yes, but you embedded realtime folks are insane. :-)

[identity profile] bear-foot.livejournal.com 2005-02-01 10:53 am (UTC)(link)
Flew, you have it wrong.. Best Languages 1) Cobol, 2) Snobol 3) Chef (http://www.dangermouse.net/esoteric/chef.html)

I really shouldn't do this but...

[identity profile] mrz80.livejournal.com 2005-02-01 08:03 pm (UTC)(link)
...I just can't resist: (From the hoary old bit, "Real Programmers Don't Use Pascal")... If you can't do it in FORTRAN, do it in assembly language. If you can't do it in assembly language, it isn't worth doing.

Anyhow, I started with BASIC, then moved to a mishmash of FORTRAN, Pascal, and various assemblers, with side orders of C, LISP and dBASEIII. Somewhere in there came a fair helping of COBOL, with emphasis on how to write in IBM's CICS environment. What can I say, it came in handy for work.

I like what you wrote about C++; it solidified some things that really rubbed me raw back when my degree compelled me to wallow in it for a semester or two.

Re: I really shouldn't do this but...

[identity profile] flewellyn.livejournal.com 2005-02-01 08:06 pm (UTC)(link)
Actually, REAL programmers write in machine code. :-)

See The Story of Mel (http://www.catb.org/~esr/jargon/html/story-of-mel.html) for a rather amusing and amazing example.

Re: I really shouldn't do this but...

[identity profile] mrz80.livejournal.com 2005-02-02 07:55 am (UTC)(link)

Re: I really shouldn't do this but...

[identity profile] flewellyn.livejournal.com 2005-02-02 04:36 pm (UTC)(link)
I remember when UF was funny...

Re: I really shouldn't do this but...

[identity profile] xuincherguixe.livejournal.com 2005-02-06 09:34 pm (UTC)(link)
Woah, I've never met anyone who lived during the period of the dinosaurs!