And it's so much better to go "Oh, this is so much easier than C!" rather than "Oh, this is so much worse than Python!" later on. It also tends to be easier to learn a new, easier way to do something in a new language than figure out the harder way with the easy way looming on the top of your mind.
Assembly's too high-level :) In my school's sophomore year, CS majors are required to implement a CPU on an FPGA; the instruction set, CPU architecture, everything are up to the student groups to design and implement. At the end of the class, all the student processors are compared w.r.t. size and speed of execution of various simple programs. Our group had a guy that wrote an assembler for our ISA, but most of the groups just wrote their programs in raw machine code, using a hex editor.
Really, nothing teaches low-level programming better than starting with gates and building your way up to programs. The sequel to that class teaches how to deal with pipelining, CPU caches, and multiple-execution chips like the TI DSPs, so after you design your processor, you get to see how real CPUs work. It's really fun, actually.
The intro computer architecture course used _Computer Organization and Design_ by Patterson and Hennessy. The practical component described here was driven by handouts, I'm not sure the text follows building something like this.
Perhaps C offers a sweet spot of being able to accomplish something with minimal effort, but still gaining a deep understanding of what the computer is actually doing.
Those are actually good ideas, and if you received a computer science degree from Carnegie Mellon University, you may have witnessed first hand their actualization in CS 213, Introduction to Computer Systems:
1. Int’s are not integers, Float’s are not
reals. Our finite representations of numbers
have significant limitations, and because of
these limitations we sometimes have to think in
terms of bit-level representations.
2. You’ve got to know assembly language.
Even if you never write programs in assembly,
The behavior of a program cannot be understood
sometimes purely based on the abstraction of a
high-level language. Further, understanding the
effects of bugs requires familiarity with the
machine-level model.
3. Memory matters. Computer memory is not
unbounded. It must be allocated and managed.
Memory referencing errors are especially
pernicious. An erroneous updating of one object
can cause a change in some logically unrelated
object. Also, the combination of caching and
virtual memory provides the functionality of a
uniform unbounded address space, but not the
performance.
4. There is more to performance than
asymptotic complexity. Constant factors also
matter. There are systematic ways to evaluate
and improve program performance
5. Computers do more than execute
instructions. They also need to get data in and
out and they interact with other systems over
networks.
That's a 200 level class, as far as I know there is always a CS1 100 level class that usually starts with some EXTREMELY high level language, sometimes a metalanguage just for the class itself? I heard somewhere Python is in style for this. Here is one such course : http://www.academicearth.org/courses/introduction-to-compute...
*EDIT - I made a stupid comment... they have C assignments in that class. I'll leave it for others amusement. I have seen that style of starting with a high level language though...
I never liked this argument. Doesn't have any inherent limitations, or proper evidence. You can just as easily say "you should not use a washing machine until you understand the pain of washing clothes with cold water". Or you should drive stick before going automatic. Or start programming in assembly, everything will seem easier afterwards.
Yes, modern languages are better then C. But they are not perfect. Programmers using them still have a lot of problems to solve and things to learn. It's just different things then with C, or assembly, or wiring transistors by hand. You have to face it, pointer arithmetic, while tremendously important in C, isn't really very useful in other languages. Knowing when the gc likes to start collecting on the other hand may be a good piece of information.
Wouldn't it be silly if a grown man got a bit of mud on his shoes and yelled "OH NO" and ran out of the room to the nearest washing machine to get his entire suit cleaned? In my experience, these are the actionscript-only coders, the people who only ever learned higher-level languages. They'll spawn a whole new object just to store a temporary number, or other silly things. They don't think it's silly, though. They don't know what's going on behind the scenes.
There's also just an assload of code out there in C. Regardless of what you think of getting close to the hardware, since most of the stuff running our desktops / servers is C / C++ / Obj-C, it's quite useful if you want to be able to dip into that world.
I certainly agree that learning C illustrates some extremely useful compsci principles. I also believe you can be a wizard at solving computational problems with very little practical programming experience - I met many such algorithms phds in my failed journey to become an algorithms phd. That being said, I made this comment because the title of the post is "The 75% answer to all newbie questions" C is not a language that a newbie who needs a 75% answer is going to do well with! There's a reason they don't use C for CS1.