Those are actually good ideas, and if you received a computer science degree from Carnegie Mellon University, you may have witnessed first hand their actualization in CS 213, Introduction to Computer Systems:
1. Int’s are not integers, Float’s are not
reals. Our finite representations of numbers
have significant limitations, and because of
these limitations we sometimes have to think in
terms of bit-level representations.
2. You’ve got to know assembly language.
Even if you never write programs in assembly,
The behavior of a program cannot be understood
sometimes purely based on the abstraction of a
high-level language. Further, understanding the
effects of bugs requires familiarity with the
machine-level model.
3. Memory matters. Computer memory is not
unbounded. It must be allocated and managed.
Memory referencing errors are especially
pernicious. An erroneous updating of one object
can cause a change in some logically unrelated
object. Also, the combination of caching and
virtual memory provides the functionality of a
uniform unbounded address space, but not the
performance.
4. There is more to performance than
asymptotic complexity. Constant factors also
matter. There are systematic ways to evaluate
and improve program performance
5. Computers do more than execute
instructions. They also need to get data in and
out and they interact with other systems over
networks.
That's a 200 level class, as far as I know there is always a CS1 100 level class that usually starts with some EXTREMELY high level language, sometimes a metalanguage just for the class itself? I heard somewhere Python is in style for this. Here is one such course : http://www.academicearth.org/courses/introduction-to-compute...
*EDIT - I made a stupid comment... they have C assignments in that class. I'll leave it for others amusement. I have seen that style of starting with a high level language though...
http://www.cs.cmu.edu/afs/cs/academic/class/15213-f01/www/
From the syllabus: