Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this is an exaggeration/strawman. Can you give an example of a mainstream introductory math book that defines some esoteric notation like "Q(x)" and then continues using it hundreds of pages later? Note that I don't mean defining it locally within the context of one proof -- that's something different.

The closest thing I can think of is notations like U(f, P) and L(f, P) for the upper and lower Riemann sums of a function f with respect to a partition P. These do show up in introductory real analysis textbooks, but they're pretty basic/fundamental notions.

By the way, as far as I know, ∂ in introductory textbooks can only mean either "partial derivative" or "boundary". Do you have examples of it meaning other things?



Most of what I've seen lately is queueing theory, markov processes or statistics, and yes, I have seen notation used for hundreds of pages at a time and I've seen these symbols used to mean different things (not including partial differential).

One of the books I read recently[0] has a table of where symbols and formulae are first defined. It's a godsend.

But I would prefer self-describing names to a lookup table. I don't put comments at the top of each file of code with such a table.

[0] https://highered.mheducation.com/sites/0073401323/informatio...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: