Good advice and worth reading especially for younger devs. With respect to...
>> Prefix systems like hungarian notation were initially meant to add meaning, but with time they ended up being used in less contextual ways, such as just to add type information.
Hungarian notation was pretty cumbersome to read, actually, and I think the main reason it fell out of use is that editors and IDEs began to make type and declaration information available for symbols in a consistent way, so it was no longer much of an advantage (and perhaps a disadvantage) to use a manual convention that was usually applied inconsistently.
One of the major functions of Hungarian notion was to communicate information which was not contained in the types of the actual variables, for example an int could be a count of bytes 'cb', or perhaps a handle 'h', etc. But it ended up being mostly misused to communicate redundant type information, such as a char* being 'sz' (zero-terminated string), which tells us nothing we didn't already know. As you say, better IDEs made the latter kind of naming no longer advantageous (if it ever was) but that was true for some time before Hungarian notation fell out of favour - the real reason being a rejection of its redundancy within MS during the transition to .NET. Joel Spolsky details the good and bad of Hungarian notation here:
From the perspective of a younger dev, this seems like excellent reading for older devs, who tend to enforce their personal style without deferring to the de facto language standards (or PEP 8 for those working in Python). In fact, perhaps everyone could do with downing a half dozen humble pills and optimize their code for harmonious human interoperability than anything to do with machines :)
>> Prefix systems like hungarian notation were initially meant to add meaning, but with time they ended up being used in less contextual ways, such as just to add type information.
Hungarian notation was pretty cumbersome to read, actually, and I think the main reason it fell out of use is that editors and IDEs began to make type and declaration information available for symbols in a consistent way, so it was no longer much of an advantage (and perhaps a disadvantage) to use a manual convention that was usually applied inconsistently.