No, the field grew tremendously and you can see a clear generational bias -- by years of experience, not age -- where the cohort from the last 10-15 years has a completely different understanding of what the craft is [software engineering vs business development] and how to approach it [optimal solution vs soonest deliverable].
You can also trace personal backgrounds and you'll see a much higher representation in the newer cohort coming from upper middle class backgrounds with families in careers like finance, consulting, medicine/dentistry whereas more in the older cohort came from more modest middle class backgrounds in engineering, academia, or even working class trades.
Of course, there were always some of all of these people in the industry, but the balance shifted dramatically during the last couple booms, tracking the atypically high compensation standards set by FAANG's since 2010 or so.
This is what happens anytime a field gets large in terms of job applications. Replace software engineer with anything else to that measure and you see the same things with wealthy families being overrepresented in the cohort because they always have an edge in getting the best credentials due to not having to work any part time jobs and having mom and dad (or even a paid advisor) actively working on your behalf to vet potential internships or other opportunities for you. You are essentially out numbered 3:1 or even 4:1 or more, and you can't work a full 1 part anyhow due to the aforementioned other obligations life has saddled on you.
I doubt that has anything to do with getting "large in terms of job applications" alone. It's a correlation, alright, because wealthy families have it easier to get high-status and high-paying jobs for their kids, and if such a field grows, wealthy people flock to it like everyone else. But I sincerely doubt you'll find the wealthy over-represented in physical labor / blue collar jobs, regardless of how the ups and downs in the labor market for those occupations.
The way I see it, it's like 'swatcoder and 'lovich said upthread: the field became a money printer, and attracted - not revealed, attracted - a different kind of people, with a different mindset. I too saw this change happening. Applicant pool size? That's a spurious correlation - it's just driven by the same factors that make software industry a money printer.
> It's a correlation, alright, because wealthy families have it easier to get high-status and high-paying jobs for their kids
You have just written down a partial solution to this dilemma: make these high-paying jobs less attractive in terms of status for these wealthy families. :-)
I completely agree. Software engineering is just the most recent field I can think of(unless you consider data science a distinct enough portion of software to carve off as a separate field) that has had this pattern occur.
Well that and this is a forum for a lot of tech people which means a good number of software engineers here.
Speaking as an old school basement nerd (coding since middle school, 90’s): If I can do cool things with code _and_ get paid, I’m gonna go do that. Business constraints make it feel much more interesting than writing code in a vacuum.
That's quite nice! You may wish to look into implementing a PID controller, so as to avoid overshoot (your carrots become too thawed initially) and unnecessary oscillation about the setpoint (meaning you are wasting energy on cooling and heating cycles that in the end cancel each other out, where you could have kept the temperature nearly constant during that time). I loved juicing carrots so much my face turned orange from the beta-carotene.
> No, the field grew tremendously and you can see a clear generational bias -- by years of experience, not age -- where the cohort from the last 10-15 years has a completely different understanding of what the craft is
I bet a lot of people 10-15 years older than you would say the same thing - except they'd say it about you and your generation.
I'm not that old, but I've been around long enough to hear people of every age over about 30 claim that everything was better back in their day until the new generation came along and ruined it.
> I bet a lot of people 10-15 years older than you would say the same thing - except they'd say it about you and your generation.
And they’d probably be right!
I remember the grognards giving me shit about memory management and me giving it right back by explaining that what they considered a large chunk of memory would be worth pennys next year because of Moore’s law and I wasn’t going to waste time considering something that I literally couldn’t learn faster than it became obsolete knowledge.
Quantitative differences can create qualitative differences and I don’t think it’s surprising that we’re in a different age of software engineering than we were 10-15 years ago for any given X year
>I remember the grognards giving me shit about memory management and me giving it right back by explaining that what they considered a large chunk of memory would be worth pennys next year because of Moore’s law and I wasn’t going to waste time considering something that I literally couldn’t learn faster than it became obsolete knowledge.
And that's why all applications are laggy as shit these days.
But ignore memory at your peril. I have one proj that has a 256GB instance. For a fairly boring CRUD app. I am asking a lot of questions as apparently we are having the yearly 'we need more memory' questions. Things that are leading to speedups. Just by using less memory. At the bottom of that stack is a L1 cache with less than a hundred KB. It doesnt matter right up until it does. I have seen huge 300+ item string classes that needed maybe 10 of the fields. They threw it in 'just because there is enough'. Yet something has to fill in those fields. Something has to generate all the code for those fields. The memory mangers have to keep track of all of that junk. Oh and all of that is in a pipeline of a cascade of applications so that 300+ class is copied 10 times. Plus the cost to keep it on disk and shove it thru the network.
On the other hand, I've seen developers who don't know about things like that start up a project on a small instance and wonder why everything is running at turtle speed.
People that stopped running tests because they were configured to make 10,000 API calls in one minutes and it crippled the app until everything was restarted.
"Add some more memory to your database instance....poof"
I definitely agree with the greybeards and I think we see the results of not listening to them. We have these processors, buses, networks, and all sorts that are magnitudes faster and more powerful than what they began on but many things are quite slow today. Worse, it seems to but getting slower. There is a lot of value in learning about things like caching and memory management. A lot of monetary value. It's amazing to me that these days your average undergraduate isn't coming out of a computer science degree being well versed and comfortable writing parallelized code, given that is how the hardware has moved. It is amazing to me we don't normalize caching considering a big change that was driven from the mobile computing side and adopted into our desktop and laptop environments is to fill ram because you might as well. It is crazy to me that we have these games that cost hundreds of millions of dollars to develop that are buggy as shit, hog all the resources of your machine, and can barely run at 4k60. Where you can hit a bug and go "yep, I know what's causing that memory error"
Honestly, I think so much of this comes from the belief of needing to move fast because. Because why? That would require direction. I think the money motivation is motivating speed but we've lost a lot of vision. Moving fast is great for learning but when you break things you got to clean it up. The problem is that once these tech giants formed they continued to act like a scrappy developer. To not go back and fix all the mess because we gotta go fast, we gotta go forward. But with no real vision forward. And you can't have that vision unless you understand the failures. We have so many low hanging fruits that I can't figure out why they aren't being solved. From deduplicating calendar entries, automatically turning off captioning on videos when a video has embedded captioning so you don't just overlay text on top of text, searching email, or even setting defaults to entry fields based on the browser data (e.g. if you ask for user's country, put the one the browser is telling you at the top of the fucking list!). These are all things I think you would think about if you were working in a space where you needed to consider optimization, if you were resource constrained. But we don't and so we let it slide. But the issue is a death by a thousand cuts. It isn't so bad in a few cases but these things add up. And the great irony of it all is that scale is what has made tech so powerful and wealthy in the first place. But no one stops to ask if we're also scaling up shit. If you printing gold but 1% of your gold is shit, you're still making a ton of shit. The little things matter because the little things add up. You're forced to deal with that when you think about memory management but now we just don't
As the base reality of computers and the inflated reality of software have diverged more and more, education and culture has tracked the software story and led to runaway irresponsibility. Forget not optimizing for performance; I think a lot of software today straight up fails to actually serve users some way or another. And those are paying users at that!
I agree. There are just too many obvious low hanging fruits. So I'm just trying to inspire people to take action and fix stuff. Ask not for permission, just fix it. Ask for forgiveness later.
As a fun anecdote I think this same rationale - ”next years hardware is so much better” - is why so many desktop softwares 90’s->00’s became slow - ”meh you don’t have to care about performance, next year’s cpu is going to be so much faster anyway”.
Then suddenly single threaded speedups didn’t happen anymore (and people realized even though cpu speeds had grown, it was not directly related to Moore’s law).
Ofc your rationale used Moore’s law correctly while the ”cpu infinite speed growth rah rah rah” peoples didn’t.
You're not wrong but this is overly reductive. In other words this is more of a sliding scale and not a step function centered on 10-15 years ago.
For instance I was a CS undergrad in the mid/late-90s. There was an enormous difference demographically between my incoming freshman class and the incoming freshman class by the time I graduated. And the talking points were exactly the same ones we see in this thread.
This post is so weird on so many levels. I'll focus on this part:
> You can also trace personal backgrounds and you'll see a much higher representation in the newer cohort coming from upper middle class backgrounds with families in careers like finance, consulting, medicine/dentistry whereas more in the older cohort came from more modest middle class backgrounds in engineering, academia, or even working class trades.
So, you reach for class warfare? Sheesh. It is anyone's fault that they are born into an upper middle class family? Are people from lower economic circumstances somehow superior, as you imply? This is just bizarre.
As a reminder: Bill Gates, who is certainly old school tech, was born and raised in an objectively wealthy, well-connected family, then went to Harvard. This is nearly made-for-TV silver spoon stuff.
It is telling that you considered their post to be about class warfare rather than different values.
The original focus of this thread was on technical precision vs. market efficiency, and how quality was sacrificed for faster conversion to sales.
That shift compromises products for everyone by creating a race to the bottom toward the minimum viable product and safety standards. When the consequences eventually hit, the aggregate responsibility and emergent effects lose direct attribution...but they exist all the same.
As the sibling comment noted, I think you might be projecting value judgment onto value distinction.
The most salient values of the later cohort are different than those in the prior ones, and those values do track with the values we associate with those different class backgrounds.
But there's no ranking being made there. They're just different values.
The values of the new cohort have earned the industry a great deal of political, economic, and cultural influence on an international level.
The values of the old cohort didn't do that, except insofar as they built a stage for the new one. They made software differently. They designed products differently. They operated businesses on different scales. They hired differently.
Indeed some of us from the old cohort don't personally savor all the spotlight and influence and cultural drama that Silicon Valley collectively bears now, and miss the way thing were. And others love it. But that's just personal preference, not class warfare.
> So, you reach for class warfare? Sheesh. It is anyone's fault that they are born into an upper middle class family? Are people from lower economic circumstances somehow superior, as you imply?
To be fair, I don't see any value judgements in the post you're replying to. He doesn't say if it's a good or bad thing, it's just a thing. But what I think this means is that field became more popular, entry filters became more competitive, and families with less resources to invest in their offspring became filtered out.
A slight addition to this topic. A lot of jobs also became software, even if your intention in signing up for the jobs was different to begin with. PCs were revolutionizing the world.
For about a decade I worked as an engineer in a field where the expectation (at least starting) was that metal gets cut, stuff gets built, and there's physical hardware.
Those existed. May have actually had more hardware interaction than many in engineering. Yet much of the day to day rapidly became computer simulations of the metal that might get cut someday.
In many fields, the organizational choice decrement on anything involving capital expenditure or purchase was so severe that usually the obvious choice was to run a computer model, and simulate what might occur. What else would you do?
Frankly a shame. Since there's a been a lot of development in mining technologies over the years.
Even for the folks that have an ecological focus, there's quite a few methods developed with limited degradation of the landscape, and reclamation of the mining sites into alternative uses (park, forestry, entertainment, tourism). The Wieliczka saltmine in Poland's an especially impressive example [1]
And these days, there's also a huge number of resources in terms of mineral identification and site mapping. The EMIT Imaging Spectrometer from NASA's a cool example that does remote satelite mineral identification from orbit. [2]
You can also trace personal backgrounds and you'll see a much higher representation in the newer cohort coming from upper middle class backgrounds with families in careers like finance, consulting, medicine/dentistry whereas more in the older cohort came from more modest middle class backgrounds in engineering, academia, or even working class trades.
Of course, there were always some of all of these people in the industry, but the balance shifted dramatically during the last couple booms, tracking the atypically high compensation standards set by FAANG's since 2010 or so.