Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a "just so" story.


So is your parent post on how monitoring will take over if devs don't unionize.


Perhaps, but do you really think developers are different from every other kind of employee? While we are well paid, the overarching story has been to systematize the work and increase the number of workers in an effort to drive down costs and increase management control. I think it's naïve to think this trend will lead to companies making wise decisions that cede additional control to workers. After all, this isn't speculation, we're discussing an article where 1/3 of workers are already subject to it!

The main thing that has made our profession different is that the work is intellectually difficult and requires a lot of training. If you think similar knowledge work jobs aren't subject to such pressures, I know a doctor that is freaking out about how they are training nurse practitioners in online schools to take away hospitalist jobs.


No, but I think everyone being forced to work remotely due to a worldwide pandemic -is- unique. The switch was not a carefully considered bought into thing, and companies are reacting to it with what they think they need, not what they actually find they need.

You call out 1/3 of workers are already subject to it; I will call out that despite almost -every- company going from "I can see who is in the office, at their computer, and I can walk by and get an idea if they're just doomscrolling Facebook, or at least staring at an IDE" to "now I can't tell at all what they're doing", a full 2/3rds of companies, in the UK (a country that already has a culture of passive surveillance, with cameras on most urban street corners and the like) -haven't- instituted such measures.

As time goes on, will that 2/3rds of companies also buy in? Maybe, if they find they can't actually determine who is and isn't producing. But that has always been a challenge with knowledge workers, and the move to remote hasn't actually changed that calculus at all. What -is- certain is that increased monitoring, especially in countries that historically have objected to it (such as the US, which is also a leader in the industry when it comes to establishing trends), has a cost, both monetarily and in morale. If companies don't see a return on investment, the push for it will likely subside, and it becomes something to cut for the sake of budgets, if not also competitiveness in attracting talent.

I will also point out, pre-COVID, most companies didn't actually track time spent in office for knowledge workers. While blue collar employees would clock in and out to ensure they were putting in the hours, and service industry employees would, necessarily, have specific hours they worked, knowledge workers such as devs (a very different category of beast than doctor or nurse, who though knowledgeable and highly trained, are not actually knowledge workers. A nurse is not nursing if they're staring into space thinking about their patient. They are more like service industry in that regard, working well defined shifts that require manpower. Same doctors; there are appointments and scheduled events they have to attend and do. You can't replace a surgery, unlike many meetings, with an email) generally have had no such measures, despite it being no more difficult to institute them. Why? Because companies realized that attendance is not a proxy for performance. The fact a bunch of bad 'leaders', faced with the most singular workplace disruption of our lifetime (hopefully), reacted poorly in trying to pretend it is, doesn't tell us anything about long term trends.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: