Hacker Newsnew | past | comments | ask | show | jobs | submit | atomicnumber3's commentslogin

The vast majority of "AI is changing everything!" takes I read say more about people's fundamental misunderstandings of the software development lifecycle (the real one that companies actually do, not the one that people think they do or what companies say they do) than about anything AI is going to change about software eng.

If anything, their solving the complete wrong problems and being blind to the actual problems is probably a reason AI won't actually result in any real, top-level appreciable gains in shipping speed.


Lego is branding, curation and quality bar, though. They're the Apple of bricks (weird sentence).

There's tons of lego-knockoffs and of not even such lesser quality that the difference can be perceived by casual inspection. The set-to-set quality bar is really where it is, especially among their set lines not targeted at children or low-end of market.

But none of those sets have any kind of staying power. There's Expert/Creator/Modular sets from 20 years ago that sell for $500-1000 _opened and pre-built/re-disassembled_. That's all brand power.

So they're less about $/brick (though i know people scrutinize it) and more about price point and brand. Phrased differently, having your brick company race to the bottom sounds like a losing strategy.


Yeah I don't know what this person is on about. Lego is obviously premium and ... charges premium prices because ... they're a business. People (consumers) who want premium products ... pay the premium.

I would be much more frustrated if they became cheaper and reduced the quality of the product.


There is a prevalent view of economy that insists businesses sell their products at the minimum price they can still make a profit at (but not lower or you are dumping.) A Marxist view of economy, if I must.

Whenever I meet one of these people, I ask if they are willing to negotiate a wage reduction with his HR. My logic is simple. If you think it is wrong for a business to sell a product at the maximum price they can demonstrably get away with like Lego does, then why is it right for you, a professional worker selling your labor, to sell your labor at a price higher than what is necessary for subsistence?


> There is a prevalent view of economy that insists businesses sell their products at the minimum price they can still make a profit at [...] A Marxist view of economy, if I must.

That's actually how competition is supposed to work in capitalism. If you sell your products at much higher than the minimum price, someone else can make a profit by selling slightly cheaper and taking over your market share.


Interestingly that's one of the Marxist critiques. The market simply is not efficient enough to work fully. The effort to get Lego2 off the ground is simply too high and Lego gets an effective monopoly in their market segment (premium blocks).

If Lego was nationalized then the excess earnings that go into the owners' pockets as dividends or asset value would be realized by the people. But that of course leads to different inefficiencies (investors don't invest, etc...).


That's contingent on the competition offering a similar experience and quality, at a smaller price point. As a parent comment pointed out, no LEGO knockoff has been able to provide the same experience as LEGO.

I think what the commenter is getting at is that it's not even about competition. People get mad when companies charge more than is necessary to make what they deem a reasonable profit.

Of course, as you mention, in capitalism, a competitor is free to go in and undercut the leading brand, but they have to be able to sell why they're better AND cheaper.


imo it's not just that other brands are of similar quality but often of way higher quality than lego. you get so much more for less money from other brands, while lego sets are becoming kind of a joke. using stickers everywhere and randomly colored bricks on the internal sides of the set

It's actually not legally fine, or at least it's extremely dangerous. Projects that re-implement APIs presented by extremely litigious companies specifically do not allow people who, for instance, have seen the proprietary source code to then work on the project.

I don't think fear or legal action makes it illegal.

If I know it is legal to make a turn at a red light. And I know a court will uphold that I was in the right but a police officer will fine me regardless and I would need to go to actually pursue some legal remedy I'm unlikely to do it regardless of whether it is legal because it is expensive, if not in money but time.

In the case of copyright lawsuits they are notoriously expensive and long so even if a court would eventually deem it fine, why take the chance.


That's my point. It's dangerous and there are sharks in the water. That sounds like you're not going to have a good time if you do the described approach to someone who might assert you're infringing.

My understanding is that that is a maximalist position for the avoidance of risk, and is sufficient but probably not necessary.

Honestly, not enough of a joke.

I was thinking something similar - this isn't AI, and none of "those people" care if it is or isn't. They don't care philosophically, or even pragmatically.

They're selling a product. That product is the IDEA of replacement of the majority of human labor with what's basically slave labor but with substantially disregardable ethical quandaries.

It's honestly a genius product. I'm not surprised it's selling so well. I'm vaguely surprised so many people who don't stand to benefit in any way shape or form, or who will even potentially starve if it works out, are so keen on it. But there are always bootlickers.

The most unfortunate part is that when the party ends, it's none of "those people" who will suffer even in the slightest. I'm not even optimistic their egos will suffer, as Musk seems to show they are utterly immune even as their companies collapse under them.


AI is already "an employee who can't say no to questionable assignments." We should all be reflective about the real value and inevitable consequences of this work.

There are also direct and very negative consequences we are having from AI right now. AI is the largest source of fake video propaganda and has largely destroyed the confidence people have in video evidence. I can imagine someone being held liable for these negative consequences, perhaps even extra-judicially.

> I'm vaguely surprised so many people who don't stand to benefit in any way shape or form, or who will even potentially starve if it works out, are so keen on it. But there are always bootlickers.

I've been getting more and more disappointed by software engineers (in aggregate) as the years go by. They don't even have to be bootlickers to do what you describe, I think a lot of it is pride in their "intelligence," which they express by believing and regurgitating the propaganda they've consumed. They prove their smarts by (among other things) having opinions that align with a zeitgeist of some group of powerful elites. They're too-easily manipulated.

And it's not just AI, it's also things like libertarianism. You've got workers identifying as capitalist tycoons, because they read a book and have some shares in a 401k.


> I've been getting more and more disappointed by software engineers (in aggregate) as the years go by

Sometimes I am dismayed by the lack of political and social consciousness in this group. Decade or two of digital boom coupled with handsome paychecks was enough to convince them that their position is different than it really is.


Why is docker used by far the most, then?

Laziness

docker got popular because it had better DX (better tooling), it was like a super lightweight VM (and initially people really wanted to put init and SSH into containers)

easy but powerful, it's not just packaging, it's also a very basic deployment system too. (docker ps) and said better allowed a relatively foolproof cross-platform develop-deploy loop.


And yet still, while you remain mildly puzzled as you surmise the above, someone more familiar with the area and history standing right beside you will perceive the same sentence with a completely different emotional reaction.

That gap - knowing this is probably fundamentally deeper when perceived by someone else - is what the author is getting at. He can feel that's missing for him. Even if he's not completely left devoid of all comprehension of the passage.


"Harvest Moon never moved to PC/Mac"

That's not the opening that Stardew took. It was only on PC because that's the platform that doesn't make you fill out an application to get dev hardware for.

The opening they took was Harvest Moon putting out increasingly uninnovative game after increasingly uninnovative game for years. Rune Factory innovated exactly once and then swore never to do so again (though they managed to produce RF4 so I do give them props for that. even if RF5 walked it so far back.)

Something about these games is very hard to pin down. See also: the millions of SDV clones, some even by huge players with lots of money, that don't make any kind of cultural dent.


Oh, a ton of other things went right for lightning to strike - but find a Harvest Moon or even Animal Crossing clone on PC before Stardew - there just wasn't much there.

Likely part of it was realizing that you could eschew the 3D realism craze that was high at the time, and still succeed.


Crypto doesn't have nuclear weapons. (It may sound like I'm joking, but I'm not.)

In a slightly less bomby way, monopoly on violence (https://en.wikipedia.org/wiki/Monopoly_on_violence)

It's getting pretty hard to maintain a monopoly on violence without nuclear weapons. Ask Ukraine, as one example.

The value in SaaS was never the code, it was the focus on the problem space, the execution, and the ops-included side.

AI makes code "free" as in "free puppy".


Right, there are dozens of open source versions of wikis/task trackers/CRMs/ERPs/whatevers. Just because you can vibecode your way to a bad version of a bunch of SaaS products shouldn't fundamentally change anything. Companies buy SaaS products to make running the thing someone else's problem. It's times like these where I wish we had a functional SEC; I really wonder how much market manipulation is going on.

> The value in SaaS was never the code

I feel like a lot of people are about to learn this lesson for the first time. Except in some very niche areas the majority of the value was never the code. The SaaSs that everyone thinks will be replaced had much more than code if they were successful -integrations, contracts, expertise, reputation, etc…


Yeah, agreed, but it was at least part of the moat. Competitors can see the model, the approach to market, etc. They still had to code up a better product.

And part of the problem that the SaaS solves is that "I have this thing that I need to do. I can probably do it in software, but I don't know how. Can I buy that software?". Which is now becoming "Can I get an LLM to do it?" instead.


That’s where the “free as in puppy” comes in. It’s still a classic case of build vs buy, except building is now quicker than it used to be. You still have to ask, “suppose I did build it myself. Then what?”

Yeah. So then you get your own product, tailor-made to your organisation, that you own (well, it's public domain because LLM-generated, but same same), and that you can change whenever you want without having to deal with a SaaS company's backlog. If you don't like something in it, you fire up Claude Code and get it changed.

There's also no danger of it being enshittified. Or of some twat of a product manager deciding to completely change the UI because they need to change something to prove their importance. Or of the product getting cancelled because it's not making enough money. Or of it getting sold to an evil corp who then sells your data to your competition. Or any of the other stupid shit we've seen SaaS companies pull over the past 20 years.


Respectfully, I think you’re only considering upsides and not considering downsides, opportunity costs, and ongoing maintenance costs. This is not what smart managers do. Plus, just because you can build something cheaper with an LLM doesn’t mean you can operate it more cheaply than a specialist can. Economies of scale haven’t been obviated by AI.

It’s useful to take an argument and take it to its logical extreme: I just don’t see every company in the world, large and small alike, building everything they depend on in-house, as though they were a prepper stocking up for Armageddon. That seems pretty fanciful on its face.


I hear that, and yeah, it's a possible future but I'm not sure it's the certain one.

The maintenance costs are kinda overrated: you fire up the LLM, point it at the code base and say "this needs fixing". I'd say that the maintenance costs of dealing with endless patches and fixes from a SaaS for features that you don't use is probably more onerous.

And generally we're talking about the situation where the SaaS customer is a domain expert in their area of expertise, but that isn't software development. They can use a system incredibly well, they just can't develop it. They have in-house IT folks to maintain their computers, networks, etc, and they're really just adding a couple of people to develop and maintain some applications via LLM to that team.

We're already seeing some of this, so it'll be interesting to see how far it goes.


Why is it public domain because it's LLM-generated?

As an attorney (and this is not legal advice), I would argue--and the U.S. Copyright Office has already stated--that machine-generated content is not copyrightable, because it's not a form of human creative expression. https://www.copyright.gov/ai/Copyright-and-Artificial-Intell... ("Copyright does not extend to purely AI-generated material, or material where there is insufficient human control over the expressive elements.")

That said, the inquiry doesn't there. What happens next after the content is generated matters. If human creativity is then applied to the output such that it transforms it into something the machine didn't generate itself, then the resulting product might be copyrightable. See Section F on page 24 of the Report.

Consider that a dictionary contains words that aren't copyrightable; but the selection of words an author select to write a novel constitutes a copyrightable work. It's just that in this case, the author is creatively constructing from much larger components than words.

Lots of questions then obviously follow, like how much and what kind of transformation needs to be applied. But I think this is probably where the law is headed.


Can the output of the service be licensed? A bit like the AGPL, you're licensed to use/reuse/derive new works.

So if it's distributed outside of the license, that's subject to contractual penalties? I guess that's what all the "wrapper" SaaS businesses will do.

Read that report, it defined the issues and the boundaries well, for the current generation of AI tools. As they develop and expand, it's going to get interesting, especially if robotics/3d printing etc get involved.

If I use an Optimus Prime to help create art, similar to Andy Warhol's "factory", do I own the copyright on the completed work?

If a person uses AI to generate work that ends up being patentable, are patents also not available?


There's already been a case where an artist generated art using an LLM, and could not claim copyright on it [0]. Though the guy was torpedoing his own case by claiming the machine created the entire thing. The courts are supporting copyright claims when using an LLM as a tool to support human effort.

All software licensing depends on copyright. If no-one owns the copyright than it can't be licensed; it's in the public domain immediately and irrevocably.

Of course, if you rip off someone else's work, and they DMCA you, then you might need to prove that they generated the entire thing using an LLM with zero human input. Though there's plenty of folks posting blog posts claiming that that's their process, so it might not be that hard.

Patents are different. For a start they cost money and effort to get. And there are lots of rules around how they're applied and how you can defend them. Very different from copyright.

[0] https://www.cnbc.com/2025/03/19/ai-art-cannot-be-copyrighted...


Sometimes, but I think there are some SaaS products whose business model is really under threat. Look at PagerDuty. Their PE ratio is like 4.4. They have a lot of existing customers but virtually no pricing power now and I imagine getting new business for them is extremely difficult.

Canva is my go-to example - you can just get NanoBanana/whatever to generate and iterate on the image. Same for all those stock photo services. I used to use them a lot, now I just generate blog images

> AI makes code "free" as in "free puppy".

Exactly right


Very cool. I would've used this a lot in a past life. It's interesting how many different ways there are to manage cron jobs, too. I've seen:

Place 1: Cron jobs edited on the box via vim, and every night they're scraped and off there are diffs, they're committed. Kind of like reverse management. It was actually a pretty elegant solution for the environment it was in, where devs just could not be convinced to not just edit the crontab directly.

Place 2: no cron jobs, it's all either airflow tasks or a bespoke managed clustering cron-like thing.

Place 3: all k8s scheduled job pods, conceptually the same but somehow less enjoyable.

Honestly I liked the airflow version the best, python is just better than bash, and often cron jobs end up developing dependencies anyway. Plus Python lets you wire in many niceties.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: