Hacker Newsnew | past | comments | ask | show | jobs | submit | more parminya's commentslogin

So why not just reinterpret the original comment so it survives, and take it to highlight the difference between a rule-of-law legal system that punishes rich companies who break the law until it has motivated law-following behavior, and a corrupt legal system that does not punish rich companies who break the law enough to motivate law-following behavior. I don't think the balance between Europe and the US has always been in Europe's favor, so it's not like there's no capacity to improve.


The UK has one or two state religions, bishops ex officio have seats in (the less powerful house of) parliament, and the upcoming coronation of their king will be a church service (with communion and all). Germany collects taxes on behalf of religious communities. Australia and Canada regularly send public money to religious schools.

I don't think the separation of the Church and State in the US is particularly less than normal. The separation of Church and Politics in the US is probably less than normal, but that's an entirely separate matter, which - to the extent that is antidemocratic - can probably only be addressed by democratic reform.


That doesn't make Facebook like a government though. If twenty years ago the government chose to advertise a job in the three largest newspapers in a city, and not in the Dah00n Times, that doesn't make the newspapers like a government or incorporate them into the government. It might mean that people should question whether the government is too closely tied to certain private corporations. But that's a different question.


This is a case for regulation* and for promoting competition and diversity in the marketplace. It's not a case for treating them as governments. If they're governments, who chooses their leadership? Do they have one board, selected by the US president/according to the rules of the US Congress, that is applicable in the US, and another board, selected by the Canadian PM/according to the rules of Canadian law, that is applicable in Canada? How much are we going to pay to their shareholders for confiscating their property?

* Banning moderation is a form of regulation. It falls far short of treating them as governments. It is possible to say "I want to regulate Twitter" without having to make weird statements like "We should treat Twitter as a government".


Nobody said that they are national governments. They are organizations with governance policies.

Even if corporations are not governments, they are far more similar to governments then they are to private individuals.

Moderation is a form of regulation- it is a way of regulating speech. Banning moderation is deregulation.

"More platforms" is not the answer. In a democracy, when people are publicly discussing matters of public or political importance, other citizens have the right to participate in that conversation.


> Nobody said that they are national governments. They are organizations with governance policies.

Under this definition, practically any organization "is a government". My college debate team, Tesla, your local Starbucks. All "governments", because they are organizations have a governance policy (which really just means "leadership").

> Banning moderation is deregulation.

No. The government controlling how corporations can act is, definitionally, regulation. You can argue that is good regulation, but it is regulation. If I am allowed to censor you, that is less regulation than if I am legally prevented from censoring you.


If they're defacto governments, then they should be nationalised. If government is government, that extends to elected leaders or appointments by elected or responsible members of the executive. This of course entails the balkanisation of the internet, but it's the logical conclusion of your position.


If you skip representation, you don't know what you're optimising for. If you include representation, then you will not exclude groups you didn't know existed in the process (or even groups who are generated by AI's policy development process). Skipping representation would be an awful, authoritarian distopia.


I guess that's two different things. One person says "You can give GHC a heart transplant" and the other says "GHC needs a heart transplant: Here is our proposal". In fact, the very text you quote as saying GHC was famously a nightmare says:

> On the bright side, GHC is written in Haskell, and this language is particularly well suited to performing massive refactorings with confidence that nothing breaks. Thus, it should be possible to refactor the GHC library towards a more robust and versatile design

and in fact Simon Peyton Jones continued by saying exactly that:

> You can do a heart or lung transplant on GHC and make truly major changes and the type checker just guides you to do all the right things. The difficult stuff is envisioning and being very clear about what you’re trying to do. Actually doing it is often not that hard.

So the two texts and the two opinions are completely in alignment. GHC is famously bad insofar as it has a poor design. But once a better design is designed, you can give it the heart transplant it needs without excess stress.

I have no idea how the heart transplant proposed by Sylvain Henry, John Ericson and Jeffrey M. Young is going. I suppose at some point there should be something checked in and a report about how painful or painless it was (and, potentially, if it's really completely wrong, perhaps a series of bug reports in the next seven releases of GHC).


Hi coauthor here!

Work has been steady the master project plan is tracked in this ticket: https://gitlab.haskell.org/ghc/ghc/-/issues/17957

Almost all the recent work has been performed by Dominik Peteler (@mmhat) during Google summer of code, which John supervised. We've primarily been focused on landing !8341[0] which makes huge strides in Core w.r.t. modularity (see https://gitlab.haskell.org/ghc/ghc/-/issues/21872), but are on hold until schedules agree and some vacations end.

So the modularity project is far from dead. In fact Sylvain and I are planning on returning to it after we upstream the new Javascript backend (slated for 9.6, see MR!9133[1], tracking ticket[2]) hopefully this week.

PS: That patch is actually a good example of a gnarly lung transplant for GHC's Core IR (and done by a new contributor!).

[0]: https://gitlab.haskell.org/ghc/ghc/-/merge_requests/8341

[1]: https://gitlab.haskell.org/ghc/ghc/-/merge_requests/9133

[2]: https://gitlab.haskell.org/ghc/ghc/-/issues/21078


Thanks! This work is truly fantastic.


I guess I wonder if it's easy to refactor then why haven't people done it when they added things in the past?


I don't know. But - and this is my interpretation, not some summary of another talk - it sounds like Simon Peyton Jones, who is one of the people who are responsible for the poor state of the code, already answered that: "The difficult stuff is envisioning and being very clear about what you’re trying to do." Presumably, wherever his skills lie, they don't lie in envisioning a better state and refactoring the code towards that goal (to him, it was "difficult stuff", and left aside as he implemented changes that had a greater impact on its users). Afaik, for a long time GHC was maintained by the same two or three people so it probably shows the imprint of individual developers much more than your average compiler of similar size. Now development is much more open and this kind of technical debt is getting paid off.


> Presumably, wherever his skills lie, they don't lie in envisioning a better state

This is possibly the single best example ever of how Computer Scientist and Software Engineer are different jobs with different skill sets.


I guess part of the problem here is that everything changes so frequently. If technology moved so slowly that generational change in craftsmen was the major source of a change in quality, then you could actually use word of mouth and other external sources of information. Nowadays, the experience of a year or two ago really doesn't necessarily tell you much about the quality of the products that are available now.


No matter how common it is, I never know what "2.5x less than some reference number" means. Is it "divide the reference number by 2.5"?


Correct, i.e. 40W instead of 100W. It sounded more impressive than "40% of the original value" so I went with "2.5x less". Not the best measure to choose one's words by, admittedly.


40W LED? Wow that's big! I think the biggest LED bulb I have is 11W


2x20 (notice plural 'bulbs' in the original message a few steps up the thread), and this is actually measured whereas iirc the box said a bit less

And yes, in my opinion we erred on the high side, but it's not far off from what the original incandescent (which apparently was 2x50W, measured).


You managed to say that immediately. Were there other serious explanations that came to mind? If not then you need to have more self-confidence because you do know!


Even better would be just committing to what's there - it's so widely deployed that it's not seriously problematic. It's there to be used.


Agreed, they should just bite the bullet and commit to flakes as enabled and not "experimental". There are plenty of other things that break regularly with my nix deployment and it's never flakes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: