Hacker Newsnew | past | comments | ask | show | jobs | submit | dgan's commentslogin

Cant wait to look like an idiot after sending a heart to my boss while trying to select the hostname

Maybe he will like it! Who knows?!

I randomly bought NAC just to try it. I dont know about the chemical interactions, but going out with collegues at that time taught me that it's basically impossible to get drunk. Usually a pint of beer is enough to make le feel at least a little dizzy, but when taking NAC, it was all like drinking water

If you all think NAC is great, wait till you try liposomal glutathione (glutathione is one of the things NAC is a precursor for, one of the general take-out-the-trash compounds for your cells). Of all the supplements I’ve tried, it has probably the most immediately noticeable positive effect (maybe because you take it by leaving it under your tongue to be absorbed sublingually for a bit before swallowing). Generally leaves me feeling great, even if I was kind of dragging and tired beforehand.

NAC taken before consuming alcohol has a positive effect apparently, but taken afterwards it's detrimental as mentioned here: https://en.wikipedia.org/wiki/Acetylcysteine


Took me a while, because i pronounce "Pfizer" as "pfee-tseh-r" in my head

That's the original pronunciation

Not sure why this is voted down, it's true.

On mice.

Just a note: “research about the safety of taking NAC every day for the long term is limited.” cf. a concerning 2019 animal study regarding higher risks of cancer https://doi.org/10.1172/jci.insight.127647 also discussed at https://www.science.org/content/blog-post/n-acetyl-cysteine-...

Same! I thought I was going crazy but the effect is clear and reproducible. My hangovers are also less bad.

When I go out drinking with my pharmacist buddy, we take NAC before going out. He swears it makes hangovers less likely. I can't say I've noticed that particular effect, but I do seem to sleep a bit better on those nights.

I am don't have an opinion on the efficacity of such poisoning, but your comment is about as useful as "when being violently attacked, do not resist, as you only make yourself suffer for longer"

I think it is pretty obvious that at the challenge with all abstract mathematics in general and the category theory in particular isnt the fact that people dont understand what a "linear order" is, but the fact it is so distant from daily routine that it seems completely pointless. It's like pouring water over pefectly smooth glass

You're more right than you'd think. The whole point of mathematics is precise thinking, yet the article is very inaccurate.

Nobody seems to care or notice. I'm watching in disbelief how nobody is pointing out the article is full of inaccuracies. See my sibling thread for a (very) incomplete list, which should disqualified this as a serious reading: https://news.ycombinator.com/item?id=47814213

My conclusion cannot be other than this ought to be useless for the general practitioner, since even wrong mathematics is appreciated the same as correct mathematics.


> Nobody seems to care or notice. I'm watching in disbelief how nobody is pointing out the article is full of inaccuracies.

I don't know. I finished my graduate studies in math a few years ago, and pretty much every textbook by well-known mathematicians was packed with errors. I just stopped caring so much about inaccuracies. Every math book is going to have them. Human beings are imperfect, and great mathematicians are no exception. I'd just download the errata from the uni website and keep it open while reading.


Is there a "mind-blowing fact" about category theory? Like the first time I've heard that one can prove there is no analytical solution for a polynomial equation with a degree > 5 with group theory, it was mind-blowing. What's the counterpart of category theory?

A thing is its relationships. (Yoneda lemma.) Keep track of how an object connects to everything else, and you’ve recovered the object itself, up to isomorphism. It’s why mathematicians study things by probing them: a group by its actions, a space by the maps into it, a scheme in algebraic geometry defined as the rule for what maps into it look like. (You do need the full pattern of connections, not just a list — two different rings can have the same modules, for instance.) [0]

Writing a program and proving a theorem are the same act. (Curry–Howard–Lambek.) For well-behaved programs, every program is a proof of something and every proof is a program. The match is exact for simple typed languages and leaks a bit once you add general recursion (an infinite loop “proves” anything in Haskell), but the underlying identity is real. Lambek added the third leg: these are also morphisms in a category. [1]

Algebra and geometry are one thing wearing different costumes. (Stone duality and cousins.) A system of equations and the shape it cuts out aren’t related, they’re the same object seen from opposite sides. Grothendieck rebuilt algebraic geometry on this idea, with schemes (so you can do geometry on the integers themselves) and étale cohomology (topological invariants for shapes with no actual topology). His student Deligne used that machinery to settle the Weil conjectures in 1974. Wiles’s Fermat proof lives in the same world, though it leans on much more than the categorical foundations. [2]

[0] https://en.wikipedia.org/wiki/Yoneda_lemma

[1] https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...

[2] https://en.wikipedia.org/wiki/Stone_duality


We should call it “relationship lemma”. That way its function is contained within its name. And would not require the definition step every time.

We should strive to name all things by their function not by their inventor or discoverer IMO. But people like their ribbons.


In my study, it's basically never that the person names the thing after themselves. My theory goes: Often a discovery is presented in a paper by someone(s), who gives it a usually only barely passable name. For a time, only a handful of experts in the field know about it and none of them care to write general explainers for the layman. So they call it what's easy. "[Name] [concept]" because they're used to talking in names all the time. Academic experts have a large library of people's names tied to the concepts in their papers, i know my PI certainly did, every query was met with a name that had solved it to go look up.

Anyways, the discussion begins with these people. Who all use the name to reference the paper which contains the result. As the discussion expand, it remains centered on this group and you have to talk _with_ them and not at them so you use the name they do. This usage slowly expands, until eventually it gets written in a textbook, taught to grad students, then to undergrads, and it becomes hopeless to change the name.

I share the frustration with naming, we can come up with such better names for things now. But until we give stipend bonuses for good naming, the experts will never care to do so. But i wholeheartedly disagree that the problem as a whole can be reduced to "people like their ribbons". Naming something after yourself is so gauche and would not be tolerated in my field at least. The other professors would create a better name simply out of spite for your greed.


well, this is more applied and less straightforwardly categorical, but thinking along the lines of solely looking at compositional structure rather than all the properties of functions we usually take as semantic bedrock in functional programming (namely referential transparency) is how you start doing neat arrowized tricks like tracking state in the middle of a big hitherto-functional pipeline (for instance automata, functions which return a new state/function alongside a value, can be neatly woven into pipelines composed via arrow composition in a way they can't be in a pipeline composed via function composition)

https://en.wikipedia.org/wiki/Abstract_nonsense

https://math.stackexchange.com/questions/823289/abstract-non...

Sometimes the proof in category theory is trivial but we have no lower dimension or concrete intuition as to why that is true. This whole state of affairs is called abstract nonsense.


I think that CT is more akin to just a different language for mathematics than a solid set of axioms from which you can prove things. The most fact-y proof I've personally seen was that you can't extend the usual definition of functions in set theory to work with parametric polymorphism (not that just some constructions won't work, but that there isn't one at all).

Well, group theory is a special case of category theory. A group is a one object category where all morphisms are invertible. You do group theory long enough and it leads you to start thinking about groupoids and monoids and categories more generally as well.

Sure, category theory can't prove the unsolvability of the quintic. But did you know that a monad is really just a monoid object in the monoidal category of endofunctors on the category of types of your favorite language?

Isn't that just the definition?

I think they're making a joke

Phil?

One of the most striking things is that cartesian products of objects do not correspond to set-cartesian products. This to me was mind-blowing when studying schemes.

Just Yoneda Lemma. In fact it feels like the theory just restates Yoneda Lemma over and over in different ways.

And the number of things you can prove using Yoneda lemma just proves how powerful category theory is.

How is this useful?

>so distant from daily routine that it seems completely pointless

imo, this is a problem with how it's taught! Order theory is super useful in programming. The main challenge, beyond breaking past that barrier of perceived "pointlessness," is getting away from the totally ordered / "Comparator" view of the world. Preorders are powerful.

It gives us a different way to think about what correct means when we test. For example, state machine transitions can sometimes be viewed as a preorder. And if you can squeeze it into that shape, complicated tests can reduce down to asserting that <= holds. It usually takes a lot of thinking, because it IS far from the daily routine, but by the same rationale, forcing it into your daily routing makes it familiar. It let's you look at tests and go "oh, I bet that condition expression can be modeled as a preorder on [blah]"


You say pretty obvious, but it took me 2 years during my PhD to be consciously aware of this. And once I did, I immediately knew I wanted to leave my field as soon as I would finish.

I'm just curious. Do you play computer games?

I have played quite a lot of video games in the past yes. But not much anymore.

If i had a euro for every time I started writing a compiler, and got lost in the parser weeds, i d have ... At least couple of euros

Motivational comment to remind OP that his life matters, even and especially, in difficult times

You talking like a bite must happen. No it's not. Source: myself, we ve had a dozen of dogs. Among them : rotweiler, new foundlands, montagne de pyrénées, terrier, and dozens of chihuahua and spitzs


I am on latest Fedora Gnome, and tab switching between windows randomly stucks. It's so annoying, i had to go back to X11, even if handles badly high dpi laptop; the alternative being to reboot randomoy in the middle of the work


Is there any way to attract bumblbees in one's landplot?


My grandfather had these hedge like bushes with giant red flowers lining the front windows that always had bumblebees. Im not great with identifying flowers; looked like Hibiscus maybe, but in a somewhat dense bush or hedge structure. Anyway, the bumblebees loved that. Didn't notice them anywhere else on the property, and the first time I saw them (4-5yo) I was quite terrified and would have remembered. They were huge and fury with bold colors and not afraid me, but not so scary after I learned about paper wasp from playing around in the wood-shed.


It really depends on where you live. I've found this site useful:

https://xerces.org/pollinator-conservation/pollinator-friend...

Make a few beds and allow them to be "wild" based on your region. All sorts of pollenating insects will show up, eventually.


They're after low-lying flowering plants, on my property. Basically weeds in the grass.


I dont know what world you/I are living in. I do ask claude to enumerate/explain concepts i am not familiar with. I never approached the free tier limit (is there one?). At work, we have a webpage which ia basically a chat to different models, sometimes i use it

Would I be paying 20€ to ask those questions? I dunno, i dont feel any particular need. Would I be paying 200€?! Are you insane hell no


People aren't paying $200 to chat, they're paying to have ClaudeCode or Cowork or Claude for Chrome do the work instead.


The question should really be what is the reservation price of existing buyers.

At some point the price will rise. But the value has to have risen for existing buyers to be ok with that - but can they perceive the value? Hmmm difficult to tell. Benchmarks are not an objective way to measure that.

In the long run google is most likely to acquire a serious cost advantage given their level of vertical integration.


We've been experimenting with claude code handling jira tickets and opening PR -- we're starting with Opus. It costs about $1 per PR that gets merged-- how much does it cost to have a software engineer do that PR? That's your price sensitivity. It will only get cheaper as models get more efficient and people get better at using them, though.


You’re operating in a micro perspective.

Managers of firms care about impact of financials. They don’t care about the metrics you are measuring / gaming. Ultimately all ‘progress’ has to show in the cash flows.

Are you taking more cost reduction projects and more revenue-generating projects? Are you actually delivering? Are customers perceiving you to be as trusted as before? Etc. are the only things that matter. ‘Show me the money’.

To me this is akin to the discussion re. Scrum, agile etc. Who cares? Show me the money.


Totally agree. We cut the scrum this week it is so impractical in the modern world.


This week my spending is > $2000 , and my ceo is very happy, another top user burned $1600.

You personally will not pay , but these who will will replace people who are not productive.

That’s actually the limiting factor. There will be top 5% of developers who will throw away on the street remaining 95%. No EU socialism will protect the rest . It is very unclear what is the new world market , with no web developers , no custom dashboard teams , no analytics teams , no devops whose role is to babysit human devs. From one side it’s a market for opportunity to automate , but human devs are generators of unlimited inefficiencies. Once the market settles down , the demand for lots of tools will shrink.

On my spendings to anthropic . I’m a manager, and top user I burn tokens on tickets creation , planing , even sprint filtering , for sure the most went to coding. Bugs are getting fixed by just sharing a screenshot and a sentence or two of description in 80% of the cases , all the way including to PR creation and making sure release is live and good with tests.

There are ways to cut the cost a bit , maybe 30-40% but it’s not practical .


sure, can't wait to see traders having chatGPT license and trying to convince it to explain the P&L lol. But what do I know


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: