Hacker Newsnew | past | comments | ask | show | jobs | submit | fancyfredbot's commentslogin

Lucky for America, in the case of civilizational collapse there will be a lot of spare semiconductors thanks to almost everyone being dead!

Well we have cool projects like CollapseOS the problem is that there is so much undocumented silicon out there that cant be used without massive efforts. I know several "gold scrappers" and its such a shame that they trash great classic chips just go get back a bit of metal. So much effort went into making those chips and its just a shame that many can't be reused. While lack of cheap electricity prevents open design from being reused, there is an even bigger world of undocumented chips that are trashed as well.

He's producing semiconductors with a 1000nm (one micron) feature size. This kind of tech was cutting edge in the mid 80s. You might be able to produce a 32KB memory chip with it.

It would be difficult to break into the RAM business with that sort of product as most of the demand these days is for higher capacities.


I don't see the OP implying that anyone should trust the government. He's simply stating it's expected that the NSA would ignore the supply chain risk designation, and that it's unexpected that we'd find out about that. If anything the comment seems to imply a lack of trust in government.

SpaceX has baled out Grok, Twitter and Tesla, so now it's our turn to bale out SpaceX in the IPO.

The American tax payer has been bailing out Tesla and SpaceX for many years. Elon is the biggest welfare queen in history.

US gave him trillion dollars?

Try setting up one laundry which charges by the hour and washes clothes really really slowly, and another which washes clothes at normal speed at cost plus some margin similar to your competitors.

The one which maximizes ROI will not be the one you rigged to cost more and take longer.


I don't think the analogy is correct here.

Directionally, tokens are not equivalent to "time spent processing your query", but rather a measure of effort/resource expended to process your query.

So a more germane analogy would be:

What if you set up a laundry which charges you based on the amount of laundry detergent used to clean your clothes?

Sounds fair.

But then, what if the top engineers at the laundry offered an "auto-dispenser" that uses extremely advanced algorithms to apply just the right optimal amount of detergent for each wash?

Sounds like value-added for the customer.

... but now you end up with a system where the laundry management team has strong incentives to influence how liberally the auto-dispenser will "spend" to give you "best results"


Shades of “repeat” in lather, rinse, repeat.

Wow that is terrible. In my memory GPT 2 was more interesting than that. I remember thinking it could pass a Turing test but that output is barely better than a Markov chain.

I guess I was using the large model?


Here is the XL model. 20x the size of the medium model. Still just 2B parameters, but on the bright side it was trained pre-wordslop.

https://huggingface.co/openai-community/gpt2-xl


There’s an art to GPT sampling. You have to use temperature 0.7. People never believe it makes such a massive difference, but it does.

Probably a much better prompt, too. I just literally pasted in the top part of my comment and let fly to see what would happen.

The article is about two models which have either 2B or 4B parameters. Both are dense models. The 2B version will certainly use less power than qwen3-coder-next.

The models are quite good. They aren't just a tech demo.


"Epic refresh pull" is my personal pet hate right now. Although "like if you are watching this in <year>" on older videos is close behind.

The comments pretending Marvel actors (e.g. Benedict Cumberbatch) are their Marvel characters in other movies (e.g. Sam Mendes' 1917) kill me.

It was apparently very successful:

https://en.wikipedia.org/wiki/EPB


Unfortunately too successful!


What will happen now is not clasons problem anymore I guess.

The point they seem to be making is that it never was their problem, but they were just solving it for everyone for free anyway, and in return they were doing it wrong and they should stop interacting with people.

Honestly even when people are being paid to work for you and their job is to do what you ask them to, speaking to them like that is never going to work out.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: