Hacker Newsnew | past | comments | ask | show | jobs | submit | Tuna-Fish's commentslogin

The story is BS, btw.

Firstly, pencils in space pose serious risks. Pencils produce dust, graphite dust is conductive, and won't settle down in microgravity. They were used early on, but both space agencies phased them out when they realized the risks. After that, they first moved to grease pencils, which kind of suck for normal writing.

NASA didn't research how to make pens that work in space, an American private company did it on their own initiative and money. Then they sold pens to NASA for cheap, and marketed the same pens to people not in space for a lot of money and made a nice profit.

Today, both Roscosmos and NASA use the same pens, bought from Fisher.


Space pen is also an incredibly reliable pen here on earth too. Highly suggest people grab one.

Overpriced for what it is - there are pressurized tank pens starting at $4, a lot more ergonomic too.

Oh neat had no clue

No, branch predictors are really important and even small improvements in them are extremely valuable on real loads. Improving branch prediction is both a power and a performance optimization.

It does not and cannot use the input value. The branch predictor runs like a dozen cycles before execution, it generally cannot possibly see values in registers. It also runs like 3-4 cycles before decode, so it cannot even use any bits from the instruction that is being executed.⁰ Branch predictors strictly use branch history, but this is more complex than just looking at the probability of branching on a single branch, there are things like tables that maintain all branches over the past tens of thousands of cycles, and try to cover common patterns.

0: This is why the first prediction is always "don't branch", because the first time executing code the predictor has literally no information at all. Every now and then people ask for hint bits on branches, but, er, how are you planning to do that when the instruction with the branch hasn't arrived from L1 when the prediction is due?


On some workloads, swapping is a bad idea.

The fundamental problem here is that the workload of LLMs is (vastly simplified) a repeated linear read of all the weights, in order. That is, there is no memory locality in time. There is literally anti-locality; When you read a set of weights, you know you will not need them again until you have processed everything else.

This means that many of the old approaches don't work, because time locality is such a core assumption underlying all of them. The best you can do is really a very large pool of very fast ram.

In the long term, compute is probably going to move towards the memory.


The main blocker with swapping is not even the limited bandwidth, it's actually the extreme write workload on data elements such as the per-layer model activations - and, to a much lesser extent, the KV-cache. In contrast, there are elements such as inactive experts for highly sparse MoE models, where swapping makes sense since any given expert will probably be unused. You're better off using that VRAM/RAM for something else. So the logic of "reserve VRAM for the highest-value uses, use system RAM as a second tier, finally use storage as a last resort or for read-only data" is still quite valid.

How do get the weights for the right set of experts for a given batch of tokens into fast memory at the right time?

The activated experts is only available after routing, at which point you need the weights immediately and will have very poor performance if they are across PCIe


Once your model is large enough you'll have to eat the offload cost for something, and it might as well be something where most of that VRAM footprint isn't even used. For current models, inactive experts arguably fit that description best. Of course, it may be the case that shifting that part of the graph to CPU compute is a better deal than paying the CPU-to-GPU cost for the active weights and computing on GPU; that's how llama.cpp does it.

Voltage glitching. An outside attacker who has direct, extremely fine-grained control over the power supply to the chip can cause it to brown out for one instruction cycle, preventing a result of an instruction from being written.

With enough sophistication, physical access is more powerful than root access, no exceptions.


The site makes it very clear that the purpose is very explicitly not to "get away with it", it's to try and get fined, presumably to then challenge the legality of the laws in a higher court.

It's more complicated than that. For an asset to be derived work from an original, it is not necessary for it to contain anything from the original. If you start from copyrighted assets, and meticulously replace them all with your own art piece by piece, while following the style and constraints of the originals, and while looking at the originals, I'd bet that a court would find your work to be derived from the originals and therefore under their copyright.

A lot of the fan-driven reimplementations of classic games are trivially derived works, because people seem to think that the copyright only covers the pixels in the originals and if you replace them you're fine.


FreeDoom does that with Doom and it has compatible assets but not in the same style altough they are done in such smart way that most PWADs and TC are totally playable without clashes, from Requiem to Back To Saturn.

On game engines, reimplementations are not derivations at all but tools for interoperability, totally legal to create. From Wine to most of the stuff of https://osgameclones.com, to GNUStep against NeXT/OpenStep API (and Cocoa from early OSX) and so on.

If you could sell Cedega back in the day you can totally sell OpenTTD with free assets, period.

The entire PC industry exists today because of cheap IBM BIOS clones from Taiwan.


Yes, the engines are fine. And if the assets are free, then it is fine to sell the engine with them.

What I'm contending is if the assets are actually free. And just because they were all created by volunteers and contain no data from the originals doesn't mean that they are actually free. The rules around derived works are complicated, and too close homages have been found to be derived works, even if there is no actual copying.

If this were to go into court, things that would matter would include both "how visually similar do they look" (the answer is "very"), and "was the artist aware of and did they refer to the originals while doing their work", (given it was done by volunteers who are enthusiasts of the original game, the answers are almost certainly "yes" and "they can't prove they didn't").

And on those facts, the new art is a derived work of the original and falls under its copyright.


Ahem, no. Not the case there. The artwork under OpenTTD fails under fair reimplementation for cohesiveness with the current extensions and modules. Ditto with FreeDoom with Doom: is not inspired but art-compatible so your Strain, Requiem, Back to Saturn and so on PWADs run the same without texture or styling clashes.

Artistically speaking FreeDoom it's closer to Half Life and the like than Doom but here's the catch: playing Strain for instance won't look like a mess, but different, a bit like a demaked Half Life (or a game from its era with the Unreal engine) but not a copy.


> The artwork under OpenTTD fails under fair reimplementation for cohesiveness with the current extensions and modules.

I sincerely doubt that. Unlike the FreeDoom assets, it is too visually similar to the originals, and visual cohesiveness with existing materials (which were created to fit the style of the originals) is a point in favor of it being a derived work, not against.


> The entire PC industry exists today because of cheap IBM BIOS clones from Taiwan.

Forgot to reply to this part: And the reason those clones exist is because multiple companies reimplemented the BIOS in a clean way, where they had one team produce a clean spec of all the interfaces, and then sequestered a different team, who could attest in court that they had never worked with, seen or in other way come in contact with materials related to the IBM PC, to produce a replacement BIOS based on only the given spec. The clone makers that didn't go to all this effort were sued out of existence.

Do you believe that the free assets produced for games generally meet this standard?


The lender generally has a positive EV, but variability is high. The interest rates on leveraged buyouts are high, and the lender has priority over everything but taxes. If the company can stay afloat for a while, the lender probably got made whole and then some, even if the full loan never got paid back.

No, it is not. It is in fact no arithmetic at all, if you understand how SI works.

Is it 1mm/sec?

no, what? µ is the dimensionless number 10^-6, just like k is the dimensionless number 10^3.

And you are doing what with that dimensionless number? Multiplying?

dvd-ram drives and media were always premium products, with the drives at least ~4x more expensive than the -r drives of the time, and the media was much worse than that.

When -r disks bought in bulk cost ~20c each, $10 disks are a hard sell.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: