First of all you don't need it. Secondly, the regulation even states that the right is granted automatically anyway. Technically, the rule had been in place for the past 45+ years anyway - even when there was mandatory military service! - so it doesn't make any practical difference.
> Apparently it is bureaucracy without purpose after all?
No it's not without purpose at all. The purpose is to know who could be drafted in a timely manner should the need arise. There's currently 2 major wars - sorry "special military operations" - happening, one of which in Europe.
A certain government involved in one of these simultaneously calls for allies to assist while at the same time openly questioning half a century of military alliances. So maybe this helps to understand why regulations like this make sense - even for people who never lived through a time when there was mandatory military service and take their own security for granted.
At the moment, the law has no teeth since they cannot stop anyone from just leaving without return ticket, and nothing happens when you return. Of course it would be very easy to change that, and that's the reason why it exists.
This is an interesting observation. So maybe it has nothing to do with the model itself, but everything to do with external configuration. Token-limit exceeded -> empty output. Just a guess, though.
> Token-limit exceeded -> empty output. Just a guess, though.
That'd be really non-obvious behavior, I'm not aware of any inference engine that works like that by default, usually you'd get everything up until the limit, otherwise that kind of breaks the whole expectation about setting a token-limit in the first place...
I just fixed this bug in a summarizer. Reasoning tokens were consuming the budget I gave it (1k), so there was only a blank response. (Qwen3.5-35B-A3B)
Most inference engines would return the reasoning tokens though, wouldn't you see that the reasoning_content (or whatever your engine calls it) was filled while content wasn't?
This doesn't necessarily relate to the inference itself. No models are exposed to input directly when using web-based APIs, there's pre-processing layers involved that do undocumented stuff in opaque ways.
Look at Python - similar story. Once a reasonably usable global package registry exists, this is exactly what happens. Languages and standard libraries evolve, shipped code more often than not doesn't.
> you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.
Believe me, if C had a way to seamlessly share libraries across architectures, OSes, and compiler versions, something similar would have happened.
Instead you get a situation where every reasonably big modern C project starts by implementing their own version of string libraries, dynamic arrays, maps (aka dictionaries), etc. Not much different really.
Absolute nonsense. Apart from the fact that password length is necessarily finite due to memory and time constraints, passwords aren't stored as clear text. You will get hash collisions, because the number of unique hashes is very much finite.
Your argument therefore doesn't apply in this context.
> Most things can't be taught to do arithmetic, making this "transformer" thing slightly magical.
Yep, for people who don't have know the fundamentals (i.e. maths). To people who don't know the universal approximation theorem, this may seem like "magic", but it's just as much magic as making a dark room bright by flipping a light switch.
The people already doing this work today already do exactly that.
There's no goalpost shifting here - it's l'art pour l'art at its finest.
It'd be introducing an agent where no additional agent agent is required in the first place, i.e. telling a farmer how to do their job, when they already now how to and do it in the first place.
No one needs an LLM if you can just lease some land and then tell some person to tend to it, (i.e. doing the actual work). It's baffling to me how out of touch with reality some people are.
Want to grow corn? Take some corn, put it in the ground in your backyard and harvest when it's ready. Been there, done that, not a challenge at all. Want to do it at scale? Lease some land, buy some corn, contract a farmer to till the land, sow the corn, and eventually harvest it. Done. No LLM required. No further knowledge required. Want to know when the best time for each step is? Just look at when other farmers in the area are doing it. Done.
That's one way to put it. The other would be 1 year of paid community service (which the alternative services ALWAYS were).
reply