I think you're misinterpreting that. Everything other than food and fuel went up 2.8%. Everything (including food and fuel) went up 3.8%. Therefore food and fuel went up more than 3.8%.
We can see that advertised on every corner, too. Gas costs for me locally went from $3 pre-war to over $5 now. My "investment" in EVs and solar is feeling really good right now.
You have food and fuel, which is some fraction of the economy - call that F. You have a rate of inflation in fuel and food - call that f. And you have a rate of inflation in everything else - call that e. Then you have
3.8 = e(1-F) + fF.
You also have e = 2.8.
I think what you're claiming is that fF = 1.0, so that e(1-F) = 2.8. And I think that's wrong. When they say inflation apart from food and fuel is 2.8, they mean e, not e(1-F).
You're over complicating it because you don't need rates within subcategories when looking at the whole - e is given and f is useless.
3.8 - 2.8 = 1
The overall inflation is 3.8. Overall inflation without food and fuel is 2.8. The overall inflation attributable to food and fuel must then be 1 (this is different than rste of inflation within food and fuel as a category, f).
They probably had a few stock owners in mind, which came ahead and keep coming ahead with strategically planned transactions placed right before another US major move - all by pure coincidence of course.
- Code lives longer than you expect. You forget sooner than you expect. Make a readme/architecture overview/theory of operation document. Put more in it than you think is needed. Check it in with the code.
So there was this same kind of feeling with webdev frameworks. You're not using this week's latest? What are you, some kind of barbarian? Get with it, or get left behind!
But people eventually found out that the additional productivity of the latest webdev framework wasn't worth rewriting everything to use it, at least not every time a new one came out. Yeah, there was a massive hype cycle for each new thing, but eventually the hype wasn't enough to make everybody care, because the payoff just wasn't enough.
The same thing happened with CPUs. There was a time when people really cared about new generations of CPUs. The performance difference really mattered. Now nobody cares. (All right, not totally nobody. Gamers may care. People who need absolute maximum performance care. Some enthusiasts care. For most people, though, the next greatest CPU really isn't going to move the needle much.)
I think AI is already at that place. Yeah, there's still a massive hype cycle. Who cares? Nobody has any mechanism for forcing you to care. Is the payoff really there for switching everything over to the new model, yet again? And if it's not, then who cares about the hype and the buzz? Ignore it and go on with your life. Move up every now and then, if the benefit is there, but not every new release. And don't feel guilty for not moving up.
Somebody could figure out a mathematical formula for percentage "better" a new release is, and how much disruption switching causes, and based on that how often it's worth switching. But I don't have to have the formula to know that if things come often enough, and the improvement is small enough, it's just not worth it.
Nah, Smith stands out because everybody knows it's common. Cooper is better. It's one of the 10 most common last names in the US, but it doesn't give off "super common, might be fake" vibes.
"Still" means "it always had hallucinations, and it still does, despite people thinking that it doesn't anymore". People think we've moved past that. We haven't.
And then getting bugs when they use a new version of the AI, just like people occasionally got bugs when they upgraded to new versions of the compiler...
they would get bugs on every invocation of the software, not on a new version of the AI. it's equivalent to your compiler have a RAND function in it where it chooses between a billion different options every time it compiles, it's absolutely not equivalent to a compiler having a bug.
reply