Hacker Newsnew | past | comments | ask | show | jobs | submit | user-'s commentslogin

I am a believer that everyone should have their main flow be model/provider agnostic at a high level. I often run out of claude tokens and use GLM-5 as backup.

https://gist.github.com/ManveerBhullar/7ed5c01a0850d59188632...

simple script i use to toggle which backend my claude code is using


I tried the agnostic thing for a while, but there are enough quirks between the providers that I gave up trying to normalize it. GPT5.x wipes the floor with other models for my specific tool calling scenarios. I am not going to waste time trying to bridge arbitrary and evolving gaps between providers.

I put my Amex details into OAI, I get tokens, it just works. I really don't understand what the hell is going on with Claude. The $200/m thing is so confusing to me. I'd rather just go buy however many tokens I plan to use. $200 worth of OAI tokens would go a really long way for me (much longer than a month), but perhaps I am holding it wrong.


Being model and provider agnostic are orthogonal concerns.

e.g. you can run Claude models on AWS Bedrock giving you provider choice for the same model. Whether or not you need model agnosticism at that point seems like a very different question.


> . you can run Claude models on AWS Bedrock giving you provider choice for the same model

Is anyone doing this for personal dev that isnt token fed by employers? Coding plans are subzidized for a reason right? If I did API usage from a cloud provider id be out tens of thousands already.


Interesting; do you find they actually react the same way to the harness?

There are differences for sure. Claude models feel the most 'stable' in that I see less tool confusion messages and other mistakes like the one im looking at right now.

"Wait, I'm editing the wrong sections. The edit tool tried to match but replaced with different prop names than what was in the file. Let me re-read the file and understand the current state properly."

And of course models are not 1-to-1 and have different strengths and weaknesses. I know I wont get the same quality plan mode output probably. Its a tradeoff.


I generally assume the differences could be minimized by tailoring the instructions to the models; that they're not incapable of doing the same things, but the way in which they're instructed matters because it needs to draw on training.

But I don't use any of the cloud stuff; I'm local4lyfe.


Around the part where Margaret explains the problem to Tom , and started to feel annoyed. I could tell it was a LLM trying to fit a sci fi novella style of writing. And it was doing a good job , it was certainly better than 90% of posts ive read in the last 6 months.

Dont know why that makes me annoyed, maybe cause its the depressing seriousness of being a 'prompter' and the americana framing of it.


Really really cool. I appreciate the article style a lot too.


thanks! a lot of credit to the people who helped write/edit


Ill be honest i got all the way in the installer until it asked me to install Java 17+ manually. I havent installed java in forever and i remember it being a huge pain so ill pass. Im heavy on code orchestration tools atm and this one looks useful so hopefully when its further along in dev ill try it.


Hey, thank you for trying anyway. To be honest, I also remember installing java to be painful, but apparently nowadays brew install openjdk@21 + adding path to shellrc is doing the trick (in case you want to give it a second try)


I mean, he took a picture of the guy posted it on his twitter calling him a 'fat uncle'. I don't think he cares about being polite.


You have to understand though that this is X (rip twitter) we are talking about and from the verified account, the 14k follower count, it is evident that this person either is or is trying to be a tech "influencer". Posting controversial rage-baits is pretty much the pattern every influencer follows today to stir up discussion, increase their visibility, and get more followers.

> I don't think he cares about being polite.

If you're polite, debate civilly, say reasonable things and act like a normal person, you are a nobody on X. Nobody will see your tweet. Nobody will engage with it. You might as well have not said anything.


Real courageous from that guy calling someone a "fat uncle" on a Twitter thread. Could've applied that same energy IRL and told him to tone it down.


Is "fat uncle" a slang I don't know about?


In some Asian cultures, "uncle" can be used to refer to any man older than yourself.


Is uncle just old unmarried guy?


Great work, my only nitpick is that when you quit, it tells you that its not gonna save, then its yes to confirm quit. I instinctively wanna type N to quit without saving.

I feel like thats the opposite behavior. For example, I use https://github.com/zyedidia/micro as my main editor in terminal, where it says `Save changes to filename before closing? (y,n,esc)”` when you try to quit.


Good point, I'll change it to something more sensible with action-based options instead of y/n. https://github.com/sinelaw/fresh/issues/226


Anyone have suggestions for similar tooling for other languages?


JetBrains IDEs :)


the obnoxious site deisgn and comments like this stopped me from clicking buy in the apple store


What an absurd take. Because of some half-assed attempts to hand out small sums, the conclusion is that money doesn’t matter and the fallback is vague talk about culture?? A few hundred dollars helps but i dont know why that study is a big deal to the author. The limited impact is obvious. What is the connection to the vague rambling about investing in values and why biden lost because of spending in red states??

Besides a bad overall point, what a badly written article.

> As a society, we are pretty good at transferring money to the poor, but we’re not very good at nurturing the human capital they would need to get out of poverty.

What a crock.


I remember them pivoting from being infra.hq


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: