Hacker Newsnew | past | comments | ask | show | jobs | submit | michaelcampbell's commentslogin

Same, and constructing at least drafts of huge documents that I can iteratively fine-tune that have (at least last week) saved me 10's of hours.

And based on reality (code) rather than my feelz of what I vaguely remember the code to have been doing in some long past.


My take is new capabilities will consume any price reductions, making them moot. At least in the medium term.

A RAM price drop due to some magic efficiencies assumes everything else doesn't change, which I doubt anyone honestly thinks will be the case.


I didn't have any "top of page" navigations that I didn't control going on.

Do you have any links to documentation of this? Andreesen has a definite bias as well, so I'm not about to just accept his say-so in a fit of Appeal to Authority.

(eg: "Cite?")


He was talking about it in the Lex Friedman interview after Trump was elected. And he was talking about a lot of things the Biden administration forced on Silicon Valley at that time (since then Google lost a case about one of these back-deals).

So no evidence then. Kind of like Lex touting his bona fides as a professor.

> We've been securing our systems in all ways possible for decades and then one day just said: oh hello unpredictable, unreliable, Turing-complete software that can exfiltrate and corrupt data in infinite unknown ways -- here's the keys, go wild.

These are generally (but not always) 2 different sets of people.


> Our uptime has a '9' in it! -- Anthropic

Github this month is very close to having 0 9s reliability. (unless they want to argue that 89% has a 9 in it)

I'm not sure I've had a day without Github hiccups this month, so that feels right.

The comment you are replying is carefully written in a way that allows 23.19%

There is always 88.9% or 88.89%

By now, I'm nearly certain that they'd be down to 0 9s of uptime if they counted it conservatively.

Or as the British would say "9 innit ?"

Dumb enough to show a splash screen that might last anywhere from an infuriating time to nigh infinity, I guess.

That it's designed for a thing and becoming the go-to choice for that thing can be far apart indeed.

> Swift could have easily dethroned Python.

Just IMO, but... no. To me a "could have easily" requires n-1 things to have happened, and 1 thing not happening. Like, we "could have easily" had a nuclear exchange with the USSR, were it not for the ONE Russian guy who decided to wait for more evidence. https://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alar...

But even in '15-'17, there were too many people doing too many things with Python (the big shift to data orientation started in the mid/late 90's which paved the way to ML and massive python usage) by then.

The 'n' was large, and not nearly of the 'n' things were in Swift's favor then.

Again, IMO.


I want to accuse you of using an LLM to write this with the temperature set to some absurdly high value, because on its face it sounds ridiculous.

And yet, here we are.


It's hard to make this up :)


an LLM making this up would be much closer to AGI than anything else I've seen


it's just a fancified key talent acquihire of people on the edge. with the amount of cash in LLMs, i expect to see more of this given the pace of innovation in that field.

the story does sound ridiculous ostensibly, but that's the press spin.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: