Hacker Newsnew | past | comments | ask | show | jobs | submit | repelsteeltje's commentslogin

I found this interview [0] on the subject of AI in CS education on the Oxide & Friends podcast very illuminating. Of course, Brown University CS != All education, but interesting angle nevertheless.

[0] Episode webpage: https://share.transistor.fm/s/31855e83


Seems only fair that we pay our taxes when those are used to subsidize such lofty endeavours.

I think there are plenty of issues with data center construction but there are real economic benefits here. If there weren’t it would be pretty easy for states to thwart them. You would see the leverage switch and companies paying states incentives.

This assumes that the legislators and regulators who approve projects like this are motivated by economic benefit and not by campaign donations and other favors.

Could you say what those benefits are?

UB != unsafe

I'm sure there have been attempts at defining a language that has no UB, but afaik all meaningful languages have UB in some dark corner or enumerated explicitly. For example, Java thread execution order is UB.

> For example, Java thread execution order is UB.

In this context "UB" means something different than how you're using it. The UB being mentioned here is the "nasal demons" form, i.e., programs which contain undefined behavior have no defined meaning according to the language semantics.

What you're talking about is probably better described in this context as "unspecified behavior", which is behavior that the language standard does not mandate but does not render programs meaningless. For example, IIRC in C++ the order in which g(), h(), and i() are evaluated in f(g(), h(), and i()) is unspecified - an implementation can pick any order, and the order doesn't have to be consistent, but no matter the order the program is valid (approximately speaking).


Great example.

So this "unspecified behavior" might turn into the more nasal demon type when g(), h() and i() share mutable state and assume some particular sequential order of execution. No?


Not necessarily. Unspecified behavior and undefined behavior are independent concepts; a language can have one but not the other. As a result, you can have languages where incorrect reliance on unspecified behavior can lead to undefined behavior (e.g., C and C++) and languages where incorrect reliance on unspecified behavior can lead to bugs, but not nasal demons (e.g., Java)

That would depend entirely on the assumptions being made and the constructs being used. I think in most cases it would likely just result in regular garden variety bugs.

But sure, if you're writing C++ and (for example) g is depended on for initialization of pointed memory that the other two consume you could end up with UB. But if you're writing Java then no, you will not end up with UB just buggy code.


It's not difficult to have a Special Purpose language with no UB. There isn't any UB in WUFFS (a language for Wrangling Untrusted File Formats Safely) for example.

Those "dark corners" come into the picture when you decide you want a General Purpose language.

When your program might actually intend to respond to emails by executing the x86-64 machine code squirrelled away inside this logo PNG and running the output as SQL on your customer database, it's not possible for the language to ensure that programs which weren't intended to do that can't do that, how would they know? The translator isn't a mind reader, your intent is unreadable.

I think we should use general purpose languages much less often, the industry doesn't seem to agree, the results are obvious for everyone to see.

Rust has a subset, "safe Rust" which isn't a fully general purpose language but deliberately shares its syntax with a larger unsafe Rust you can use when it turns out that you needed it. Safe Rust doesn't have UB.


Yeah, like key bindings in IntelliJ that might make sense on Windows or Macintosh, but conflict with Linux defaults. I switched to Linux couple of decades ago, but this second class treatment of Linux desktop is one of the reasons I'm still doing most of my work in the terminal.

To be fair, for IntelliJ, just switch the keymap to the gnome or KDE one (yes, it comes with them ootb) and that problem is fixed.

... and the safety argument is a great way of saying "no" disguised as a "yes, if ..." to your prospects.

I get the sarcasm, but what about Neanderthals versus Homo Sapiens?

What about it?

Today's tech echoes 1960-1970 mainframe era: very centralized around a handful of companies controlling "massive cloud compute" in bespoke mainframe-like topology.

All of that will all be legacy in a couple of years. Today's B200 clusters are tomorrow's e-waste. Decentralization might happen gradually or abruptly. But to me it's obvious that we'll be thinking of high-tech tensor processors and GPUs the way we thought of individual transistors and tube amplifiers in the 1980s.

If AI turns out to be the revolution it purports to be, than the underlying hardware will change much more rapidly than it did with ICs and microprocessors in the late 1970s. Today's hot is tomorrow's junk.


> Today's B200 clusters are tomorrow's e-waste.

Hardware depreciation timescales are actually getting longer, not shorter, because frontier hardware like B200 clusters is highly bottlenecked. It's not just a RAMpocalypse out there, we're seeing early signs of production bottlenecks with GPUs and maybe even CPUs.


Which, in itself, is a major crack that AI has caused in the delicate foundation of our technological society.

One thing that is potentially different this time is that Moore's Law has stopped scaling. Computers aren't getting smaller exponentially. They're getting bigger with multiple chips glued together to make up for Moore's Law.

...But there's a new world dawning for photonic chips.

No reason to expect Moore's observation to apply there (though, maybe?), but it will have big implications for power usage.


Photonic chips allow computers to get bigger, not smaller.

Free (but admittedly useless) advice when you plan to fall seriously ill:

- do not get on a cruise ship

- do not get off at a remote island


From what I gathered from the article the person who got off was a resident of Tristan? They have such limited shipping options that this might have been the only way for them to travel from any mainland. Not sure though, but I don't think they got off there to seek medical assistance.

Hah. Coincidentally, I was listening to this exact episode of future of coding podcast while stumbling upon this HN mention.

Access to these visuals really helps understanding what they are talking about


Is it the podcast "Feeling of computing" on Spotify?


Seconding for a great podcast! The second time in as many days that one of their research papers have popped up on here.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: