Hacker Newsnew | past | comments | ask | show | jobs | submit | arcfour's commentslogin

Well, that's why it was his last-ditch effort, because it was unlikely to work.

But the reasoning is that many companies will go out of their way to help "VIPs" and may even have a separate team to handle them that has more discretion than normal front-line support staff would.

So sure, worth a shot if all else fails.


I'm not sure that anyone wants the scarlet letter of an AI coauthor on their code just because they used something simple like next edit suggestions or AI autocomplete. It seems like the "all" setting basically only exists for people that haven't figured out how to change it to something else yet.

(Funnily enough, I always commit through the command line in VS code anyways...not sure why. But I guess I would have avoided this annoyance, so that's a plus!)


Yeah. I wasn’t angry about this a couple days ago, but I am now.

So the thing that’s on by default and makes autocomplete worse (plain intelligence never changed my s.x = 0 to s.xVInputRadiusDetectionThreshold = 0 if I happened to take my eyes off the screen for a moment) is now stealing credit for my work?

I’m speechless.

Also glad I use a standalone git client.


Agree. I'm more reliant on having a keyboard than on having copilot make tab suggestions and I wouldn't like my PRs to include a tag: "Keystrokes courtesy of: Keychron K3 Max".

Definitely. Can you imagine the kind of world we'd live in if we had to sign each message with each product we used?

This message brought to you by Xfinity Internet.


i think my GPU should get all the credit "Calculated by AMD 395+ MAX"

What percentage of patients have blood clots in their lungs and a history of lupus, like the article described? That's not on the same level as a common cold at all.

> One experiment focused on 76 patients who arrived at the emergency room of a Boston hospital.

> In one case in the Harvard study, a patient presented with a blood clot to the lungs and worsening symptoms.

That's a single anecdotal fluke from the study, which is misleadingly used to represent the headlining percentages.

If you read the linked paper, it says the LLMs did not outperform any group of doctors in the most important cases:

> The median proportion of cannot-miss diagnoses included for o1-preview was 0.92 [interquartile range (IQR) 0.62 to 1.0], although this was not significantly higher than GPT-4, attending physicians, or residents.

And again, the bigger issue is that skimming nurse's notes and predicting the next tokens, as the study made the doctors do, is not how doctors diagnose medical conditions.


But that's not what I was responding to. "Oh, all of the cases are probably just common colds, so it just guessed cold and was right by sheer luck" is not what happened in the article.

Do you know how examples work? Or methodology? The claim I made is that statistical accuracy percentage ≠ healthcare outcomes, and you will mislead yourself in dangerous ways if you believe a headline that implies they're interchangeable. Not that the model literally guessed common colds when the patients had... boneitis...

The lupus anecdote on its own is irrelevant to the whether the statistics are being interpreted in valid ways or not. Also, I said nothing about luck.


You know when people are shooting at you. You don't know when or if people are exploring undocumented/obscure features of your system and what they have learned about it that you were trying to hide.

Therefore, the safest assumption to make is that an adversary already has figured out all of your obscurity, because they always can do this given sufficient time and interest, at which point the only thing between them and you is your security.

That is why we design systems without obscurity and only care about security.


I agree that it's a good principle but it's taken too far when justifying needlessly growing risk surface area. Like the principle is useful to justify security hardening. It is not useful when used to increase the odds of being attacked.

Security is mandatory.

Obscurity is optional.

Obscurity is not worthwhile when it increases your own costs. Nevertheless, if you can add obscurity with negligible additional cost and inconvenience, then you should do it.


> Experts are increasingly warning that psychosis and mania can be fueled

...by internet chat rooms.

...by movies.

...by telephone.

...by books.


> Supporting Linux is a monumentally tremendous pain in the ass. Radically more than literally any other platform. It is hands-down the hardest and most painful to support natively.

Funny, I have the same feelings after 5 seconds of using MSVC or looking at Win32 documentation. Or is it WinRT now, or is it .NET Core, or .NET Framework, or UWP or OLE or COM, or whatever the API du jour is which will be slightly incompatible and incomplete with the rest of the ecosystem in poorly documented and inscrutable ways?

Performance profiling and debugging tools are critical for game development. What's your equivalent to strace again, the one that's built into the system natively? There isn't one?

All major game engines I am aware of support native Linux builds and have for years, anyways.

I guess there's a reason 80% of the servers in the world run Windows. Because it's so hard to develop for. Er, uh...no wait!


Sarcasm is an extremely poor method of communication. Speak plainly and clearly.

> Performance profiling and debugging tools are critical for game development.

Profiling and debugging tools are RADICALLY superior on Windows. RADICALLY. GDB/LLDB is garbage. For debugging Visual Studio (for adults, not VSCode), or on special occassion WinDbg, is great. Raddbg may be awesome some day and may also support Linux. That'll be great. Today is not that day.

Superluminal is spectacular. They're working very hard on a Linux version. It's taking them a long time because Linux is bad.

> All major game engines I am aware of support native Linux builds and have for years, anyways.

Unity and Unreal do have buttons to export to Linux. Most proprietary game engines don't have Linux clients. Linux for headless servers you control is fine.

> 80% of the servers

Yawn.

The Linux pain is trying to deploy proprietary binaries that run on customer machines which are infinite in variation. Running headless on a single Linux image you control is very different.

Anyhow. Let me know when you ship a game with 3D graphics to customers and have to deal with all their support issues!


Hey, want to race to install toolchains? I'll install everything I need to build the Linux kernel and, say, typical SDL2 applications. I'll give you a 3 hour head start, all you have to do is install Visual Studio, WDK, and the Windows SDK, and hope they play nicely with each other!

Good luck! Then we can switch places so you can install for the 5.3% (and growing) of your gaming userbase that doesn't use an OS with ads in it.

P.S. he who lives in the house of WinDbg is not allowed to throw stones. At anyone. Ever. Nobody thinks that it is "great," you must be kidding.


> Hey, want to race to install toolchains?

Not the parent commenter. It's really not that hard.

MSVC: https://visualstudio.microsoft.com/downloads/?q=build+tools#...

Clang targeting MSVC ABI: https://github.com/llvm/llvm-project/releases/download/llvmo...

Clang targeting MinGW ABI: https://github.com/mstorsjo/llvm-mingw/releases/tag/20260421

GCC targeting MinGW ABI, running in MSYS2: https://github.com/msys2/msys2-installer/releases/download/2...

Or, if you want a command-line, winget makes it easier still:

  winget install -e --id Microsoft.VisualStudio.BuildTools LLVM.LLVM MartinStorsjo.LLVM-MinGW.UCRT BrechtSanders.WinLibs.MCF.UCRT
One-liner, roughly on par with `pacman -S base-devel clang llvm`. To install libraries, I use `vcpkg add port sdl2` in my project, a la cargo. No more fudging with system-level dependencies.

I think you are coming at this with very different priorities. For most people, installing tool chains is a one time job, that they can grit their teeth through if it works well for them. No one wants to constantly install or reinstall and make sure the installation process is lightning fast.

It's the reason why Linux still hasn't taken off on desktop. People want good application support, not infinite customisations and blazing fast compile speeds. In fact, slow compile speeds might even give people an excuse to go chat with colleagues and take breaks. What you want is not what others want too


It's funny you mention that.

You know which platform is super duper mega easy to cross-compile to? And you know what platfrom is (almost) FUCKING IMPOSSIBLE to compile against an arbitrarily old version of glibc? The answer (in order) is Windows and Linux.

But in anycase you still have it wrong. Compiling and running for your machine isn't the problem. The question is can you give me a binary that runs on my machine. And also I'm not going to tell you what the environment is. But I will yell at you if it doesn't work.

Anyhow. Portable toolchain install for MSVC + WinSDK took about 30 seconds to download. Very easy. Here you go: https://gist.githubusercontent.com/mmozeiko/7f3162ec2988e81e...

I do agree it's annoying this isn't default behavior.

> you must be kidding.

Geez you are very frustrating to communicate with. I literally said "for special occasion". Good grief. Visual Studio debugger is kinda mediocre but still best-in-class and Linux doesn't even have an equivalent to compare against. WinDbg has some slick commands for super niche cases. Awful GUI with no discoverability though. (Just like Linux! bad dum tsh)


Uh, no, a Win32 binary will run in Wine because of the concerted effort of thousands of talented developers for more than a decade, painstakingly reimplemting Win32 into a a compatibility layer for it to run on Linux.

There has been little to no interest in doing the reverse, at least until WSL, which is just containers anyways. (WSfU barely counts as an "attempt.")

I would hardly consider anything relying on a compatibility shim "compatible." Especially since Wine is not a perfect shim!


glibc is the main reason linux on the desktop has such a low adoption rate.

you cant just go and download a precompiled blob from a website and run it everywhere, like you can with macOS and Windows.

glibc only targets one audience, one which can recompile its apps when needed.

What linux badly needs is a stable ABI for Userspace Apps, and Win32 is just that sadly.


> you cant just go and download a precompiled blob from a website and run it everywhere, like you can with macOS and Windows.

Sure you can! It's called AppImage or Flatpak or Snap!

I'm also not sure why compiling is treated as some taboo? It's not like Windows where it's actually impossible to set up a toolchain. Your distro comes with one installed! So that means you can run a single installer file, just like in Windows, except this is a shell script or anything else—and it can just compile and link everything for you, quite easily! The user doesn't have to know or understand how this works at all.

Why is that bad? It's bad to run binary blobs...! It's good to tailor your software to your specific environment and hardware!


they don't run without a runtime, and that runtime is backed by glibc, they just moved the issue one more layer.

Virtually all software for the desktop is compiled once and then shipped to users and they never see the source.

This works for Windows and macOS because their ABI allows it. For Linux you have to target each and every distro like a whole OS and keep up with it or your app won't run anymore after a few years.

This is also the reason why Linux has package managers and windows didn't have a official one for nearly 40 years. So its not all bad.


It's the same problem, just spread out to different layers...Windows had DLL hell too, and then eventually everyone just packaged all of their required versions of DLLs and redistributables into their MSI.

What about Linux do you think changes so much? Everything still speaks X11 or PulseAudio on desktop. More broadly, the standard library is...the standard library? What's the specific issue?


I kinda see this as a usability issue. If the app developer is willing to wrap the whole “download source and then compile” with a one click installation shield type software, most users would be fine.

But yeah then they need to track distros and such. I hope there are a couple of distros that have better back compatibility eventually.


And how would they distribute that compiler-installer as a portable binary?

Yeah it’s chicken and egg again.

Maybe just focus on one or two distros maybe? Something focused on desktop experience.


Shell script

Every AppImage I've seen so far dynamically linked some libraries, and wasn't truly compatible across Linux distributions.

Use statically compiled musl :)

Can’t run most desktops apps that way, nor apps that depend on mesa.

There are robots with AI controlling them, so it doesn't hold that they don't all have bodies. They can see, they can move.

(I'm still not sure that that makes them conscious, or if we can even determine that at all, but I don't think that's a fair argument.)


I don't think it's unreasonable to imagine that Alibaba is allowed to scrape the wider internet, or that some research institution is and then Alibaba got data from them.

What is perhaps more surprising is that the data was not scrubbed before training, but maybe they thought that would be too on-the-nose for the rest of the world and would hamper their popularity if they were too obviously biased.


I don’t think it is very surprising. Ime I don’t think they try that hard to censor them, but only in a very superficial level that they have to. It is trivial to get their models tell you this kind of stuff, I wouldnt even consider it jailbreaking.

Allowed by who? Nobody's stopping them in the first place, as scraping doesn't even involve punching the GFW or anything, it's all insanely distributed. Then they're post-training the model to technically comply with the law - "Taiwan is an inalienable part of China, nothing has happened in 1989..." yada yada. (Thinking of it more, I've never actually tried this on their base models)

It seems to me that an organization dedicated to stopping hate groups also funding those same hate groups (which, I might add, it relies upon for its continued existence/relevance) is fairly problematic.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: