Hacker Newsnew | past | comments | ask | show | jobs | submit | dijit's commentslogin

It's actually pretty common for old sysadmin code too..

You could always tell when a sysadmin started hacking up some software by the if-else nesting chains.


That 2019 logo looks fantastic.

The modern logo reminds me of Microsoft Word for Mac 2010 :(


My postings on LinkedIn have definitely had direct consequences in my professional life.

I consider them all good because ultimately if you get upset by the way I behave then that's probably going to be true if we work together also.

Sometimes people like to tell me that I'm very authentic and it's clear that I'm not trying to suck up to anyone, which they respect. Some people quietly retreat from me and I find out later that it's because I hurt their feelings inadvertently by shitting on AI or calling out web development as largely being inefficient in resources or something.


Love this response, and as some one who does perhaps a bit too much spending/wasting time on other types of social media, including here, I've made a conscious decision to post on LinkedIn more.

And it's such a difference. It forces me to slow down and think about a lot of things. The most important being: Is this even worth posting AT ALL?

And then, okay -- how can I say this in a future-proof way that both appeals to normies and tech folk like myself. I feel like I'll be doing better the more I post to places like that, and maybe less here?


Goes with the territory of allowing remote code execution arbitrarily and all the time otherwise you won't be able to..

* checks notes *

read text on the internet.


> read text on the internet.

B—b-it 4K scaling! Reactive design (to read text)! Emojis, reactions, badges! Memes and gifs!


All of which can be done with HTML, CSS, or UTF-8

Nearly all the top level comments are about the value of Linkedin at all rather than the technical reasons that 2.4G of RAM for a website is atrocious.

Can we talk about how it's possible that any application short of video editing can require so much RAM?

In fact, I've done video editing on computers with 1GiB of RAM back in 2004 and it worked fine, (for the 1024x768 resolution which was en vogue at the time)..

Is linkedin doing something complex? Is there a reason that it requires more resources than my entire computer from 20 years ago, or my entire operating system, text editor and compiler today?


I know right? Talk about missing the point. Honestly, I think everyday HN is closer to a reddit comment section.

I don't know your background, but I find that the people who feel that there's nothing wrong with IPv4 have never done any work with UPNP or NAT. For them it's always "just worked" and they don't recognise what pain has gone into trying to keep it working well despite our usage of it bordering abusive.

NAT is the devil.

If anyone replies to this with the myth that NAT is a security mechanism I will firmly, yet politely, point them to a network development course because they don't know what they're talking about and I'm sick of hearing it. It's not true, I will not entertain this falsehood anymore.


I dunno, I've manually set up NAT (down to the sysctls and iptables) and it's... fine. There's a small learning curve, but it's small. You should of course run a firewall which NAT is not; that's also simple and just basic hygiene.

Well, if you've set it up then you're aware that you need conntrack;

Conntrack is not always your friend, and even when it is: it's adding a lot of overhead.

https://www.tigera.io/blog/when-linux-conntrack-is-no-longer...


Well yes, a stateful NAT is stateful. But as that article notes:

> For most workloads, there’s plenty of headroom in the table and this will never be an issue.

And yes, if you're doing thousands of connections per second then you should evaluate things more carefully for performance, but again... That's rather a lot.


It's interesting that you'd think that.

Connection table of a single IP is as high (by default) as 16,383[0].

I've hit this limit personally, and due to limitations in stateful firewalling we had to move to stateless filters inside our network equipment instead.

[0]: https://learn.microsoft.com/en-us/troubleshoot/windows-clien...


It's interesting that I agree with the article that you linked?

I'm not contesting that it's completely possible to hit the limits in play, but 16k connections (per IP) is high enough that I don't think that's a common problem, even in public-facing web services. Granted, I suspect the services I've run professionally all dealt with it by making it the problem of a load balancer in front of the application and internal network, but... you probably have that anyways, so I'm still not seeing the problem.


it’s interesting because I think youve internalised the constraint and built mechanisms around it rather than engaging directly with it, subconsciously.

I'm not sure why that would be interesting either, but no, it's not a constraint I've ever hit so I doubt that I'd bother avoiding it, subconsciously or otherwise.

I do work with NAT, but the stuff I use does hole-punching pretty transparently. I run 2 VPSes to facilitate this. I don't use UPNP, I have it explicitly disabled everywhere (too much malware that tries to leverage it)

And yes I know that NAT has the same effect as a "deny all inbound" on IPv6. Which is something I would set there too if I did use IPv6 so I'd still have to do hole-punching anyway.


I had a programmer pushing multi-gig packages to a Meta Quest 3; and it was taking around a minute. He didn’t even think that it could be faster because he assumed the Quest or software was slow and didn’t check.

I implored him to try a different cable (after checking cables with the Treedix mentioned in TFA), and the copy went from taking over a minute to about 13s.

Its not just normal people confused.


I find some programmers (and this is presumably true of any industry) very narrow in their expertise within technology.

Yeah, most programmers are not curious hackers anymore. They are 9-5 white collar workers with hobbies far outside of programming, systems, hardware, etc. It shows very much as soon as you meet one of them. But, like you said, this is true of any industry.

Oh, and pointy jab: these folks are also, in my opinion/experience, the most eager to vibecode shit. Make of that what you will.


"anymore"? Over a decade ago, a coworker had a path for updating some app's files to a database, and it was taking something like 10 minutes on certain test inputs.

Swore blind it couldn't be improved.

By next morning's stand-up, I'd found it was doing something pointless, confirmed with the CTO that the thing it was doing was genuinely pointless and I'd not missed anything surprising, removed the pointless thing, and gotten the 10 minutes down to 200 milliseconds.

I'm not sure if you're right or wrong about the correlation with vibe-coding here, but I will say that co-workers's code was significantly worse than Claude on the one hand, and that on the other I have managed to convince Codex to recompute an Isochrone map of Berlin at 13 fps in a web browser.


I do feel like the industry has taken a nosedive quality wise over covid in particular. Lots of new people only in tech for the money, no deep idea about computers.

But I know stories like yours from a decade past as well. A tale old as time, but compounding in recent years - IMHO.


Could be, but I think the rot I see now predates the pandemic, possibly with reactive, possibly even before then: https://benwheatley.github.io/blog/2024/04/07-21.31.19.html

I blame it on "software eating the world" (in general) - at some point, about two decades ago, it started to become obvious to everyone that programming is the golden ticket to life - an easy desk job paying stupid amounts of money, with no barriers to entry. So very quickly the pool of students, and then employees, became dominated by people who joined in for the pay, not because of interest in technology itself.

Obligatory link: https://thedailywtf.com/. It's full of stories like this.

I actually purchased one of these as this article has surfaced before.

It’s well worth the hype, I used it to audit all my cables (both for home and work) and it’s amazing how many thick and unwieldy cables are actually terrible for data.

For example I purchased a pair of B&W Px8 S2 noise cancelling headphones, which boast a DAC if you connect via USB-C directly, the cable it came with though was thick but only rated for USB 2.0 speeds. These headphones cost more than AirPods Max, which are themselves considered overpriced, and include comforts like nappa leather; so shipping with a chunky cable that doesn’t even carry decent data feels like a bizarre oversight. Apple’s own USB-C cables manage the same power delivery at less than half the thickness with a woven shell. You’d assume a premium product would at least match that.

Honourable mention to the USB-C cables that ship with Dell Ultrasharp monitors (both pre-USB4 and post). Those support basically everything except Thunderbolt 4 despite being unmarked.


> so shipping with a chunky cable that doesn’t even carry decent data feels like a bizarre oversight.

USB 2.0 can support up to 480 Mbps. It’s more than fast enough for any audio stream you can send to a DAC.

Your headphones don’t need USB 3.0 5 Gbps speeds. USB 3 requires extra wires with different properties that need to be controlled more tightly, which can impact cable flexibility. If your headphones used USB 3 when they didn’t need it that would be one more thing to break and more failure modes for the cable.

A USB 2 cable with fewer conductors was the right choice for this product. The fact that you only got miffed about it when plugging the cable into a tester, not from actually using the product or cable, is good evidence that a USB 3 cable wasn’t needed.


Nobody said the headphones needed USB 3. The point is that the cable is physically thick and rigid (like something you'd expect to carry serious data) but doesn't. Meanwhile Apple ships a thinner, more flexible cable that supports the same USB 2.0 speeds and equivalent power delivery. The cable B&W chose is worse ergonomically for no functional benefit. That's the kind of mismatch the Treedix exposes.

For headphones? I'll take the more robust cable any day. The thinner it is the sooner I'm going to have to replace it.

USB 3 doesn’t just mean “higher quality wire”. It requires additional data pairs that take up space and add cost.

Apple’s iPhone cables are not known for their durability. They serve a mostly stationary purpose, unlike headphones you wear on your head.


A thick cable tangles less easily.

I started buying Belkin TB5 cables which are around $50 a pop. They can easily power a laptop at full load and can stream video at any reasonable resolution/framerate I might need. I've yet to find a need for an NVMe faster than 20 GBps nevermind finding USB4 enclosures, or that the cable supports up to 80. They're also not nearly as chunky as the Dell cables, which are good, but seem to have very rigid shielding.

I keep a few converters for older devices and servers that don't have (m)any C ports, but as far as a consumer "forever cable" goes, TB5 feels close. Certainly the cable's bandwidth is beyond what most people need, unless you're editing 8k video or continually shuffling hundreds of GBs between external disks.


I do something largely similar.

It alleviates the anxiety of knowing what cable does what.

I use Apples Thunderbolt 4 or USB-C cables exclusively: if its white its for charging and low data, if its black its for high data.

I’ve been doing this for a few years, but its really costly as those Apple Thunderbolt cables are crazy expensive.


I wish the USB spec had mandated labeling. There must be a lovely label printer they make for cables or something that shouldn’t be too expensive these days. Label a few cables a day, finish the whole house in a jiffy.

LTT or another big YouTuber made a cable and made sure to get it labeled. Also complained how difficult it was to find a supplier willing to make a better cable than usual.


> Honourable mention to the USB-C cables that ship with Dell Ultrasharp monitors (both pre-USB4 and post). Those support basically everything except Thunderbolt 4 despite being unmarked.

I have one of those. They are thick and unwieldy af. Since I've borked the usb connection on my monitor because of static discharge, I no longer use it and figured I'd repurpose it for my digital camera, for which I used to have a short cable that was sometimes annoying. This cable is so freaking think and hard that it'll move my (admittedly somewhat light) camera on the table.


Is there any audio you might play that doesn't fit in 400Mbps?

The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.

It’s rigid and thick, like a Thunderbolt 3 cable, yet only supports USB 2.0 speeds and fast charging for a device that doesn’t need fast charging.

Compare that to Apple’s iPhone USB-C cable which is thin, flexible, and supports the same features.

That matters because someone might grab that cable assuming it’s a “better cable”: it came with a £629 product, it’s thick and feels serious, so surely it’s capable. But it isn’t. And there’s nothing marked on it to tell you otherwise.

The whole system ends up relying on presumption, which is exactly the problem the device in the article is solving.


> The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.

The purpose of the heavy construction is to make it durable, not to carry 5 Gbps data streams to your headphones.

Unlike most USB peripherals like your printer and keyboard that get plugged in and then don’t move around, headphone cables go to your head and move around constantly. They can get pinched in drawers or snagged on corners.

Hence the more durable construction.


Apple's woven USB-C cable gets dragged around with iPhones, iPads and laptops daily and manages durability at half the thickness. Durability doesn't require rigidity... in fact for a headphone cable, rigidity is the opposite of what you want. Stiff cables tug on the headphones and transmit mechanical noise.

You don’t wear your iPhone or iPad on your head with the cable plugged in all day like you do with headphones.

Apple’s USB iPhone cables wearing out prematurely is so common it’s a meme.


Not sure why you're being downvoted.

Maybe Apple's changed their cables recently, but the fragility is the reason I avoid Apple cables.

Especially in headphones. The number of times those broke during a bike ride or run was way to high for me to keep wasting money on them knowing full well they weren't going to last more than a few months just like every other Apple headphone I've ever had.



It’s common to add weights to headphones to make them feel premium which is bizarre since actually premium headphones tend to try very hard to reduce weight as the weight makes them more uncomfortable.

I don’t know how to fix the market especially when consumers keep rewarding these practices, and I think the effectiveness of TikTok style influencer marketing will make it worse.


I don’t think that’s what’s happening here. B&W actually reduced the weight on the Px8 S2 compared to the original, and the headphones themselves are genuinely lightweight for what they are. The cable isn’t thick to “feel premium” (it feels kinda bad); it’s thick because it’s rated for 65W+ power delivery that the headphones don’t need.

The problem is the opposite of what you’re describing, it’s not a cynical design choice, it’s a lazy one. They probably just purchased a cable for capabilities irrelevant to the product and the result is worse ergonomics and misleading physical cues about what the cable can actually do.


“I don’t think..” Ok, you’ve made a number of assumptions and we don’t share the same priors so I’m unable to follow you to your conclusion.

I think you are underestimating the importance of perceived premium combined with the pressures of cost accounting, but I do think that is pretty normal for ‘audiophiles’ which is their target market.


Which assumptions? The weight reduction on the S2 is documented and the cable’s 65W rating is what the tester confirmed.

If the argument is that B&W deliberately chose a thick cable to seem premium, it doesn’t square with them actively slimming down the headphones. B&W are primarily a speaker company, their USB-C product range is basically just a few headphones and earbuds.

More likely they just sourced a generic cable that happened to support high wattage and didn’t think about the mismatch.

Either way, we’re deep in the weeds on B&W’s cable procurement now. The root point is that USB-C is a mess. You can’t tell what a cable supports by looking at it, and even premium manufacturers are shipping cables that don’t do what you’d reasonably expect.

That’s exactly the problem the Treedix from the article solves.


My point on weight was that the market for that it is common, which is probably a stronger statement than needed. I should have made the weaker argument and said the market exists which only needs one example. The company Beats can serve as that example, this company sells the majority of premium headphones but I don’t actually know what percentage have weights placed in them. I am assuming a non trivial percentage.

You are using circular reasoning in your logic, you assume the premise is true and from there you derive your evidence.

I would contend that someone thought about it and decided to go with the cheaper option because they could get away with it. I would consider my assumption to have more grounding given my experience with manufacturing and cost accounting.


You’ve gone from “companies add weight to feel premium” to “they went with the cheaper option because they could get away with it.” Those are opposite explanations. But either way, the cable doesn’t do what its physical presence suggests, nothing on it tells you otherwise, and that’s the entire point of the device in the article.

My position is entirely consistent, it is cheaper to signal premium quality than actually deliver it. The point I am making is that there is immense comercial pressure to do this is a highly competitive market when selling to consumers who don’t know better.

My example of weights is that the steel weighs are cheaper than the alternative of using heavier drivers, by adding weight they are signaling premium without delivering it. Similarly with the USB cable, consumers assume such cables are thick because of thicker wires and better shielding, it’s cheaper to make a thick cable without those those features, once again signaling premium without actually providing it.


That's a more coherent version of your argument, but it's still speculative. You're attributing a deliberate strategy to what is more easily explained by indifference. B&W make about four products with USB-C cables. This isn't a company with a cable strategy, cynical or otherwise.

4th times the charm. You’ve provided no evidence for indifference. My point remains, given industry standards indifference would be highly unusual and not at all a safe assumption.

The vast majority of high volume consumer manufacturers use cost accounting practices which would absolutely be tracking and attributing the usb cable costs and the whole point of that accounting practice is to constantly be thinking about minimizing costs of even the smallest inputs, all the way down to the individual screws used. Yes, they’re thinking about how to save 1/100ths of a cent from each screw.


The reason it is thick is because it supports 65W charging. Apple did the same with the USB-C cables that shipped with the pre-MagSafe MacBooks. It was a thicker cable that supported 100w charging but was only USB 2.0.

Can you help me understand why that would be a reason to compromise the comfort of the cable that is supplied for a device which charges at 5w?

Or, why Apple manages the same in half the footprint?

Or, why someone would expect that a cable that came with a pair of headphones actually charges things at over 65w?


Like most things in the audiophile world, it's more about aesthetics than anything else. A big cable looks like it means business.

I think that's being a bit uncharitable to B&W specifically; they're one of the few headphone companies where the engineering does back up the price. The cable is the odd one out.

I don't have an informed opinion of B&W either way, but are you sure it's not an instance of Gell-Mann amnesia?

The headphones have equivalent performance whether a USB 2 cable is connected, or a USB 3 cable is connected. The headphones themselves are not USB 3 devices; the addition of USB 3 cabling instead of USB 2 cabling would change absolutely nothing about how they work.

So, no: I wouldn't expect the cable for a pair of headphones (of any price) to support USB 3. That represents extra complexity (literally more wires inside) that is totally irrelevant for the product the cable was sold with. (The cables included with >$1k iPhones don't support USB 3, either.)

Meanwhile: Fast charging. All correctly-made USB C cables support at least 3 amps worth of 20 volts, or 60 Watts. This isn't an added-cost feature; it's just what the bare minimum no-emarker-inside specification requires. A 25-cent USB C-to-C cable from Temu either supports 60W of USB PD, or it is broken and defiant of USB-IF's specifications.

---

Now, of course: The cable could be thinner and more flexible and do these same things. That'd probably be preferred, even: Traditional analog headphones often used very deliberately thin cables with interesting construction (like using Litz wire to reduce the amount of internal plastic insulation) to improve the user's freedom of movement, and help prevent mechanical noise from the cables dragging across clothes and such from being telegraphed to the user's ears.

Using practical cabling was something that headphone makers strived to be good at doing. I'm a little bit annoyed to learn that a once-prestigious company like B&W is shipping cables with headphones that are the antithesis of what practical headphone cables should be.

---

But yeah, both USB C cables and the ports on devices could be better marked so we know WTF they do, to limit the amount of presumption required in the real world. So that a person can tell -- at a glance! -- what charging modes a device accepts or provides, or whether it supports video, or whether it is USB 2 or USB 3, or [...].

Prior to USB C, someone familiar with the tech could look at a device or a cable and generally succeed at visually discerning its function, but that's broadly gone with USB C. What we have instead is just an oblong hole that looks like all of the other oblong holes do.

After complaining about this occasionally since the appearance of USB C a decade or so ago, I've come to realize that most people just don't care about this -- at all. Not even a little bit. Even though these things get used by common people every day, the details are completely out of the scope of their thought processes.

It doesn't have to be this way, but it's not going to change: Unmarked ports are connected together with unmarked cables and thus unknown common capabilities are just how we roll.


The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.

Your last paragraph is depressingly accurate though. I think that's exactly why devices like the Treedix exist: the standards bodies and manufacturers clearly aren't going to fix the marking problem, so now we need test equipment to figure out what our own cables do.


> The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.

"I heard what you guys are planning and I talked to my financial guy. He said I have enough to put a manufactured home on some land in some desolate place like the Dakotas or central Wisconsin, as long as I keep a bit of supplemental income and live a little lower. So I'm going to do that, and take my chances on growing artisanal rutabaga to sell at farmers markets.

I've already packed up the Prius. I just stopped by to wish you kids luck with your new headphone project and tell you that I won't be back."


No. CD audio is 1,4 mbit. Even increasing the temporal and spatial resolution beyond that, which is audiophile nonsense, will never even approach USB 2 speeds.

There are a bunch of similar testers around that do more or less the same thing, e.g. a the ChargerLab Power-Z range or any number of dodgy third-party Amazon/Aliexpress clones. The one thing that definitely doesn't exist though outside of $1,000-and-up USB diagnostic devices is something to report on which of the 800 different ways the downstream device has screwed things up, including failing a basic cut-and-paste of pullup resistors from the spec coughRaspberryPicough.

After the publicity a few years ago of bad USB-C cables they've been mostly fixed, but what hasn't been fixed is the infinite number of broken downstream USB-C implementations. So your charging problems aren't due to the cable, which is most likely fine by now, but because the downstream device is telling the upstream one that it can't take more than 5V 1A. One sure way to tell the vendor has screwed up is when your USB-C device comes with an A-to-C cable to charge it.


Just to clarify the above, I'm talking about USB-C PD, not data throughput, on re-reading it the text is a bit unclear.

Sorry, as much as I despise Trump (though I'm thankful it caused Europe to wake up to the idea that the US is an unreliable ally); "Her emails" were:

A) Used for Official business as secretary of state

B) Full of national security strategically important decisions.

C) Improperly secured.

FBI directors personal email feels less cutting in that context.

Breaching my personal email (or my own mail server, I host one) will tell you literally nothing about my employer except perhaps the conversation from when I joined and my own employment contract.


A lot of Apples offerings feel a bit like that actually.

To be expected when lord of the supply chain Tim Cook is running the show.


Apple doesn’t end up with a lot of parts in warehouses because they control the supply chain so well.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: