Hacker Newsnew | past | comments | ask | show | jobs | submit | rileymat2's commentslogin

It’s not that big a change these forms have existed for a long time, AOL was a giant closed Goliath for a large number of users.

One could even argue that Google Search was/is THE platform of the internet. If you're not there you're not anywhere.

And before that, specific BBSs, for those that could afford the dial ups to them.

We have consistently made exceptions to this rule in situations with limited choices. We would not abide by the electric company dictating a range of things, even of you have the option to run your own generator.

The truth is there are two reasonable platforms, as long as that is the case we should apply scrutiny.


I'd go even farther than that. The US should adopt an equivalent of the second amendment regarding end user control over personal electronics and it should bind not only the government but also private enterprise. We are increasingly dependent on these devices to go about our day to day lives and they have not only been used against us for mass surveillance but are also quickly gaining the ability to exhibit intelligence and act autonomously.

> We would not abide by the electric company dictating a range of things, even of you have the option to run your own generator.

An absurdly dishonest comparison. Which you (hopefully) knew when you made it.


You are describing theory, which is fine as a model but it is an imperfect model. There are a host of reasons this falls apart in reality.

In reality these work very well for many of the important things. Ask any farmer who sells a 60 pound bushel of wheat for $6, producer of $10 blue jeans or maker of those $400 60" TV's. They're not swimming in profit.

The exceptions are far fewer, but far more noticeable. Housing and health care don't follow the one price rule. The exceptions dominate our mindshare because they're so painful, but the non-exceptions outnumber the exceptions.


One price rule for commodities isn't super relevant for consumers though? Commodified markets are well understood so long as there is decent competition.

What we are increasingly seeing on the consumer side of the market - even on grocery items - is price segmentation. Grocery stores (moreso their suppliers) learned that many (most?) consumers are willing to pay much more for staple food items that are not commodities but quite common buys. Like chips or soda or branded packaged foods. They set a regular retail price to 50% more than it was a few years ago over time, and then to capture more of the price sensitive consumers they offer incentives like coupons, in-app deals, random sales, etc. to induce those consumers to purchase.

This is getting to be extremely aggressive and will continue to do so for the foreseeable future. Uber/Instacart for example have plenty of whistle blower insider types who have written about how price segmentation on an individual basis based on personal information and habits happens. Such as the type of credit card on file (Amex holders get charged more), how much gift card credit balance you have, your trends like accepting higher prices once from a given location/destination pair and time, etc.

If you go to the McDonalds drive-thru and simply order at the window you will be likely paying considerably more than the person who has the app installed and orders through that method.

Airline tickets perhaps follow this model as well - browser history and cookies will present a higher price to one consumer vs. another for the same book at exactly the same time. Some court cases are attempting discovery on this recently, so it will be interesting to see if true.

The price of an individual consumer transaction is absolutely set to what the company charging it believes the market will bear. Increasingly that "market" is the size of exactly one consumer.

I listened to a few earnings calls for fast food and consumer staple companies during COVID. Executives were absolutely incredulous that they could continue to increase prices and have it not impact volume of sales much if at all. What was taught in MBA school simply was not reality on the ground, and COVID times exposed this fact. The US consumer at least as a whole has simply lost the ability to price shop and is not as price sensitive as the textbooks say. This may change, but it's the current state.

About the only thing producer prices set is a pricing floor.


Not a single post in this comment thread isn't just describing theory.

double negative, so... every comment does?

Yup.

Yeah but, that vagary is literally the exact same wording that can be applied to "prices are set according to what consumers no; not it's value" that sparked this thread.

Can you say more about the ISP connecting to any computer on your network? I can’t find any references to this aspect in googling the right terms and the concept is foreign to me.

There are a bunch of ways to break it, or misconfigure it. But I have idea what this isp method is.


It's just normal routing. If you send packets to a router, it'll route them.

More concretely, they can run the equivalent of `ip route add 192.168.1.0/24 via <your WAN IP>` on a machine that's connected to your WAN network, and then their machine will send packets with a dest of 192.168.1.x to your router. Your router will route them onto your LAN because that's what its own routing table says to do with them.

Anyone on your immediate upstream network can do this, not just your ISP. Also, if you use ISP-assigned GUAs then this inbound route will already exist and anyone on the Internet can connect. Applying NAT to your outbound connections will change their apparent source address, but it won't make that inbound route disappear.


Have you tried that?

I have yet to see a router that allows that forwarding unless explicitly configured. Still, i'm using mostly openwrt/opnsense/mikrotik

Default is to disallow/block forwarding packets from public wan to private range lan.

ISP can still inject packets on ports that NAT opens if it spoofs the source address/port, so you still have some validity to argument.


Yup, repeatedly.

It's true that almost everything comes with a firewall rule that blocks new connections from the WAN to the LAN, so in practice these connections will be blocked on most things by default. But they come with this rule precisely because NAT doesn't do the job.


> Yup, repeatedly

Cool, me too :)

Anyway, the other side of the argument:

It is the default and default is secure. Users don't have to reason about it, they can assume it works, how doesn't matter and they may lack training/willingness to figure out.

You can't say the same for IPv6 where default is allow (have things changed?, havent checked in a long time)


Of course you can say the same for v6. Blocking connections that go from WAN to LAN by default has the same effect on both protocol families. If you assume that having the appropriate firewall rule to do that is the default then inbound connections will also be blocked on v6 by default.

NAT contributes nothing to your security in this scenario, and instead makes it harder (not easier) to understand and reason about what your router is doing.


> If you assume that having the appropriate firewall rule to do that is the default

That's the thing, it's not the default, default is public ipv6 for everyone and its the users duty to configure firewall...

I could definitely set this up easily, someone like my parents or friends would ask me 'what's IPv6?'


Ah, okay. In that case v4 doesn't have a firewall by default either.

That's precisely why routers come configured with a firewall that blocks inbound connections from the WAN -- because the protocol itself doesn't have a firewall by default, and neither does NAT.


20 some years ago when cable broadband was new, you connected a computer and got public IP. For this example let's just assume it was a public/24. Back then there was no firewall built into Windows, it didn't ask you if you were connecting to a public or private network.

For some ISPs you could connect a switch or hub (they still existed with cable came out, 1gbps switches were expensive) and connect multiple computers and they would all get different public IPs.

Back then a lot of network applications like windows filesharing heavily used the local subnet broadcast IP to announce themselves to other local computers on the network. Yes this meant when you opened up windows file sharing you might see the share from Dave's computer across town. I don't recall if the hidden always on shares like $c where widely know about at this time.

ISPs fixed this by blocking most of the traffic to and from the subnet broadcast address at the modem/headend level but for some time after I could still run a packet capture and see all the ARP packets and some other broadcasts from other models on my node, but it wasn't enough to be able to interfere with them anymore.


I understand this aspect, and this conversation is tricky because most consumer routers have this barebones firewall built in to reject the routing mentioned by the OP. So what we think of as a "router doing nat" often is subtly doing more. I'd hate to call what a barebones consumer router is doing a firewall because there are important firewall features that it does not have that are necessary for security.

I am not sure ethics have much to do with it nor implied contract.

In the past there was no ethical issue nor contractual issue with going to the bathroom during a network commercial break, no ethical issue with skipping multi page magazine ads. We were free to change the radio channel during ad breaks.

My parents would often mute the tv in commercial breaks and talk.


> are trained on what we told them we do. They don’t “think” at all. They’re a mixmaster of other people’s ideas, cleverly packaged in a way that we perceive as natural.

Sometimes I wonder how this is different from most of my education. Or my creativity, mixing ideas together to see if they still make sense with other things I have been told.


"what is 16929481231+22312333222?" is an easy way to test this claim. Pick large enough numbers and there's no way all the sums of that size would fit into the dataset (you don't need to stick to + either, but it's the simplest thing that works)


But if you were to ask that same question to a human with no specific math training there are exceedingly low odds they would get the right answer.

We spend hours and hours over reinforced over years to have humans that can do it.


Indeed! So, where are you going with this?

For my contribution to the conversation: Earlier/cheaper models can't do it either, they make mistakes, they need a calculator/jupiter kernel/what have you. 'Medium' models will put the numbers underneath each other and do it 'properly' in a table, checking themselves after. Claude Opus 4.6 (the current rolls royce today) just says the answer in one go sometimes (it's a monster). But all of them end up spending many seconds and thousands of tokens on a task that takes a calculator or an ALU fractions of a second.


Yeah, that’s my main beef with this article. There is not even an attempt, just waving hands and saying they are not.

Decades and decades of “turing test” talk until they can pass it.


Mind: Turing test doesn't test for actual thinking though, just functional indistinguishability. Turing sidestepped the problem way back when.


That seems a little harsh. GUI tools can give us a more vibrant and useful interface.

But, I think the main problem is that although there have been many attempts we have not gotten to a standard way to compose different GUI tools easily or repeat actions.


> but GUI based IDEs are generally useful and easier to use out of the box for development.

This is true, they are much better for discovery and affordance, but as you progress with your tooling and tool usage there is a much higher ceiling on your productivity with other tools and their composability. In my opinion, not putting effort into learning tools ultimately holds a lot of people back from their potential.


I use both and mostly agree, but for me I don’t think the ROI for learning terminal based tooling is there.

They make some parts of text manipulation faster, but those parts of text manipulation take up less than 1% of my time spent working.

Things like debugging, which take up a large portion of my time, are not so nice in terminal based environments


Yes, for things like Node, I do use tools like the chrome dev tools for debugging and such.

But find a terminal first approach leads me to other tools like curl and jq usage as I go. I see coworkers using a ton of time trying to repetitively execute the code to see those spots in really inefficient ways. And end up completely lost when they could be using tools like git bisect.

Or another good example devops type support is if one web server out of many seems to be misbehaving, I can use aws command line to get internal ips behind the lb to curl to grep and find it in minutes after others have tried for hours. It makes it second nature if your mind goes there first.


I work 99% in a terminal and fire up a JetBrains IDE when I need to do deep debugging. It’s so rare for me though that it’s worth more for me to get good at the terminal stuff. I’m sure this depends heavily on the type of work being done, game dev for example really needs a good debugger. That being said, gdb and others have perfectly fine text mode interfaces, albeit with a steeper learning curve.

As always, the “best” tool is the one your most familiar with that gets the job done. Text vs GUI doesn’t really matter at the middle of the bell curve.


> there is a much higher ceiling on your productivity with other tools and their composability

What exactly is the "ceiling" for GUI based IDEs?


Composition. I don’t think there’s any GUI that can be used for the git email workflow.

Versatility. In most TUI editors, running an external commands is easy, including piping a part of the text and receiving its output. As this is a fundamental unix principle, there’s basically no barrier between what you’re editing and other data on your system. In GUI, everything is its own silo.

Presentation. Code is not linear, but most gui forces use to use one window to see your text files. And when they allow splitting it’s cumbersome to use. Vim and Emacs has a easier way to divide you screen so that the relevant information can be presented at once. And there’s terminal multiplexers for simpler editors.


You’d also probably be surprised about how subjective and unevenly applied the law is… by design, to allow appropriate outcomes and discretion.

Edit: Consider the following words included in law.

“reasonable” “reckless” “due care”


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: