Yeah that's tech for you! E4X was a ES standard, but Mozilla dropped it in 2014, then FB pick it and made JSX and now that is the defacto "Front-End" development.
would also like to hear from DDG ppl. On another note, Really appreciate DDG's map functionality using apple map with iaxm query parameter set "maps" and q set to whatever you want to search for:
web pages do NOT need cookies or javascript period.
and if you insist on that your users must use javascript to view your web site, then cookie is not even needed, one can do some finger printing and there is your permanent cookie!
This development of web has many tech savvy people stop using internet on their phone and switched back to old dump phones (and this comes from someone who has being promoting JS since before nodejs came along). Please bring back web pages that does not require cookie or javascript to function, a web page that can work in the terminal browser like lynx or links2 is the base standard for web page. JS and cookies are just icing on the cake not needed.
Web pages don't need javascript? Get real. You have to use javascript for your web app if you want to deliver a reasonable product experience. This is a dead argument I wish people would just stop making. The ship sailed over a decade ago.
Without cookies - and knowing that HTTP is a stateless protocol - how do you suggest we implement a solution that allows me to connect on my bank to view my account balance and make a wire transfer from the comfort of my home ?
The way we did that kind of thing before cookies was by encoding the state in the URL (for GET requests) or hidden fields (for POST). Higher-level Web frameworks at the time would often abstract this away from the app code while allowing the app to be deployed in either mode. Here's an example for ASP.NET: https://docs.microsoft.com/en-us/dotnet/api/system.web.confi...
If you encode auth token in the URL then a shared URL accidental or otherwise means being authenticated. There is a lot of existing infrastructure that assumes the URL is public knowledge while cookies are not.
If you do this through hidden forms then page navigation can no longer be done through hyperlinks and must now all be form submissions, which means a malfunctioning back button and logout when refreshing or opening a link in a new tab.
Please do not do this.
First party cookies are very useful and it's bad enough that people keep trying to replace them with javascript+localstorage despite the decades of security best practices that have been built into them.
I do agree that we can do away with third party cookies however.
FWIW I'm not suggesting that people do any of this today. But it is how it was actually done - it wasn't the case that advanced web apps that required per-client state management weren't possible at all without cookies.
exactly! web developer in the late 90's did that and worked fine. Yeah it is a bit more pain to use, however it keeps web stateless which was a what it suppose to be. If you want to make your fancy desktop stateful programs do it with something that is not document base like html.
It didn't keep the web stateless, though. It implemented state on top of what we had at the time, with certain flaws that others have already pointed out in this thread.
I always wonder what specifically marked the turning point for web usability. Too many dark UX patterns, popups, ads and extremely heavy frameworks for extremely simple sites.
I feel that we got to a point where the community put out a couple recipes for deploying websites and everyone just jumped on it irrespective of the problem, mimicked the same patterns (email newsletter popups, ads, google analytics, 30 million external assets, etc) and called it a day.
While I think your opinion is a bit extreme, I agree that there are so many things that we don’t need, but companies are hiring and new devs are copying the recipes.
The web is SO much more usable now than at anytime earlier.
For example, browsers used to actually allow pop-up windows (as in it would open a new desktop window (not a tab), sometimes off-screen). And then when you closed it the browser would let it spawn more pages.
There was a common pattern of spamming popup windows with a "close" button the same place a "run" button was when you downloaded an executable in Windows 95.
And then after closing 10 windows with the button in the same place they'd hit you with an executable download.
I agree and I remember those days. However, I think there is a distinction between browser improvements vs web usability.
I’m moreso talking about _how_ people design sites for the web and how poor design choices lead to poor usability. I think the proliferation of tools that require not too much in-depth knowledge (react), heavy CSS frameworks and large client side JS libs have ruined a lot of the web surfing I used to enjoy
> I always wonder what specifically marked the turning point for web usability. Too many dark UX patterns, popups, ads and extremely heavy frameworks for extremely simple sites.
It's funny that you include popups in that, because actual popups are almost nonexistent today, whereas they were ubiquitous in the early 2000s web.
Cookies are actually incredibly valuable as a place to store web auth tokens where JavaScript cannot get access to your valuable user information.
No matter what happens, if I store my JWT in a no JavaScript cookie, it's safe. Nowhere else on the web is safe in that way.
I also feel like it's a mistake too Tell people to use fingerprinting instead of cookies, when users actually have control of cookies, it's almost always better for them If we use them instead of fingerprinting.
Not just HTTP-only cookies (no JS), but also the control for 'Secure' (HTTPS only) and 'SameSite' for CSRF blocking. Not using cookies and storing your auth tokens in other places is a rookie mistake.
That said, 3rd-party cookies should be blocked by default. IDPs and other exceptional cases can request permission or use one-time query param hashes to exist without them.
How about using browser-supported auth mechanisms instead of manually reimplementing auth using cookies? There is HTTP basic auth, or TLS client certs and probably more.
Basic Auth with digests has some issues, but both that and client certs mostly fail because of poor browser implementation.
Client certs would really be ideal if browsers handled them better and sync'd the certs between devices (like bookmarks), but I guess that still wouldn't solve the signing in from a new/different (non-synced) device.
interesting, do want more information on this. Can someone who work for Mozilla care to comment on this? If this is true, then I would rather be using brave or ungoogled chrome build than Firefox.
To what end and through what means, though? Do you really think Mozilla would never do anything nefarious with the data that they collect? They would never, ever, swear on their mother's grave, sell that data to a third party, is that what you believe?
however that fact that it is NOT using V8 is quite refreshing change. We need diversity in the JS eco system other than google v8. Didn't realize JavascriptCore is faster than v8.
This is the same weak argument that's being used to sell native compilation on Java. Tons of graphics showing startup time improvement, never a word on what happens after. But the only cases where this metric makes a real difference are CLI utilities and Serverless (if you know others, please tell me)
That’s not true, as far as I've read. I feel it is spelled out pretty thoroughly: Startup time and memory footprint are way lower, but peak performance is also lower: JIT outperforms static compilation due to it having more information. There are some tools to gather such information by running the program for a while, gathering stats, but JIT still outperforms.
yeah what happens if a few years down the road, the company decided to close the free accounts? To me for open source project we need a way to archive the discussion and make it public not tight to any company or so.
Nobody wants their data stuck - people assess the likelihood of that as low and the impact as low, so rationally don’t care about it.
If one of my personal projects was unilaterally deleted right now by GitHub it’d be annoying to lose my issues but I could recover ok. And I don’t think it’s likely anyway, so why worry?
People only have so much energy to spend worrying about things. Most would spend it building instead.
Bingo, we are just bunch of bacteria walking around. Some organism become symbiotic over time like mitochondria, so it go incorporated into our cells to make energy. The whole concept that we as human-being is one organism needs to be revisited.
This is already understood, it's not like scientists aren't aware.
The simplified concept of humans being one organism will continue to be taught nonetheless, because it's extremely useful, and not even wrong in most contexts in which it's applied.
There's nothing special about this, you can say the same thing about any simple model in biology: that the brain is an organ inside the head is a simplification, any diagram of a homeostatic system or metabolic pathway that fits on a single page is a simplification, the "central dogma" of DNA -> RNA -> protein is a simplification...
All of these things are well-known. They continue to be used as models, because, well... they're useful models. And there's nothing wrong with that!
Even more generally, reductionism works but with known limitations which warrant a more holistic approach, and we can't work our way out of most real-world problems without this multi-layered approach. Welcome to empiricism in complex systems.