Noted and agreed. See the comment below. The original response should have been phrased as aspirational. Clearly there's no right to be forgotten in the law today. However, query whether it's possible for the law to truly "forgive" a crime after a sentence is served without some efforts being taken to "forget".
I don't think the process of repentence+forgiveness can or should include things returning to a state as if nothing had happened. Scars will always remain and they are an important part of the human experience and help society remember and improve.
I think the definition of "expert" can be pretty ambiguous. We have one type of "expert", the MLB hitter who is an expert because he is more skilled than others at hitting. And we have another type of "expert" who is an expert because she knows more about something than others. These are two very different types of "experts" but what makes them experts is their skill/knowledge relative to the general populance rather than skill/knowledge relative to some absolute.
Even the best batter in the league goes to batting practice, studies what other good batters do, takes advice from coaches, some will likely study sports science literature themselves (rather than relying on coaches doing that for them). Batters will always want to think of ways to improve their mechanics, and will always want an "edge" over pitchers, and pitchers vary in their techniques.
The great hitters from decades ago did not have access to video footage of themselves, their rivals, and opposing pitchers, for example. So while some of them were exceptionally skilled at hitting baseballs (and not just home runs) (and also running bases and being competent in field positions usually) they were not really experts for want of a body of rigorous literature.
Conversely, one can of course have sub-major-league skills but enormous expertise -- there are batting coaches and sports science academics after all, and even popular analysts. And sports skills decay with age.
Nobel-prize-winning scientists can get senile dementia too; sadly that not just wrecks their skills, it also wrecks their expertise as they forget much of what they've read and studied.
Google can't "simply preload resources of non-AMP pages". Using a CDN means they can get more consistent load times and avoid the inevitable privacy concerns that come with loading 3rd party content the user may never actually click on.
So it's not only possible to do, they've literally been doing it for 5 years now.
Firefox doesn't appear to do prerendering, but it will happily prefetch as well and has since Firefox 23.
AMP's cap on file sizes and other limitations helps these relatively ancient prefetching & prerendering work better, but you don't need AMP to get preloading. Not at all.
> Firefox doesn't appear to do prerendering, but it will happily prefetch as well and has since Firefox 23.
Don't know, if it's necessarily "happily" in the case of Firefox. It's hardly optional, if you want to compete at all in terms of speed with browsers that are doing the full prerendering. And they are aware of the implications, which is why they're not doing it fully.
Provided the server Hosting the site to be prefetched uses HTTP headers correctly, you can even do it via ajax from any website, regardless of how the browser is configured or capable: ust load the page/image/whatever and throw away the result, now it's in cache.
Being forced to load everything you read via Google is worse for privacy. There are already plenty of solutions for blocking 3rd party scripts, like Privacy Badger and umatrix.
Google already has your IP; it's their page. Preloading resources from it's own CDN doesn't tell them anything they don't already know. Preloading resources from someone else's domain would.
Google doesn't have a list of everything you do online unless you're loading resources from their servers after you leave their site. AMP always loads from Google. If you block those 3rd party scripts, AMP pages literally take 8 seconds to load.
With an extension such as uMatrix or NoScript, blocking first-party scripts will cause `noscript` tags to be rendered, and one of these tags disable the CSS animation, causing the page to appear immediately.
When I found out about this, I tried to find the reasons for this artificial "delay" in the AMP documentation: I can't find any valid reasons for the artificial delay.
The net result unfortunately is that most users wanting to block `ampproject.org` out of privacy concerns are going to feel the need to whitelist `ampproject.org` to "un-break" a site making use of it.
Just how do you imagine they'd "CDN people's non-AMP content"? There's no mechanism by which they could tell the browser to load nytimes.com but to replace the URLs of random resources with different ones.
They'd need to host the actual page on Google.com. And after solving all the problems that doing this introduces, you've pretty much got AMP already.
Even if you can't package up and ship all of your traditional site to Google's CDN, you could do most of the burdensome/heavy bits. But then Google doesn't get to control your website and define the way it's allowed to look, which is what AMP is really for.
So it was not possible when AMP launched, is not possible now, and might or might not be possible sometime in the future after some specs are finished, but only in some browsers. Doesn't sound very practical, to be honest...
I also can't imagine the amount of shit Google would have taken if they'd started just randomly doing that kind of thing for existing web pages. Instead they introduced a totally new mechanism (i.e. AMP) where the caching was a core concept from the start.
> Even if you can't package up and ship all of your traditional site to Google's CDN, you could do most of the burdensome/heavy bits. But then Google doesn't get to control your website and define the way it's allowed to look, which is what AMP is really for.
But "heavy/burdensome bits" are exactly the things that matter the least for this use case. Ideally they would not exist at all. If they do, they should not be speculatively prefetched.
It'd also mean that these pages are now tied to Google's CDN, no matter what. Have a user click through the link from some other source than a Google search result? They'll still end up loading the resources from them. Is that really what you want?
That's true. I don't see any reason why they couldn't cache non-amp content with a combination of checking for speed benchmarks and schema markup. When you think about it that way, it seems like they are more concerned with controlling the user experience than they are with speed improvements.