I started using poetry abiut 4 years ago and definitely hit a lot of bugs around that time, but it seems to have improved considerably. That said, my company has largely moved to uv as it does seem easier to use (particularly for devs coming from other languages).
Out of interest, how are you approaching this? Are you trying to "refactor" in specific areas, or is it somewhat more ad-lib? Do you have something in mind for the next "volume"?
I haven't touched CAD for a couple of years, but I get the impression that (inevitably) the generative design hype significantly exceeds the current capability.
Yes, it still exists. Most (all?) EU legislation that ended had to be explicitly revoked, since the UK was fairly diligent in transposing it to national legislation.
I think the idea here is that children need to have the opportunity to explore the wider world independently, not just a 50 metre circle around their home.
> - Super smart designs causing the page layout to change after it's loaded.
This seems to happen far too much. Nowadays usually causes me to give up on a site and go somewhere else. Feels like eliminating stuff like this should be low-hanging fruit UX-wise (albeit not especially exciting, I guess)
How is this an intense road? It looks pretty wide and clear of traffic with a few parked cars. If it can't handle this, it's got no chance in an average city in Western Europe.
Surely this is showing correlation, but not necessarily causation. There are probably many fields where improvements in X have happened, along with greater compute power. How much the improvement in X is attributable to greater computing power is surely very hard to quantify. Or am I missing something?
For the oil example, I can think of a very simple explanation that has nothing to do with computing power. They state that the drilling success rate was 10% in 1940 and 70% in 2010. In other words, out of 10 exploratory drills, 9 were dry in 1940 and only 3 in 2010. And since the companies use computers to predict the presence of oil, and the computing power has increased, voila, the computing power explains the increased success.
But the alternative explanation is that over time oil could be found at higher and higher depths. The cost of exploration has increased. When it does not cost you much to drill a shallow hole (in 1940), you don't mind if you miss 9 times out of 10. But if you have to put in millions of dollars to drill to thousands of meters depths, then you think twice (or maybe 10 times) before you drill.
Any prediction system will give you an estimated probability of success. If the cost of failure is not high, you may decide to drill as soon as p > 10%. But if it's high, you may increase the threshold to 90%.
Of course, I'm not saying this is all there is to it. But it's likely that this explained part of the increased success. Another part is no doubt due to more experience. Another is due to algorithmic advances. Yet another to the improvement in the seismic sensors and technology used.
> Yet another to the improvement in the seismic sensors and technology used.
This is a huge one. Almost all of that geoseismic downhole sensing technology uses radioactive emissions. In 1940, it was impossible to "see" anything downhole with any reliability. Nowadays it's passe.