> If the scan room door is closed when a quench occurs and helium escapes into the scan room, the depletion of oxygen causes a critical increase in pressure in the room compared with the control area. This produces high pressure in the scan room, which may prevent opening of the door. If this should happen, the glass partition between the scan and control rooms should be broken to release the pressure. The scan room door can then be opened as usual and the patient evacuated. In such a case the patient should be immediately evacuated and evaluated for asphyxia, hypothermia and ruptured eardrums.
OOH! Neat! I looked on my mobile phone enough to get a sense of what this is.
I'm not in the petroleum industry, but about 45 years ago I was mesmerized at an energy fair at my elementary school by this Exxon magazine that showed the refinery flow with a bunch of little dots: https://archive.org/details/p-2330663/P2330670.JPG
How awesome that you were able to find this! I’m only missing a couple! The SRU is my final hurrah that I’m trying to get the play dynamic right. Balancing air demand with sulfur load without overwhelming the screen with bars and widgets.
> How awesome that you were able to find this! I’m only missing a couple!
Missing a couple of features you want to add?
The "able to find this" (the 1981 Exxon magazine) needs context to appreciate re: the dispersion of certain personal belongings over time. I would have picked the magazine up in 1981 or 1982 (possibly 1983) at the school energy fair, and it remained in my physical possession for the next 40-something years.
In elementary school I did not have a significant amount of possessions outside of toys. Then in middle school I got a small desk, and in high school I got a larger desk, and it ended up in a folder of "neat stuff" that I saved.
Then after college my stuff from the desk ended up in a banker's box, which I still have, along with a couple of other boxes of stuff from that era. This year I looked for maybe 20-30 minutes and found it.
I still have all of my copies of Compute's Gazette from the mid-late 1980s.
I will say that it is amazing to be able to go online and find all sorts of old computer/electronics/whatever magazines from the 1950s-1980s on archive.org or bitsavers or worldradiohistory.org and there they are... unless they're not.
Perhaps, but at least I find it very simple for the optimality properties it gives: there is no inherent need to say, "I know that better parameters likely exist, but the algorithm to find them would be hopelessly expensive," as is the case in many forms of minimax optimization.
Ideally either one is just a library call to generate the coefficients. Remez can get into trouble near the endpoints of the interval for asin and require a little bit of manual intervention, however.
While I'm glad to see the OP got a good minimax solution at the end, it seems like the article missed clarifying one of the key points: error waveforms over a specified interval are critical, and if you don't see the characteristic minimax-like wiggle, you're wasting easy opportunity for improvement.
Taylor series in general are a poor choice, and Pade approximants of Taylor series are equally poor. If you're going to use Pade approximants, they should be of the original function.
For a polynomial P (of degree n) to approximate a function F on the real numbers with minimal absolute error, the max error value of |P - F| needs to be hit multiple times, (n+2 times to be precise). You need to have the polynomial "wiggle" back and forth between the top of the error bound and the bottom.
And even more surprisingly, this is a necessary _and sufficient_! condition for optimality. If you find a polynomial whose error alternates and it hits its max error bound n+2 times, you know that no other polynomial of degree n can do better, that is the best error bound you can get for degree n.
Chebyshev polynomials cos(n arcos(x)) provide one of the proofs that every continuous function f:[0,1]->R can be uniformly approximated by polynomial functions. Bernstein polynomials provide a shorter proof, but perhaps not the best numerical method: https://en.wikipedia.org/wiki/Bernstein_polynomial#See_also
Those don't guarantee that that they can be well approximated by a polynomial of degree N though, like we have here. You can apply Jackson's inequality to calculate a maximum error bound, but the epsilon for degree 5 is pretty atrocious.
I've been struggling to curve fit an aerodynamics equation relating Mach number to rocket nozzle exit/entrance area ratio for quite some time. It's a 5th or 6th degree polynomial whose inverse doesn't have a closed-form solution:
But I was able to use a Chebyshev fit that is within a few percent accurate at 3rd degree, and is effectively identical at 4th degree or higher. And 4th degree (quartic) polynomials do have a closed-form solution for their inverse. That lets me step up to higher abstractions for mass flow rate, power, etc without having to resort to tables. At most, I might need to use piecewise-smooth sections, which are far easier to work with since they can just be dropped into spreadsheets, used for derivatives/integrals, etc.
Anyway, I also discovered (ok AI mentioned) that the Chebyshev approximation is based on the discrete cosine transform (DCT):
The secret sauce is that Chebyshev approximation spreads the error as ripples across the function, rather than at its edges like with Taylor series approximation. That helps it fit more intricate curves and arbitrary data points, as well as mesh better with neighboring approximations.
Is there any similar ecosystem hook for a zip-like archive? It would be great to have something like .zip file containers for zstd/brotli which can contain a small number of dictionaries and then the decompression utility automatically uses them. For example, suppose you have a lot of .js / .css / .html files. Or Python files. Or whatever. It would be more efficient than individual .zstd files.
reply