Hacker Newsnew | past | comments | ask | show | jobs | submit | boredprograming's commentslogin

Many people pushing Ivermectin have a vested interest in proving the vaccines are unnecessary.

It may work, but that's why you get recommendations for "vaccine made me magnetic" and other garbage like that whenever you search for videos.

It's the same group that was pushing hydroxychloroquine as a miracle drug


> Many people pushing Ivermectin have a vested interest in proving the vaccines are unnecessary.

What is their vested interest in a generic, cheap, decades old, off patent drug? Vested interests tend to be in new, patented treatments. That's where big pharma profit is. So let's try a rephrase: Many people pushing patented treatments have a vested interest in proving off patent alternatives are unnecessary.

For example, here's Merck warning against using Ivermectin for Covid-19:

https://www.merck.com/news/merck-statement-on-ivermectin-use...

Then a few months later, "Merck Announces Supply Agreement with U.S. Government for Molnupiravir, an Investigational Oral Antiviral Candidate for Treatment of Mild to Moderate COVID-19"

https://www.merck.com/news/merck-announces-supply-agreement-...


Being proven right. That’s the anti-vac crowd’s dearest interest.


Ok, but let's not get confused. This is different than being an anti-vaxxer. You can speak up for something like Ivermectin AND be a pro-vaxxer. The most prominent voice on this is Bret Weinstein (biology PhD), who is more vaccinated than average. He and his wife and kids are vaccinated for the typical things plus typhoid, rabies, yellow fever... They are pro-vaccine and so am I. Vaccines are one of the best inventions ever.


You can, but you'd have to have some proof.

And so far we do not have any. So until then this story is pushed way further than it has any right to. Weinstein thrives on the censorship, but the fact of the matter is that there simply is no proven efficacy and until then this has no business being promoted to a mainstream audience who might get themselves into a lot of trouble, or who might forego getting vaccinated.


In my case that is far from the case. The implications of an effective treatment such as ivermectin are huge in that a) Covid passports go away b) Deaths and illnesses due to covid get reduced greatly c) It puts a spotlight on why these treatments have not had government sponsored clinical trials considering the ramifications. Incompetence in our governments in regards to these kinds of situations should not be tolerated. For example in Canada the province of BC allowed a trial to commence in may of this year and yet has not started. almost 2 years after the start of this thing? ivermectin has proven safe over the 40 years its been in use and if we try similar doses (which is the recommended for covid) there is all gain and no loss to test even as small trials. I would have certainly taken it since I had severe effects from covid. d) We don't have to have a phase 3 vaccine trial be a public trial where normally that is phase 4


That's the wrong side to look at this from. These are just people on the fringe with little influence and funding.

If ivermectin was authorized as a viable treatment the vaccines receiving emergency use authorisation wouldn't have been possible. There were and still are billions on the line.


I'm having difficulty understanding this line of reasoning. What does the authorization of ivermectin have to do with the vaccines? Xofluza, Relenza, and Tamiflu have been approved for treating influenza, and that has had no effect on the recommendation that people get their flu shots.


Someone else answered correctly saying an EUA can't be issued if an already licensed drug can help. I would further that point by saying to actually go dig up the true source of that policy on the FDA's website in your country (not someone's summary or interpretation). It's a great exercise that will leave you with some sort of ground truth in this mess.

Same as in coding, you eventually reach a point where you learn that when in doubt, you must read the source.


It's the way that emergency authorisation use works with FDA. They won't issue that if there are other safe viable treatments. These vaccines got that approval because these other options were suppressed.


>Xofluza, Relenza, and Tamiflu

None of them are as effective as Ivermectin (allegedly). It is only logical that efficacy is also important.


We already have a drug that's 99% effective at keeping you out out hospital. And it's only a single/double dose. It's the COVID vaccine.

Any other treatment is basically unnecessary at this point because COVID deaths in vaccinated people are less than 1 in a million.

That's why anti-vaxxers are so vested in other treatments. It's the only way they can rationalize not getting vaccinated


Nope, that is not the only way. You have side effects. And you have the fact that the vaccine is still in test phase. What are the long term side effects, we don't know. And unless they are obvious we will never know, because vaccine are like a religion too many. You don't question a religion.


You're indirectly admitting that I'm right. The anti vaccine crowd is desperate for a non vaccine cure, because the vaccine is so incredibly effective their beliefs fall apart otherwise


The vaccine is empirically way more dangerous than every other vaccine commonly given. I did the vaccines but I don't understand why we can't be honest about it.


We can believe the vaccine works but that the risk is too great to take it because it's so new. And for the record, I think the alternatives are also too risky.


Well the covid vaccine comes with some caveats to me most crippling to freedoms being the vaccine passports that seemingly all nations are already implementing. Having various viable treatments would obviously put an end to that. Since ivermectin has a long history of safety and already largely available its reasonable that interest is strong.


But not for this particular application, and not in the doses where apparently it has some effect on COVID. And that is the problem with promoting this, as long as you weren't aware of that you probably should not be part of the army of 'useful idiots' of the anti-vax crowd.


I don't think this comment really addresses my point. From what I've seen the dosages (at least for prophylaxis) are similar to what you would normally take. No I don't think it should be rammed down peoples throats without solid evidence but neither should the vaccine. The fact is vaccines take around 7 years to exit trials and so yes I'm hopeful for a proven alternative that's not on trial but the likelihood as far as I'm convinced is nil.


Create the problem, bring an unecessary solution nobody would have accepted otherwise. Step 3, profit !

Vaccines and covid are going to come back every year for sure.


for all the talk of "vaccine passports", not a single state has done it. Biden said he wasnt going to do it months ago. Not a single country has done it in all of the earth.

if I was so worried about such a thing i might start to think maybe I was being manipulated to fear something that didnt exist.

Nobody has implemented vaccine passports. And nobody will.

But the idea of a treatment even close to effective as a vaccine is attractive to those that believe that in such magical thinking

Ivermectin might be cheap but the vaccine is literally free if you're in the US. There's zero reason to not get it, and zero reason to hang onto hope that alternatives will work. We already have an extremely effective treatment, so effective it can wipe of the virus entirely, and its totally free.

Our government under Trump bought the vaccine en masse for a few dollars a dose. Those that refuse such a miracle treatment when the rest of the world dies of COVID are a stain on America's sheen


> for all the talk of "vaccine passports", not a single state has done it

New York's Excelsior Pass is a vaccine passport, and has been called such by New York Times and other sources.

https://www.nytimes.com/2021/06/09/nyregion/excelsior-pass-v...

https://www.msn.com/en-in/money/topstories/new-york-s-excels...


These are extreme arguments with little legroom for nuance. If you are so sure then so be it. There's more than enough news on this countering most of your re-assurances.


Ok, I was confused by this line of reasoning, so did some digging. I believe it's one of the popular antivax disinfo conspiracy theories. Here's my reading.

The line of reasoning hinges on the vaccines being "experimental" and only being distributed under an Emergency Use Authorization, as opposed to a full approval. The FDA policies for EUA indicate they're only to be used when there is no adequate, approved, and available alternative. This makes a lot of sense - if (let's say) someone comes up with a new flu vaccine (an mRNA one, to continue this example, as that would be kinda exciting), you really want it to go through the full approval process instead of EUA, even if it is better. That's because we have plenty of good, approved flu vaccines.

So, the theory goes, if we had an approved treatment for Covid, then the EUA for the vaccines would be illegal. And so that creates incentives for the pharmaceutical companies to suppress a miracle cure like (they claim) ivermectin.

To anybody with the capacity for rational thought, this is obviously bullshit. We have fully approved treatments already, including remdesivir. The idea that a treatment for Covid, even a pretty good one, would make vaccines unnecessary makes no sense.

I am fairly confident in making the following prediction. Full FDA approval for the Pfizer/BioNTech and Moderna vaccines is likely by the end of the year[1], at which point the above line of reasoning will no longer be applicable. Antivaxxers will smoothly transition to another line of argument.

I do think this "theory" is one reason you see a significant overlap between pro-ivermectin and antivax, for example in the comments of Bret Weinstein videos.

[1]: https://www.cnbc.com/2021/05/18/covid-vaccines-what-full-fda...


I think what has separated the policy around Covid, vs. say the flu, is the death rate. If the death rate were closer to the flu, or below some threshold that takes spread into account, the response would have been much different.

If a better treatment was available that would lower the death rate that much, it would change the equation.


Yes. Anybody trying to rationalize not getting one of the extremely effective vaccines will be grasping at straws for any other effective treatment. It's a natural outcome.

The vaccines are as effective as the one that eliminated smallpox. To be against it, you need some serious FUD


FUD like the fact that the vaccines have had no completed studies on their long-term effects? Or being part of a demographic that has a low risk from COVID?


Yes, that's FUD. You can't have a completed long term study when a vaccine has not been on the market for a long enough term. That's pretty obvious.


Right, so how is it FUD if you simply want to avoid being part of a clinical trial? Vaccines take at least 7 years (usually 10ish) to complete long-term trials.

For young people, the cost-benefit analysis isn't clear at all--much less risk of anything bad at all from covid itself, plus far more years of life to lose or suffer from vaccine injury, which is a totally unknown risk.

To boot, the vaccines contain at least three entirely new technologies never before adopted in vaccine treatment.


The several newly minted billionaire Pfizer execs are certainly happy Ivermectin is only just recently starting to get a decent look.


> vaccines being "experimental"

Under the PREP act, pharma companies have total immunity from liability. Why would that be. Maybe because the vaccines are still only in stage 3 of clinical trials? Because the long term effects are unknown because it hasn't been long? With worrying reports about side-effects including at least 5000 deaths in the U.S. VAERS database, do you think those quotes are appropriate?

> The idea that a treatment for Covid, even a pretty good one, would make vaccines unnecessary makes no sense.

Actually, it makes perfect sense to many people.

You are shooting down the weak version of this argument. I think you are confounding necessity from the point of view of the state and institutions with necessity from the point of view of many people.

You are correct that the state and health institutions do want to get people vaccinated regardless of other cures, the evidence for that is overwhelming and existence and availability of some alternative strategy/cure isn't going to stop immediately that intent.

However, if there was, hypothetically, an accessible and efficient medication/treatment with profylactic or curing effects for COVID-19, this would make substantial portion of population skeptical about getting the vaccine, especially now that the number of serious cases is low and manageable.


IMO vaccine are pushed as miracle drug. There is probably fanatic peoples in both though. You might also want to check remdisivir, drug that is not efficient, but they pushed it far enough to get a contract of 1 billion, and effectively 0.2 billion have been spent on it.


The vaccines are a miracle drug. Less than one in a million people that have gotten both doses of mRNA vaccines have died of COVID.

It's one of the biggest infectious disease success stories since the invention of penicillin


Maybe, but do you know how many have died from the vaccine-induced side effects? How could we find this number?


Yes. Basically none. The vaccines have been given to almost a billion people. And side effects have been tracked. How can we find this number? We already have it.

If there was anything dangerous within even 3 orders of magnitude of those that have died because they didn't get the vaccine it would be front page news.

Get vaccinated


What about long-term effects of the vaccines?


I asked why Covid vaccines are absent from VAERS list of reportable AEFIs

https://vaers.hhs.gov/docs/VAERS_Table_of_Reportable_Events_...

And the person you are responding basically said that the new vaccines are so safe that it didn't have to be on that list.

https://news.ycombinator.com/item?id=27513984

I don't mean this as a personal attack on that user, but It is really concerning how blind and apologetic some people are to the various things that can go wrong with a Vaccine.


I agree. I just hope that such blindness doesn't affect us. I worry that it will.


> And side effects have been tracked. How can we find this number? We already have it.

Oh, that is great! Where can I find this tracker and the reports of side-effects? And this number we have, what is it? Can you write it here?

Should I rely for this information on my TV news?


It is and it is weird.

What do these people (the promoters) get out of it? Attention? A more dedicated following?


I think the most useful lens to understand the sociology of these kinds of questions is religious belief. And there is no shortage of people out there who want to convince others of their belief.

The narrative for "the Ivermectin story" is particularly compelling, as it involves brave maverick doctors working selflessly to get the word out, suppression and censorship by shadowy organizations (big pharma fearing competition, the big Internet companies just lusting after the power of thought-control), and the empowerment of people to take medical decisions into their own hands.

Incidentally, this was the exact same narrative as HCQ, and is being pushed by a lot of the same people. The end of the story may turn out differently, as the evidence on HCQ is overwhelming that it doesn't work, so you only see dead-enders pushing it, but there is a good chance that Ivermectin will turn out to be at least moderately effective, though the jury is still out.


Religious belief. Exactly. And what are your religious beliefs ? Are you sure you have none ? Is there something or someone you would never doubt ?


How about having a discussion? Not everything has to be so complicated lol


I've had bad experiences with counterfeit components on AliExpress. It's rare to get anything real and unused.


SparkFun and Adafruit are great, but they're for casuals. If you're building breadboard prototypes there's nothing better than Digikey and Mouser.


The disdain for casual hobbyists you show here implies that you are probably not the target audience for the article, which is targeted even lower than hobbyists: people who need an early introduction. Professionals have very different needs than hobbyists and therefore need different materials and suppliers.


Eh true. And it's not disdain, those companies are great for what they do. The guy above just implied they sucked, which isn't the case. They're great for their target market


I actually disagree with you completely. (And I'm a professional, this is my day job.) For breadboarding I buy from Adafruit and Sparkfun, since they have good selections of breakout boards, MSOP to DIP adapters, etc. DK and Mouser have plenty of that, and some really nice stuff (like TI's adapter "EVMs")... but for prototyping on a breadboard, I start with the other guys.

(Then usually tack them on to a Digi-Key or Mouser order, because if it doesn't require something weird, I probably didn't need to prototype it.)


One day, Rust needs a GC. Reference counting is a just crappy GC. Modern GC can perform better than this so Rust is actually hurting its own performance by not having one.

A good GC would make heavily concurrent apps much easier to build with Rust. And would have better performance than the typical Arc Mutex objects passed around right now


Tracing GC is troublesome for any non-memory resource, such as network connection or file handle, due to its untimely release, but otherwise I actually agree: reference counting is a GC mechanism—not a very good one, but it's the only one I'm aware of that works both for memory and resources.

I would enjoy someone test a model where the type system guarantees (or at least lets you detect the situation) that you cannot store such non-memory objects behind a traced GC node (these would include plain memory objects that need to be registered/unregistered precisely).

It might be that it would be needlessly annoying to use just compared to just RC. Or maybe it would be best of both worlds?


Not when the language also supports value types and region allocators (e.g. IDispose in .NET).

You can even turn it into RAII proper, by turning into a compilation error not handling those interfaces properly.

Again with .NET, there are SafeHandles as well, alongside the MarshalInterop APIs.

This is nothing new actually, Mesa/Cedar for Xerox PARC used reference counting with a cycle collector, while other descendent languages down to Modula-3 and Active Oberon always combined value types, tracing GC and C++ like resource handling capabilities.

Oh Common Lisp also has similar capabilities, specially the ZetaLisp predecessor from Lisp Machines.

Then Eiffel not only had this, it was also probably the first Algol like language to support non nullable references.

Sadly they decided to ignore all of this in Java, and then its world domination kind of made everyone else ignore it as well.

Thankfully even Java is improving their story in this regard, while languages like D, Nim and yes .NET kind of show what was already available for several decades.


I must be missing something. How is it possible to precisely collect a resource with tracing GC? And if you need to update counters when you make duplicates of object references, you are not using a tracing GC where the benefits are the cheap duplication of object references, cheap allocations and cheap (batched) releases, but the downside is not being able to precisely and automatically do it when the value is available for collection.

Seems to me it is impossible to have both automatic precise release of a resources and collection-based GC?

As I understand it, even the documentation for IDisposable in .NET says as much at https://docs.microsoft.com/en-us/dotnet/api/system.idisposab...:

> The primary use of this interface is to release unmanaged resources. The garbage collector automatically releases the memory allocated to a managed object when that object is no longer used. However, it is not possible to predict when garbage collection will occur. Furthermore, the garbage collector has no knowledge of unmanaged resources such as window handles, or open files and streams.

> Use the Dispose method of this interface to explicitly release unmanaged resources in conjunction with the garbage collector. The consumer of an object can call this method when the object is no longer needed.

So this is the interface you can use to explicitly release a resource, because the GC gets around to it only later at some unspecified time.

About SafeHandle it says at https://docs.microsoft.com/en-us/dotnet/api/system.runtime.i...:

> The SafeHandle class provides critical finalization of handle resources, preventing handles from being reclaimed prematurely by garbage collection and from being recycled by Windows to reference unintended unmanaged objects.

Doesn't seem it's at all helpful for automatic precise release of resources.


> the benefits are the cheap duplication of object references, cheap allocations and cheap (batched) releases, but the downside is not being able to precisely and automatically do it when the value is available for collection.

Note that you don't need GC to reap these benefits, if desired. You can allocate an arena and do secondary allocations inside it, then deallocate everything in a single operation. Arena deallocation is not timely or precise, but it does happen deterministically.


True, but GC gives those benefits automatically, compared to a naive program doing e.g. RC-based memory management.

And there is of course the question of safety; should you release an arena too early, you may have introduced a bug. Worse: it might not crash immediately.

There is actually some work for doing arena management automatically, called region inference: http://www.mlton.org/Regions

But the way I see it, it's just a way to make memory management even more efficient; it's not about precise release of resources, and indeed not all programs can be expressed so that releases can happen only in batches of an arena (assuming those arenas themselves aren't dynamically managed, which certainly is a valid strategy as well, but manual).


> should you release an arena too early, you may have introduced a bug.

A memory safe programmming language will detect any such bugs and reject the program. This is not hard, it's a clean application of existing lifetime checks.


So are there some languages that do it? I'm sure the devil is in the details.


You aren't reading it properly, the documentation you are reading is for the case you leave the work to the GC, you can take it yourself C++ RAII style:

   {
      using my_socket = new NetworkSocket()

   }

   // my_socket no longer exists when code arrives here

Or even better if NetworkSocket is a struct, it gets stack allocated, zero GC.


So how about this then:

    {
      using my_socket = new NetworkSocket();
      my_socket.write("Started");
      register_callback(() => my_socket.write("Finished"));
    }
This is the case what RC solves well and tracing GC doesn't solve at all, regardless of the number of interfaces you implement. It is easy to find yourself in this situation given how much callbacks are used in modern codebases.


    NetworkComponent foo = new NetworkComponent();

    {
       using my_socket = new NetworkSocket();
       foo.socket = my_socket;
    }

    foo.do_sth_with_socket(); // oops, runtime failure, socket closed


Trying to be clever?

Here is your Rust version, enjoy.

    use std::io::{self};

    struct NetworkComponent {
      socket : NetworkSocket
    }

    impl NetworkComponent {
        fn new() -> NetworkComponent {
            println!("Creating NetworkComponent");
            NetworkComponent {
                socket : NetworkSocket{}
            }
        }
        
        fn do_sth_with_socket(&self) {
            
        }
    }

    impl Drop for NetworkComponent {
        fn drop(&mut self) {
            println!("Dropping NetworkComponent");
        }
    }    


    struct NetworkSocket {
        
    }

    impl Drop for NetworkSocket {
        fn drop(&mut self) {
            println!("Dropping NetworkSocket");
        }
    }  

    fn main() -> io::Result<()> {
        let mut foo = NetworkComponent::new();
        
        {
            let socket = NetworkSocket{};
            foo.socket = socket;
        }
        
        foo.do_sth_with_socket(); // oops, runtime failure, socket closed
        
        Ok(())
    }
https://play.rust-lang.org/?version=stable&mode=debug&editio...


And what did you try to prove here? There is no use after free and no runtime error in this rust code. The socket stays valid since the moment of its creation and for the whole lifetime of the network component. It gets moved out of nested scope properly and gets closed after leaving the outer scope, after dropping the NetworkComponent struct.

The "oops" comment is invalid in your Rust example because the socket is still valid at that point.

Which is totally different than what would happen in C#, where you'd get use-after-free bug (actually use-after-close).

Try with resources is not RAII. It is a lot weaker.


> foo.do_sth_with_socket(); // oops, runtime failure, socket closed

Happens just as well in Rust, why do you think I gave you a Playground link.

If you want, I can shut up the cleverness with a cargo build example instead of a dummy playground example.


The playground link confirms the socket is closed after dropping networkComponent.

Last two lines of the output:

    Dropping NetworkComponent
    Dropping NetworkSocket
Btw: you probably fooled yourself by accidentally creating 2 sockets, and indeed the first one gets dropped immediately when you lose (overwrite) the reference to it. Use Option to avoid that.


You forgot another line, it was actually:

    Creating NetworkComponent
    Dropping NetworkSocket
    Dropping NetworkComponent
    Dropping NetworkSocket
Besides, you forgot another tiny detail,

By replacing the socket now the port number is another one, and all processes that had open connections to that port will now crash, or have messages dropped without getting why.

I can also fabricate plenty of error situations with Rust if you feel so inclined.

And if you were actually serious, you would be aware that there are Roslyn analysers that validate IDispose follows proper RAII patterns, like https://github.com/DotNetAnalyzers/IDisposableAnalyzers

Remember, Rust isn't perfect, and only fixes 70% of existing error patterns, I can have plenty of inspiration with the remaining 30%.


> You forgot another line, it was actually:

No I didn't. That line is totally irrelevant and does not apply to the socket that was passed to the NetworkComponent. It applies to the initial socket you've added which was not even present in the original example. You should have used Option to make your code equivalent.

Anyway, your example failed to show use-after-close in Rust.

> Remember, Rust isn't perfect, and only fixes 70% of existing error patterns

Sure, no-one here debated that. But it fixes/protects from more error patterns than C#, and use-after-close is one of them.

You stated that try-with-resources + struct types are functionally equivalent to RAII. My code proved they were not, because you can trivially make use-after-free, and it is even really easy to do that by accident. There is nothing in the language that protects from leaking a closeable reference from the `using` scope and then using that reference after the scope gets closed. And that leak can happen 10 layers below, when it is not as easily seen as in this trivial example I posted. In Rust you can do it only with explicit `unsafe`; otherwise the typesystem tracks that for you.


> In Rust you can do it only with explicit `unsafe`; otherwise the typesystem tracks that for you.

Exactly, so I can continue this charade by creating such example.

Rust is not a magic bullet, and just like with your "proof" I can provide similar "proof" with unsafe.

Or I can provide an example in D, with has a GC and C++ like RAII, or Swift that also has a GC (ARC is a GC algorithm) and C++ like RAII as well.

I have been playing this game about explainging how to do deterministic resource management in GC enabled languages since I learned Oberon in 1995.

It is always the same pattern.

- "GC languages cannot do X"

- "Actually you can partially achieve X with Y"

- "Yeah, but ....."

So whatever.


I would need some data on that but I have to say that it always makes me laugh when people only take about the GC in threads about D. It's so good for productivity. I don't really like it but I can't describe just how much of a non-issue it is for us (The company I work for)


Tracing GC has poor memory performance because it has to access rarely used or swapped out pages to scan them for pointers. And of course, the peak memory use is much higher since it doesn't free everything as soon as possible.

There may be advantages if you can use it to add compaction, but I don't think you need a GC to do that necessarily.


Actually it is the other way around.

https://github.com/ixy-languages/ixy-languages

No wonder that M1 has specific architecture optimizations that help streamline ARC boilerplate code, while Swift 5.5 will bring more aggressive optimizations (disabled by default, because application can crash if weak/owned references are annotated improperly => WWDC 2021 talk)


This isn't representative of application code and there isn't even any mention of the metrics I mentioned…

> No wonder that M1 has specific architecture optimizations that help streamline ARC boilerplate code

No it doesn't. I told you it didn't the last time you said this.


> This isn't representative of application code and there isn't even any mention of the metrics I mentioned…

Yeah, that is the usual answer when benchmarks prove how much urban myth reference counting performance is actually like.

> No it doesn't. I told you it didn't the last time you said this.

Did you?

There is more important stuff in life to store on my brain than a list of who replies to me on hacker news.

Anyway,

https://github.com/apple/swift/blob/main/stdlib/public/Swift...

https://twitter.com/ErrataRob/status/1331735383193903104


> Yeah, that is the usual answer when benchmarks prove how much urban myth reference counting performance is actually like.

CPU/wall time benchmarks are not that relevant to system performance (seriously!) because second-order effects matter more. But if you had peak memory and page demand graphs that would matter.

For a network driver I don't know if it'd really look any different though. That's mostly arena allocations.

> https://twitter.com/ErrataRob/status/1331735383193903104

The fast atomics and JavaScript instructions do exist but aren't "special", they're just part of the ARM ISA.


Apple's atomics as of recently are almost magically fast, though.


Thanks for sharing these links! Super interesting. I do have a question though. The ixy benchmarks seem to imply that RC is generally slower then GC (go and C# are much faster then swift and are only outdone by languages with manual memory management).

However in the tweet thread you shared, the poster said

> all that reference counting overhead (already more efficient than garbage collection) gets dropped in half.

Implying that reference counting is actually more efficient. I don't know how to rectify these two observations. Do you have any insights?


The observation is done by point of view of Swift developers.

The only reason why Swift has reference counting was historical.

Objective-C GC implementation failed, because it was very hard to mix frameworks compiled with and without GC enabled, alongside the usual issues of C memory semantics.

https://developer.apple.com/library/archive/documentation/Co...

Check "Inapplicable Patterns" section.

So Apple did the right design decision, instead of trying to fix tracing GC in such environment, just like Microsoft does in COM, they looked into Cocoa [retain/release] pattern, automated that, and in a marketing swoop sold that solution as ARC.

Swift as Objective-C replacement, naturally had to build on top of ARC as means to keep compatibility with Objective-C runtime without additional overhead (check RCW/CCW for how .NET GC deals with COM).

Here is a paper about Swift performance,

http://iacoma.cs.uiuc.edu/iacoma-papers/pact18.pdf

> As shown in the figure, performing RC operations takes on average 42% of the execution time in client programs, and 15% in server programs. The average across all programs can be shown to be 32%. The Swift compiler does implement optimization techniques to reduce the number of RC operations similar to those described in Section 2.2.2. Without them, the overhead would be higher. The RC overhead is lower in server programs than in client programs. This is because server programs spend relatively less time in Swift code and RC operations; they spend relatively more time in runtime functions for networking and I/O written in C++.

It makes technical sense that Swift uses reference counting, as explained above, but it isn't due to performance, it just sells better than explaining it was due to Objective-C inherited C memory model, which besides the memory corruption problems, it doesn't allow for anything better than a conservative garbage collector, with very bad performance.

https://hboehm.info/gc/


LLVM has built in hooks for GC, used by Azul's Zing JVM JIT compiler and perhaps Safari's LLVM JS JIT.

I'm not sure if the GC integration is tied to JIT support or not. I think it's mostly related to insertion of safepoints which could be useful for Rust implementation


Safari/JavaScriptCore has moved away from LLVM and now uses its own B3 backend for advanced optimizations.


In the future, we may have better ways of escaping earth than our brute force rockets.

If we can use our atmosphere as oxidizer, like planes do, we could get to orbit with far better mass fraction. And far less fuel. Closer to a normal plane than a tin can full of explosives.

Problems are heat related, at speeds and altitude needed regular jet engines would melt. There's ongoing research into precoolers to make the concept work with fairly normal engines.

The British are working on this, with Skylon and the Sabre engine. https://en.m.wikipedia.org/wiki/Skylon_(spacecraft)


What about a non zero launch speed with some electrical ramp, railgun style?


The atmosphere at low altitude burns off tons of energy. I don't think we'll ever achieve ballistic launch from earth surface, but that's just my opinion :)


Agreed. For all the problems with ERCOT, Texas does have an astoundingly high penetration of renewable sources that makes programs like this extremely useful. High renewable penetration causes grid stability issues, but not for the reasons you would think.

Electricity travels at the speed of light, so there's a huge dependency on spinning generator inertia to keep a stable supply. With renewables, this "grid inertia" is greatly reduced.

The reduction in grid inertia makes quick demand reduction important. It takes most generation sources several seconds to "spin up" and support load transients. Loads on the other hand can be cut within a few hundred milliseconds. HVAC is nearly ideal for load cuts because they use a ton of power and small variations in temperature can be absorbed by thermal inertia of buildings and product being cooled.

There's a great document on this and how it affects the Texas grid. https://www.nrel.gov/docs/fy20osti/73856.pdf

People that willingly sign up for these programs then complain about it is a classic "leapords ate my face" scenario. The same thing happened to users of Griddy during the huge grid failure earlier this year. Griddy exposed users to realtime spot energy pricing which pegs at $9000/mwh when there's more demand than available generation.

A side note, ironically the same document praising ERCOT's innovative handling of renewable sources says the US national east/west grids are unlikely to face the same problems until the 2040's. Basically, if Texas grid was connected to one of the national ones, these problems they're trying to solve with innovation would just go away for at least 20 years.


>> trying to solve with innovation would just go away for at least 20 years.

Interesting! But wait... wouldn't that mean everyone would save effort and potentially money by hooking up to the national grid? One that has way more people working on innovation + a 20 year time line for it?

Honestly that sounds way more simple, straightforward, and literally more innovative than "let's just all turn our temps down when peaking" lol.

Then again Texas politics... :-|


I got an Aeron and I don't like it. Mesh chairs aren't as cushioned, Aeron included. You can easily feel the border between mesh and frame and it gets painful after sitting for hours.

I dont buy that ergonomic chairs are any better for your body, there doesn't seem to be any real research showing this is so. They are just less comfortable.

It's extremely durable though so I'll probably be living with this mistake for a long time :)

I miss my padded chairs but the padding always wears out


I left a bad review for one of these company's products on Amazon a while back because it caught fire when plugged in.

Somehow they got my email address and keep sending me vaguely threatening untraceable emails to "reconsider" my review in exchange for gift cards.

Super shady. Glad Amazon is finally doing something


After one less than positive review, I found that Amazon gave them my home address. I stopped posting reviews after that.


Excursion is related to frequency. A sure way to blow most up is turn bass to the max. Some speakers have bandpass filters. Cheap ones usually don't


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: