Hacker Newsnew | past | comments | ask | show | jobs | submit | cashsterling's commentslogin

I too doubted, from the beginning, that neural networks will be the basis of AGI. As impressive and useful as LLM's are, they are still a long, long way from AGI.


Until a company can demonstrate a quantum volume of even just 2^16... their computer is just about worthless for any sort of half-real quantum computing.

I pay zero attention to any technical information and marketing speak coming from a quantum computing company until they demonstrate decent quantum volume.

Most companies computers can't even hit 2^16 so they are figuring out means of distracting the market from the poor fidelity of their systems.


Until a company can demonstrate that a motor vehicle can be operated by a layperson, it's just about worthless for any sort of half-real transportation task.

I pay zero attention to any technical information and marketing speak coming from a motor vehicle company until they demonstrate decent improvements in operating their machines.

Most companies' motor vehicles have to be hand cranked and kick-started by a youthful person with good vitality who doesn't fear being run over by their motor vehicle.

-- Some guy in 1925 who worked in the horse and buggy industry...


The better analogy here, in my opinion, is until someone builds a car that actually drives forward...

A lot of these quantum computing papers are just loud engine rev'ing.

But you are right... "zero attention" is hyperbole... I shouldn't have said "I pay zero attention"... I should have said, "I am underwhelmed and don't give a lot of credence to systems until they demonstrate ability to perform high fidelity calculations of reasonable complexity and depth."

I do think superconducting qubits approaches will continue to improve in fidelity and ability... there are just too many brilliant people working on these challenges to count them out.


The Ford Model T had been on the market for over almost 2 decades by then. Over 15 million were sold, mostly to laypeople.

In contrast, no quantum computer today actually quantum computes. They just approximate quantum computing in specific scenarios and have yet to match, let alone exceed, the performance of regular computers.


The Model T was produced in 1908. Normal people being able to use (though not afford) cars was normal before that.

It took only 30 years to go from Otto and Benz going "Look at this neat contraption" to "Mass produced, semi-affordable personal cars".


Quantum volume is a good metric but that's kind of one-dimensional take. Almost any interesting circuit doesn't requires all-to-all connectivity and superconducting QC are bad at all-to-all connected circuit so we can have interesting NISQ experiments without particularly large QV


It is not a one dimensional take... it is a stress test of qubit gate fidelity [across all qubits involved in the circuit], state prep and measurement , lifetime (coherence), memory errors, etc.

Now I agree that there are other great stress tests of quantum computer systems... but most of the industry agreed that quantum volume was a great metric several years ago. As many companies systems have been unable to hit decent QV, companies have pivoted away from QV to other metrics... many of them are half baloney.


> fidelity [across all qubits involved in the circuit]

I don't see a scenario in which the fidelity of 2QG between two far away qubits matter. Stress tests should be somehow related to the real tasks the system is intended to solve.

In case of quantum computers, the tasks are either NISQ circuits or fault-tolerant computation, and in both cases you can run them just fine without applying 2QG between far-away qubits that translate in large amount of swaps.

If you're interested in applying Haar-random unitaries, then surely QV is an amazing metric, and then systems with all-to-all connectivity is your best shot (coincidentally, Quantiniuum keeps publishing their quantum volume results). It's just not that interesting of a task.


Modern high-end fabs have extremely expensive equipment and are highly automated... like they are so automated that people don't actually handle wafers... it is almost all robotic.

Thus, salaries and cost of services do not factor in as heavily as you might think to fab economics.

Data suggests that TSMC's per wafer costs in Arizona are 10-30% higher than Taiwan and that Arizona fab is relatively new. It's economics will probably improve over time, narrowing the margin to 5-15%.

Looking towards the future, power costs and other global supply chain factors could very easily make TSMC's Arizona fab less expensive and more reliable to operate over time. For one, the US is completely energy independent... Taiwan is not.


Aren’t TSM fabs in the US little islands of Taiwanese workers? How is this domestic knowledge and manufacturing not exactly?


They are majority Taiwanese employees for now, but I’m sure they’re hiring Americans to grow and backfill. They just wanted to bootstrap the knowledge and wisdom, but over enough time that can be shared and spread among Americans as well


Yeah right like they are just going to let that knowledge go


Atlantic Quantum has some brilliant people and I'm sure they have some great ideas.

That said, I dislike the quantum computing craziness we're in where big claims are made without proof or data.

"We have the fastest clock speeds, lowest error rates, and most scalable architecture... no data, just take our word for it."

It will be interesting to see how they do in the DARPA QBI program.


I always try to communicate a strong sense of hope to my teenage kids. Many nations have an inverted population demographic, including the US (especially without immigration).

I think the US is "as little as 10 years" away from a significant skilled labor shortage.


The US is also "as little as 10 years" away from losing currency reserve status, defaulting or inflating away our debt... or hilariously, maybe even raising taxes on young people to reduce it. All of these outcomes mean Americans just become very poor.

People forget that our skilled labor force exists here because we have the resources to demand it. If we actually run this country into a ditch like it looks like we're trying to do, there is no reasons those jobs don't go to higher demand countries like the Nordics, China, or even just Canada.

I think we'll be okay, but everyone is treating this outlook like it's a speed bump, when the consequences of wrecking your economy is a cascade of politically difficult problems that create incentives against investing in the future: see Argentina and Japan.


I think Ada is a great language and it is completely possible that Ada will experience a resurgence in coming years.. especially as LLM's are used more and more to generate software. ADA/Spark can provide very robust guide rails for correctness (so can Rust) of 'AI' generated code.


Yeah... my intro CS class was in C and Ada95 (I'm not a CS guy btw, just took the class). I actually preferred Ada over C... but continued to program in C for other classes because of compiler availability; I had to do all my Ada programming on Sparc workstations at school.

I personally think that AdaCore, and friends, missed an opportunity in the early 2000's to fully embrace open source... they left a big gap which Rust has filled nicely.

I still think Ada is a great programming language. When a question comes up along the lines of: "what's the best programming language nobody's heard of?", or "what's the best programming language that is under used?" Ada is usually my first answer. I think other good answers include F#, <insert LISP flavor>, Odin, Prolog, Haxe, and Futhark (if you're into that sort of thing).


We are on Ada 202x nowadays being discussed, and in a world where FOSS tool makers have problems making a sustainable business, always changing licenses, there are still 7 Ada vendors selling compilers.


And only AdaCore / GNAT will ever support Ada 202x. The language has left the legacy vendors behind.


Libre compilers do not impose restrictions on output.


In embedded world restrictions on the output is the last thing of worries while fulfilling various certification requirements can be a big and costly headache.

And from the vendor point of view releasing a compiler under libre license allows for concurrents to undercut one on R&D leading to the compiler and related tools certification. So from business point of view it just makes no sense. This is very different from contributing to, say, clang, where a cost of maintaining own closed fork outweighs any disadvantages of contributing.


As long as there are enough people around to actually work on them.


I really hope the best for Oxide and applaud their compensation model.

I applied to one of their roles which required me to write about 10 pages of text to answer all their questions... which I think is a big ask but I did it because "why not".

They took over 3 months to get back to me, but at least they got back to me (with an apology and a polite "no").


I never bothered applying because they explicitly said they didn't want candidates who didn't finish college.


Where did we say that? My general understanding of our hiring practices is that we do not have minimum education requirements.


No, it was more like dropping out of college is a red flag.


Hm, well I am not sure about that either.


> The completion of a formal education is much more important than the institution

From https://rfd.shared.oxide.computer/rfd/0003 -- which was top of mind since I started looking at the postings after reading the article. ;-)


It's not saying that doing so is a requirement, it's saying that, if you have a degree, the fact that you have one is more important than where it is from.

I dropped out of school, had terrible grades, and ended up going back and finishing, but still got like a 2.something GPA. The subject of school never came up. When I review applicants, I don't rate them poorly if they didn't get a degree.


That's totally fair! I just sympathize with gp as the quoted section and the sentence that emphasizes "completion" from the last paragraph are easy to misinterpret as "you should have finished a four-year degree".

e: hopefully clearer, but tired and still commenting on the internet, so who knows. =)


That section is comedic gold. It reads like a character from Silicon Valley opining on hiring "only the finest." Poe's Law hilarity.


Looks coherent and well considered to me. Oxide can't really get away with hiring friendly-but-incompetent. I liked this section from the written part, that looks worth doing as a self reflection exercise.

> What work have you found most challenging in your career and why?

> What work have you done that you are particularly proud of and why?

> When have you been happiest in your professional career and why?

> When have you been unhappiest in your professional career and why?


>Looks coherent and well considered to me.

It's some of the blandest HR takes I've read.

Besides, I was referencing the hilarious credentialism, where a degree isn't required wink wink, but they wouldn't entertain you if you don't have one.


They require a lot just to apply, and they didn't get back to you until three months later? That's unacceptable.


Three months is long. We are upfront that the process takes a while. Part of the reason is that while we ask a lot of people, we also then give a lot: reviewing everyone's materials takes a significant amount of time. But the goal is usually half of that, six weeks or so.


100% agree with you!

I have worked US manufacturing and manufacturing R&D for most of my career: pharmaceutical, microelectronics, materials, aerospace, etc. The US is awesome at manufacturing when we want to be.

One problem is that "modern MBA/business philosophy" views manufacturing and manufacturing employees as a cost center and there is so much emphasis on maximizing gross margin to increase shareholder value.

So business leaders scrutinize the hell out of anything that increases the cost of their cost centers:

- employee training & development? hell with that.

- Increasing pay to retain good employees in manufacturing? Why? isn't everything mostly automated?

- manufacturing technology development? Not unless you can show a clear and massive net present value on the investment... and, then, the answer is still no for no good reason. I have pitched internal manufacturing development investments where we conservatively estimated ~50% internal rate of return and the projects still didn't get funded.

There is also a belief that outsourcing is easy and business people are often horrible at predicting and assessing the total cost of outsourcing. I have been on teams doing "insource vs. outsource" trade studies and the amount of costs and risks that MBA decision makers don't think about in these situations really surprised me initially... but now I'm use to it.

Anyhow... the US (and Europe for that matter) can absolutely increase manufacturing. It is not "difficult"... but it would be a slow process. I think it is important to differentiate between difficulty and speed.


You could simply make taxes scale inversely with the number of employees. Make the tax scale with a lack of career path. Even more tax if you don't have a system to measure and reward performance. More tax for lack of R&D. They don't have to be huge amounts, just enough for the MBA to stfu.


I graduated in ChemE in southern California in 1999 when there was a major downturn in the job market. One or two big chemical engineering design firms closed their SoCal offices flooding the market with qualified chemE's, aerospace companies were consolidating, etc.

Some of my school colleagues got good jobs at refineries and whatnot... but they were the fortunate ones. It took me 12 months to land my first "I made it" engineering job with a good salary. In the interim, I worked hourly jobs making between 13-18 USD an hour.

Don't let the current job market deflate you. You are young, intelligent, and you have a degree from MIT... you are going to be fine.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: