Hacker Newsnew | past | comments | ask | show | jobs | submit | bufo's commentslogin

Were you using pre-compiled headers before?


Grateful that this one supports Windows out of the box.


Brave.


Ah yes, the crypto shill browser. I'd rather take the thousand buggy stabs of Firefox than get down to that level.


The plan is to have many, many Mechazillas.


Because Go has massive traction both inside and outside of Google, whereas Dart/Flutter never got big traction.


Dart only found a real good use case fairly recently. Given its explosion in usage since then, I think it may very well be more popular that go in several years.


Curious- what is the use case?


I think they mean Flutter, and Darts truly cross platform abilities.


The RTX 4090 has about the same BF16 Tensor Core TOPs than the A100, assuming 50% MFU (like the A100 40 GB PCIe) it would take 8x longer on 1 RTX 4090 vs 8x A100 80GB SXM, so 12 hours. Datasheet here for the TOPs https://images.nvidia.com/aem-dam/Solutions/geforce/ada/nvid... 50% MFU should be achievable on the 4090.


I recommend Phind.com, it’s been much better and faster for me than Perplexity Pro. I typically use their custom 70B model but you can also use GPT4 o or Turbo, or Claude 3 Opus.


It takes a while to take down job posts. Everyone likely learned the decision recently. I don’t think the employees who are going to be laid off care about updating the job posts at the moment…


“Vice website is shutting down” is 100% accurate. Note that they will also lay off most of their staff.


Yup. The parent comment disambiguates between vice the website and vice generally. Only the website is shutting down; vice the org will still be around.


Was Google a chip designer before the first TPU?


Yes. Google had a number of chip products before that. Some made it to A1 and worked. Just cause they don’t advertise it doesn’t make it not so.


> Yes. Google had a number of chip products before that.

Is that true? I can't find anything suggesting it is. In fact, the little I can find suggests you are incorrect. I'll link them for the sake of referencing sources but they're both pretty awful ad-ridden sites...

A 2016 Tech Radar interview [0] with Norm Jouppi has him quoted as saying:

> [The] Tensor Processing Unit (TPU) is our first custom accelerator ASIC [application-specific integrated circuit] for machine learning [ML], and it fits in the same footprint as a hard drive.

And a 2023 Tom's hardware post [1] begins:

> Google has made significant progress in its endeavor to develop its own data center chips, according to a new report. The Information says that a key milestone has just been reached, which means that Google can plan to roll out server systems powered by the new chips starting from 2025.This is not the first processor that Google has successfully put through R&D - the company has previously made an ASIC for servers and an SoC for mobile devices. The search giant started using its internally developed Tensor Processing Unit (TPU) as far back as 2015.

[0]: https://www.techradar.com/news/computing-components/processo...

[1]: https://www.tomshardware.com/news/google-reaches-self-develo...


I guess it depends on what you are defining as a chip and what you are defining as "Google" -- as in if they have contractors design/build to their needs does that count.

1/ https://www.wired.com/2012/03/google-microsoft-network-gear/

2/ I believe they had a few custom chips designed for the youtube workloads that predate the TPU.

I remember in 2010 there was a building in MV that focused on custom chips.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: