Hacker Newsnew | past | comments | ask | show | jobs | submit | digdugdirk's commentslogin

Your numbers seem a bit off on the second Trump term. Trevor Milton was on the hook for over half a billion dollars of restitution alone.

Thanks for the heads up on that.. there's a lot of massaging/cross checking that still needs to be done. Right now the numbers are based purely on what the sentence is described on the DOJ website.

https://www.justice.gov/pardon/clemency-grants-president-don...

cmd-f trevor milton .. if the text for the sentence column doesn't say anything about a fine or restitution the system is not going to be able to figure that out.

The numbers for the prison time reduced is also technically incorrect, Ross Ulbricht, Rod Blagojevich and many others had already served many years in prison, so technically we should not count that as time reduced.


I would be careful about this one. While the overall impact (in the global/national aggregate sense) may not be massive, the impact to individual communities nearby these new hyperscale datacenters is far more impactful than most people on this site might think.

Look at the grok datacenter in Memphis for one example. The "move fast and break things" mentality in this arena isn't about code anymore, it's being applied to communities.


Let’s say we grant the grok example.

A) How many other datacenters with similar problems can you name?

B) How does this industry compare to every other one on earth, and then look at the disproportionate hate this gets compared to other industries that are substantially worse.


I'm going to flip this around, I know about the grok data center because I am under the impression it is unusual in terms of approvals and pollution.

How many other datacenters with similar problems can you name? If it is indeed not unique, I would appreciate you pointing out some other examples of the same behavior from non-AI related datacenters.


It's important to look at the whole picture though - most of the issues you raised are political issues, either brought about by NIMBYism or a lack of infrastructure investment over the past few decades. Without this investment (which, yes, is more expensive than it would be if it weren't also providing additional benefits tailored for renewables) the grid would continue to deteriorate. So clearly there is some amount of this cost that would need to be invested regardless.

There's also subsidies for fossil fuels to consider [0]. I don't hold these figures as gospel, but there's inarguably a massive amount of money going to propping up the (wildly profitable and hugely destructive) industry that's causing most of your raised issues in the first place - either through reduced maintenance and infrastructure investment (gotta get those shareholder returns) or lobbying/public influence campaigns.

To be clear - I absolutely agree with most of your complaints. I just see them as issues caused/exacerbated by entrenched political players, and I think the benefits to our society of getting off our fossil fuel addiction are worth the costs of modernizing our infrastructure for the long haul.

[0] - https://www.theguardian.com/environment/2023/mar/09/fossil-f...


I'm certainly not saying we should stay on fossil fuels, my main point is that for the money that is being spent on various renewables could have been spent on a giant nuclear expansion at lower cost and far higher relability.

I don't really agree with this 'declining grid' narrative the renewable lobby has pushed. Yes there is upgrades to be done, etc etc. But peak UK electricity demand is down from ~65GW to ~45GW (which may change, but doesn't look to be).

Nearly all of the cost on the grid is to do with renewables, not 'general upgrades'. We would not be building 10GW of HVDC from scotland to england. We wouldn't be doing a drastic 275kV -> 400kV rerating and duplication in the middle of nowhere scotland otherwise.


Again, we're in agreement, and I yearn for a world with massive buildout of scalable nuclear power.

But what is this "renewable lobby", and how are they doing funding-wise against the various extremely well funded and deeply entrenched fossil fuel industry lobbying groups (and companies!) that have been pouring money into UK politics ever since they stopped setting UK foreign policy directly and overtly?

All I ask is that people take a step back and look at the whole picture, and then question whether their arguments benefit an industry that is responsible for so much damage and destruction (environmental, economic, human health, societal, political, etc.) or if their arguments benefit individuals and society as a whole.

Nuclear power built out in such a way as to achieve the original "too cheap to meter" goal would be a dream. But don't let perfect be the enemy of good and all that.


This seems tailored to the Claude web/chat interface. Does anyone have any experience or systems specific to Claude code?

I've been using Opencode alongside Claude, trying to utilize Opencode for as much easy/rote functionality as possible so I don't blow through my Claude context, but it is a pain in the rear. I'm sure someone on here has solved this for themselves, and I'd love to hear what people are doing in the "token efficiency" realm.


in the Claude Code realm I use GSD and it manages all that and keeps token use low. Plus it writes to a file so memory is stored so I can type /clear and the token count resets yet it still can remember where I left off for the next work. Check it out here:

https://github.com/gsd-build/get-shit-done


Can you explain this? I haven't heard of JEPA, and from a quick search it seems to be vision/robotics based?

It’s a self supervised learning architecture, and it’s pretty much universal. The loss function runs on embeddings, and some other smart architectural choices allover. Worth diving into for a few hours, Yann LeCun gives some interesting talks about it


Where are you getting this opinion from? Because the vast number of matriarchal hunter-gatherer societies throughout history would disagree with you. Those "woke" policies existed prior to the agricultural and industrial revolutions and were stamped out by the "mentally developed" societal and economic systems that we invented along the way.

I'm not sure where you're getting your "fundamental truths" from, but as the fields of sociology and economics don't actually have anything of the sort it'd be worthwhile to expand your reading list and adding some history in there for good measure.


Interesting. I tend to imagine the exact opposite - it seems as though this is the big players realizing there's not much left from training better models, so they're focusing on scaffolding to get their improvements. If that's the case, why wouldn't a company completely devoted to the scaffolding not do a better job at that goal? And more importantly, why would I lock myself into their system?


Because the model labs can RL against their own scaffold, which will be borderline unbeatable


It's inherently limited by its geometry kernel. Most "real" CAD suites use something like parasolid, usually with a bunch of extras slapped on top. Making a new one from scratch is a massive undertaking, but I'll remain forever hopeful that we get a new, modern, open-source kernel one of these days...


This isn't really true. The vast majority of problems are in the UI. The geometry kernel is limited, but it's good enough for an open source project. Compared to say OpenSCAD, Open CASCADE is leagues ahead.


I don't necessarily agree in this case - OCCT is more than capable for what FreeCAD is offering. Add to that the development trajectory of OCCT also seems to be really taking off recently (with the 8.0-RC, they've re-worked how all B-Spline algorithms work, with implications for all operations).


Not gonna lie I just hope the rewrite it in rust community takes a stab at it at one point,


There are already at least two geometry kernels being written in scratch in Rust (see fornjot.app for one) --- the problem is the first parts are obvious/easy, so initial progress is rapid, then one hits the difficult/intractable parts and progress stalls, usually to be abandoned.

There are a couple of doctorates available for folks who are willing to research and publish in this space --- the commercial products are all holding their solutions as trade secrets in their code --- even then though, the edge cases are increasingly difficulty to solve in such a way as to not break what is already working, hence the commercial kernels having _very_ large teams working on them, or at least that is my understanding from what Michael Gibson (former lead developer of Rhino 3D, current developer of Moment of Inspiration 3D) has written on the topic.


What's wrong with OCCT?


Commenting here in case you or someone else remembers what this is. I'm always on the lookout for practice resources I can recommend to CAD beginners.



Honestly? I've been playing with using LLMs specifically for that reason. I'm far more likely to make prototypes that I specifically intend to throw away during the development process.

I try out ideas that are intended to explore some small aspect of a concept, and just ask the LLM to generate the rest of whatever scaffold is needed to verify the part that I'm interested in. Or use an LLM to generate just a roughest MVP prototype you could imagine, and start using it immediately to calibrate my initial intuition about the problem space. Eventually you get to the point where you've tried out your top 3-5 ideas for each different corner of your codebase, and you can really nail down your spec and then its off to the races building your "real" version.

I have a mechanical engineering background, so I'm quite used to the concept of destructive validation testing. As soon as I made that connection while exploring a new idea via claude code, it all started feeling much more natural. Now my coding process is far more similar to my old physical product design process than I'd ever imagined it could be.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: