Hacker Newsnew | past | comments | ask | show | jobs | submit | hbbio's commentslogin


Don't get me started on the Psion 5mx...

Still have it, last time I checked it worked well.


Still not accepting Codeberg moral stance.

Yes, gitea (and originally gogs) are released under permissive licenses, so it's legally allowed to fork them.

But forking complete working projects with years of work, rebranding with a "good guys" attitude, and progressively erasing the name/history (mentioning a gitea fork has moved down the faq now) is not fair.

Edit: even worse, the word "fork" is not in the FAQ. It is "Comparison with Gitea" now (fork is mentioned on that page).


> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software…

https://github.com/go-gitea/gitea/blob/main/LICENSE

If you don't want your software used like that, don't choose this licence.

You can't post-hoc decide how people behave.


open source is all fun and games until they fork you

I mean you build a base for your oss tooling. You reject a base's notion, what do you expect?

This is already a crazy take on its own, why would a fork have to describe their relation to the parent project front and center? Both the Readme and the comparison page link to the origin blog post [1] that describes the lineage clearly.

But even if there were some "ethical reason" to do this, I don't think Gitea is the right project to play up as a victim. Their homepage [2] doesn't mention that Gitea itself is a fork either. Their Readme does, but is this so much better?

[1]: https://forgejo.org/2022-12-15-hello-forgejo/ [2]: https://about.gitea.com/


Imagine this applied to coding.

- Do you want to add that _cool_ feature users will love?

- Yes

...

Yes

You may end up with a software art piece.


I wouldn't say it hasn't improved. Security has improved considerably, and it's one of the main reasons to use a Mac.

However, there's too many bundled apps. Just wrote about this last week: https://medium.com/@hbbio/let-me-uninstall-spotlight-1fe64a3...


Been playing with agent strategies, vibe coding this: https://github.com/hbbio/rc that uses helix by default

Using it already (the granular branch) but it's far from stabilized...


Strange that they apparently raised $169M (really?) and the website looks like this. Don't get me wrong: Plain HTML would do if "perfect", or you would expect something heavily designed. But script-kiddie vibe coded seems off.

The idea is good though and could work.


Strange that they raised money at all with an idea like this.

It's a bad idea that can't work well. Not while the field is advancing the way it is.

Manufacturing silicon is a long pipeline - and in the world of AI, one year of capability gap isn't something you can afford. You build a SOTA model into your chips, and by the time you get those chips, it's outperformed at its tasks by open weights models half their size.

Now, if AI advances somehow ground to a screeching halt, with model upgrades coming out every 4 years, not every 4 months? Maybe it'll be viable. As is, it's a waste of silicon.


Poverty of imagination here, plenty uses of this and its a prototype at this stage.


What uses, exactly?

The prototype is: silicon with a Llama 3.1 8B etched into it. Today's 4B models already outperform it.

Token rate in five digits is a major technical flex, but, does anyone really need to run a very dumb model at this speed?

The only things that come to mind that could reap a benefit are: asymmetric exotics like VLA action policies and voice stages for V2V models. Both of which are "small fast low latency model backed by a large smart model", and both depend on model to model comms, which this doesn't demonstrate.

In a way, it's an I/O accelerator rather than an inference engine. At best.


With LLMs this fast, you could imagine using them as any old function in programs.


You could always have. Assuming you have an API or a local model.

Which was always the killer assumption, and this changes little.


Even if this first generation is not useful, the learning and architecture decisions in this generation will be. You really can't think of any value to having a chip which can run LLMs at high speed and locally for 1/10 of the energy budget and (presumably) significantly lower cost than a GPU?

If you look at any development in computing, ASICs are the next step. It seems almost inevitable. Yes, it will always trail behind state of the art. But value will come quickly in a few generations.


maybe they're betting on improvement in models to plateau, and that having a fairly stablized capable model that is orders of magnitude faster than running on GPU's can be valuable in the future?


A much better source is https://go.dev/doc/go1.26


Now they acquired Photomator with Pixelmator, but it's still an independent subscription... not even included in this bundle. Maybe they just forgot.


Makes a lot of sense for SQLite to be written in C. It's a heavily optimized and debugged database implementation: Just look at btree.c with all its gotos :)

The only language that would make sense for a partial/progressive migration is zig, in huge part due to its compatibility with C. It's not mentioned in the article though.


Zig hasn't even had it's first release yet. And projects written in it still break in new releases. Given their stance on boringness and maturity it would make no sense for Sqlite to consider zig yet.


SQLite is never gonna be rewritten by its creators in another language. That is highly doubtful considering the age of SQLite and the roadmap for support I think until 2060 or mid 2050s based on what I’ve read.


> SQLite is never gonna be rewritten by its creators in another language.

Almost certainly correct. It is however being rewritten in Rust by other people https://github.com/tursodatabase/turso. This is probably best thought of as a seperate, compatible project rather than a true rewrite.


I'm biased but when I see Discord as the only way of communication, it doesn't make for a serious project. I wish more projects would rely on IRC/Matrix + forums.


For better or worse, plenty of serious projects are using Discord for communication. It's not great, but IRC and Matrix have their own problems (IMO Zulip is the best of the bunch, but doesn't seem to be particularly widely adopted).


You're right. Just look at their branding. Such poor taste. I'm sure it's a scam.


The amount of completely obscure and exotic platforms that have C compiler, and the amount of tooling and analysis tools C has — I'd be surprised anything comparable exists.


What would be a reason to bring Zig in?

For example, Rust has additional memory guarantees when compared to C.


Zig has better ergonomics over C but not as complex as Rust.


Maybe DRH will see this and update the page !


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: