Yes, gitea (and originally gogs) are released under permissive licenses, so it's legally allowed to fork them.
But forking complete working projects with years of work, rebranding with a "good guys" attitude, and progressively erasing the name/history (mentioning a gitea fork has moved down the faq now) is not fair.
Edit: even worse, the word "fork" is not in the FAQ. It is "Comparison with Gitea" now (fork is mentioned on that page).
> Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software…
This is already a crazy take on its own, why would a fork have to describe their relation to the parent project front and center? Both the Readme and the comparison page link to the origin blog post [1] that describes the lineage clearly.
But even if there were some "ethical reason" to do this, I don't think Gitea is the right project to play up as a victim. Their homepage [2] doesn't mention that Gitea itself is a fork either. Their Readme does, but is this so much better?
Strange that they apparently raised $169M (really?) and the website looks like this. Don't get me wrong: Plain HTML would do if "perfect", or you would expect something heavily designed. But script-kiddie vibe coded seems off.
Strange that they raised money at all with an idea like this.
It's a bad idea that can't work well. Not while the field is advancing the way it is.
Manufacturing silicon is a long pipeline - and in the world of AI, one year of capability gap isn't something you can afford. You build a SOTA model into your chips, and by the time you get those chips, it's outperformed at its tasks by open weights models half their size.
Now, if AI advances somehow ground to a screeching halt, with model upgrades coming out every 4 years, not every 4 months? Maybe it'll be viable. As is, it's a waste of silicon.
The prototype is: silicon with a Llama 3.1 8B etched into it. Today's 4B models already outperform it.
Token rate in five digits is a major technical flex, but, does anyone really need to run a very dumb model at this speed?
The only things that come to mind that could reap a benefit are: asymmetric exotics like VLA action policies and voice stages for V2V models. Both of which are "small fast low latency model backed by a large smart model", and both depend on model to model comms, which this doesn't demonstrate.
In a way, it's an I/O accelerator rather than an inference engine. At best.
Even if this first generation is not useful, the learning and architecture decisions in this generation will be. You really can't think of any value to having a chip which can run LLMs at high speed and locally for 1/10 of the energy budget and (presumably) significantly lower cost than a GPU?
If you look at any development in computing, ASICs are the next step. It seems almost inevitable. Yes, it will always trail behind state of the art. But value will come quickly in a few generations.
maybe they're betting on improvement in models to plateau, and that having a fairly stablized capable model that is orders of magnitude faster than running on GPU's can be valuable in the future?
Makes a lot of sense for SQLite to be written in C.
It's a heavily optimized and debugged database implementation: Just look at btree.c with all its gotos :)
The only language that would make sense for a partial/progressive migration is zig, in huge part due to its compatibility with C. It's not mentioned in the article though.
Zig hasn't even had it's first release yet. And projects written in it still break in new releases. Given their stance on boringness and maturity it would make no sense for Sqlite to consider zig yet.
SQLite is never gonna be rewritten by its creators in another language. That is highly doubtful considering the age of SQLite and the roadmap for support I think until 2060 or mid 2050s based on what I’ve read.
> SQLite is never gonna be rewritten by its creators in another language.
Almost certainly correct. It is however being rewritten in Rust by other people https://github.com/tursodatabase/turso. This is probably best thought of as a seperate, compatible project rather than a true rewrite.
I'm biased but when I see Discord as the only way of communication, it doesn't make for a serious project. I wish more projects would rely on IRC/Matrix + forums.
For better or worse, plenty of serious projects are using Discord for communication. It's not great, but IRC and Matrix have their own problems (IMO Zulip is the best of the bunch, but doesn't seem to be particularly widely adopted).
The amount of completely obscure and exotic platforms that have C compiler, and the amount of tooling and analysis tools C has — I'd be surprised anything comparable exists.
https://en.wikipedia.org/wiki/Clive_Sinclair
reply