Good points, but Postgres has all those, along with much better local testing story, easier and more reliable CDC, better UDFs (in Python, Go etc.), a huge ecosystem of extensions for eg. GIS data, no licencing issues ever, API compatability with DuckDB, Doris and other DBs, and (this is the big one) is not Oracle.
Unless I’ve missed something, Postgres doesn’t have automatic index creation, nor does it have JSON introspection to automatically convert it to a normalized schema (which is insane; I love it). It also doesn’t do any kind of sharding on its own, though of course forks like Citus exist. It definitely doesn’t do RAC / Exadata (not sure which part this falls under), where multiple nodes are connected and use RDMA to treat a bunch of SSDs as local storage.
I love Postgres, and am not a huge fan of Oracle as a corporation, but I can’t deny that their RDBMS has some truly astounding capabilities.
RAC is a default part of any cloud Oracle DB, I think! I must admit I'm not an expert in all the different SKUs so there might be some that aren't, but if you rent an autonomous DB in the cloud you're probably running on ExaData/RAC. That's why the uptime is advertised as 99.95% even without a separate region.
I was ambiguous. That's an answer telling how to create indexes manually, and saying that you get an index for primary keys and unique constraints automatically. Sure, all databases do that. Oracle can create arbitrary indexes for any relation in the background without it being requested, if it notices that common queries would benefit from them.
Forgetting to create indexes is one of the most common issues people face when writing database apps because the performance will be fine on your laptop, or when the feature is new, and then it slows down when you scale up. Or worse you deploy to prod and the site tanks because a new query that "works fine on my machine" is dog slow when there's real world amounts of data involved. Oracle will just fix it for you, Postgres will require a manual diagnosis and intervention. So this isn't the same capability.
What that utility is doing is quite different. For one, it assumes you start with a schema already. Oracle can infer a schema from a collection of documents even if you don't have one by figuring out which fields are often repeated, which values are unique, etc.
For another, what you get after running that utility is relational tables that you have to then access relationally via normal SQL. What JSON duality views give you is something that still has the original document layout and access mode - you GET/PUT whole documents - and behind the scenes that's mapped to a schema and then through to the underlying SQL that would be required to update the tables that the DB generated for you. So you get the performance of normalized relations but you don't have to change your code.
The nice thing about this is it lets developers focus on application features and semantics in the early stages of a startup by just reshaping their JSON documents at will, whilst someone else focuses on improving performance and data rigor fully asynchronously. The app doesn't know how the data is stored, it just sees documents, and the database allows a smooth transition from one data model to another.
I don't think Postgres has anything like this. If it does it'll be in the form of an obscure extension that cloud vendors won't let you use, because they don't want to/can't support every possible Postgres extension out there.
Great work on the chip, I’m really onboard with the trusted computing aim!
Is there a way to bootstrap binary code into the reram? I’m thinking being able to ‘hand-type’ in a few hundred byte kernel rather than use a flashing tool
The chip comes from the factory with a boot0/boot1 chain that is fully reproducible and buildable from source. Developers can replace boot1 with their own version, where you could add the feature you're thinking about.
sure, for example it generated this small demo of the type and contract safeguards. As you can see, it's mostly "Forth but with things":
# bank.ax — The Safe Bank (Milestone 2)
#
# Demonstrates property-based verification of financial operations.
# Each function has PRE/POST contracts. VERIFY auto-generates random
# inputs, filters by PRE, runs the function, and checks POST holds.
# DEPOSIT: add amount to balance
# Stack: [amount, balance] (amount on top)
# PRE: amount > 0 AND balance >= 0
# POST: result >= 0
DEF deposit : int int -> int
PRE { OVER 0 GTE SWAP 0 GT AND }
ADD
POST DUP 0 GTE
END
# WITHDRAW: subtract amount from balance
# Stack: [amount, balance] (amount on top)
# PRE: amount > 0 AND balance >= amount
# POST: result >= 0
DEF withdraw : int int -> int
PRE { OVER OVER GTE SWAP 0 GT AND }
SUB
POST DUP 0 GTE
END
# Verify both functions — 500 random tests each
VERIFY deposit 500
VERIFY withdraw 500
# Prove both functions — mathematically, for ALL inputs
PROVE deposit
PROVE withdraw
# Demo: manual operations
1000 200 deposit SAY
1000 300 withdraw SAY
Running it outputs:
VERIFY deposit: OK — 500 tests passed (1056 skipped by PRE)
VERIFY withdraw: OK — 500 tests passed (1606 skipped by PRE)
PROVE deposit: PROVEN — POST holds for all inputs satisfying PRE
PROVE withdraw: PROVEN — POST holds for all inputs satisfying PRE
1200
700
APIs in a B2B style would likely be much more prevalent, less advertising (yay!) and less money in the internet so more like the original internet I guess.
The original Macintosh had similar specs as well – 128k with a 68k clocked at ~6-7 MHz. It helps that both platforms put a significant amount of OS code in ROM.
It was a massive shame the TV-toy project at Sinclair did not work out. It was a SOC/low cost computer based on the Inmos transputer (Called the T400, an M212 without dedicated link hardware) around 1983. That might have kept Inmos afloat- they were responsible for a lot of the RAM chip innovation, VGA standard, transputer etc. so the world would have looked very different.
I do wonder what could have been with that chip paired with the Slipstream chip, oh well.
Tip of the hat there, it’s a very selfless thing to commit to caregiving. From a 50kft view, we have an aging demographic globally, and the bet seems to be robotics- hopefully they will get good enough to help meaningfully in this capacity. What happens to an economic system predicated around having more kids (GDP growth) is another concern.
We already have the ability to take care of people now. All it needs is is for someone in power to give a fuck and set up a system and fund it. The suggestion that we do nothing for 30 years so we can leave our loved ones home with a robot care taker is kind of fucking angering.
The robotics thing to replace caregivers misses the point that elder people also want connection. Yeah, it might free caregivers but still we will have a loneliness epidemic. I think this is more related to the desire for progress which is the backbone of modern life (you see it politics, school, your family, etcetera). This, I believe, has been slowly replacing the social glue of societies like religion, public space, play, chatting, etcetera.
https://youtu.be/nHsgdZFk22M?si=Yt_m0W6OozSm4TkW
Allows Minecraft on a 16mhz Falcon
reply