Phones catch fire and men tend to stick them in the front pant pocket. But yes true, so far I haven't gotten into smart jewelry. I was pretty into jewelry for a while but never rings because of the cold finger situation. Necklaces can be problematic too. I chipped my front tooth slightly when I jumped wearing a big crystal and smacked myself in the face with it.
Totally agree, completely different skillset. Every engineer I've seen "promoted" as such becomes miserable, and frankly is not very good at their new role, effectively making it a double loss.
Honestly, I’m pretty good at it but yes indeed quite miserable, particularly now, in this market. With hiring very slow, companies know people are trapped.
I always say "own the output". No need to do it by hand but you better damn well research _why_ the AI chose a solution, and what alternatives there are and why not something else, how it works and so on. Ask the AI, ask a seperate agent/model, Google for it, I don't care, but "I don't know the LLM told me" is not acceptable.
Unless your company is investing in actually teaching your junior devs, this isn't really all that different than the days when jr devs just copied and pasted something out of stack overflow, or blindly copied entire class files around just to change 1 line in what could otherwise have been a shared method. And if your company is actually investing time and resources into teaching your junior devs, then whether they're copying and pasting from stack overflow, from another file in the project or from AI doesn't really matter.
In my experience it is the very rare junior dev that can learn what's good or bad about a given design on their own. Either they needed to be paired with a sr dev to look at things and explain why they might not want to something a given way, or they needed to wind up having to fix the mess they made when their code breaks something. AI doesn't change that.
It's easier to be lazy now more than ever. Hard to blame them because the temptation to deliver and prove oneself as a junior is always high.
I can't count how many seniors have forgotten what it means to understand the code they're merging since AI coding tools became popular. So long as businesses only value quantity the odds are stacked against juniors.
For me, the hardest part of software development was learning incantations. These are not interesting, they're conventions that you have to learn to get stuff to work. AI makes this process easier.
If people use AI to generate code they don't understand, that will bite them. But it's an incredibly tool for explaining code and teaching you boring, rote incantations.
> Indeed, it's relatively impossible without ties to real world identity.
I don't think that's true? The goal of vouch isn't to say "@linus_torvalds is Linus Torvalds" it's to say "@linus_torvalds is a legitimate contributor an not an AI slopper/spammer". It's not vouching for their real world identity, or that they're a good person, or that they'll never add malware to their repositories. It's just vouching for the most basic level of "when this person puts out a PR it's not AI slop".
NUL in the middle of a string is fine, types have no meaning, VARCHAR limits are just suggestions…
The flexible typing is the biggest WTF to me, especially because it necessitates insane affinity rules[0]. For example, you can declare that a column is of type “CHARINT” (or “VARCHARINT”, for that matter), and while that will match the rule for TEXT affinity (contains the string “CHAR”), it also matches the rule for INTEGER affinity (contains the string “INT”), and since that rule matches first, the column is given INTEGER affinity. "FLOATING POINT" maps to INTEGER since it ends in "INT", and "STRING" maps to NUMERIC since it doesn't match anything else.
Then there are the comparison rules (same link). NULL < NULL, INTEGER || REAL < TEXT < BLOB - but those may be altered at comparison time due to type conversion. Hex values as strings get coerced to 0 as INTEGER, but only if they're in the SQL text, not if they're stored in a table. Finally, no conversion takes place for ORDER BY operations.
This is particularly galling considering that most of sqlite3's display types (this is `markdown`) don't visually differentiate between string-types and numeric-types - I manually added the strings on rows (by PK) 2 and 4 to assist the explanation.
sqlite> CREATE TABLE foobar (id INTEGER NOT NULL PRIMARY KEY, b BLOB NOT NULL);
sqlite> INSERT INTO foobar (b) VALUES (10), ('10'), (0xA), ('0xA');
sqlite> SELECT id, b, 15 > b, '15' > b, 0xF > b, '0xF' > b FROM foobar ORDER BY b;
| id | b | 15 > b | '15' > b | 0xF > b | '0xF' > b |
|----|----- |--------|----------|---------|-----------|
| 1 | 10 | 1 | 1 | 1 | 1 |
| 3 | 10 | 1 | 1 | 1 | 1 |
| 4 | '0xA' | 0 | 1 | 0 | 1 |
| 2 | '10' | 0 | 1 | 0 | 0 |
SQLite is great, if and only if you use STRICT mode (and enable FK checks, if applicable). Otherwise, best of luck.
Yes, the problem is that many corporate resources cannot differentiate their roles from that of a glorified search engine. In fact, some experts on the human mind cannot effectively differentiate the human experience from that of a glorified search engine.
Some companies definitely do just exist on marketing. Some clothing brands are objectively overpriced crap and pure wealth signalling. Or something like a juicero.
But I agree Apple doesn't even though they've gone into a direction I couldn't follow them in.
Not really. They back it up with "good enough tech" that looks pretty and sucks people in with marketing, and then locks them into a closed ecosystem. Admittedly, some of their tech is actually very good (e.g. M-series ARM-based CPUs), but much of it is nothing special, or worse, just copying something else that competitors have been doing for years, presenting it as brand-new, and claiming credit for it.
They did this with the always-on screens for phones. My LGs had this many, many years ago. It was so bad that when Apple finally brought it out and acted like they had invented it, coworkers saw my LG and asked if I had gotten the latest iPhone, and I had to point out that it was a 5-year-old LG.
And then there's other stuff that Apple has which is just plain bad, but they present as new and wonderful, such as the "island" keyboard.
I'm planning to eventually launch an open source platform with the same name (peerweb.com) that I hope will be vastly more usable, with a distributed anti-abuse protocol, automatic asset distribution prioritization for highly-requested files, streaming UGC APIs (e.g. start uploading a video and immediately get a working sharable link before upload completion), proper integration with site URLs (no ugly uuids etc. visible or required in your site URLs), and adjustable latency thresholds to failover to normal CDNs whenever peers take too long to respond.
I put the project on hiatus years ago but I'm starting it back up soon! My project is not vibe coded and has thus far been manually architected with a deep consideration for both user and site owner expectations in the web ecosystem.
Well this is supposed to load a website in the browser like a "normal" website (doesn't work for me, stuck on "Connecting to peers...").
Just using a torrent client means that you have to download the website locally with a torrent client, and then open it in your browser. Most people wouldn't do that.
If it actually worked i could certainly see the value prop of not making users download a separate program. Generally downloading a separate program is a pretty big ask.