My software development skillset has improved. I’m learning and stress testing new patterns that would have taken far longer pre-AI. I’m also working in new domains and tech stacks that would have taken me much longer to get up to speed on.
“Cryptic” exit posts are basically noise. If we are going to evaluate vendors, it should be on observable behavior and track record: model capability on your workloads, reliability, security posture, pricing, and support. Any major lab will have employees with strong opinions on the way out. That is not evidence by itself.
We recently had an employee leave our team, posting an extensive essay on LinkedIn, "exposing" the company and claiming a whole host of wrong-doing that went somewhat viral. The reality is, she just wasn't very good at her job and was fired after failing to improve following a performance plan by management. We all knew she was slacking and despite liking her on a personal level, knew that she wasn't right for what is a relatively high-functioning team. It was shocking to see some of the outright lies in that post, that effectively stemmed from bitterness at being let go.
The 'boy (or girl) who cried wolf' isn't just a story. It's a lesson for both the person, and the village who hears them.
Same thing happened to us. Me and a C level guy were personally attacked. It feels really bad to see someone you actually tried really hard to help fit in , but just couldn’t despite really wanting the person to succeed, come around and accuse you of things that clearly aren’t true. HR got the to remove the “review” eventually but now there’s a little worry about what the team really thinks, whether they would do the same in some future layoff (we never had any, the person just wasn’t very good).
Thankfully it’s been a while but we had a similar situation in a previous job. There’s absolutely no upside to the company or any (ex) team members weighing in unless it’s absolutely egregious, so you’re only going to get one side of the story.
Cultural obituaries are often premature, and the one for literacy is no exception. A nascent contrary impulse is emerging: readers deliberately turning to long-form works as a form of intellectual resistance. I’ve been working through Norman Lewis’s Word Power Made Easy and Tom Heehler’s The Well-Spoken Thesaurus, not just to expand vocabulary but to restore the sinew of productive speech.
That project led me to conscript AI as a private tutor. With custom instructions, ChatGPT and Gemini now surface new words and nudge my prose toward clarity, turning a vague fear of erosion into conviction. A dedicated subset of users will inevitably harness such tools to strengthen their expressive range and communicative precision.
Until recently, my writing rarely left emails and journals. Now, with AI as scaffold and sparring partner, I draft short stories from my own life and recast them in the voices of authors I admire. This feels less like a technology poised to supplant teachers, and more like the substrate for a renaissance in autodidactic education.
There are several strategies companies can employ. One common approach is to raise an extension or bridge round. Many startups are adopting this method, with estimates indicating that approximately 40% of current funding rounds fall into this category.
In these cases, companies raise funds at the same valuation as their previous round, often labeled as Series A+ or Series C+ or Series B Extension.
Another, less common strategy involves using a SAFE (Simple Agreement for Future Equity), which will convert to equity during the next priced round.