How is this different than the Coinbase Shift card? I had a shift card and the fees on purchases made it prohibitively expensive so I stopped using it after a few purchases.
Shift just shut down, apparently. So that’s at least one difference. From their website:
> We hope you enjoyed using the Shift Card and truly thank you for your loyalty. We, unfortunately, will be retiring the program in April of this year. All Shift Cards will be officially deactivated on April 11, 2019.
Signal is not much of a PGP replacement. There's a lot that PGP can do that Signal can't: signing, encryption of large blobs at rest, and key management.
I use PGP as part of my backup solution, encrypting my backups at rest with an asymmetric key. I can't do that with Signal.
Signal does a better job of practically everything PGP does with regards to message encryption. Yes, PGP is more useful for encrypting files or signing updates. I agree that it's too early to write of PGP for those applications. But people should use Signal instead of PGP for message encryption.
> But people should use Signal instead of PGP for message encryption.
I won't disagree; when your usecase is in Signal's wheelhouse, by all means, use it.
But as someone who uses PGP regularly; the limit of how I could use Signal instead of PGP is limited to the occasional transfer of PII, passphrases, and private keys (something I couldn't use Signal for, since these are typically sent between GUI-less hosts). A very tiny fraction of my PGP usage.
"Signal does a better job of practically everything PGP does with regards to message encryption."
Except for endpoint security. The ultra-portable, self-contained implementations of PGP can run on countless configurations of desktop or embedded system. Transport methods also vary if they're funning messages or files through other apps. All sorts of hardening or isolation techniques can be applied. Remote attackers have a lot to look at trying to break or bypass GPG for an arbitrary user. Whereas, vast majority of Signal use relies on one app with two OS's.
The extra security tech and obfuscation you can layer on PGP/GPG is still an advantage in its favor until competition gets that.
> But people should use Signal instead of PGP for message encryption.
Signal the protocol or Signal the service? There does not appear to be a mature FOSS toolchain for the former that can replace gnupg and Thunderbird/Enigmail, and the latter is only available on Android and IOS smartphones.
Any chance you could elaborate on this? What about email is inimical to secret messages? Why are they less secret over email than through some other medium?
How would that work? Last I checked, Signal used Signal servers for key exchange (or whatever the equivalent in their lingo is). Is there any way to use Signal without relying on their servers?
Reminds me of something Seymour Cray (allegedly) once said: "Anyone can build a fast processor - the trick is to build a fast computer around it". Given the performance boost an SSD gives, I imagine there's some room left for improving overall system architecture without increasing the clock speed of the CPU or replacing the CPU at all.
Part of it is the balance of value between hardware and software companies. When hardware stagnates, software has to become increasingly complex, and software companies dominate.
If you only use your computer for web browsing, then sure.
But the consumer market is more than just this. Several popular "consumer" applications, such as gaming and photo/video editing, continue to see benefit from increases in CPU performance, and single-core performance (what most people actually want when they ask for higher clock speeds), continues to be very relevant to this day. Not all workloads are easily parallelizable.
Work grows to fill the available time. "IT" style tasks, like reading/writing documents, spreadsheets, emails, etc. don't require anything like the resources they currently use, or those they used in 2000. They use(d) those resources since there was no reason not to.
Lots of (most?) software development typically charges ahead without much regard to resource usage, until performance becomes a problem; then things are optimised until performance is no longer a problem, and the charge resumes. This results in software with performance which is just about acceptable, regardless of what resources are available. It was the case in 2000, it is the case now, and it would be the case if we had 50GHz machines.
This is the case for tasks where the main bottleneck is 'has anybody bothered to implement this yet?'; I'd say your examples of gaming and video editing are tasks where performance is a major part of the bottleneck. Arguably, Trello didn't exist in the 90s because nobody had bothered to make it yet; Skyrim didn't exist in the 90s because the machines weren't up to it.
For Desktop computing of normal people ? Maybe, but there are a lot of people and businesses that can always use more CPU power. High end CPUs will also eventually trickle down to lower powered devices and i still think modern Ultrabooks could be faster than they are.
Same in Chicago. If the metered taxi takes an inefficient route, Uber legally can not do a fare ajustment. It's nearly impossible to complain to the City of Chicago if it happens, so you end up having to pay whatever the meter displays.
The only site I allow to send me notifications is dictionary.com because I like getting a word of the day. Other than that, getting an alert from blogs every time a new article is posted gets old really fast.
I'm hesitant to recommend this but Flume is a wonderful full client Mac OS app (Hesitant because I don't know how it hasn't been shut down, and I love it dearly and use it daily)