Hacker Newsnew | past | comments | ask | show | jobs | submit | more streamofdigits's commentslogin

if the hardware landscape settles for a while (in terms of the capabilities of devices and networks - which is not a crazy precondition given Moore's law has run its course) then it may be that these server/client pendulum oscillations get damped around an "optimum" of sorts.

An optimum defined both in terms of what it enables users to do with software but also how easy it is for developers to deliver it. Since simple is better than complex it feels that any architecture that does not require two distinct ecosystems might have an advantage (all else being equal).


If it was just a "scam" it would be easier to resolve this. Alas it is a complex and evolving mix of all sorts of factors:

* new software architectures based on cryptography and additional communication patterns (aka consensus) which may or may not have long term utility, but are "real"

* a loose "anti-systemic" ideology that is in part deserved but also part half-baked / / misinformed / delusional. In any case it creates lots of talking points in societies that are mismanaged and "angry"

* buy-in and (partial) legitimization from frustrated parties (eg NGO's) that want to "solve the world's problems" and think blockchain might be the tool to fix humanity's faults

* defensive engagement by systemic parties (the same people that have brought society to its knees) that want to embrace and extinguish (just in case it might actually work)

* opportunistic engagement by intermediaries that could not care less what "asset" they make a market in, provided they get a cut

* opportunistic engagement by marketeers that could not care less what channel they use, provided it generates some exposure for their clients (NFT)

* greedy techies that feel left out from the tech oligopolies and found an alternative exploitative path to riches (print your own money)

* a younger generation that is 1) flush with cash, 2) has never had experienced fleecing by smooth operators so has no behavioral immunity 3) thinks this is all very "tech-savvy" and modern and thus kosher

* actual scammers, that ride on all the above


The largest collection of talks that never walked


Both BeeWare and Kivy are great projects adressing a space that could surely produce some very interesting applications. Offering seamless integration of the versatile python stack on mobile devices would be a game changer as far as I am concerned, turning the mobile more into a computing device as opposed to merely facilitating the fastest click to the cloud.

They seem under-resourced, though, especially BeeWare. Targeting platforms that do not support python natively feels like a task that would challenge even much larger teams


R is frequently compared with python and julia which are general purpose programming languages but it is not really a proper comparison. Once you approach R as a domain specific language / system then its various quirks and pecularities are more palatable and explainable: they are in a sense the price to pay for tapping a large domain of statistical analysis expertise that is not available elsewhere.


This is mental gymnastics. People have some job to do and are looking for an appropriate tool for it; sometimes that’s R and other times it isn’t. Who cares if you call it a DSL or a general purpose language. If I want to do something and the language makes it difficult, telling myself “oh but it’s a DSL” doesn’t get me any closer to solving my problem.


> If I want to do something and the language makes it difficult, telling myself “oh but it’s a DSL” doesn’t get me any closer to solving my problem.

Unless the thing that makes the language difficult is your expecations. In that case, offering you an alternative mental model that helps you make better decisions when using the language does get you closer to solving your problem.


>makes it more difficult

Yes, sure, as long as you recognize that as a very subjective determination.

From the statistician's non-programmer POV the syntax of R or some other language are similarly opaque. Learning one vs. another will present similar investments in time. From their perspective, R does not make things more difficult, and the fact that it's more of the lingua franca within the field has it's own benefits.

The people I see complain about R are usually people that learned a different general purpose language first and find that when work requires data analysis they much prefer the GPL for working through the non-analytical portions if their work. (Especially with python where pandas and numpy have made less specialized tasks much easier)


From a statisticians POV the R syntax is great. Here is the t test:

t.test(x, y = NULL, alternative = c("two.sided", "less", "greater"), mu = 0, paired = FALSE, var.equal = FALSE, conf.level = 0.95, …)

A statistician opens the vignette and already knows what all of these variables represent mathematically, and can begin producing analysis immediately.


Yes, precisely. Very much not the pythonic way but that only matters if your prior background before R was python. If your background was SPSS then many of these would be drop downs or check boxes, and (IMO) it's superior to the SPSS scripting language as well.

Heck, my background before using R was python and SPSS and I still prefer R for precisely the example you gave: fine-grained control built in as above, specifying how to handle missing values etc.

I end up using python for large scale data prep.


It's important to keep this in mind though because R (or rather S) is primarily supposed to be used interactively. A prof of mine used to call the R REPL and then go on from there. He called an editor from the REPL, wrote source files from the REPL etc. Once you see someone working with R like that, you start seeing R as what it is.

The beautiful it is to be used interactively, it really takes a lot of practice to write reliable code that doesn't abort with some error now and then.


I think the point about interactivity is pretty well understood. Another comment in the thread pointed out how the majority of people who write R do it in RStudio and RStudio's defaults push an interactive workflow on the users (the nature of the work you do has a similar effect). So even for someone very new to the language it's pretty obvious.


In a strange way the adoption of JS SPA's has made the "thinking-not-required" backend frameworks like Rails and Django less "opinionated" and thus requiring more thinking.

I don't know about the Ruby/Rails ecosystem but in Django there are now quite a few different ways to deliver the "same" user functionality depending on how much one splits the load between front / backend, whether and how much it is structured around DRF etc.


this seems to be the main new data point that is now evident for all to see

I mean it is sort of obvious to anybody not captured and with basic morals but such is the allure of greed that for ages people were cynically and hypocritically pretending otherwise


It may really be time to re-examine the text-only proposition (gemini style). There is this fear of missing out on visual content but I suspect this too could be handled.


As people commented already, meta / facebook will be a enormous cash cow for quite some time. This raises the interesting question of whether it could actually reinvent itself in some way.

Having an almost 100% concentration on a business model that is (thankfully) increasingly seen as a socially detrimental aberration, it means that they would need to diversify into more conventional tech business models the way, e.g., Alphabet/Google is trying to do [0]

The problem is, of-course, that honest tech business models are a well occupied ecological niche and in the absence of some regulatory/political granted monopoly the competition tends to turn lethal.

They could launch a cloud business for example, with the unique selling point: we know best how to collect and monetize your data, so we know best how to protect it :-).

[0] I am dismissing the "metaverse" thingy as some sort of smoke and mirrors that seems to be necessary to provide cover for precisely the kind of news now being discussed


Imagine this dismall planet dotted with decrepit nuclear plants, even as the societies that are supposed to keep them safe collapse either of their own accord, or due to interminable warfare.

When deciding on the wisdom of adopting a technology that will shape our fate for the next century you don't condition on the best (or even expected) scenario, you condition on the worst plausible scenario.

For a long as we haven't solved our obvious and serious socio-economic pathologies it is actually safer to keep burning carbon. At least that damage is in-principle reversible.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: