Hacker Newsnew | past | comments | ask | show | jobs | submit | chromacity's commentslogin

I find the production and consumption of AI music to be uniquely... anti-human. You can make utilitarian arguments for most other uses of AI. For example, the code you're generating didn't exist before, and it would take serious time or money to write it. So, I get it, the economic argument is compelling enough.

But music? There's basically an inexhaustible supply of human-created tracks that can be accessed for next to nothing. Millions upon millions of them, in every conceivable style, for every conceivable mood. There's nothing you gain by listening to AI music day-to-day, so what's the argument for it - other than utmost indifference to human creativity?


But what if you like to listen to a specific genre? Say electo-swing (https://en.wikipedia.org/wiki/Electro_swing)

There isn't that much good electro-swing made by humans, and not much new coming out. One can easily consume it all and want to hear some new tunes in that genre, and maybe AI can help with that.


If you’re really listening to AI electro-swing then… I just have no words.

Neither does the electro-swing, probably

Many people just play music as background noise. Having a bland, generic, vanilla AI music playlist is a bonus.

I think that's probably the crux of where there's conflict here. There was a time in my life where I definitely much more emotionally invested in the music listened to. I thought I'd definitely kill myself if I ever went deaf. But these days, I really just have it for background noise when I'm working, exercising, doing chores. And it's all just electronic stuff – I don't like vocals (unless they're sufficiently unintelligible so they don't become a distraction to my thinking). At the end of the day, it's just some beats to me. AI or not.

I can recommend you to spend some free time to really listen to music again, Beethoven, Hendrix, Gorillaz, Slayer, Sub Focus, whatever floats your boats your boat. Your brain is wired to remember and sing along to music around a campfire, and will pump you full of exquisite drugs if you really give into it, ideally together with other people. Alleviates stress and makes you happy.

Music demoted to just background noise is unrelated to the social concept of music, which is so ingrained in our nature that we all can’t escape it. And that to me is also why I agree with OP—AI-generated music is fundamentally treason to our species.


Have an opposition to the 7 distributal cents of the Spotify subscription going to a lab instead of Taylor.

(Assuming the lab didn't license anything fairly.)


Sometimes you can't even tell. I was in an uber drive where the driver had this incredible playlist of Brazilian bossa nova. It was sublime and some of the best tracks I ever heard. He even said he loves the singer but cannot find their name anywhere. It turns out it's a youtube playlist that is fully AI generated and genuinely some of the best bossa nova you can imagine. I still hear that playlist daily tbh. Moreover, imagine if you are an independent musician, have a good voice, know how to play instruments...you could ask AI to generate hit tracks for you and then you can play them at concerts or shows and claim them for yourself

What's the playlist? Curious as a bossa enjoyer.

>But music? There's basically an inexhaustible supply of human-created tracks that can be accessed for next to nothing.

Isn’t this an argument against all new music, even human made?

Either we have it all already, or there’s room for new things that we haven’t heard before.


[flagged]


This is what we call in the business a "fever dream"

> As far as I'm concerned we're content scarce and I don't care what makes the music - humans, robots, netherworld demons - I just want good music.

Presumably you've already listened to every piece of music ever recorded? Otherwise it seems it would be more efficient to do that first than wait for AI to generate it and you chancing upon it.


All good finds are chanced upon. Just now sometimes it's made by AI.

[flagged]


You’re not a machine. I’m also tired of hearing that ontological take spouted by AI enthusiasts.

I think humans are machines, they are just vastly more advanced than any machine invented by humans. This is something I thought long before the current AI hype cycle.

What do you think are some important differences between machines and humans?


Is there a difference in how you treat machines vs how you treat humans?

If you're just a machine, can I unplug you?

The same way you can unplug a laptop, I guess?

Oh that's what you're banging on about. You think AI is like a demon, or you think LLMs are people too, something like that, hence "I don't care what makes the music". That would otherwise be a spooky and implausible phrase that says something strange about what gives music quality, as if quality in music is something ethereal and mathematical and objective and detached from the human condition, and detached from artists. But if you think the AI counts as a person too then it seems less cold and abstract.

Are you really suggesting quality in music isn’t largely mathematical?

Is formulaic pop music produced by a corporate label that's designed to push all the right buttons more "human" than the average track you find published on Suno? I wouldn't say so. Pop music was already to some extent a commodity.

Actually, it is more human, because there are humans involved at each level. Doesn't matter if you think the music sucks, it's definitionally more human than AI music.

It is sort of a blend now. Beats and rhythm tracks are often generated. Vocals are auto-tuned. There's still some humanity in it, but it's not what it used to be.

AI music is generated from the result of training on far more human-made music than any human could ever consume in their lifetime, so there are even more humans involved in its creation.

Just like AI comments are more human than any human could ever produce... /s

There's a difference between entertainment and thoughtful content.

> Pop music was already to some extent a commodity.

The commodification of humanity predates human history. It may be a negative trend that alienates us from each other and from the products of our labor, but it is truly ancient.


> Pop music was already to some extent a commodity.

And as everyone knows, some commodification of some thing means we must go ahead and totally commodify all the things.


Also, a lot of the people who hate and resist AI slop also hate and resist corpo slop, we're just outnumbered.

That's disingenuous. The point is that "human" isn't a particularly good dividing line if you want to distinguish music with value vs music without.

Used Suno to reimagine a handful of my old demos late last year, and honestly the results floored me. I could never release those tracks though, purely out of shame. But it seems pretty practical to study the AI remixes to understand what I like about them, and use these as a practice tool for music production.

> what's the argument for it

Record companies can sell it and don't have to pay any royalties. They only pay the artists pennies as it is, but that's too much for them.


Electronic music exists but has limited commercial scope because most people don't see the point of music if they can't form an emotional connection with the artist through the music. Popular music has an intense focus on the artist.

AI "music" has the same issues as electronic music but worse: because it's trying to imitate humans rather than be its own thing like electronic music, it's not only emotionally unavailable but also creepy. Can you imagine listening to an AI "musician" laughing, for instance? It makes my skin crawl even thinking about it.


because most people don't see the point of music if they can't form an emotional connection with the artist through the music

Strong disagree on "most"; most people listen to music simply because it sounds good. For that, AI serves the purpose very well.


That's a dangerous game to play, though—the only value record companies have is their intellectual property, especially if they are no longer financing recording new material. Convincing people to listen to slop is a great way to completely obsolete yourself.

Not only that, but music generated by AI is not copyrightable. If it's truly 100% AI generated, you can redistribute it to your heart's content without infringement. (IANAL)

Someone will surely attempt some kind of end-run around this, perhaps through ToS alterations at the service you obtain the music from, but it's undoubtedly a problem for the labels. In the meantime they have a strong incentive to keep human creativity in the loop.

To me the anti-AI crowd is looking at this through the wrong lens, it's now possible to generate an infinite library of music that isn't copyrighted, and can be freely shared, some of which is quite good. There is a pathway all the way from conception to mass distribution that doesn't require the major labels. Whatever else happens that seems like a silver lining at least.


it's now possible to generate an infinite library of music that isn't copyrighted, and can be freely shared, some of which is quite good.

Many YouTube channels started using AI music because they were getting sick of copyright strikes, and I agree some of it is actually very good.


they def pay artists more than pennies on the dollar

artists complaining about not making enough is like programmers complaining their 7 star repo on github isn't making them enough on ko-fi

I mean my github is like that but I wouldn't expect to live off it unless I was Evan You


There's basically an inexhaustible supply of human-created tracks that can be accessed for next to nothing

You train an AI on that, in order to create something that combines all of the best parts that you want. If anything, I think AI music is the natural progression of innate human desire to leverage and "stand on the shoulders of giants" to create something bigger from smaller pieces.


I guess using AI is just the logical continuation of what mainstream pop already did before that: reduce music to the lowest common denominator so it can appeal to as many listeners as possible. AI only speeds up that process.

while I don't like AI music, "Millions upon millions of them, in every conceivable style, for every conceivable mood." is something that's not true. There very often is a gap which forces me to open up Ableton and make edits

If you consider say elevator music - music that's just there to fill space, rather than to be listened too - then I don't think there's that much difference between using AI to produce it and using AI to produce clip art or boilerplate code.

Music as wallpaper vs music as artistic paintings.

We are fine with mass-producing wallpaper with machines. People buy this every day, no problem.

We are not fine with mass-producing framed paintings that are "art".

Both hang on the wall as decoration. Essentially the same purpose. But we have very different feelings about them and hold them to very different standards.

Music is the same. We have muzak - background music that isn't supposed to be listened to, it's just wallpaper. I don't think many people object to this being machine-made in bulk. And then we have music that is art and is supposed to be listened to explicitly. We hold this to a higher standard and expect it to be the product of human creative urges.



Relevant Basquiat quote:

“Art is how we decorate space, music is how we decorate time.”


I have the sudden urge to frame some wallpaper.

> We are not fine with mass-producing framed paintings that are "art".

Uhh... Cheap, basically AI generated art for home decor definitely exists.

> And then we have music that is art and is supposed to be listened to explicitly

Just like how most people are not sommeliers, most people just listen to pop music "slop"


> We are not fine with mass-producing framed paintings that are "art".

China is full of factories where exactly this is being done and people are fine with this.

https://news.ycombinator.com/item?id=15742507


Well, code and visual art is more differentiated, so the thing you need probably doesn't exist and it would take effort & money to procure it. Not always, but often enough to make rational people default to AI.

With music... if there's a style you like, no matter how eclectic, there are probably thousands matching human-recorded tracks you can listen to today.


Because human singers will usually sing about what they like. They will use their own life experience and imagination to write and sing songs. Other people may or may not like them.

AI will only sing songs that other people like, so AI singers will naturally attract more listeners.


AI will only sing songs that someone wanted sung, and that someone might not be a particularly good singer at all.

You’ve hit upon a bit of a paradox inherent in music - the average listener really gives next to no shits about human creativity or the artistry and hard work that goes into being a musician capable of releasing music. They can’t even comprehend, so don’t. Music is something that comes out of a speaker same as water is something that comes out of a tap.

You can repeat your argument with photos, poems, code??, and just about anything else that humans produce.

Not that you're wrong, but human creativity doesn't mean what it used to.


its changed the way I DJ.... I can be much more expressive.

It's not that people want to listen to AI music, per se. According to the article, this artist charting was part of an April fools gag. It's about ego, or maybe hubris. People think their idea for a record is good, but don't want to learn musical composition. Instead, they put blind faith in AI generation. Gen AI is more for the idea men unwilling to put in the effort than the consumers.

It isn’t indifference, it’s obliviousness. My mother keeps on listening to AI music, and I’ll be like “why are you listening to this slop” and she’ll then argue back that it isn’t AI, it’s actually really very good and I’m just jealous, as the synthetic voice continues to warble nonsenses like a fucking arcade machine full of snakes in the background.

It’s an even more uncomfortable truth: your average Joe cannot tell the difference between human made music or AI generations, just as they also really think that that 8 year old African boy with a huge beard and no hands built a helicopter out of old bottles, or that that cat walked into a hairdresser wearing a suit and had its whiskers curled.

So there’s no argument for it apart from “people will buy the product because they can’t tell that it isn’t real”.


> I find the production and consumption of AI music to be uniquely... anti-human.

I mean, I'm a professional musician - not sure if that gives me more credibility or less - but I don't feel slighted by folks listening to music made by others (whether those others are other humans, or birds, or whales, or AI).

As you point out, music has an infinite edge; one can spend a lifetime exploring either its niches or its closures and still have an infinity of each to continue discovering.

As moat identification goes, I do feel slightly secure in the sense that AI music (and the information age generally) seems to stoke a hunger for dirty traditionals played well on thick steel strings, and it's going to be a minute before robots can pick 'em like we can.


Have you heard of dubstep? It sounds like robots falling down stairs, and humans made it and love it. If AI can make music less crappy, I'm all for it.

Eh. It doesn't start or stop with people like Altman, Zuckerberg, or Nadella. I think it's a symptom of a broader problem in tech. Half the people on this site made a decision to work at companies that do shady things, and they did that to maximize personal wealth.

The difference isn't that the average techie doesn't dream of making a billion by any means necessary; it's that most of us don't think we have a shot, so we stick to enabling lesser evils to retire with mere millions in the bank.


I don't think it's all that hard to avoid working on anything shady. It's not as easy to avoid being associated with anything shady due to widespread cynicism and a tendency to treat tech companies with thousands of projects as a monolith.

> The difference isn't that the average techie doesn't dream of making a billion by any means necessary

That's actually the difference, most people don't want a billion


> The difference isn't that the average techie doesn't dream of making a billion by any means necessary

I hope that's not true. If it is, we live in a bleak world indeed.

I can confidently say I've never once dreamed of having billions. I've never wanted billions. Not even in a fanciful manner. What would I do with that money? Buy mansions and megayachts? That's loser stuff

Most of what I want out of life cannot be bought. The pieces that come with a price tag, like a comfortable home, do not require billions

I think only sociopaths want billions because they don't understand spending your life seeking things that actually matter, like family and human connection


There are three possible paths that sort of substantiate current valuations:

1) Business: LLMs become essential to every company, and you become rich by selling the best enterprise tools to everyone.

2) Consumer: LLMs cannibalize search and a good chunk of the internet, so people end up interacting with your AI assistant instead of opening any websites. You start serving ads and take Google's lunch.

3) Superhuman AGI: you beat everyone else to the punch to build a life form superior to humans, this doesn't end up in a disaster, and you then steal underpants, ???, profit.

Anthropic is clearly betting on #1. Google decided to beat everyone else to #2, and they can probably do it better and more cheaply than others because of their existing infra and the way they're plugged into people's digital lives. And OpenAI... I guess banked on #3 and this is perhaps looking less certain now?


Indeed. Gotta keep that body in a tip-top shape so that we can pull off all-nighters at some dude's AI startup while eating pizza and pretzels.

I'm increasingly unsure if this is something to aspire for. I make an effort to only follow people I know, and I turn off algorithmic feeds on social media, but it doesn't matter because the people I know routinely reshare made-up political bait and AI slop that's coming from the broader ecosystem.

This sucks and there's no way to push back on that. First, if you do it too much, you're just a "reply guy" - you become a part of the same suckiness of social media that you're trying to push back against. Second, the near-universal reaction you get is "maybe these specific immigrants were not eating pets, but you gotta agree with my broader concern about immigration". This just an example, the reaction has its equivalent for all sides of the political spectrum. We just like to read stuff that aligns with our political identity and beliefs. The pursuit of truth is a distant second.

I think that for social networks or forums to be at least somewhat healthy, they need to be small, specifically to limit the interactions you have with complete strangers and content that doesn't interest you at all. If you open up the ecosystem too much, it devolves into some flavor of Facebook.


I have rarely been in a group chat that suffers the same problem as I see on all the other social networks (Google+ and its "Circles" seemed promising while it lasted...) but it could be because leaving a single chat is easier than leaving an entire network and the group is well-defined. Federation is good but it's not enough on its own. If I think back far enough I do remember email chain letters with people forwarding everything to everyone on in their Eudora addressbook:

> Now for 180. Forward stupid chain letters to as many people as you can. > Remember: Be annoying whenever possible

https://patorjk.com/misc/chainletters/179waystoannoypeople.h...

---

It seems to me that if a given social media network is not an effective way for you to connect with someone then try something else. Expecting one platform to handle all out social connections is unreasonable. Some people live on Discord and others prefer a phone call etc. A world with everyone on IRC would be convenient but probably also a nightmare once someone figured out how to make money off with it.


Yeah, I agree that small social networks are better. But some people are just bad at using social media - even if they're great people in other ways, so they share AI slop and made up political bait posts. You may have to curate your feed a bit.

> Everyone who depends on the good graces of a cloud provider for something (not just Google, but Amazon, Microsoft, Apple, whatever) needs to at the very least, take a moment, and figure out what their plan is when they are suddenly banned and locked out permanently, without any way to contact the company.

This is one of the most common sentiments I hear expressed on HN, next only to "if you're not building your software business around Claude Code, you're gonna left behind".


It probably has to do with the fact that we condition children and adolescents to consider white-collar jobs as more noble than blue-collar jobs, then we tell them that to get a good white-collar job, they need a degree... and then we make STEM degrees hard by subjecting students to more math than most people realistically need. So we have a lot of frontend developers who know calculus and an oversupply of people with humanities degrees.

With that degree, you're generally pushed toward jobs in journalism, publishing, graphic design, teaching, administrative functions, and so on. Most of these pay relatively little.


Calculus is required for English degrees in other countries. Heck a lot of countries require some amount of calculus just to graduate high school.

Same goes for the basics of statistics. A basic understanding of statistics is a requirement for any college degree in many countries, and for good reasons. Stats comes up all the damn time. From proper A/B testing, to marketing, to understanding public health emergencies, to making informed medical decisions.


I understand the value of statistics. But calculus? I say this, as someone who took 6 semesters of calculus in college.

6 semesters seems like... a lot? IIRC getting a math undergrad at my Uni didn't require that many classes of calc.

I think calc 1 and 2 are extremely valuable. The concept of rate of change is fundamental to so many things in life, and understanding "area under the curve" is essential to understanding how many ideas are communicated, including lots of graphs in physics, chemistry, and economics.

Beyond that I feel calculus starts getting into specific applications and is less generally applicable to the populace at large.


6 quarters, not 6 semesters!

Decades later, I wish I had more linear algebra.


Publishing : standard English major career track :: Gaming : standard CS major career track.

It's not much more complicated than that.


I don't think it's a matter of more 'noble', simply a more comfortable option if it's available to you. It has historically paid better and taken a lower toll on your body. The former is now less true, but the latter is still a big issue.

It's a shame that calculus isn't required by every college degree. Just because I'm not integrating functions during my normal work, doesn't mean I don't benefit from understanding the fundamental principles.

Yes, totally. I was about to undero surgery but found out the doctor didn't even know about Laplace transforms. He small-mindedly spent his formative years learning anatomy, never benefitting from the knowledge of frequency-domain derivatives. I dodged that bullet by storming out.

You joke, but if you talked to a doctor of radiology odds are they at least took a class covering Fourier Transforms.

Would you say the same about learning Christianity: maybe not directly useful for your job, however it is rather foundational to much of English society.

Yeah! I've found that learning the foundations of religions is a great way to inoculate people from worst aspects of those ideas.

The number of people with humanities degrees who also could successfully obtain a rigorous CS or engineering degree is not very large.

I suggest you revisit your hypothesis with a little less bias.


The reverse is also true.

My current hypothesis is that as AI forces software development down less and less deterministic pathways, I suspect that the value of a basic CS degree will diminish relative to humanities training. Comfort with ambiguity, an ability to construct a workable "theory of mind", and to construct unambiguous natural-language prompts will become more relevant than grokking standard algorithms.


The reverse most certainly is not true, and even if it were it wouldn't matter.

Humanities advocates have been hoping for the demise of valuable STEM degrees for at least the last 30 years. It's not happening for many reasons, of them being: All the skills you listed are also taught in an engineering and rigorous CS curriculum, plus those degrees provide validation that the individual is intelligent and determined enough to complete coursework that most people cannot.


I dunno, man. The difficulty (and resentment of having to even take them) most STEM majors had in my college-level writing classes causes me to doubt that, as does the general reaction on this board to any kind of problem / domain with irreducible ambiguity. But look, I'm not talking about the top ~10%, or whatever: the really smart kids can adapt to whatever gets thrown at them[0]. I'm doubtful that a 50th-percentile or below CS degree / student will retain the value that they've recently had - and given what I read on here about the present job market for new grads on here, that's maybe already happening.

Anyway, I had to pick one, my money'd be on philosophy degrees rising in value: they're already sought out by financial firms. Have you seen the sort of analytical / symbolic reasoning they do?

[0] In fact, in case you didn't know, rigorous humanities programs and research involve an awful lot of statistics and coding, even though the dinosaurs that run the MLA and most English departments aren't able to handle it.


> I dunno, man. The difficulty (and resentment of having to even take them) most STEM majors had in my college-level writing classes causes me to doubt that, as does the general reaction on this board to any kind of problem / domain with irreducible ambiguity.

I don't think most STEM majors would be outstanding English Literature (or whatever humanities program you prefer) majors, but I do think they could manage to obtain a degree. Very, very few humanities majors could get an engineering degree.

And yes, the writing classes they force engineers to take are largely pointless and not enjoyable. Everyone with a degree got through them though, and I have to imagine the percentage of STEM students who washed out on that and not organic chemistry, compiler design, differential equations, etc. is extremely small (it was 0 out of the hundreds of people I knew at my school).

> But look, I'm not talking about the top ~10%, or whatever: the really smart kids can adapt to whatever gets thrown at them[0].

Sure. Very few of these kids are going into publishing, because they'll have more lucrative options and will pursue them.

> I'm doubtful that a 50th-percentile or below CS degree / student will retain the value that they've recently had - and given what I read on here about the present job market for new grads on here, that's maybe already happening.

That may be, but they're still in better shape than a 50% percentile humanities degree holder, who also is having the value of their skillset eroded by AI.

> Anyway, I had to pick one, my money'd be on philosophy degrees rising in value: they're already sought out by financial firms. Have you seen the sort of analytical / symbolic reasoning they do?

Lol, they are not "sought out" in any sense of the word. Philosophy majors at top tier schools are sought out because everyone at the school is sought out, not because they majored in philosophy.

And yes, I took a number of philosophy classes in college as an undergrad because they were easy (have you seen the analytical/symbolic reasoning required of EE or CS majors? It's a lot more difficult that what is required of philosophy majors).


> [50th percentile CS grads] are still in better shape than a 50% percentile humanities degree holder, who also is having the value of their skillset eroded by AI.

That's the crux of it, and right now it appears to me that the ability to write unambiguous natural language prompts - in a variety of contexts, not specifically heavy-duty dev work - is going to be increasingly valuable. The 50th percentile english / philosophy grad is better at that than the 50th percentile CS major - while, at the same time, the bottom rungs of the developer ladder appear to have been kicked out.

I'm trying very hard not to make this into a "who's smarter?" question. That's a well-trodden and pointless argument, particularly if money is going to be the measuring stick. Besides, if that's where we're going, the finance bros and C-suite win, and do either of us think they're the geniuses in the room?

But, we'll see. We're living in Interesting Times.


It's a meaningless, feel-good rule. Every country has countless carve-outs. To give you a trivial example: in the US, you can't get a passport if you owe more than $2,500 in child support.

As of 2015 (FAST Act), your passport will be revoked if you owe more than $66,000 in unpaid taxes.

Whilst I agree, to be fair, a passport is usually only needed when entering a country, not leaving one, right? Under the cited rule, the US needs to allow you to leave, not help you in entering some other country.

I have yet to leave a country (well, a state technically) without having to show a passport - with the exception of the Schengen area.

That's mostly because transport companies have to pay to ship you back if you get turned away at the border, so they will want to see your permission to enter your destination country before you leave. I've traveled internationally a fair bit and I've never had to show my passport to government officials when leaving the US.

Don't the TSA count as government agents? I don't have a problem with these checks, but I do believe the TSA does them, no?

TSA needs some form of ID but they’ll accept non-passport ID even if you’re traveling internationally.

TSA doesn't even need an ID, if you don't have one they just take your information and check it against some databases to "confirm" your identity.

Would they do that for an international departure? They know where you’re flying, and I’d think they’d just tell you to stop being an idiot and show them the passport you obviously must have. But policies can be weird, so maybe not.

Ah, that's right. But don't airlines check passports then? I vaguely remember needing to provide a passport at boarding time.

Yes, that's what I said above. The US government doesn't give a toss, but the airline has to fly you back if you're refused entry at your destination, so they will do their best to ensure you have the documents you need.

I can drive to Canada with my driver license.

I mean, really not trying to frame this in any way, but asylum seekers do it all the time.

Ok, fair enough, but if I were German - I don't really think I would asylum anywhere on the basis of Germany maybe intending to conscript me in the future.

I'm reasonably sure Russia would take you.

I rather doubt it, but - can you back that up by some examples at least?

You generally do present your passport when leaving. Most places you get an exit stamp (which matches your entry stamp). They usually confirm things such as not overstaying a visa.

ex:

overstaying in Thailand results in a on-the-spot fine

China lately has exit checks when traveling to SEA (they try to intercept people traveling to scam centers)


It is quite difficult to leave a country without simultaneously entering another

It is trivial for any country that is not land-locked. You just have to sail to international waters. What is difficult is to stay there.

> Delve doesn’t even have the ability to claim anything they’ve done as original, unless you count fraud as a service.

I'd wager there's some prior art...


I think the simple reason why small web / webring sites don't work is that if you're in the mood of "let's pull the handle on the internet slot machine and see what it surprises me with today", then social media does a better job. Without fail, it gives you something to be outraged about or impressed with.

And if you're looking for something specific - "I want to learn category theory" - then you don't visit a small web site because the content you're looking for is probably not on any woefully short, hand-curated list of URLs. So you do a normal web search (or ask your chatbot).

Another problem with web rings is that if you're hopping sites at random, you more often than not end up someplace weird in 3-5 hops. I guess it's the internet version of six degrees of separation: you're always at most six clicks away from neo-Nazis or SEO spammers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: