Well I think it's like, you come to enjoy an author's particular style of writing, and if people are just going to use ChatGPT to write things for them, then they're not going to develop any style. Everyone might even end up all sounding the same.
With a calculator this is a feature. We want computations to be the same after all. Everyone should be able to get the same results when they enter the same numbers in. But this homogeneity doesn't belong in writing.
What if people use them to try to learn a style? If someone is trying to write in a more active voice, or avoid gendered language, then are they suddenly not worth reading anymore?
I think it is worth pointing out that plenty of prose editors advised on words to remove from your sentences (IA Writer as an example) before LLMs were a thing. If they used one of these tools making suggestions, would they no longer be worth reading? What about the green squiggle of grammar errors before this?
Part of the appeal of ChatGPT compared to traditional writing assistant/grammar checkers is that you can tell it to write in different personas, although IME it's pretty spotty at this still. Lots of conversations that turn bland and repetitive quickly if you don't get lucky.
I'm not too worried about "default GPT style" becoming common, though, because I think it's more likely to be used by the people who have no style beyond "what I see on TV and in my family." Raising the floor, basically.
Anyone who wants their writing to stand out will still have to differentiate themselves. To put it another way: you're gonna be able to recognize the lazy users pretty quickly cause they're gonna have "GPT voice."
With a calculator this is a feature. We want computations to be the same after all. Everyone should be able to get the same results when they enter the same numbers in. But this homogeneity doesn't belong in writing.