Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is pretty clearly just a search engine with more parameters.

I thought there was something more going on with copilot, but the fact that it is regurgitating arbitrary code comments tells me that there is zero semantic analysis going on with the actual code being pulled in.



It's more that the model is so large it is capable of memorizing a lot. This can be seen in other language models like GPT-3 as well.

Comments, I suspect, will be more likely to be memorized since the model would be trained to make syntactically correct outputs, and a comment will always be syntactically correct. That would mean there is nothing to 'punish' bad comments.


The model in this case is just a lossy compression of github, and you search that.


It is decidedly not "just a search engine with more parameters." Language models are just prone to repeating training examples verbatim when they have a strong signal with the prompt. Arguably, in this case, it is the most correct continuation.


They openly claim it is an AI. What about the state of AI currently in use made you think that there was any intelligence behind it?


When do words lose their meaning? There's the word intelligence in the thing, and yet, no intelligence in the concept itself as known today.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: