Hacker Newsnew | past | comments | ask | show | jobs | submit | kewp's commentslogin

Tried to put it into my svelte project. Build failed. Went onto github to report it. A reproduction url is required ... so not submitting a bug ... and looking for something else to use ...


Hello kewp,

We appreciate your feedback.

Honestly, we've just shipped our first release and we will be creating guides for the framework integration soon.

https://flyonui.com/docs/getting-started/framework-guides/

However, you can mention localhost in the URL if you're trying on the local machine or you can share stackblitz URL with setup which you have tried.


What is RAG?


Retrieval Augmented Generation. Fancy way of saying, retrieve chunks from your document corpus similar to your input using a similarity (mostly cosine) of embedding vectors of the chunks and input vectors, then inject those relevant chunks into your prompt to the LLM. Useful for Document Intelligence.


Thank you for the explanation, I remember hearing about this now


Retrieval Augmented Generation - using search (usually with some kind of semantic component) to find relevant context and provide it to the language model to help it respond, give it knowledge about a specific document, etc.


Oh right, I remember now. Thank you for explaining it


I wish he had tried instead to do the faster subset of typescript, that is a pet peeve of mine and I've love to see how it would be done!


AssemblyScript?


This is great, have been looking for something like this.


This is how I felt when trying to learn Elm: the program had to be correct, exactly correct, or it wouldn't work. You had to make every piece, every function, fit precisely, to define it's shape, it's exact inputs and outputs and effects ... in the end I found it very restrictive. I like the idea of loose-ness by default and adding contraints gradually (like javascript -> typescript).


Someone on YouTube mentioned that LK99 is ceramic ie not metal which nullifies many of its potential uses even if it does turn out to be superconducting


YBCO is also not a metal and has also been hard to use, but it has its uses.


Do you have any idea how many crazy things are said by "Someone on YouTube"?


I loved reading this, it made me emotional. I've always been afraid of losing my sight and this made me feel less scared.


are these LLMs just answering the question "if you found this text on the internet (the prompt) what would most likely follow" ?


In essence, yes I think, but... isn't that essentially not much different than what I'm doing in making this comment?


That's how they are trained initially, but the resulting model isn't all that useful (was SOTA two years ago but this field moves fast).

A lot of the utility comes from the later finetuning. You can see this using the examples from the article, every mistake they identify with GPT-3 (which is the unfinetuned version) is answered correctly by chatGPT, which has gone through an extensive finetuning process called RLHF.


Yes, they are being trained, to simplify, to complete sentences. You can then use the resulting model to do lots of things.

How you train a model and the inference jobs it can do don't necessarily have to be the same.


That's how the text decoder works, but the model gets to define "most likely" and an RLHF model uses this to make the text decoder produce useful answers instead.


anyone know how to use this? kind of confusing install instructions in the readme


If you don't care what exact tool in particular, https://github.com/AUTOMATIC1111/stable-diffusion-webui is the easiest to install I think and gives lots of toys to play with (txt2img, img2img, teach it your likeness, etc.)


If you're used to installing python packages it should be relatively easy. There are other projects with nice UIs but that's not what this library is for.


Isn't it possible that intelligence is P(words|every sequence of words you've ever heard)?


Then where would sensible words come from, for the first time, in hominid history?


P(sounds when object is present|random sounds other people made when object was present) ?


No, you’re conflating System 1 thinking and System 2 thinking.


Possible? Maybe. I believe we know enough now, however, to conclude that it’s astonishingly unlikely.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: