Recently started using: https://github.com/babonet/Markco (cant find the similar extension i used previously since it got uninstalled). I add inline comments in the markdown and I have a skill which uses the comments to update the plans and remove resolve comments.
The significance is responsiveness — instead of waiting for the LLM to finish generating the entire code block before anything happens, each statement executes as soon as it's complete. So API calls start, UIs render, and errors surface while the LLM is still streaming tokens.
Combined with a slot mechanism, complex UIs build up progressively — a skeleton appears first, then each section fills in as the LLM generates it.
That is super cool. Sorry to be nitpicky but would really like to know your mental model: I didn’t understand from the blog why user waiting for a functional UI is a problem ? isn’t the partial streamed UI non-functional ?
I can see the value in early user verification and maybe interrupting the LLM to not proceed on an invalid path but I guess this is customer facing so not as valuable.
"In interactive assistants, that latency makes or breaks the experience." Why ? Because user might just jump off ?
Maybe I am a bit overdramatic ;) For me this is mostly about user experience. If the agent creates a complex mini app, the user might have to wait 30 seconds. That's 30 seconds without feedback. It's way nicer to already see information appearing - especially if that information is helpful. Also the UI can be functional already, even if it's not 100% complete!
That's because there are hundreds of people coming from hacker news clicking the button. if you just refresh the page you can see the number change. The current total seems to be saved server side
The (claimed) value proposition is leveraging block/template to decompose the html page and bind its computation to browser events over a pubsub channel. Its a toolkit and not a framework so one can still use their favourite framework. The Go library isn’t doing anything earth shattering but re-implementing the aforesaid design would still need to be done in any framework.
It’s a rather confusing demo. If that’s the intended behavior, I think you need to think of a less surprising use of it that can be understood as a demo, and perhaps even comes across as a benefit rather than a bug.
Yeah I was afraid of that. Websockets are costly. Thankfully the approach doesn't completely depend on websockets so I have disabled it for now on the site.
It is intended as a longer-term project originating from the previous experiment: https://github.com/adnaan/gomodest. You are right, a better selection criteria for the target audience should be enlisted. I have been working on this for a while so I just got greedy for some early feedback.
reply