amelius a day ago

I'd rather apt-get install something.

But that seems not a possibility in the modern days of software distribution, especially with GPU-dependent stuff like LLMs.

So yeah, I get why this exists.

  • halJordan 11 hours ago

    What is the complaint here? There are plenty of binaries you can invoke through your cli that will query a remote llm api

xigoi 17 hours ago

It’s not clear from the README what providers it uses and why it needs your GitHub username.

ryancnelson a day ago

this is neat.... whose anthropic credits am i using, though? sonnet-4 isn't cheap! would i hit a rate-limit if i used this for daily work?

ccbikai 3 days ago

I am the author, thank you for your support.

Welcome to help maintain it with me

kimjune01 3 days ago

hey i just tried it. it's cool! i wish it was more self aware

  • ccbikai 3 days ago

    Thank you for your feedback; I will optimize the prompt

t0ny1 a day ago

does this project request to llm providers?

  • cap11235 a day ago

    Are you serious? Yeah, its using gemini 2.5 pro without a server, sure yeah.

eisbaw a day ago

Why not telnet?

  • accrual a day ago

    I'd love to see an LLM outputting over a Teletype. Just tschtschtschtsch as it hammers away the paper feed.

    • cap11235 a day ago

      Last week or so, there was the LLM finetune posted that speaks like a 19th century Irish author. I look forward a bit to having an LLModem model.

  • RALaBarge a day ago

    No HTTPS support

    • benterix a day ago

      I bet someone can write an API Gateway for this...

dncornholio a day ago

Using React to render a CLI tool is something. I'm not sure how I feel about that. It feels like like 90% of the code is handling issues with rendering.

  • demosthanos a day ago

    I mean, it's a thin wrapper around LLM APIs, so it's not surprising that most of the code is rendering. I'm not sure what you're referring to by "handling issues with rendering", though—it looks like a pretty bog standard React app. Am I missing something?