1 month ago
Wed Oct 16, 2024 3:10pm PST
Ask HN: Any existing Node.js frameworks that give basic LLMs access to tools?
Basically what's described here:

* https://github.com/openai/openai-node?tab=readme-ov-file#automated-function-calls

* https://jrmyphlmn.com/posts/sequential-function-calls

Suppose I have a model that does not natively support this.

Has anyone written/came across a library that implements such logic?

Conceptually, I would imagine that it simply boils down to another LLM interpreting user's question. The prompt for that LLM includes instructions that explain access to tools it has. That LLM then responds with a plan of execution. The plan is some JSON with instructions of what to execute. After the first cycle, the plan is re-evaluated. This repeats until LLM is happy with the response.

In practice, this feels like _a lot_ of edge cases to handle, so I am wondering if there are existing abstractions that I could use here.

For context, I am not using the native `tools` functionality provided by OpenAI SDK because it does not provide enough flexibility. It is a black box approach. I need to have more granular level control over when tools should be used/which.

comments:
add comment
loading comments...