tool
Tools are functions that an LLM can invoke to extend an agent’s capabilities beyond simple text generation. In Sensai, a tool is defined as a file following the naming convention tool.(name).(ts|js)
. Here's an example:
export default ({ city }) => {
// return weather for city
};
Please note name
is not optional should be an identifier the LLM uses to
understand and match user intent (weather
in the example above).
The result of a tool call can either be passed back to the LLM for further reasoning or returned directly to the user, depending on your design.
For instance, if a user asks for the weather in Paris and a weather tool is available, the LLM can trigger that tool with Paris as input. The tool fetches the relevant weather data, and the LLM incorporates that result into its final response.
Schema
JSON schemas are used to define the parameters passed to a tool during execution. In Sensai, you have access to keyword guard
to define your tool schema.
export default guard(
({ city }) => {
// return weather for city
},
{
input: {
type: "object",
properties: {
city: { type: "string" },
},
required: ["city"],
},
}
);
Note the schema is also used for input validation, ensuring your tool always receives the expected data.