# AI Tools

# AI Tools

AI Tools are specialized wrappers around Custom Endpoints that allow Large Language Models (LLMs) to interact with Curiosity Workspace data and perform actions.

# What is an AI Tool?

An AI Tool provides a clear description of an endpoint's purpose, expected input schema, and output format. This metadata allows an LLM agent to decide when and how to call the tool to fulfill a user request.

# Defining an AI Tool

To define an AI Tool:

  1. Identify the Endpoint: Select an existing Custom Endpoint or create a new one.
  2. Describe the Tool: Provide a clear name and description that explains what the tool does.
  3. Specify Parameters: Define the input parameters using a JSON schema.
  4. Register the Tool: Add the tool definition to your Workspace configuration under AI Integrations.

# Example: search_documents Tool

{
  "name": "search_documents",
  "description": "Search for relevant documents in the workspace based on a keyword query.",
  "parameters": {
    "type": "object",
    "properties": {
      "query": {
        "type": "string",
        "description": "The search term to look for."
      }
    },
    "required": ["query"]
  },
  "endpoint": "/api/endpoints/document-search"
}

# Best Practices

  • Clear Descriptions: The LLM relies on the tool's description to understand its utility. Be as descriptive as possible.
  • Granular Tools: Prefer small, focused tools over large, multi-purpose ones.
  • Error Handling: Ensure the underlying endpoint returns meaningful error messages that the LLM can understand and act upon.
  • Security: AI Tools respect the same permission and authentication rules as the endpoints they wrap.

# Next Steps