The currently available tools include the default search tool, an item details tool that gets information about a specific item, a comparison tool that helps evaluate two different items, and an ensemble queries tool that brings different queries together, for example to help plan a tour or put together a reading list of articles on a set of topics.
More specialized tools can be added, with a set of tools for working with recipes part of the first release. These can help substitute ingredients or find the right accompaniments. For your own use cases and content, you can define and add your own tools for your own specific use case, working with the project code and extending and changing as necessary. Future developments may allow MCP servers to be used as tools.
Getting started with NLWeb
The NLWeb development team offers a handful of quick starts to help you deploy your first NLWeb instances. These start with a basic local instance, running in a Python virtual environment with a vector database. You will need access to a LLM endpoint, with the default being Azure OpenAI, for inferencing and for generating the embeddings that are stored in your vector database. The demonstration search works against a set of RSS feeds, and you can quickly add your own choices. RSS feeds are a good first choice for a structured source of web content, as the RDF format offers many of the features NLWeb requires to generate answers.