--- title: Rag Mcp Server emoji: 🏢 colorFrom: indigo colorTo: pink sdk: gradio sdk_version: 5.31.0 app_file: app.py pinned: false short_description: 'This space defines a RAG MCP server ' --- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference # Rag MCP server This is a simple gradio server that allows you to run a RAG (Retrieval-Augmented Generation) model using the MCP (Model Control Protocol) interface. # Requirements - UV: The python package manager. Visit https://github.com/astral-sh/uv Since the server uses SERPER API, you will need to set the `SERPER_API_KEY` environment variable to enable the search functionality. You can export it in your shell: ```bash export SERPER_API_KEY=your_serper_api_key ``` # Installation ```bash uv sync ``` # Usage Just run the gradio application: ```bash uv run gradio app.py ``` # Access the server Open your browser and go to `http://localhost:7860` to access the RAG MCP server. You can see the [gradio docs](https://www.gradio.app/guides/building-mcp-server-with-gradio) for more information on how to use the interface. # Using the MCP server with a mpc client You can use the tiny-agents client to interact with the MCP server. To do this, you need to install the huggingface_hub package: ```bash pip install huggingface_hub ``` Then, you can use the following code to interact with the MCP server: ```bash export HF_TOKEN=your_huggingface_token tiny_agents run agent.json ``` The `agent.json` file contains the configuration for the MCP client. Here is an example of how to create this file: ```json { "model": "meta-llama/Llama-3.3-70B-Instruct", "provider": "cerebras", "servers": [ { "type": "sse", "config": { "url": "https://frascuchon-rag-mcp-server.hf.space/gradio_api/mcp/sse" } } ] } ``` You can change the server URL to point to your own MCP server if you are running it locally or on a different host.