InfraNodus MCP Server: Installation Guide
InfraNodus MCP Server is a standardized protocol that connects large language models (LLMs) like Claude, ChatGPT, and other AI systems directly to InfraNodus's knowledge graph analysis engine via its API. This means you can perform text analysis and extract insights from knowledge graphs using natural language commands directly in your favorite LLM client, AI chat, code editor, or automated workflows.
The advantage of using the MCP server to the API is that the MCP server has a collection of well-described tools that can be used for specific use cases without the need to write code or set up parameters for API connection.
The model can also combine different tools at its own discretion based on the task at hand, so this can also be a good option if you don't know how to use the InfraNodus graph interface and feel more comfortable giving the instructions via a chat interface.
Deploy InfraNodus MCP Server Locally - (e.g. for Cursor, Claude Desktop, local n8n server)
Deploy Remotely via InfraNodus MCP - (e.g. for Claude Web, ChatGPT, cloud n8n server)
Learn More about MCP Servers:
What Can the InfraNodus MCP Server Do?
InfraNodus MCP server has a collection of tools that can be called by the LLM models to perform specific tasks.
Here is a high-level overview of what you can do with the InfraNodus MCP server:
• Connect your existing InfraNodus knowledge graphs to your LLM workflows and AI chats
In this scenario, you can search through and query your existing text graphs directly from Claude or ChatGPT, retrieve related ideas from them, and generate new insights based on new connections — both between the graphs themselves and between the graphs and other texts.
• Identify the main topical clusters in discourse without missing the important nuances (works better than standard LLM workflows)
This is a useful workflow for content creation and research. You can use it to make sure that you cover all the important topics and keywords in your content. Network analysis of the knowledge graph helps identify bias towards a particular set of topics or concepts and rectify it if you're looking for a more balanced perspective.
• Identify the content gaps in any discourse (helpful for content creation and research)
This functionality is helpful for finding what's missing in your content or a public discourse based on the topical structure of the text. Content gap analysis can then help bridge the gaps with new ideas and stimulate thinking in new directions.
• Generate new knowledge graphs and ontologies from any text and use them to augment your LLM responses and reasoning logic
You can also ask InfraNodus to generate an ontology from a set of texts, documents, or from a particular topic. The ontology is then saved as a knowledge graph and can be invoked by the MCP server to augment your LLM responses and reasoning logic.
• Import data from external sources (e.g. Google results or search intent) and use them to augment your LLM responses
This is a very powerful workflow for SEO and LLM optimization. You can use it to understand what people search for in relation to a particular topic and what they actually find. The InfraNodus MCP server will highlight which topics you should focus on to improve your topical authority and underserved niches in search intent, so you can ensure that your content is relevant and useful to your audience.
How Do MCP Servers Work?
Once you have installed the MCP server to your LLM client, the model will have access to the tools provided by the MCP servers. These tools are described in such a way that the model can understand what they do and how to use them.
As you are conversing with the model, you can explicitly ask it to use a particular tool or wait until the model suggest the use of the tool based on the conversation.
For instance, in the example below, we asked Claude Desktop to do the SEO research for the topic of "MCP servers". As InfraNodus has the SEO tool available, the model suggested to use several of them to generate a comprehensive SEO report for the topic:
Available Tools in the InfraNodus MCP Server
The MCP server has a collection of tools that can be called by the LLM models to perform specific tasks. Here are the detailed tool descriptions, you can find more up-to-date data on the MCP Server npm package README page.
generate_knowledge_graph - Convert any text into a knowledge graph with topics, concepts, and structural analysis
analyze_existing_graph_by_name - Retrieve and analyze graphs from your InfraNodus account
generate_content_gaps - Generate content gaps from text
generate_topical_clusters - Generate topics and clusters of keywords from text using knowledge graph analysis
generate_contextual_hint - Generate a topical overview of a text and provide insights for LLMs to generate better responses
generate_research_questions - Generate research questions based on content gaps
generate_research_ideas - Generate innovative research ideas based on content gaps that can be used to improve the text and discourse
research_questions_from_graph - Generate research questions based on an existing InfraNodus graph
generate_responses_from_graph - Generate responses from an existing InfraNodus graph or ontology
develop_conceptual_bridges - Analyze text and develop latent ideas based on concepts that connect this text to a broader discourse
develop_latent_topics - Analyze text and extract underdeveloped topics with ideas on how to develop them
develop_text_tool - Comprehensive text analysis combining research questions, latent topics, and content gaps with progress tracking
create_knowledge_graph - Create a knowledge graph in InfraNodus from text and provide a link to it
overlap_between_texts - Create knowledge graphs from two or more texts and find the overlap (similarities) between them
difference_between_texts - Create knowledge graphs from two or more text and find what's not present in the first graph that's present in the others
analyze_google_search_results - Generate a Google search results graph from search queries
analyze_related_search_queries - Generate a graph of search requests related to search queries provided
search_queries_vs_search_results - Find what people search for but don't yet find
generate_seo_report - Analyze content for SEO optimization by comparing it with Google search results and search queries
memory_add_relations - Add relations to InfraNodus memory from text with entity detection and save for future retrieval
memory_get_relations - Retrieve relations from InfraNodus memory for specific entities or contexts
search - Search through existing InfraNodus graphs
fetch - Fetch a specific search result for a graph
How to Deploy the InfraNodus MCP Server
There are two ways to deploy the InfraNodus MCP server: locally and remotely.
1. Locally: You can deploy the MCP server locally on your own server or on a cloud provider. You can find the instructions on the MCP Server npm package README page or below. This is suitable for local Claude desktop version or your local IDE such as Cursor, VSCode, or Windsurf AI.
2. Remotely: You can deploy the MCP server remotely using our server https://mcp.infranodus.com — the full instructions are available at How to Install the InfraNodus MCP Server supprot page. This is a better option for faster one-click installation.
3. Legacy: You can deploy the MCP server via Smithery as shown in the video below, but we do not recommend this method anymore and it will be deprecated in the future.
Quick Start & Configuration Examples
Here are step-by-step configuration examples for the most popular platforms. First, obtain your InfraNodus API key from your API control panel.
1. Claude Web / Desktop Configuration
The easiest way to install the InfraNodus MCP server on a remote server is to use our MCP server URL:
1. Open Claude Desktop and go to the Settings > Connectors page.
2. Click the Add Custom Connector button.
3. Add the following URL to the connector's URL field: https://mcp.infranodus.com
4. Give it a short name (e.g. "InfraNodus") and then click "Add".
5. The server will appear in the list of connectors. Click on it and then click "Connect".
6. You will be redirected to InfraNodus's authentication page. There you can provide your InfraNodus API key which you can obtain from your API control panel.
7. Once you're authenticated, click "Configure" to ensure Claude has access to the MCP server tools and has the right permissions.
8. You can now use the InfraNodus MCP server in any chat in your Claude Web or Desktop. Try opening an existing conversation and asking InfraNodus to analyze it directly from your LLM chatbot.
2. Cursor IDE / Claude Desktop / Local Installation
If you are using a local LLM client you can add the InfraNodus MCP server directly via the configuration file. The advantage is that you don't rely on additional server infrastructure:
1. Open Cursor and go to the Settings > Cursor Settings >Tools & MCP page. Then click the Add New MCP Server button and Cursor will automatically open the settings file for you.
2. In Claude Desktop, open the Settings > Developer > Local MCP Severs > Edit Config and edit the configuration file.
The Claude Desktop configuration file is usually located at:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
3. Add the following configuration to the file. Make sure to provide your InfraNodus API key in the "env" field below
{
"mcpServers": {
"infranodus": {
"command": "npx",
"args": [
"-y",
"infranodus-mcp-server"
],
"env": {
"INFRANODUS_API_KEY": "your-api-key-here"
}
}
}
}
This instructs your Claude or Cursor to launch InfrNodus MCP remotely using `npx` package manager and the InfraNodus MCP server npm package.
3. ChatGPT Web Configuration
ChatGPT doesn't have very good support for MCP servers, because you can only use it in the Developer Mode, and only in new converastions (you cannot activate it in existing conversations). So we recommend using it with Claude. But if you decide to use the InfraNodus MCP server with ChatGPT, here's how you can do it.
1. Open the Apps page
2. Click "Advanced Settings" and activate the "Developer Mode".
3. Then you'll have the "Create App" button appearing at the top right corner.
4. Click the "Create App" button and the following URL to the connector's URL field:
https://mcp.infranodus.com
5. Give it a short name (e.g. "InfraNodus")
6. Keep the OAuth authentication mode on
7. Then click "Connect". You will be redirected to the InfraNodus MCP server authentication page.
8. Specify your InfraNodus API key there, then click Connect
9. You will be redirected back to ChatGPT. You will see the InfraNodus MCP server with the list of tools available.
10. You can now use the InfraNodus MCP server in any chat in your ChatGPT. Try opening an existing conversation and asking InfraNodus to analyze it directly from your LLM chatbot.
4. n8n Installation
n8n is a popular no-code automation tool that you can use to create workflows that use the InfraNodus MCP server. You can install the MCP server in n8n using our MCP server URL:
1. n8n Cloud
1. Create a new workflow, click +, and add the new MCP Client node. You can also create a new AI Agent node and add the MCP Server Tool to it.
2. Specify the MCP server URL: https://mcp.infranodus.com
3. Choose the MCP OAuth2 and create new credentials with your InfraNodus API key. You can read in more detail how to set up n8n credentials in our How to Install the InfraNodus MCP Server support article.
4. Once connected, you will have access to a list of MCP tools. You can choose a specific one you want to use or make several of the tools available to your AI Agent node.
2. n8n Local
For the local n8n version, you can also use the setup above with the remote server.
You can also launch the InfraNodus MCP server locally and then make it available to your local n8n instance. See instructions on the InfraNodus MCP server GitHub repository.
5. Claude Code CLI
You can also configure the InfraNodus MCP server in Claude Code. The easiest way to install it is to run the following command in your terminal:
claude mcp add infranodus -s user \
-- env INFRANODUS_API_KEY=YOUR_INRANODUS_KEY \
npx -y infranodus-mcp-server
In this case, a configuration will be added to your Claude Code configuration file usually located at `~/Your_User_Name/.claude.json` (your username home directory). You can also add the configuration manually if you open that file, find `mcpServers` section (create it if it doesn't exist) and add the InfraNodus mcp server activation setting:
{
"mcpServers": {
"infranodus": {
"type": "stdio",
"command": "env",
"args": [
"INFRANODUS_API_KEY=YOUR_INFRANODUS_API_KEY",
"npx",
"-y",
"infranodus-mcp-server"
],
"env": {}
}
}
}
6. Local Installation (for development and debugging)
The InfraNodus MCP server can be installed locally on your own machine. It is open-source, so you can modify it, add more tool calls, and deploy it anywhere you want. This can be a more interesting setup if you want to add additional tools or modify the tool calling following our API documentation. Here's how you can install the InfraNodus MCP server locally:
Install the latest version of the MCP server from our GitHub repository using Terminal:
git clone https://github.com/yourusername/mcp-server-infranodus.git
cd mcp-server-infranodus
npm install
npm run build
Create the `.env` file and add your InfraNodus API key. You don't need to have the API key but you will then hit rate limits after a while.
INFRANODUS_API_KEY=your_api_key
Start and build the server using Terminal:
npm run inspect
Claude Desktop Configuration (macOS)
open ~/Library/Application\ Support/Claude/claude_desktop_config.json
Add the InfraNodus server configuration:
{
"mcpServers": {
"infranodus": {
"command": "node",
"args": ["/absolute/path/to/mcp-server-infranodus/dist/index.js"],
"env": {
"INFRANODUS_API_KEY": "your-api-key-here"
}
}
}
}
Example Usage in Chat
Once configured, you can use natural language commands like:
- "Analyze this text and show me the main topical clusters"
- "What are the content gaps in this article?"
- "Generate research questions from this document"
- "Find my existing graphs where I talk about [topic]" - for logged in users
- "Compare these two texts and find overlapping concepts"
- "What do people search for about [topic] but don't find in results?"
Troubleshooting Common Issues
Issue: "Rate limit exceeded"
Solution: Free accounts have limited API calls. Upgrade to Advanced, Pro, or Premium for higher limits.
Issue: MCP server not appearing in Claude/Cursor
Solution: Restart the application completely after editing the config file.
Issue: "API key not found"
Solution: For some types of configurations (e.g. local installation using `npx`) you can use the MCP server for free for the first few iterations. Then you'll need to log in InfraNodus and get your API key on the InfraNodus API Access page. You will need to add this key to your client's MCP settings in the `ENV` field or re-authorize if you are using the URL server `mcp.infranodus.com`.
Issue: Wrong tools are used
Solution: Ask your LLM expicitly to use the InfraNodus tool in the prompt and let us know about this issue.
Adding Tools to the InfraNodus MCP Server
If you install the InfraNodus MCP server locally, you can add more tools to it. The easiest way to do that is to read through our API documentation and create your own tools based on the already existing ones.
If you don't know how to code but would like to have a certain tool added, you can request it via our support portal, Discord server, or GitHub issues on the repository page.
Data Privacy
InfraNodus MCP server uses the `doNotSave` parameter from its API to avoid saving graphs in your user account. This way your data won't even be stored in our logs and there will be no trace of what you process on our servers. Most of the InfraNodus functionalities don't require the use of AI, however, when it is needed and you explicitly ask for it, we use GPT models from OpenAI, Claude, or Gemini's API (you choose the model). All of these companies claim that they don't use the data processed using their API for training purposes, so you get an additional degree of privacy for your data. This is suitable for the use in web-based applications, such as Claude Web, ChatGPT (via Developer Mode > Connectors), and cloud-based automation paltforms, such as N8N (via their MCP node).
Frequently Asked Questions (FAQ)
Here are some of the most frequently asked questions about MCP servers:
What's the difference between using the MCP server vs the API directly?
The MCP server provides a standardized interface with pre-configured tools that LLMs can call automatically based on your natural language requests. The API requires you to write code and manually configure parameters for each request. MCP server is more suitable for conversational workflows in your IDE or chat interface. You can use the InfraNodus API when building custom applications or automated pipelines where you need precise control over parameters.
Can I use InfraNodus MCP with multiple LLMs at the same time?
Yes! You can install the InfraNodus MCP server in multiple platforms simultaneously. For example, you can have it configured in both Claude Desktop and Claude Web, Cursor IDE, or in both Claude Web and ChatGPT. Each installation is independent, so you can use different API keys or configurations for different platforms if needed.
How many API calls does a typical analysis use?
Most simple operations (generating a knowledge graph, analyzing topical clusters, or identifying content gaps) use 1-2 API calls. More complex workflows like SEO reports that combine search results analysis, search intent analysis, and content comparison may use 3-5 API calls. The MCP server is designed to be efficient, and the LLM will typically batch related operations when possible.
What data does Smithery have access to when I use their deployment?
When you deploy via Smithery, they act as a proxy between your LLM client and the InfraNodus API. Smithery routes your requests but does not store the content you analyze. Your actual data is processed by InfraNodus using the doNotSave parameter, which means it won't be saved in your account or logs (unless you explicitly ask for the data to be saved in your InfraNodus account). For maximum privacy, you can deploy the MCP server locally instead of using Smithery.
Do I need an InfraNodus account to use the MCP server?
No, you can use the MCP server without an account using the free tier with rate limits. However, having an account and API key provides several benefits: higher rate limits (especially for Advanced, Pro, and Premium subscribers), the ability to save and retrieve knowledge graphs, access to your existing graphs for analysis, and memory/RAG functionality for persistent knowledge storage.
Which AI models can I use with the InfraNodus MCP server?
The MCP server works with any LLM client that supports the Model Context Protocol, including Claude Desktop, Claude Web, ChatGPT (Developer Mode), Cursor, VSCode with MCP extensions, Windsurf AI, Claude Code, and automation platforms like n8n. The protocol is LLM-agnostic, so as more platforms add MCP support, they'll automatically work with InfraNodus.
What happens if I exceed my API rate limit?
You'll receive a "Rate limit exceeded" error. Free tier users can either wait for the rate limit to reset or upgrade to an Advanced, Pro, or Premium plan for higher limits. If you're using Smithery without an InfraNodus API key, you can add your API key in the Smithery configuration settings, then disconnect and reconnect the server to apply the changes.
Can I customize or add new tools to the MCP server?
Yes! If you install the MCP server locally (via git clone), you can modify existing tools or create new ones using our API documentation as a reference. The MCP server is open source, so you can extend it to fit your specific needs. If you'd like to request new tools without coding, submit requests via our support portal or GitHub repository.
What if the LLM uses the wrong tool or doesn't use InfraNodus tools?
Sometimes LLMs may choose different tools or not recognize when InfraNodus tools would be helpful. To ensure the right tools are used, explicitly mention InfraNodus in your prompt: "Use InfraNodus to analyze this text and show topical clusters" or "Generate an SEO report using the InfraNodus MCP server." If this continues to be an issue, please let us know so we can improve the tool descriptions.
How are MCP servers different from Claude Skills?
Skills is a feature available in Claude Desktop and Web which allows you to write the extended prompts with code blocks to perform specific tasks or follow a certain logic. The models automatically select which skills to use depending on the context of the conversation. Theoretically, you could write a skill that would contain references to the InfraNodus API and achieve the same tasks as the MCP server, but that's going to be less structured and prone to bugs. It is better to use skills for high-level tasks and the MCP servers for specific interactions with the InfraNodus API or other external tools.
Try the InfraNodus MCP Server
You need to create an account on InfraNodus first and then obtain the API key. The API key is accessible to all users, but Advanced, Pro, and Premium subscribers get higher usage limits. Then simply deploy the MCP server locally or via our MCP server URL
https://mcp.infranodus.com and add the API key to the settings.Sign Up for an InfraNodus Account
Deploy InfraNodus MCP Server Locally
Deploy Remotely via MCP.INFRANODUS.COM