InfraNodus MCP Server: AI Text Knowledge Graphs with Network Analysis Insights


InfraNodus MCP Server is a standardized protocol that connects large language models (LLMs) like Claude, ChatGPT, and other AI systems directly to InfraNodus's knowledge graph analysis engine via its API. This means you can perform text analysis and extract insights from knowledge graphs using natural language commands directly in your favorite LLM client, AI chat, code editor, or automated workflows.

The advantage of using the MCP server to the API is that the MCP server has a collection of well-described tools that can be used for specific use cases without the need to write code or set up parameters for API connection.

The model can also combine different tools at its own discretion based on the task at hand, so this can also be a good option if you don't know how to use the InfraNodus graph interface and feel more comfortable giving the instructions via a chat interface.


Deploy InfraNodus MCP Server Locally - (e.g. for Cursor, Claude Desktop, local n8n server)
Deploy Remotely via Smithery - (e.g. for Claude Web, ChatGPT, cloud n8n server)

 
 


What Can the InfraNodus MCP Server Do?

InfraNodus MCP server has a collection of tools that can be called by the LLM models to perform specific tasks.



Here is a high-level overview of what you can do with the InfraNodus MCP server:

• Connect your existing InfraNodus knowledge graphs to your LLM workflows and AI chats

In this scenario, you can search through and query your existing text graphs directly from Claude or ChatGPT, retrieve related ideas from them, and generate new insights based on new connections — both between the graphs themselves and between the graphs and other texts.

• Identify the main topical clusters in discourse without missing the important nuances (works better than standard LLM workflows)

This is a useful workflow for content creation and research. You can use it to make sure that you cover all the important topics and keywords in your content. Network analysis of the knowledge graph helps identify bias towards a particular set of topics or concepts and rectify it if you're looking for a more balanced perspective.

• Identify the content gaps in any discourse (helpful for content creation and research)

This functionality is helpful for finding what's missing in your content or a public discourse based on the topical structure of the text. Content gap analysis can then help bridge the gaps with new ideas and stimulate thinking in new directions.

• Generate new knowledge graphs and ontologies from any text and use them to augment your LLM responses and reasoning logic

You can also ask InfraNodus to generate an ontology from a set of texts, documents, or from a particular topic. The ontology is then saved as a knowledge graph and can be invoked by the MCP server to augment your LLM responses and reasoning logic.

• Import data from external sources (e.g. Google results or search intent) and use them to augment your LLM responses

This is a very powerful workflow for SEO and LLM optimization. You can use it to understand what people search for in relation to a particular topic and what they actually find. The InfraNodus MCP server will highlight which topics you should focus on to improve your topical authority and underserved niches in search intent, so you can ensure that your content is relevant and useful to your audience.


 

How Do MCP Servers Work?

Once you have installed the MCP server to your LLM client, the model will have access to the tools provided by the MCP servers. These tools are described in such a way that the model can understand what they do and how to use them.

As you are conversing with the model, you can explicitly ask it to use a particular tool or wait until the model suggest the use of the tool based on the conversation.

For instance, in the example below, we asked Claude Desktop to do the SEO research for the topic of "MCP servers". As InfraNodus has the SEO tool available, the model suggested to use several of them to generate a comprehensive SEO report for the topic:

Claude MCP server InfraNodus
 

Available Tools in the InfraNodus MCP Server

The MCP server has a collection of tools that can be called by the LLM models to perform specific tasks. Here are the detailed tool descriptions, you can find more up-to-date data on the MCP Server npm package README page.

generate_knowledge_graph - Convert any text into a knowledge graph with topics, concepts, and structural analysis

analyze_existing_graph_by_name - Retrieve and analyze graphs from your InfraNodus account

generate_content_gaps - Generate content gaps from text

generate_topical_clusters - Generate topics and clusters of keywords from text using knowledge graph analysis

generate_contextual_hint - Generate a topical overview of a text and provide insights for LLMs to generate better responses

generate_research_questions - Generate research questions based on content gaps

generate_research_ideas - Generate innovative research ideas based on content gaps that can be used to improve the text and discourse

research_questions_from_graph - Generate research questions based on an existing InfraNodus graph

generate_responses_from_graph - Generate responses from an existing InfraNodus graph or ontology

develop_conceptual_bridges - Analyze text and develop latent ideas based on concepts that connect this text to a broader discourse

develop_latent_topics - Analyze text and extract underdeveloped topics with ideas on how to develop them

develop_text_tool - Comprehensive text analysis combining research questions, latent topics, and content gaps with progress tracking

create_knowledge_graph - Create a knowledge graph in InfraNodus from text and provide a link to it

overlap_between_texts - Create knowledge graphs from two or more texts and find the overlap (similarities) between them

difference_between_texts - Create knowledge graphs from two or more text and find what's not present in the first graph that's present in the others

analyze_google_search_results - Generate a Google search results graph from search queries

analyze_related_search_queries - Generate a graph of search requests related to search queries provided

search_queries_vs_search_results - Find what people search for but don't yet find

generate_seo_report - Analyze content for SEO optimization by comparing it with Google search results and search queries

memory_add_relations - Add relations to InfraNodus memory from text with entity detection and save for future retrieval

memory_get_relations - Retrieve relations from InfraNodus memory for specific entities or contexts

search - Search through existing InfraNodus graphs

fetch - Fetch a specific search result for a graph



How to Deploy the InfraNodus MCP Server

There are two ways to deploy the InfraNodus MCP server: locally and remotely.

1. Locally: You can deploy the MCP server locally on your own server or on a cloud provider. You can find the instructions on the MCP Server npm package README page or below. This is suitable for local Claude desktop version or your local IDE such as Cursor, VSCode, or Windsurf AI.

2. Remotely: You can deploy the MCP server remotely on Smithery. You can find the instructions on the Smithery MCP Server page. This is a better option for faster one-click installation.



 
 

Quick Start & Configuration Examples

Here are step-by-step configuration examples for the most popular platforms. First, obtain your InfraNodus API key from your API control panel.


1. Claude Desktop Configuration

Install the MCP server via Smithery:

Go to the Smithery MCP Server page, choose "Claude" and click the install button.

At the beginning, you don't need to add the Infranodus API key, just install the server. You might need to log in Smithery though to generate their API key that gives you access via Smithery to the InfraNodus server. If you hit the API rate limit, just open a free Smithery account, log in, and in the Configuration tab for the InfraNodus server add the API key that you can obtain at InfraNodus API access page.

Restart Claude Desktop and you'll see InfraNodus tools available in your chat.

Smithery will add something like this to your Claude Desktop configuration file:

{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": [
				"-y",
				"@smithery/cli@latest",
				"run",
				"@infranodus/mcp-server-infranodus",
				"--key",
				"YOUR_SMITHERY_API_KEY",
				"--profile",
				"YOUR_SMITHERY_PROFILE_NAME"
			]
		}
	}
								

The Claude Desktop configuration file is usually located at:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json


You can also install the MCP server locally bypassing Smithery and using `npx` package manager instead. This will work fine and what happens in this case is that npx is launching the MCP server on your machine directly from our official npm package.
{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": [
				"-y",
				"infranodus-mcp-server"
			]
		}
		"env": {
			"INFRANODUS_API_KEY": "your-api-key-here"
		}
	}
}
									

2. Claude Web Configuration

The easiest way is to open the Connectors page, then click "Add Custom Connector" and add the following URL to the connector's URL field:

https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp

Give it a short name (e.g. "InfraNodus") and then click "Connect". You will be redirected to Smithery's authentication page. There you can provide additional settings, such as your InfraNodus API key (if you have an account). If you do not have an account, you can use it without the key, but when you hit the rate limit, you'll need to disconnect the server and connect it again and add the InfraNodus API key this time.


3. ChatGPT Web Configuration

ChatGPT doesn't have very good support for MCP servers, because you can only use it in the Developer Mode, and only in new converastions (you cannot activate it in existing conversations). So we recommend using it with Claude. But if you decide to use the InfraNodus MCP server anyway, here's how you can do it.

Open the Connectors page, then click "Advanced Settings" and activate the "Developer Mode". Then you'll have the "Create" button appearing at the top right corner next to "Enabled Connectors". Click it and the following URL to the connector's URL field:

https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp

Give it a short name (e.g. "InfraNodus") and then click "Connect". You will be redirected to Smithery's authentication page. There you can provide additional settings, such as your InfraNodus API key (if you have an account). If you do not have an account, you can use it without the key, but when you hit the rate limit, you'll need to disconnect the server and connect it again and add the InfraNodus API key this time.


4. Cursor IDE or n8n Installation

1. Installation via Smithery

Go to the InfraNodus MCP server page on Smithery and choose Cursor in "Add Your Own Client" > Auto menu.

Smithery will open Cursor and offer you to add a server.

Make sure you give it a short name (e.g. "InfraNodus") and then click "Connect".

Smithery will add the following setting to your MCP server configuration:

{
	"mcpServers": {
		"InfraNodus ": {
			"type": "http",
			"url": "https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp?api_key=YOUR_SMITHERY_API_KEY&profile=YOUR_SMITHERY_PROFILE_NAME",
			"headers": {}
		}
	}
								}

You can verify it in Settings → Cursor Settings → Tools & MCP and click "Add new MCP Server".


Claude Code MCP Server Configuration

You can also configure the InfraNodus MCP server in Claude Code. The easiest way to install it is to run the following command in your terminal:


	claude mcp add infranodus -s user \
	-- env INFRANODUS_API_KEY=YOUR_INRANODUS_KEY \
		npx -y infranodus-mcp-server
								

In this case, a configuration will be added to your Claude Code configuration file usually located at `~/Your_User_Name/.claude.json` (your username home directory). You can also add the configuration manually if you open that file, find `mcpServers` section (create it if it doesn't exist) and add the InfraNodus mcp server activation setting:


	{
		"mcpServers": {
			"infranodus": {
				"type": "stdio",
				"command": "env",
				"args": [
					"INFRANODUS_API_KEY=YOUR_INFRANODUS_API_KEY",
					"npx",
					"-y",
					"infranodus-mcp-server"
				],
				"env": {}
				}
		}
	}
								

5. Local Installation (for development and debugging)

You don't have to use Smithery to use the MCP server locally in your favorite IDE or Claude Desktop. This can be a more interesting setup if you want to add additional tools or modify the tool calling following our API documentation. Here's how you can install the InfraNodus MCP server locally:

Install the latest version of the MCP server from our GitHub repository using Terminal:



git clone https://github.com/yourusername/mcp-server-infranodus.git
cd mcp-server-infranodus
npm install
npm run build

								

Create the `.env` file and add your InfraNodus API key. You don't need to have the API key but you will then hit rate limits after a while.


INFRANODUS_API_KEY=your_api_key
								

Start and build the server using Terminal:

npm run inspect

Claude Desktop Configuration (macOS)

   
 open ~/Library/Application\ Support/Claude/claude_desktop_config.json
   

Add the InfraNodus server configuration:


   
   {
   	"mcpServers": {
   		"infranodus": {
   			"command": "node",
   			"args": ["/absolute/path/to/mcp-server-infranodus/dist/index.js"],
   			"env": {
   				"INFRANODUS_API_KEY": "your-api-key-here"
   			}
   		}
   	}
   }


5. Example Usage in Chat

Once configured, you can use natural language commands like:

  • "Analyze this text and show me the main topical clusters"
  • "What are the content gaps in this article?"
  • "Generate research questions from this document"
  • "Find my existing graphs where I talk about [topic]" - for logged in users
  • "Compare these two texts and find overlapping concepts"
  • "What do people search for about [topic] but don't find in results?"

6. Troubleshooting Common Issues

Issue: "Rate limit exceeded"
Solution: Free accounts have limited API calls. Upgrade to Advanced, Pro, or Premium for higher limits.

Issue: MCP server not appearing in Claude/Cursor
Solution: Restart the application completely after editing the config file.

Issue: "API key not found"
Solution: You can use the MCP server for free for the first few iterations. Then you'll need to log in InfraNodus and get your API key on the InfraNodus API Access page. Add this key to the Smithery Configuraiton page or to your `ENV` settings if you set up the server locally. You might need to disconnect, remove, and reconnect the server again for it to work.

Issue: Wrong tools are used
Solution: Ask your LLM expicitly to use the InfraNodus tool in the prompt and let us know about this issue.



Adding Tools to the InfraNodus MCP Server

If you install the InfraNodus MCP server locally, you can add more tools to it. The easiest way to do that is to read through our API documentation and create your own tools based on the already existing ones.

If you don't know how to code but would like to have a certain tool added, you can request it via our support portal, Discord server, or GitHub issues on the repository page.


 

Data Privacy

InfraNodus MCP server uses the `doNotSave` parameter from its API to avoid saving graphs in your user account. This way your data won't even be stored in our logs and there will be no trace of what you process on our servers. Most of the InfraNodus functionalities don't require the use of AI, however, when it is needed and you explicitly ask for it, we use GPT models from OpenAI, Claude, or Gemini's API (you choose the model). All of these companies claim that they don't use the data processed using their API for training purposes, so you get an additional degree of privacy for your data. This is suitable for the use in web-based applications, such as Claude Web, ChatGPT (via Developer Mode > Connectors), and cloud-based automation paltforms, such as N8N (via their MCP node).



Frequently Asked Questions (FAQ)

Here are some of the most frequently asked questions about MCP servers:

What's the difference between using the MCP server vs the API directly?

The MCP server provides a standardized interface with pre-configured tools that LLMs can call automatically based on your natural language requests. The API requires you to write code and manually configure parameters for each request. MCP server is more suitable for conversational workflows in your IDE or chat interface. You can use the InfraNodus API when building custom applications or automated pipelines where you need precise control over parameters.

Can I use InfraNodus MCP with multiple LLMs at the same time?

Yes! You can install the InfraNodus MCP server in multiple platforms simultaneously. For example, you can have it configured in both Claude Desktop and Claude Web, Cursor IDE, or in both Claude Web and ChatGPT. Each installation is independent, so you can use different API keys or configurations for different platforms if needed.

How many API calls does a typical analysis use?

Most simple operations (generating a knowledge graph, analyzing topical clusters, or identifying content gaps) use 1-2 API calls. More complex workflows like SEO reports that combine search results analysis, search intent analysis, and content comparison may use 3-5 API calls. The MCP server is designed to be efficient, and the LLM will typically batch related operations when possible.

What data does Smithery have access to when I use their deployment?

When you deploy via Smithery, they act as a proxy between your LLM client and the InfraNodus API. Smithery routes your requests but does not store the content you analyze. Your actual data is processed by InfraNodus using the doNotSave parameter, which means it won't be saved in your account or logs (unless you explicitly ask for the data to be saved in your InfraNodus account). For maximum privacy, you can deploy the MCP server locally instead of using Smithery.

Do I need an InfraNodus account to use the MCP server?

No, you can use the MCP server without an account using the free tier with rate limits. However, having an account and API key provides several benefits: higher rate limits (especially for Advanced, Pro, and Premium subscribers), the ability to save and retrieve knowledge graphs, access to your existing graphs for analysis, and memory/RAG functionality for persistent knowledge storage.

Which AI models can I use with the InfraNodus MCP server?

The MCP server works with any LLM client that supports the Model Context Protocol, including Claude Desktop, Claude Web, ChatGPT (Developer Mode), Cursor, VSCode with MCP extensions, Windsurf AI, Claude Code, and automation platforms like n8n. The protocol is LLM-agnostic, so as more platforms add MCP support, they'll automatically work with InfraNodus.

What happens if I exceed my API rate limit?

You'll receive a "Rate limit exceeded" error. Free tier users can either wait for the rate limit to reset or upgrade to an Advanced, Pro, or Premium plan for higher limits. If you're using Smithery without an InfraNodus API key, you can add your API key in the Smithery configuration settings, then disconnect and reconnect the server to apply the changes.

Can I customize or add new tools to the MCP server?

Yes! If you install the MCP server locally (via git clone), you can modify existing tools or create new ones using our API documentation as a reference. The MCP server is open source, so you can extend it to fit your specific needs. If you'd like to request new tools without coding, submit requests via our support portal or GitHub repository.

What if the LLM uses the wrong tool or doesn't use InfraNodus tools?

Sometimes LLMs may choose different tools or not recognize when InfraNodus tools would be helpful. To ensure the right tools are used, explicitly mention InfraNodus in your prompt: "Use InfraNodus to analyze this text and show topical clusters" or "Generate an SEO report using the InfraNodus MCP server." If this continues to be an issue, please let us know so we can improve the tool descriptions.

How are MCP servers different from Claude Skills?

Skills is a feature available in Claude Desktop and Web which allows you to write the extended prompts with code blocks to perform specific tasks or follow a certain logic. The models automatically select which skills to use depending on the context of the conversation. Theoretically, you could write a skill that would contain references to the InfraNodus API and achieve the same tasks as the MCP server, but that's going to be less structured and prone to bugs. It is better to use skills for high-level tasks and the MCP servers for specific interactions with the InfraNodus API or other external tools.



Try the InfraNodus MCP Server


You need to create an account on InfraNodus first and then obtain the API key. The API key is accessible to all users, but Advanced, Pro, and Premium subscribers get higher usage limits. Then simply deploy the MCP server locally or via Smithery and add the API key to the settings.

Sign Up for an InfraNodus Account

Deploy InfraNodus MCP Server Locally
Deploy Remotely via Smithery