InfraNodus MCP Server


InfraNodus MCP Server is a standardized protocol that connects large language models (LLMs) like Claude, ChatGPT, and other AI systems directly to InfraNodus's knowledge graph analysis engine via its API. This means you can perform text analysis and extract insights from knowledge graphs using natural language commands directly in your favorite LLM client, AI chat, code editor, or automated workflows.

The advantage of using the MCP server over the API is that the MCP server has a collection of well-described tools that can be used for specific use cases without the need to write code or set up parameters for API connection. The model can also combine different tools at its own discretion based on the task at hand.


To Set up with ChatGPT or Claude Web Connectors (remote authorization via OAuth):
https://mcp.infranodus.com

Activate through Settings > Connectors. You will be asked for the InfraNodus API key to connect to your account using OAuth2 authentication.


To Set up with Your Local LLM Client (with a locally stored API key):
{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": ["-y", "infranodus-mcp-server"],
			"env": {
				"INFRANODUS_API_KEY": "your-api-key-here"
			}
		}
	}
}

You can also set up the MCP server locally, so you can adjust the tools and parameters or follow our step-by-step guide to deploy the MCP server remotely.

Deploy Locally (GitHub) Deployment Guides



Learn More about the InfraNodus MCP Server:

What Can the InfraNodus MCP Server Do?

InfraNodus MCP server has a collection of tools that can be called by LLM models to perform specific tasks.



Connect your existing InfraNodus knowledge graphs to your LLM workflows and AI chats. Search through and query your text graphs directly from Claude or ChatGPT.

Identify the main topical clusters in discourse without missing important nuances. Network analysis helps identify bias and rectify it for balanced perspectives.

Identify content gaps in any discourse and use AI to bridge them with new ideas and stimulate thinking in new directions.

Generate knowledge graphs and ontologies from any text and use them to augment your LLM responses and reasoning logic.

Import data from external sources (e.g. Google results or search intent) and use them to augment your LLM responses for SEO and LLMO optimization.

See the full list of available tools with detailed descriptions and examples.

How Do MCP Servers Work?

Once you have installed the MCP server to your LLM client, the model will have access to the tools provided by the MCP servers. These tools are described in such a way that the model can understand what they do and how to use them.

As you are conversing with the model, you can explicitly ask it to use a particular tool or wait until the model suggests the use of the tool based on the conversation.

For instance, in the example below, we asked Claude Desktop to do the SEO research for the topic of "MCP servers". As InfraNodus has the SEO tool available, the model suggested to use several of them to generate a comprehensive SEO report for the topic:

Claude MCP server InfraNodus

Data Privacy

InfraNodus MCP server uses the doNotSave parameter from its API to avoid saving graphs in your user account. This way your data won't even be stored in our logs and there will be no trace of what you process on our servers. Most of the InfraNodus functionalities don't require the use of AI, however, when it is needed and you explicitly ask for it, we use GPT models from OpenAI, Claude, or Gemini's API (you choose the model). All of these companies claim that they don't use the data processed using their API for training purposes, so you get an additional degree of privacy for your data.