SentinelX
SentinelX beta

Operate your Linux servers from Claude.ai or ChatGPT.

Install a small agent on your servers. Connect SentinelX as an MCP connector in your LLM. Then ask the LLM to check disk usage, restart a service, edit a config, or anything else you've allowed.

curl -fsSL https://get.sentinelx.app | sudo bash
Linux only. Run as root. The agent runs as an unprivileged user after install.

How it works

  1. 1
    Install the agent. Run the command above on any Linux host. It clones the open-source agent, sets up a virtualenv, and registers a systemd service.
  2. 2
    Sign in with Google. The installer prints a URL. Open it, sign in with Google, copy the displayed token, and paste it back into the installer.
  3. 3
    Connect SentinelX in Claude.ai or ChatGPT. Settings → Connectors → Add custom MCP → https://mcp.sentinelx.app/mcp/mcp → authorize with the same Google account. Done.
Multiple servers? Run the install command on each one. Your hosts get human-friendly aliases automatically (their hostname). Ask the LLM "list my servers" and it'll show all of them — pick which one to target by host_id, hostname, or a custom label.

What you can ask

Once it's connected, you can talk to your servers in plain English (or whatever language). The LLM picks the right tool automatically:

# Health checks
"Show me uptime and disk usage on my-vps"
"Compare free memory across all my servers"

# Operations
"Restart nginx on prod-web and tell me when it's back"
"Tail the last 50 lines of /var/log/syslog on db1"

# Config edits
"Update the worker_processes setting in my nginx config to 4"
"Show me the diff before applying"

What's installed where

On your server

The agent goes to /opt/sentinelx-cloud-core and runs as systemd unit sentinelx-cloud-core.service under user sentinelx.

Allowed commands live in /etc/sentinelx/config.yaml — you fully control what the LLM can do.

In the cloud

A hosted hub at mcp.sentinelx.app relays MCP tool calls between your LLM and your agents. Authentication is OAuth 2 + PKCE via Google.

Your agent connects out to the hub. No port forwarding, no inbound traffic to your server.

Security model

Open source

The agent and its protocol are under Apache 2.0:

Manual install

If you'd rather not curl | sudo bash, fetch and inspect first:

curl -fsSL https://get.sentinelx.app/install.sh -o install.sh
less install.sh
sudo bash install.sh

Status

SentinelX is in beta. Hub status: healthz · readyz