Confidant
Receive secrets from humans securely โ no chat exposure, no copy-paste, no history leaks.
๐จ CRITICAL FLOW โ Read This First
This is a human-in-the-loop process. You CANNOT retrieve the secret yourself.
- Run the script โ you get a secure URL
- SEND the URL to the user in chat โ THIS IS MANDATORY
- WAIT for the user to open the URL in their browser and submit the secret
- The script handles the rest (receives, saves to disk, confirms)
โ DO NOT curl/fetch the secret URL yourself โ it's a web form for humans
โ DO NOT skip sharing the URL โ the user MUST receive it in chat
โ DO NOT poll the API to check if the secret arrived โ the script does this
โ DO NOT proceed without confirming the secret was received
โ
Share URL โ Wait โ Confirm success โ Use the secret silently
โก Quick Start
You need an API key from the user? One command:
{skill}/scripts/request-secret.sh --label "OpenAI API Key" --service openai
The script handles everything:
- โ Starts server if not running (or reuses existing one)
- โ Creates a secure request with web form
- โ Detects existing tunnels (ngrok or localtunnel)
- โ Returns the URL to share
If the user is remote (not on the same network), add --tunnel:
{skill}/scripts/request-secret.sh --label "OpenAI API Key" --service openai --tunnel
This starts a localtunnel automatically (no account needed) and returns a public URL.
Output example:
๐ Secure link created!
URL: https://gentle-pig-42.loca.lt/requests/abc123
(tunnel: localtunnel | local: http://localhost:3000/requests/abc123)
Save to: ~/.config/openai/api_key
Share the URL above with the user. Secret expires after submission or 24h.
Share the URL โ user opens it โ submits the secret โ done.
Scripts
request-secret.sh โ Create a secure request (recommended)
# Save to ~/.config/<service>/api_key (convention)
{skill}/scripts/request-secret.sh --label "SerpAPI Key" --service serpapi
# Save to explicit path
{skill}/scripts/request-secret.sh --label "Token" --save ~/.credentials/token.txt
# Save + set env var
{skill}/scripts/request-secret.sh --label "API Key" --service openai --env OPENAI_API_KEY
# Just receive (no auto-save)
{skill}/scripts/request-secret.sh --label "Password"
# Remote user โ start tunnel automatically
{skill}/scripts/request-secret.sh --label "Key" --service myapp --tunnel
# JSON output (for automation)
{skill}/scripts/request-secret.sh --label "Key" --service myapp --json
| Flag | Description |
|---|---|
--label <text> |
Description shown on the web form (required) |
--service <name> |
Auto-save to ~/.config/<name>/api_key |
--save <path> |
Auto-save to explicit file path |
--env <varname> |
Set env var (requires --service or --save) |
--tunnel |
Start localtunnel if no tunnel detected (for remote users) |
--port <number> |
Server port (default: 3000) |
--timeout <secs> |
Max wait for startup (default: 15) |
--json |
Output JSON instead of human-readable text |
check-server.sh โ Server diagnostics (no side effects)
{skill}/scripts/check-server.sh
{skill}/scripts/check-server.sh --json
Reports server status, port, PID, and tunnel state (ngrok or localtunnel).
Rules for Agents
- NEVER ask users to paste secrets in chat โ always use this skill
- NEVER reveal received secrets in chat โ not even partially
- NEVER
curlthe Confidant API directly โ use the scripts - NEVER kill an existing server to start a new one
- NEVER try to expose the port directly (public IP, firewall rules, etc.) โ use
--tunnelinstead - ALWAYS share the URL with the user in chat โ this is the entire point of the tool
- ALWAYS wait for the user to submit โ do not poll, do not retry, do not try to retrieve the secret yourself
- Use
--tunnelwhen the user is remote (not on the same machine/network) - Prefer
--servicefor API keys โ cleanest convention - After receiving: confirm success, use the secret silently
Example Agent Conversation
This is what the interaction should look like:
User: Can you set up my OpenAI key?
Agent: I'll create a secure link for you to submit your API key safely.
[runs: request-secret.sh --label "OpenAI API Key" --service openai --tunnel]
Agent: Here's your secure link โ open it in your browser and paste your key:
๐ https://gentle-pig-42.loca.lt/requests/abc123
The link expires after you submit or after 24h.
User: Done, I submitted it.
Agent: โ
Received and saved to ~/.config/openai/api_key. You're all set!
โ ๏ธ Notice: the agent SENDS the URL and WAITS. It does NOT try to access the URL itself.
How It Works
- Script starts a Confidant server (or reuses existing one on port 3000)
- Creates a request with a unique ID and secure web form
- Optionally starts a localtunnel for public access (or detects existing ngrok/localtunnel)
- User opens the URL in their browser and submits the secret
- Secret is received, optionally saved to disk (
chmod 600), then destroyed on server
Tunnel Options
| Provider | Account needed | How |
|---|---|---|
| localtunnel (default) | No | --tunnel flag or npx localtunnel --port 3000 |
| ngrok | Yes (free tier) | Auto-detected if running on same port |
The script auto-detects both. If neither is running and --tunnel is passed, it starts localtunnel.
Advanced: Direct CLI Usage
For edge cases not covered by the scripts:
# Start server only
npx @aiconnect/confidant serve --port 3000 &
# Create request on running server
npx @aiconnect/confidant request --label "Key" --service myapp
# Submit a secret (agent-to-agent)
npx @aiconnect/confidant fill "<url>" --secret "<value>"
# Check a specific request
npx @aiconnect/confidant get <id>
โ ๏ธ Only use direct CLI if the scripts don't cover your case.