Quick Start
openclaw skills install llm-provider-forensicsbyandyrenxu7255 · development
Forensically verify what model family or routing layer may actually sit behind a claimed LLM endpoint or model ID. Use when an agent must investigate whether a provider is genuine, proxied, aliased, aggregated, wrapped, or currently unusable across OpenAI-compatible protocol layers, GPT/OpenAI, Anthropic/Claude, Google Gemini, GLM/Zhipu, Qwen/Tongyi, Kimi/Moonshot, MiniMax, DeepSeek, and mixed compatibility gateways. Supports deeper family-fingerprint analysis, long-context tests, structured-output stress, refusal and variance profiling, streaming/error clues, repeated stability checks, and cross-provider comparison reports.
openclaw skills install llm-provider-forensics Or ask OpenClaw: "Install the LLM Provider Forensics skill"
openclaw skills install llm-provider-forensicsInstall and run LLM Provider Forensics instantly — no setup required.