What this search usually needs
An AI skills trust report is a buyer-facing or reviewer-facing evidence packet. It explains where a skill came from, what it can access, what changed, which risks were found, and what the reviewer recommends. The best reports are specific enough for IT review and clear enough for customers.
Where it applies
- A skill author needs evidence for a customer security questionnaire.
- A platform team wants every approved skill to carry a standard review record.
- A marketplace wants to distinguish reviewed skills from unverified uploads.
How to run the review
- Run a provenance scan for source, author, fork chain, version, and license.
- Generate a permission inventory for tools, network, external services, and file writes.
- Attach injection scan findings and risky installation steps.
- Include upgrade diff and rollback guidance when a baseline exists.
- Export the report as HTML or PDF with reviewer notes and support contact.
Common risks to catch
- A generic assurance statement is not enough for enterprise review.
- Reports should avoid overstating safety or claiming formal certification unless that review actually occurred.
- Outdated reports can be misleading after a skill upgrade.
Use SkillProvenance Scan for this review
SkillProvenance Scan exports a practical trust report that connects evidence to the install decision, then routes teams to Team annual checkout for full report generation.