CISA and G7 partners have released voluntary guidance outlining minimum elements for Software Bills of Materials tailored to artificial intelligence systems, giving public and private organizations a common structure for documenting AI components and dependencies.

The document, “Software Bill of Materials for AI — Minimum Elements,” was developed with G7 partners including Germany, Canada, France, Italy, Japan, the United Kingdom and the European Union.

CISA describes the guidance as a way to improve transparency in AI systems and supply chains, while noting that AI-specific recommendations should be considered alongside general SBOM requirements because AI systems are still software systems.

The new guidance builds on a broader SBOM program already moving through U.S. cybersecurity policy. In August 2025, CISA published draft “2025 Minimum Elements for a Software Bill of Materials” guidance for public comment, updating the 2021 NTIA baseline to reflect more mature SBOM tooling, greater use of machine-readable SBOM operations and new expectations around software supply-chain data.

Expanding the definition of software components

That earlier SBOM work focused on software components. The AI guidance adds elements that conventional software inventories do not fully capture, including model identity, dataset properties, infrastructure dependencies, security measures and system performance indicators.

That shift mirrors a broader enterprise-security push toward AI bills of materials, as vendors such as Cisco, Wiz and Palo Alto Networks move to help organizations track shadow AI, models, agents, prompts, datasets, identities and cloud infrastructure.

The G7 structure organizes the AI SBOM elements into seven clusters: metadata, system-level properties, models, dataset properties, infrastructure, security properties and key performance indicators.

Metadata covers the SBOM document itself, while the other clusters describe the AI system, the models it uses, datasets across the model lifecycle, required physical and virtual infrastructure, cybersecurity measures and performance indicators.

Addressing unique supply-chain risks

Separate international guidance on AI and machine learning supply-chain risks, first published in October 2025 and updated in March 2026, warned that pre-trained models and third-party datasets can introduce unique supply-chain risks if not securely managed.

That guidance said the risks should inform vendor questions and requirements when organizations source third-party AI or ML systems.

Translating guidance into procurement standards

For CIOs, CISOs and procurement teams, the guidance turns AI supply-chain risk into procurement questions. Buyers can move from general security assurances to specific questions about model versioning, training and testing data, infrastructure dependencies, security controls and performance measures.

That level of documentation could also support incident response and vulnerability work, particularly where teams need to identify affected models, datasets, infrastructure or dependencies.

CISA’s 2025 SBOM notice said SBOMs help provide software users with supply-chain data that can inform risk management and software security decisions. The AI version extends that logic to components that are harder to inspect through standard code-based inventories.

Practical limits and the path to adoption

The guidance is voluntary, and CISA’s 2025 SBOM notice also said statutes, regulations and binding governmentwide policies do not currently require federal agencies to obtain SBOMs from software vendors, even as stakeholder experience has shown the need for clearer and more precise specifications.

Personalized Feed
Personalized Feed