The General Services Administration (“GSA”) recently released a proposed AI clause that the agency intends to incorporate into future federal contracts. While framed as a safeguarding measure, this clause introduces sweeping obligations that signal a major shift in how the federal government intends to procure and govern AI.

Below, we provide a breakdown of what we believe are the most impactful sections of the proposed AI clause and the practical impact these provisions will have on contractors if implemented, as written.

The Government Owns all the Data

One of the more significant provisions in GSA’s proposed AI clause is the government’s complete ownership of all “Government Data” utilized and produced by an AI system.

According to the clause, “Government Data” includes any data, prompts, queries, instructions, source data, knowledge bases, or other information submitted into an AI system or any responses, results, analyses, derivative data, logs, or other data produced by the AI System. The only thing a contractor (or AI service provider) will retain is ownership in the underlying AI system or base model itself—that’s it.

The implications of this provision will require contractors to rethink their relationship with their own AI systems. Indeed, one of the apparent advantages of using an AI system is the ability for the model to learn and continuously improve based on the data and feedback it receives. GSA’s clause, however, will prevent contractors from owning or otherwise reusing prompts, outputs, or feedback received, eliminating the ability for the AI system to improve outside of its limited environment—i.e., the specific GSA contract. Any improvements or efficiencies established during performance must be returned to the government, preventing contractors from implementing such improvements in other AI models or for other opportunities. GSA’s clause also strictly bans training models with government data and mandates deletion and certification at the contract’s end.

Ultimately, GSA’s AI clause creates a one-way door to data ownership that heavily favors the government. Contractors will be required to keep AI environments separate under GSA contracts and agree that all data generated will be deleted and not used to enhance other AI systems that they (or their subcontractors or service providers) own or operate.

“American AI Only” Requirement

GSA’s proposed AI clause also explicitly prohibits foreign AI systems in GSA contracts. This ban covers foreign AI end products and also any system where “components” of the system are manufactured, developed, or controlled by non-U.S. entities. So, to use AI in a GSA contract, the entire system (including all components) must be U.S.-made, developed, and controlled.

If GSA’s “American Only” policy is implemented as written, contractors using foreign systems or systems with global or open-source AI components will need to rethink how they will deliver these services in compliance with this clause. Additionally, because the clause requires contractors to bear responsibility for subcontractor compliance, contractors will also need to ensure AI is properly implemented at each level of contract performance.

Significant Government Oversight and Evaluation Rights

In addition to segregating data from other customers and AI systems, GSA’s clause also requires AI systems to allow the government to run automated benchmarks to test the production of the system itself and identify any material gaps. Critically, these automated assessments can occur “at any time” and the government is entitled to use “its own benchmarks.” Unlike a one-time compliance check, ongoing audits will hold contractors accountable for the compliant performance of their AI systems. This ongoing, performance-based enforcement power could potentially lead to contract termination. Indeed, the clause states that failure to meet this requirement could lead to suspension of the AI system until compliant or cancellation of the contract for cause (a stark outcome).

The practical impacts of this provision are quite obvious. Contractors will be unable to use a cookie cutter, straight out of the box, AI system. Instead, they will need to build or adopt a highly controlled AI environment with a robust and auditable data management system. This could be difficult for smaller contractors who do not have the resources or know-how to build or manage such a system. Contractors will need to be prepared for ongoing validation audits and quickly respond to any issues identified by the government during these audits. Failure to do so could result in a termination for cause which could impact future contracting opportunities for years to come.

“Unbiased AI” Requirements

In addition to the requirements above, the GSA clause requires AI systems to comply with the following Unbiased AI Principles. First, the AI system must be a neutral, nonpartisan tool that does not manipulate responses in favor of ideological dogmas such as Diversity, Equity, Inclusion. Second, the system must be truthful in responding to user prompts seeking factual information or analysis. Third, it must prioritize historical accuracy, scientific inquiry, and objectivity and must acknowledge uncertainty where reliable information is incomplete or contradictory. Finally, the contractor or service provider must not intentionally encode partisan or ideological judgments into the system’s data outputs.

Given the subjective nature of bias and neutrality, many contractors may be left wondering “is my AI system actually biased or neutral?”. As it stands, the clause does not explicitly define bias or neutrality. This uncertainty will inevitably create potential conflicts between the contractor’s business practices, GSA’s regulatory expectations, and the ability to successfully align the two.


GSA’s proposed AI clause is the first of its kind and serves as the initial blueprint for how the federal government wants AI to operate: in a controlled, auditable, and sovereign system removed from the commercial sector. 

As it stands, the commenting period for the proposed clause ends on April 3, 2026. We anticipate that many comments will center around potential waivers to the “American Only” requirement or how AI bias and neutrality will be defined, measured, and governed.  

Ultimately, the opportunity for contractors to implement AI systems in their proposals and bids for federal government work remains significant. That said, contractors who proactively adapt to GSA’s proposed clause, if implemented, will be best positioned to compete in this evolving landscape.

If you have any questions about how this clause may impact future bid opportunities, please contact us for a free consultation!

What GSA’s Proposed AI Clause Could Mean for Federal Contractors was last modified: April 3rd, 2026 by Timothy Laughlin