{"id":51690,"date":"2026-03-25T12:45:32","date_gmt":"2026-03-25T11:45:32","guid":{"rendered":"https:\/\/www.eh.at\/?p=51690"},"modified":"2026-04-07T12:42:07","modified_gmt":"2026-04-07T10:42:07","slug":"ai-act-and-contracts-why-ai-compliance-starts-at-the-negotiation-table","status":"publish","type":"post","link":"https:\/\/www.eh.at\/en\/ai-act-and-contracts-why-ai-compliance-starts-at-the-negotiation-table\/","title":{"rendered":"AI Act and Contracts: Why AI Compliance Starts at the Negotiation Table"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"51690\" class=\"elementor elementor-51690 elementor-51683\" data-elementor-post-type=\"post\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-bab9b5b elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"bab9b5b\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7b05da2\" data-id=\"7b05da2\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-102a797 elementor-widget elementor-widget-text-editor\" data-id=\"102a797\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>25.03.2026<\/p><p><em><a href=\"https:\/\/www.eh.at\/en\/team\/gernot-fritz\/\">Gernot Fritz<\/a>, <a href=\"https:\/\/www.eh.at\/en\/team\/tanja-pfleger\/\">Tanja Pfleger<\/a>, <a href=\"https:\/\/www.eh.at\/en\/team\/amina-kovacevic\/\">Amina Kovacevic<\/a><\/em><\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fc56eac elementor-widget elementor-widget-text-editor\" data-id=\"fc56eac\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>When companies deploy AI today, they rarely do so entirely in-house. In practice, systems are typically procured, integrated, or accessed via platforms. Companies are therefore not merely acquiring technology, but often complex, dynamic systems whose functionality, data basis, and behaviour continuously evolve.<\/p><p>This is precisely where a key difference to traditional software lies. AI systems are not static products; they change during operation. They rely on training data, learn from new inputs, and are continuously adapted through updates. This creates risks that can no longer be adequately addressed through traditional IT contractual clauses alone.<\/p><p>Against this background, contract design becomes a central element of AI compliance. The AI Act does not only regulate the technology itself, but also the collaboration along the AI value chain. Many of its requirements can, in practice, only be met if they are contractually reflected between the involved actors.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-78a7838 elementor-widget elementor-widget-heading\" data-id=\"78a7838\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Allocation of roles as the starting point of any contract<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-71530e6 elementor-widget elementor-widget-text-editor\" data-id=\"71530e6\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>A central anchor point of the AI Act is the allocation of regulatory roles. The Regulation distinguishes, in particular, between providers and deployers of AI systems, attaching significantly different obligations to each role. While providers are responsible, for example, for risk management, technical documentation, and conformity assessments, deployers are primarily subject to requirements relating to use, monitoring, and documentation.<\/p><p>In practice, however, this allocation is rarely straightforward. Complex projects often involve multiple parties \u2013 such as model providers, integrators, platform operators, and end users. In addition, roles may shift. Companies may qualify as providers even if they did not originally develop the system \u2013 for instance, where they place it on the market under their own name, substantially modify it, or change its intended purpose.<\/p><p>This dynamic illustrates why contractual clarification of roles is not a mere formality, but a key prerequisite for effective compliance. A lack of clarity may result in companies assuming regulatory obligations without having consciously managed that outcome.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-81c2a29 elementor-widget elementor-widget-heading\" data-id=\"81c2a29\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Cooperation along the value chain<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-48fe6f4 elementor-widget elementor-widget-text-editor\" data-id=\"48fe6f4\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>Role allocation alone is not sufficient. The AI Act also requires structured cooperation between the actors involved \u2013 particularly where multiple parties contribute to the functionality and compliance of a system.<\/p><p>In the case of high-risk AI systems, providers will often depend on information, components, or support from third parties. To enable them to fulfil their regulatory obligations, these contributions must be reliably organised. In practice, this means that contracts must clearly define which information is to be provided, which technical access is required, and which cooperation obligations apply.<\/p><p>Similarly, deployers require clear contractual arrangements with providers to meet their own obligations \u2013 for example, regarding instructions for use, provision of documentation, data control, and access to logs.<\/p><p>As a result, the function of contracts is shifting. They no longer merely delineate commercial responsibilities, but become an operational instrument for managing compliance across the entire value chain.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c929850 elementor-widget elementor-widget-heading\" data-id=\"c929850\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Training data and data use as a critical pressure point<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d2c8b76 elementor-widget elementor-widget-text-editor\" data-id=\"d2c8b76\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>One particularly sensitive area concerns the handling of data. Many AI systems are continuously improved through further training or refinement based on new data. This raises a key contractual question: what happens to the customer\u2019s data?<\/p><p>Especially in the context of generative AI, it is by no means self-evident that inputs are used solely for the provision of the specific service. Providers often reserve the right to use such data \u2013 at least in aggregated or anonymised form \u2013 for training purposes. For companies, this can create significant risks, particularly with regard to trade secrets, personal data, and regulatory requirements.<\/p><p>Contracts must set clear boundaries in this respect. This includes defining whether and to what extent data may be used for training purposes, which forms of anonymisation are envisaged, and whether customers have a right to opt out. In practice, this issue often becomes one of the central negotiation points and has a decisive impact on the risk profile of an AI project.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-40574a0 elementor-widget elementor-widget-heading\" data-id=\"40574a0\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Dynamic systems require dynamic contractual mechanisms<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3a3e8b5 elementor-widget elementor-widget-text-editor\" data-id=\"3a3e8b5\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>Another key difference to traditional software lies in the dynamic nature of AI systems. Models are continuously refined, functionalities are adjusted, and underlying risk profiles may change. Updates are therefore not only a technical issue, but also a legal one.<\/p><p>Contracts must take this into account. It is not sufficient to simply allow or exclude updates. Rather, the key questions are how changes are communicated, whether and under what conditions customers can object to them, and how it is ensured that regulatory assessments remain up to date.<\/p><p>In sensitive use cases, it may be crucial that changes are transparently documented and their impact remains traceable. Otherwise, there is a risk that a system initially considered low-risk gradually \u201cevolves\u201d into a regulated category without this being recognised in time.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-22f4396 elementor-widget elementor-widget-heading\" data-id=\"22f4396\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Liability and risk allocation in complex systems<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-46ba9e2 elementor-widget elementor-widget-text-editor\" data-id=\"46ba9e2\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>As AI systems become more complex, the importance of clear risk allocation increases. Errors may originate from training data, model decisions, integration, or actual use. As a result, attributing responsibility is often difficult.<\/p><p>Contracts play a central role here. They define who is responsible for which risks, which liability limitations apply, and in which cases indemnities are triggered. Particularly relevant are issues relating to the origin and quality of training data, as well as the outputs generated by the system.<\/p><p>In practice, it becomes clear that traditional liability models are often insufficient to capture the specific characteristics of AI systems. Contract design must therefore be more granular and reflect the different risk spheres along the value chain.<\/p><p>This gains additional importance in light of the revised Product Liability Directive (EU) 2024\/2853, which explicitly brings software and AI systems within the scope of product liability and thereby introduces new requirements for contractual risk allocation.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d1c88e7 elementor-widget elementor-widget-heading\" data-id=\"d1c88e7\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Transparency, audit, and the limits of the black box<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b21c9ba elementor-widget elementor-widget-text-editor\" data-id=\"b21c9ba\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>A recurring challenge in AI projects is the limited transparency of many systems. At the same time, the AI Act \u2013 particularly for regulated use cases \u2013 requires a high degree of traceability and control.<\/p><p>This raises a key question for companies: how can compliance with regulatory requirements be verified if there is no insight into the system? Contractual clauses on audit and control rights become significantly more important in this context. They provide at least partial access to relevant information, for example regarding training data, model behaviour, or implemented security measures.<\/p><p>Even if full transparency will remain unrealistic in many cases, one point is clear: without contractually secured information and control rights, robust AI compliance is hardly achievable.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c6b7756 elementor-widget elementor-widget-heading\" data-id=\"c6b7756\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Conclusion: Contract design as the key to AI compliance<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fd1be2f elementor-widget elementor-widget-text-editor\" data-id=\"fd1be2f\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>The AI Act does not only reshape regulatory requirements \u2013 it also changes how AI projects must be structured. Many obligations cannot be fulfilled in isolation within a single organisation, but require coordinated implementation across the entire value chain.<\/p><p>Contracts therefore become the central instrument for ensuring this coordination. They define roles, structure information flows, allocate risks, and create the foundation for transparency and control.<\/p><p>For practice, this means one thing above all: AI compliance does not begin when a system is deployed, but already at the negotiation table \u2013 because what is not reflected in the contract will be difficult to enforce later on.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9417f14 elementor-widget elementor-widget-text-editor\" data-id=\"9417f14\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p><em>If AI projects are to scale, they require robust contractual frameworks \u2013 we are happy to support you in getting them right.<\/em><\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>25.03.2026 Gernot Fritz, Tanja Pfleger, Amina Kovacevic When companies deploy AI today, they rarely do so entirely in-house. In practice, systems are typically procured, integrated, or accessed via platforms. Companies are therefore not merely acquiring technology, but often complex, dynamic systems whose functionality, data basis, and behaviour continuously evolve. This is precisely where a key [&hellip;]<\/p>\n","protected":false},"author":21,"featured_media":51686,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_lock_modified_date":false,"inline_featured_image":false,"footnotes":""},"categories":[235],"tags":[805,901,902],"group":[],"area":[],"location":[],"systype":[],"class_list":["post-51690","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-legal-update-en","tag-ai-2","tag-ai-act-2","tag-contracts"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/posts\/51690"}],"collection":[{"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/users\/21"}],"replies":[{"embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/comments?post=51690"}],"version-history":[{"count":11,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/posts\/51690\/revisions"}],"predecessor-version":[{"id":52077,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/posts\/51690\/revisions\/52077"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/media\/51686"}],"wp:attachment":[{"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/media?parent=51690"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/categories?post=51690"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/tags?post=51690"},{"taxonomy":"group","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/group?post=51690"},{"taxonomy":"area","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/area?post=51690"},{"taxonomy":"location","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/location?post=51690"},{"taxonomy":"systype","embeddable":true,"href":"https:\/\/www.eh.at\/en\/wp-json\/wp\/v2\/systype?post=51690"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}