{"id":33566,"date":"2025-09-24T13:24:46","date_gmt":"2025-09-24T11:24:46","guid":{"rendered":"https:\/\/opi.org.pl\/prawne-aspekty-transformacji-cyfrowej-i-ai\/"},"modified":"2025-10-13T10:51:27","modified_gmt":"2025-10-13T08:51:27","slug":"prawne-aspekty-transformacji-cyfrowej-i-ai","status":"publish","type":"post","link":"https:\/\/opi.org.pl\/en\/prawne-aspekty-transformacji-cyfrowej-i-ai\/","title":{"rendered":"Legal frameworks for digital transformation and artificial intelligence"},"content":{"rendered":"\n<p><strong><strong>The National Information Processing Institute (OPI PIB) conducts extensive research on artificial intelligence and the digital transformation, including their legal implications<\/strong>.<\/strong><\/p>\n\n\n\n<p><strong><strong>The law vs AI<\/strong><\/strong><\/p>\n\n\n\n<p><em>\u2018Understanding the legal impact of artificial intelligence is vital in ensuring the protection of individuals\u2019 rights as automated decision-making becomes more widespread,\u2019<\/em> explains Marek Michaj\u0142owicz, Deputy Head for Software Development at OPI PIB.<em> \u2018In Poland, such analyses enable the adaptation of national legislation to new challenges, including data protection and liability for algorithmic errors. At the European level, common frameworks such as the AI Act guarantee uniform protection standards across the EU and help prevent market fragmentation. Assessing the legal effects also contributes to public trust in emerging technologies, which is crucial for innovation to be embraced.<\/em>\u00a0 <em>Finally, conscious and careful lawmaking ensures that AI development serves the common good, while preventing inequality and abuse.\u2019<\/em><\/p>\n\n\n\n<p>Although algorithms deliver substantial advantages, they also pose considerable risks. Their power lies in their processing of vast amounts of data that can predict our choices and shape our behaviours. Platforms like Amazon and Netflix already influence our purchasing and viewing decisions, limiting the options we consider. Whoever controls the algorithm holds real power, which, in turn, determines modern wealth distribution.<\/p>\n\n\n\n<p><em>\u2018From a legal perspective, it highlights the pressing necessity of regulations that protect individuals and ensure their fundamental rights. Without prudent, well-considered regulation, we risk a \u201cDarwinian anarchy\u201d, \u00a0in which the strongest prevail, regardless of whether they act in the public interest,\u2019<\/em> says Luigi Lai, research and technical expert at OPI PIB.<\/p>\n\n\n\n<p><strong><strong>European regulations<\/strong><\/strong><\/p>\n\n\n\n<p>The General Data Protection Regulation (GDPR) forms the foundation of The EU\u2019s data protection framework, safeguarding the right not to be subjected to decisions made exclusively by automated systems and ensuring human involvement in decision-making. The AI Act, another key legal instrument, classifies AI systems based on their level of risk, ranging from minimal to unacceptable.\u00a0 The act also specifies the responsibilities of algorithm developers and users\u2014particularly in sensitive sectors such as healthcare, finance, and the judiciary. The goals of the act are to strengthen the protection of fundamental rights, and to support transparency and accountability.<\/p>\n\n\n\n<p><strong><strong>Regulations in the United States<\/strong><\/strong><\/p>\n\n\n\n<p>Across the Atlantic, major online platforms are governed by the Digital Services Act (DSA) and the Digital Markets Act (DMA). These regulations are intended to support fair competition, ensure transparency, and protect consumers in the digital marketplace.<\/p>\n\n\n\n<p><strong><strong>A challenge for the future<\/strong><\/strong><\/p>\n\n\n\n<p>Constantly evolving legislation reflects an increasing recognition of the need to adapt laws to meet the demands of rapidly advancing technology. An unregulated digital transformation could have far-reaching consequences for both current and future generations.<\/p>\n\n\n\n<p>Shaping a fair digital future is a responsibility shared by legislators, technology firms, and individuals. Collaborative action is crucial in the development of a society in which innovation empowers individuals without restricting their rights and freedoms.<\/p>\n\n\n\n<p>Watch the latest episode of the OPI PIB Academy series on our YouTube channel, which features an OPI PIB expert discussing the legal aspects of AI.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"AKADEMIA OPI PIB #20 \u2013 Who Controls the Algorithm Shapes the Future: AI, Law and Civil Society\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/zLlnzdLMj_M?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>The ongoing digital transformation and the rapid growth of artificial intelligence (AI) are influencing societies, economies, and legal systems profoundly across the world. Central to the contemporary debate are questions on achieving a balance between innovation, individual rights, and collective responsibility. Above all, it remains essential that humans, not machines, stay in control of decision-making processes.<\/p>\n","protected":false},"author":30,"featured_media":33500,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","_links_to":"","_links_to_target":""},"categories":[411],"tags":[835,990,991,838,491,840,989,473],"class_list":["post-33566","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-en","tag-ai-en","tag-artficialintelligence","tag-digitaltransformation","tag-it-en","tag-law","tag-opipib-2-en","tag-opipibacademy","tag-science"],"_links":{"self":[{"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/posts\/33566","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/users\/30"}],"replies":[{"embeddable":true,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/comments?post=33566"}],"version-history":[{"count":1,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/posts\/33566\/revisions"}],"predecessor-version":[{"id":33568,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/posts\/33566\/revisions\/33568"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/media\/33500"}],"wp:attachment":[{"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/media?parent=33566"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/categories?post=33566"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opi.org.pl\/en\/wp-json\/wp\/v2\/tags?post=33566"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}