Liability for The Robot’s Own Deed
Keywords:
artificial intelligence; subjects of law; civil law; legislationAbstract
The research of the proposed theme aims to demonstrate the existence of civil legal liability in the case of agencies with artificial intelligence, not only regarding human beings.
There are already several studies of scientists in this field, and they have resulted in not only theoretical concepts, but also actual results, namely: robots. European legislation has also enshrined rules that consider the legal relationship of robots with humans.
The methods we understand to use are quantitative, logical method, sociological method, comparative method.
The conclusions of the article capture the importance of recognizing robots as subjects of civil law, corresponding to their degree of understanding and perception of reality, but also of assuming the consequences of their deed.
This analysis can be useful: university professors, researchers in the field, doctrinaires, students, juries. The listing is not a limiting one, as the research is relevant to all those interested in this topic.
The novelty of the study consists in formulating ideas based on existing legislation, but also based on the opinion poll of some professionals. The research will show that robots are also responsible for their own deed, bringing both arguments and counterarguments.
References
[2] Puṣcă A., Drept civil român – Persoanele fizice ṣi persoanele juridice, Editura Didactică ṣi Pedagogică Bucureṣti, Bucureṣti, 2006.
[3]Puṣcă A., Legal Aspects on the Implementation of Artificial Intelligence - https://eudl.eu/pdf/10.4108/eai.13-7-2018.164174.
[4] Puṣcă A., Should We Share Rights and Obligations with Artificial Intelligence Robots? - https://link.springer.com/chapter/10.1007%2F978-3-030-51005-3_33.
[5]Colṭan T.V., Răspunderea civilă delictuală în Noul cod civil. Între dezideratul legiuitorului ṣi realităṭile practice, Bucureṣti.
[6]Baias F.A., Chelaru E., Noul Cod civil – comentariu pe articole, editura C.H. Beck, Bucureṣti, 2012.
[7]Verdeṣ E.C., Tendinṭe în abordarea teoretică a răspunderii juridice. Privire specială asupra relaṭiei dintre răspunderea civilă delictuală ṣi răspunderea penală, Bucureṣti, 2010.
[8]Orientări în materie de etică pentru o inteligență artificială fiabilă, elaborat în aprilie 2019, în cadrul Comisiei Europene, de către Grupul de Experți la nivel înalt privind AI (AI HLEG).
[9]Raportul din 27.01.2017 al Parlamentului European, conținând recomandări adresate Comisiei referitoare la normele de drept civil privind robotica (2015/2103(INL)) - https://www.europarl.europa.eu/doceo/document/A-8-2017-0005_RO.html.
[10]One Hundred Year Study on Artificial Intelligence (AI 100), https://ai100.stanford.edu/.
[11]Morar M., Uscov S., Scurte considerații privind răspunderea Inteligenței Artificiale în România sub unghiul mort al AI - https://www.juridice.ro/693832/scurte-consideratii-privind-raspunderea-inteligentei-artificiale-in-romania-sub-unghiul-mort-al-ai.html.
[12]Bejan T., Definiṭii ale eticii, https://ro.scribd.com/document/371382238/Definitii-Ale-Eticii.
[13]https://www.goethe.de/prj/fok/ro/akt/21742229.html.
[14]European Commission – White Paper On Artificial Intelligence – A European approach to excellence and trust, COM (2020),Brussels,19.2.2020,https://ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.
[15]http://www.academyforlife.va/content/dam/pav/documenti%20pdf/2020/CALL%2028%20febbraio/AI%20Rome%20Call%20x%20firma_DEF_DEF_.pdf – Rome Call for AI Ethics, https://ec.europa.eu/info/events/artificial-intelligence-who-should-be-liable-damage-2020-may-05_en.
Published
How to Cite
Issue
Section
License
The author fully assumes the content originality and the holograph signature makes him responsible in case of trial.