COVR—Legal risk assessment (LEGARA)—White paper

Publikation: Bog/antologi/afhandling/rapportRapportRådgivning

Standard

COVR—Legal risk assessment (LEGARA)—White paper. / Van Rompaey, Léonard.

2021. 39 s.

Publikation: Bog/antologi/afhandling/rapportRapportRådgivning

Harvard

Van Rompaey, L 2021, COVR—Legal risk assessment (LEGARA)—White paper.

APA

Van Rompaey, L. (2021). COVR—Legal risk assessment (LEGARA)—White paper.

Vancouver

Van Rompaey L. COVR—Legal risk assessment (LEGARA)—White paper. 2021. 39 s.

Author

Van Rompaey, Léonard. / COVR—Legal risk assessment (LEGARA)—White paper. 2021. 39 s.

Bibtex

@book{20935804d593445fa40b6fb070e24691,
title = "COVR—Legal risk assessment (LEGARA)—White paper",
abstract = "With the advancement of robotics and artificial intelligence, the nature of risks evolves in a way that makes them harder to foresee and mitigate against. While specific risks cannot be as easily foreseen as before, general risks can still be identified and mitigated against. This mitigation may gain from being done at both technical and non-technical levels. The way we address that changing nature of risks matters for generating and maintaining trust in the technology, and for setting the conditions for the widest possible societal deployment of the technology. This is relevant to all the actors in the robotics industry, and even for actors in neighbouring technological industries (IoT, AI, etc.) as public trust erosion in one technology or application would affect all other contemporary technologies. Current standards are not tackling this changing nature of risks, and are more generally unsuited to the current and near future technology. Compliance to those standards, or even to more precise and updated standards might not shield producers from damage claims in case of accidents. We cannot wait for public policymakers to find answers to those challenges and must engage in a self-regulation effort. This white paper thus attempts to explore the way engineers and lawyers deal with new types of risks, and tries to clarify the way lawyers and judges are likely to react when having to deal with accidents and other legal issues related to developmental technologies",
keywords = "Faculty of Law, collaborative robotics, artificial intelligence, product liability, machinery directive, safety",
author = "{Van Rompaey}, L{\'e}onard",
year = "2021",
language = "English",

}

RIS

TY - RPRT

T1 - COVR—Legal risk assessment (LEGARA)—White paper

AU - Van Rompaey, Léonard

PY - 2021

Y1 - 2021

N2 - With the advancement of robotics and artificial intelligence, the nature of risks evolves in a way that makes them harder to foresee and mitigate against. While specific risks cannot be as easily foreseen as before, general risks can still be identified and mitigated against. This mitigation may gain from being done at both technical and non-technical levels. The way we address that changing nature of risks matters for generating and maintaining trust in the technology, and for setting the conditions for the widest possible societal deployment of the technology. This is relevant to all the actors in the robotics industry, and even for actors in neighbouring technological industries (IoT, AI, etc.) as public trust erosion in one technology or application would affect all other contemporary technologies. Current standards are not tackling this changing nature of risks, and are more generally unsuited to the current and near future technology. Compliance to those standards, or even to more precise and updated standards might not shield producers from damage claims in case of accidents. We cannot wait for public policymakers to find answers to those challenges and must engage in a self-regulation effort. This white paper thus attempts to explore the way engineers and lawyers deal with new types of risks, and tries to clarify the way lawyers and judges are likely to react when having to deal with accidents and other legal issues related to developmental technologies

AB - With the advancement of robotics and artificial intelligence, the nature of risks evolves in a way that makes them harder to foresee and mitigate against. While specific risks cannot be as easily foreseen as before, general risks can still be identified and mitigated against. This mitigation may gain from being done at both technical and non-technical levels. The way we address that changing nature of risks matters for generating and maintaining trust in the technology, and for setting the conditions for the widest possible societal deployment of the technology. This is relevant to all the actors in the robotics industry, and even for actors in neighbouring technological industries (IoT, AI, etc.) as public trust erosion in one technology or application would affect all other contemporary technologies. Current standards are not tackling this changing nature of risks, and are more generally unsuited to the current and near future technology. Compliance to those standards, or even to more precise and updated standards might not shield producers from damage claims in case of accidents. We cannot wait for public policymakers to find answers to those challenges and must engage in a self-regulation effort. This white paper thus attempts to explore the way engineers and lawyers deal with new types of risks, and tries to clarify the way lawyers and judges are likely to react when having to deal with accidents and other legal issues related to developmental technologies

KW - Faculty of Law

KW - collaborative robotics

KW - artificial intelligence

KW - product liability

KW - machinery directive

KW - safety

UR - https://www.safearoundrobots.com/toolkit/publications

M3 - Report

BT - COVR—Legal risk assessment (LEGARA)—White paper

ER -

ID: 328694909