节点文献

论我国算法解释权制度的完善

On Perfecting China’s Algorithm Explanation Right System

【作者】 李娜

【导师】 陈业宏;

【作者基本信息】 华中师范大学 , 法律硕士(非法学)(专业学位), 2023, 硕士

【摘要】 随着人工智能技术的不断发展,算法决策应用日益普遍,其在为人们生活带来便利的同时,由于算法技术“黑箱”属性的存在,也引发了像算法歧视、算法霸权、算法操控等危害,使个人的隐私、自由等合法权益遭到侵害,破坏了社会的公平正义。为了应对算法决策带来的危害,平衡算法决策双方的利益,我国《个人信息保护法》在第24条第3款以及第48条确立了算法解释权,算法解释权在我国作为一项新兴权利,对其的规定还有诸多有待完善的地方。就算法解释权概念的界定问题来说,本文从立法目的入手,并结合自动化决策算法的技术特性,提出算法解释权是指,数据主体(个人)在算法决策可能将或已经对其个人权益产生法律影响或类似严重影响时,有要求算法决策使用者对算法的运行及决策进行解释的权利。在确定算法解释权的权利主体和义务主体时,本文在考虑到欧盟和我国法律规定的同时,也注意到了法律实施层面可能遇到的问题,论述了权利主体是作为数据主体的自然人,不包括法人,并适当限缩了义务主体的范围,提出义务主体包含法人、公共机构等组织,不包含自然人。同时需要注意的是义务主体为算法决策使用者,不包括算法设计者,但在实践中,若算法决策使用者因技术、知识等客观因素难以对算法决策进行解释时,算法设计者应当协助算法决策使用者进行解释。就算法解释权的权利行使规则来说,本文在借鉴欧盟和美国法律规定的同时,结合我国国情,明确了适用范围、解释对象以及行使阶段。算法解释权的适用范围应当同时满足两个条件:一是决策的类型不能局限于仅基于自动化处理的决策,还应当包括有“人工参与”因素的混合决策。因为在实践应用中,个人可能完全依赖算法的建议作出决定,于是算法决策成为决定的实质性基础;二是要对个人权益产生“重大影响”,对“重大影响”可理解为“法律影响或类似严重影响”。就解释对象来说,个人既可以主张对算法系统的解释,也可以主张对特定决策的解释。就权利的行使阶段来说,应当覆盖决策的事前、事中以及事后阶段,以此可以实现解释对象和解释阶段的有机配合,对个人权益提供更周全的保护。对于义务主体的解释标准设定问题来说,本文借鉴了欧盟法律的规定,提出了解释义务豁免的特殊情形,即算法决策使用者的正当权益压倒性的优于数据主体的权益、自由时,则可以阻却数据主体行使算法解释权。而在一般情形下,鉴于我国算法决策应用场景的多元化和复杂化,本文在综合分析了学界提出的多种方案,并考虑到算法解释权的价值取向后,最终支持采用场景化的解释标准,即根据算法决策不同的应用场景以及算法决策对决定产生的实质性影响来设定解释标准。

【Abstract】 The use of algorithmic decision-making is becoming increasingly common with the advancement of artificial intelligence technology.Although it provides convenience,its "black box" nature also results in potential harm such as algorithmic discrimination,dominance,and manipulation.Such harm infringes on the legitimate rights of individuals,including their privacy and freedom,and undermines social justice and fairness.To mitigate these risks and balance the interests of all parties involved,China’s "Personal Information Protection Law" has established the right to algorithmic explanation in Article24,Paragraph 3 and Article 48.This right is an emerging development in China,and further improvements to its provisions are still needed.Regarding the definition of the right to algorithmic explanation,this article takes a legislative purpose perspective and considers the technical characteristics of automated decision-making algorithms.It posits that the right to algorithmic explanation grants data subjects(individuals)the right to demand an explanation of the operation and decisionmaking of an algorithm from the user when it may or has had a significant legal or similar impact on their personal rights and interests.When determining the subjects and obligations related to the right to algorithmic explanation,this article considers both EU and Chinese legal provisions and potential implementation issues.The article specifies that the right’s subject is a natural person,excluding legal persons,and that the obligation subject includes legal persons,public institutions,and other organizations,but not natural persons.It’s important to note that the obligation subject is the user of the algorithmic decision-making process,not the algorithm designer.However,in practice,if the user experiences difficulties in explaining the algorithmic decision due to objective factors such as technology or knowledge,the algorithm designer should assist the user in providing an explanation.Regarding the rules for exercising the right to algorithmic explanation,this article draws on the legal provisions of the EU and the United States while taking into account the national conditions of China.It clarifies the scope of application,the objects of explanation,and the stages of exercise.The scope of application of the right to algorithmic explanation should meet two conditions: first,the type of decision-making cannot be limited to decisions based solely on automated processing but should also include hybrid decision-making with "human participation" factors.This is because in practical applications,individuals may rely entirely on the algorithm’s suggestions to make decisions,making algorithmic decision-making the substantive basis for the decision.Second,the decision-making should have a "significant impact" on personal rights and interests,which can be understood as "legal or similarly significant impact".Regarding the objects of explanation,individuals can assert the right to explain the algorithmic system or specific decisions.Regarding the stages of exercising the right,it should cover the predecision,during decision,and post-decision stages,so as to achieve an organic combination of the object and stage of explanation and provide more comprehensive protection for personal rights and interests.Regarding the issue of setting the explanation standard for the obligation subject,this article draws on the legal provisions of the EU and proposes special circumstances for exemption from the obligation to explain,which is when the legitimate rights and interests of the user of the algorithmic decision outweigh the rights and freedoms of the data subject,the exercise of the right to algorithmic explanation can be blocked.However,in general,given the diversity and complexity of the application scenarios of algorithmic decisionmaking in China,and after analyzing various proposals put forward by the academic community,considering the value orientation of the right to algorithmic explanation,this article agrees with the adoption of a scenario-based explanation standard,which sets the explanation standard based on different application scenarios of algorithmic decisionmaking and the substantive impact of algorithmic decisions on the decisions mad.

  • 【分类号】D922.16;D923
节点文献中: 

本文链接的文献网络图示:

本文的引文网络