There are evident gaps in certain areas, like the legal aspects of deepfakes or the creation/deployment of AI tools with existential risks, and policy areas that require further development, such as plagiarism.
There are evident gaps in certain areas, like the legal aspects of deepfakes or the creation/deployment of AI tools with existential risks, and policy areas that require further development, such as plagiarism.
In a webinar titled “Cutting Through the Noise,” David Copping and Ethan Ezra, lawyers from Farrer & Co. law firm, discussed the legal implications of artificial intelligence (AI) within the higher education sector, emphasizing the breadth and variety of AI as a technological phenomenon.
There are evident gaps in certain areas, like the legal aspects of deepfakes or the creation/deployment of AI tools with existential risks, and policy areas that require further development, such as plagiarism. Nonetheless, there exists an established legal framework that allows many questions regarding AI usage to be addressed without the need for new laws or regulations. There is a substantial disparity in how students and academics perceive the utilization of generative AI. Students largely favor its use, while academics predominantly see AI as a risky tool for plagiarism. The challenges associated with student use of generative AI include the misrepresentation of academic abilities and the potential infringement of third-party intellectual property. The webinar posited that considering the widespread prevalence and popularity of generative AI among students, enforcing a complete ban on its use may prove challenging.
To effectively manage and regulate the use of generative AI, Copping and Ezra recommended universities to consider the following strategies:
In the context of AI owning patents, the UK Intellectual Property Office (UKIPO) has rejected two patent applications in 2018, invoking sections 7 and 13 of the Patents Act 1977, which stipulate the involvement of “a person” in the patent process. The applications were submitted for an AI machine named DABUS, which was deemed not to qualify as “a person,” raising questions about its eligibility as an inventor or patent owner. The case is presently under review by the Supreme Court, and the outcome is pending. It’s noteworthy that a UKIPO consultation on this matter determined that no alterations to the existing law were necessary.
According to the Copyright Designs and Patents Act 1988, the “author” of a work is the individual who creates it. In the context of computer-generated works, Section 9(3) CDPA stipulates that “the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken.”The question of authorship for AI-generated art remains open to interpretation, with potential candidates including the creator of the AI system and/or the person instructing the system to generate the artwork.
The current copyright framework in the UK and other jurisdictions is deemed applicable to AI-generated work. Regarding ownership, it is unclear, but a general principle is that the author of a work is its owner, or in employment scenarios, the employer may claim ownership. AI platforms may adopt varied approaches, such as OpenAI assigning ownership of the generated output to the user.
According to Copping and Ezra, in terms of infringement, the utilization of AI poses a risk of intellectual property (IP) infringement, particularly because AI tools like ChatGPT are “trained” on extensive third-party datasets from the internet to generate output material.
While there are exceptions for infringement, such as text/data mining for non-commercial research, these exceptions are less likely to apply in larger-scale data extraction scenarios within a commercial context.
Concerning data protection, the associated risks involve users entering personal information into AI tools, as many tools retain or make onward use of input data. Additionally, there are risks related to AI tools targeting publicly available personal data, such as profiles and registries.
Higher education institutions should address these issues by conducting suitable data protection impact assessments, implementing AI policies that regulate permissible input data (e.g., limiting personal details), and employing technical measures to deter data scrapers.
In terms of bias, numerous studies have highlighted various social and political biases inherent in AI tools like ChatGPT. Higher education institutions should carefully consider the types of work and tasks for which they would permit AI to be used.
For example, typing in queries on sensitive political issues, student admissions, and welfare queries are areas where biases may have significant implications and should be scrutinized.
The symbiosis between artificial intelligence and education is on an upward trajectory, with the forthcoming years promising significant expansion of AI applications within this sector, according to a report by think tank Technavio.
In its report titled “Artificial Intelligence Market In The Education Sector Market 2023-2027”, Technavio also noted that the increase underscores the rising importance of integrating AI into the educational landscape to boost learning outcomes. Over the next five years, the research found that the education sector is poised for a transformative leap, fueled by AI. It projected a net increase of $1,100.07 million by 2027 due to AI in the education market setting the stage for revolutionary changes. These robust growth estimates suggest a compound annual growth rate of 41.14 percent within the forecast period, reflecting the rapidly accelerating pace of AI adoption in this sector.
A coalition of education and technology organizations, led by TeachAI, has unveiled the AI Guidance for Schools Toolkit aimed at equipping educational institutions with the necessary framework and resources to responsibly integrate artificial intelligence into the classroom.
The initiative is backed by industry leaders including Code.org, ETS, the International Society of Technology in Education, Khan Academy, and the World Economic Forum.
[elementor-template id=”78745″]