Powered by MOMENTUM MEDIA
cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

ChatGPT put on trial: Can AI be a lawmaker?

Can OpenAI’s iconic AI chatbot effectively cater to human societal needs when writing laws? Or does the technology have a way to go?

user icon Daniel Croft
Tue, 13 Feb 2024
ChatGPT put on trial: Can AI be a lawmaker?
expand image

The capabilities of artificial intelligence (AI) are being tested across all industries, with the technology aiding in analysis, manufacturing, and even creative avenues.

However, when it comes to our legal system, can AI replicate the discretion needed to determine the best interests of human society? Can AI write laws?

An academic from Charles Darwin University (CDU) has conducted research to determine just that. Associate Professor Guzyal Hill put everyone’s favourite AI chatbot, OpenAI’s ChatGPT, to the test, asking it to write legislation for domestic violence.

============
============

As CDU points out in its release, domestic violence is very much a human issue, one that destroys lives and families and one that kills dozens of Australians every year.

“Domestic violence represents a complex human problem, with up to 50 women dying every year in Australia alone,” said Hill.

“The federal, state and territory governments introduced the joint national plan to end violence against women and children within one generation. Can ChatGPT help in producing a high-quality definition of domestic violence?

“After running several tests and comparing with the definition produced by the Australian Law Council, the answer is ‘not yet’ – human drafting is still superior. ChatGPT, however, was very useful in classifying and identifying underlying patterns of types of domestic violence.”

Hill added that as much as ChatGPT did show some signs of successful analysis of the issue, AI tools like it and others should not be used in the search for legal advice, particularly as these bots often base their advice on US law due to the datasets they are built off.

“As tempting as it is, ChatGPT should not be used for legal advice. The chatbot is based on predicting text based on probability. The majority of text ChatGPT was trained on does not come from Australia,” said Hill.

“The Australian legal system is different to the US. There are also important differences between jurisdictions; for example, Northern Territory and New South Wales have vast differences in criminal law.”

“It is unlawful to provide legal advice without a practising certificate across all Australian jurisdictions. For example, in New South Wales, the maximum penalty for engaging in unqualified legal practice is a fine of 250 penalty units [$27,500] or imprisonment for two years or both, in accordance with section 10 of the Legal Profession Uniform Law 2014 (NSW).”

“The main dangers of using ChatGPT for legal advice are in sharing personal information, acting on hallucination or simply wrong advice.”

As Hill pointed out, ChatGPT now has a disclosure stating it is unable to provide legal advice.

Despite AI not being at a level where it can write legislation and properly understand the needs and nuances of the legal system, legal professionals and students should be taught how to use and understand AI tools in order to mitigate the risks the technology presents, as well as to spur on further research into what the role of AI is in the legal system.

“For lawyers and law students, AI is an area where we must upskill,” said Hill.

“Eluding or ignoring AI has many unpredictable drawbacks and at least several predictable dangers, such as making major mistakes in misuse of AI; missing an opportunity to lead the debate on the development of law with the emergence of AI; and allowing experts from other fields to develop solutions that do not consider fundamental human rights or contradict foundational principles of rule of law.

“Without any doubt, AI poses serious risks and threats if used unchecked. Lawyers and law students should treat AI in a way that is practical, cautious, and yet curious.

“At this point, AI systems are an augmentation of human acuity rather than an abrogation of legal analysis and reasoning. We, as lawyers, have an opportunity to inhabit this new AI domain with the potential to transform law and the way we approach law globally.”

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

cd intro podcast

Introducing Cyber Daily, the new name for Cyber Security Connect

Click here to learn all about it
newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.