Two AI projects will share A$ 250,000 in funding under the NSW Government’s Access to Justice Innovation Fund (AJIF). The Attorney General stated that the cutting-edge projects will help people with a disability, seniors, Aboriginal people, and those from culturally and linguistically diverse backgrounds understand and exercise their legal rights.
As a part of the program, a not-for-profit organisation will receive A$ 174,000 to build a language processing model to help an estimated 50,000 people across NSW who lack the legal literacy they may need when searching for legal resources and services online.
Separately, the University of Sydney has been awarded A$ 76,000 to develop a fairer assessment model for parents with a cognitive disability involved in care proceedings before the Children’s Court.
The not-for-profit organisation’s AI model will be free to other not-for-profit organisations in NSW and the University of Sydney’s resources will be available to parents, Children’s Court clinicians, the court and statutory caseworkers across the state at no cost.
The CEO of the not-for-profit organisation stated that his team will collect thousands of language samples from diverse groups in NSW to incorporate into the AI model’s training, with the help of hundreds of pro bono lawyers. They hope the project can serve as an example of AI for good and can ultimately be used by legal organisations across the justice sector.
Meanwhile, Dr Susan Collings from the University of Sydney’s Research Centre for Children and Families said parents with an intellectual disability are much more likely to lose their child to statutory care. The government has pledged $1 million over four years to the AJIF, with applications for the next round of funding to open in the second half of 2021.
According to the Artificial Intelligence: Australia’s Ethics Framework, a discussion paper released by the Australian government in 2019, Artificial intelligence (AI) is changing societies and economies around the world.
Data61 analysis reveals that over the past few years, 14 countries and international organisations have announced AU$ 86 billion for AI programs. Some of these technologies are powerful, which means they have considerable potential for both improved ethical outcomes as well as ethical risks. This report identifies key principles and measures that can be used to achieve the best possible results from AI while keeping the well-being of Australians as the top priority.
According to the paper, recent advances in AI-enabled technologies have prompted a wave of responses across the globe, as nations attempt to tackle emerging ethical issues.
Germany has delved into the ethics of automated vehicles, rolling out the most comprehensive government-led ethical guidance on their development available. New York has put in place an automated decisions task force, to review key systems used by government agencies for accountability and fairness. The UK has a number of government advisory bodies, notably the Centre for Data Ethics and Innovation. The European Union has explicitly highlighted ethical AI development as a source of competitive advantage.
The paper notes that Australia’s colloquial motto is a “fair go” for all. Ensuring fairness across the many different groups in Australian society will be challenging, but this cuts right to the heart of ethical AI. There are different ideas of what a “fair go” means. Algorithms can’t necessarily treat every person exactly the same either; they should operate according to similar principles in similar situations.
But while like goes with like, justice sometimes demands that different situations be treated differently. When developers need to codify fairness into AI algorithms, there are various challenges in managing often inevitable trade-offs and sometimes there’s no “right” choice because what is considered optimal may be disputed. When the stakes are high, it’s imperative to have a human decision-maker accountable for automated decisions— Australian laws already mandate it to a degree in some circumstances.