Search
Close this search box.

We are creating some awesome events for you. Kindly bear with us.

U.S: Interpretability Methods Describe How AI Works

U.S. researchers have developed and tested the Explanation Methods also known as Interpretability Methods to try and explain how black-box machine learning models make predictions. At their most fundamental level, explanation methods are either global or local. While global explanations aim to describe an entire model’s overall behaviour, local explanation methods concentrate on explaining how the model arrived at a single prediction.

This is frequently accomplished by creating a different, more basic model that resembles the larger, black-box model. But because deep learning models function in fundamentally nonlinear and complex ways, it is particularly difficult to create a global explanation model that is effective.

According to Yilun Zhou, a graduate student in the Interactive Robotics Group of the Computer Science and Artificial Intelligence Laboratory (CSAIL), who studies models, algorithms, and evaluations in interpretable machine learning, this has caused researchers to shift much of their recent focus instead toward local explanation methods.

There are three major groups into which the most common local explanation techniques can be divided. The earliest and most widely used type of explanation technique is known as feature attribution. The features that the model prioritised when it made a certain decision are displayed through feature attribution methods.

A machine-learning model uses features as its input variables, which are then used to make predictions. In essence, feature attribution techniques reveal what the model concentrates on when making a prediction.

A counterfactual explanation is the second category of explanation technique. These techniques demonstrate how to alter an input such that it belongs to a different class given the input and the forecast of a model.

For instance, the counterfactual explanation reveals what must happen for a borrower’s loan application to be approved if a machine-learning algorithm predicts that the person will be denied credit. Perhaps the person must have a higher credit score or income, two factors that the algorithm utilised to make its prediction, to be accepted.

The third category of explanation approaches includes examples of important explanations. This approach needs access to the data that were used to train the model, unlike the others.

A sample importance explanation will highlight the training sample that a model most frequently used to make a certain prediction; ideally, this sample will be the one that is closest to the input data. This kind of justification is especially helpful when one notices an apparent irrationality in a prediction.

A specific sample that was used to train the model may have been impacted by a data entry error. With this knowledge, the sample might be fixed, and the model retrained for more precise operation.

End-users should exercise caution when attempting to employ explanation methods in practice, even if machine-learning practitioners may find them helpful occasionally when trying to find defects in their models or comprehend how a system operates.

Researchers assert that the community should put more effort into understanding how information is presented to decision-makers so they can grasp it. They also added that greater regulation is required to ensure machine-learning models are utilised responsibly in practice. The solution goes beyond better explaining techniques alone.

Researchers anticipate future studies on explanation approaches for specific use cases, such as model debugging, scientific discovery, fairness audits, and safety assurance, in addition to new work aimed at improving explanations.

By figuring out the fine-grained features of explanation methods and the needs of different use cases, researchers could come up with a theory that would match explanations to specific situations. This could help avoid some of the problems that come up when using explanations in the real world.

PARTNER

Qlik’s vision is a data-literate world, where everyone can use data and analytics to improve decision-making and solve their most challenging problems. A private company, Qlik offers real-time data integration and analytics solutions, powered by Qlik Cloud, to close the gaps between data, insights and action. By transforming data into Active Intelligence, businesses can drive better decisions, improve revenue and profitability, and optimize customer relationships. Qlik serves more than 38,000 active customers in over 100 countries.

PARTNER

CTC Global Singapore, a premier end-to-end IT solutions provider, is a fully owned subsidiary of ITOCHU Techno-Solutions Corporation (CTC) and ITOCHU Corporation.

Since 1972, CTC has established itself as one of the country’s top IT solutions providers. With 50 years of experience, headed by an experienced management team and staffed by over 200 qualified IT professionals, we support organizations with integrated IT solutions expertise in Autonomous IT, Cyber Security, Digital Transformation, Enterprise Cloud Infrastructure, Workplace Modernization and Professional Services.

Well-known for our strengths in system integration and consultation, CTC Global proves to be the preferred IT outsourcing destination for organizations all over Singapore today.

PARTNER

Planview has one mission: to build the future of connected work. Our solutions enable organizations to connect the business from ideas to impact, empowering companies to accelerate the achievement of what matters most. Planview’s full spectrum of Portfolio Management and Work Management solutions creates an organizational focus on the strategic outcomes that matter and empowers teams to deliver their best work, no matter how they work. The comprehensive Planview platform and enterprise success model enables customers to deliver innovative, competitive products, services, and customer experiences. Headquartered in Austin, Texas, with locations around the world, Planview has more than 1,300 employees supporting 4,500 customers and 2.6 million users worldwide. For more information, visit www.planview.com.

SUPPORTING ORGANISATION

SIRIM is a premier industrial research and technology organisation in Malaysia, wholly-owned by the Minister​ of Finance Incorporated. With over forty years of experience and expertise, SIRIM is mandated as the machinery for research and technology development, and the national champion of quality. SIRIM has always played a major role in the development of the country’s private sector. By tapping into our expertise and knowledge base, we focus on developing new technologies and improvements in the manufacturing, technology and services sectors. We nurture Small Medium Enterprises (SME) growth with solutions for technology penetration and upgrading, making it an ideal technology partner for SMEs.

PARTNER

HashiCorp provides infrastructure automation software for multi-cloud environments, enabling enterprises to unlock a common cloud operating model to provision, secure, connect, and run any application on any infrastructure. HashiCorp tools allow organizations to deliver applications faster by helping enterprises transition from manual processes and ITIL practices to self-service automation and DevOps practices. 

PARTNER

IBM is a leading global hybrid cloud and AI, and business services provider. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM’s hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently and securely. IBM’s breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM’s legendary commitment to trust, transparency, responsibility, inclusivity and service.