
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Created by mathematicians and machine learning experts from
the University of Cambridge, Darktrace’s Enterprise Immune System uses AI
algorithms that mimic the human immune system to defend enterprise networks of
all types and sizes.
OpenGov recently spoke to Sanjay Aurora, MD, Asia Pacific at Darktrace, about how cyber-attacks are evolving and how artificial intelligence can help defend against increasingly sophisticated attacks.
A new era of low and
slow cyberattacks
Today cyber-attacks make headlines all the time. But these
are not the cyber-attacks of 15 -20 years ago, defacing websites, stealing
credit card information. Cyber-criminals are now doing significantly more
sophisticated and stealthy, low and slow pointed attacks.
Mr. Aurora said, “You look at the DNC email hack in the US.
They are still talking about was there an attack, was there an attempt. Their
trust in the information is getting lost. This is the new generation of
cyberattacks that we are dealing with. People and organisations, from the
largest of the large to the smallest, are unable to trust information.”
He gave another example of a client, a laboratory in
Australia, which deals with a lot of confidential information about patients. They
are worried about somebody stealing that information. But they are even more
worried about somebody coming in without getting caught and tweaking some of
the information, so that the company cannot understand what is real and what is
fake. These low and slow pointed attacks reside in organisations for a long
time. The attackers are not going after organisations randomly. Sometimes, they
are after a particular piece of information from a specific organisation.
“But organisations, unfortunately, still rely on finding
‘good’ and ‘bad’. They build walls. Like how did you enter this building? You
showed your ID to the gatekeeper, the gatekeeper checked your ID, gave you a
pass. You tap the pass and you are in,” Mr. Aurora said.
He continued, “Or I could have tapped you in. then I would
have become an insider threat. Cyber security till recently was heavily relying
on rules and signatures, on locks and walls. Whereas the attackers are using so
many mechanisms to get inside. If you have the tallest of firewalls, attackers
will get even higher ladders. Notwithstanding policies, compliance and
training, all of us sitting here, are insider threats to our respective
organisations.”
Other than the reliance on locks and walls, the second issue
is the lack of visibility. Today everything connected with an IP is a
point-of-entry. It could be a printer, the audio/ video conference facilities
residing in the corporate boardrooms or even a connected coffee machine. Any of
these could be an easy point of entry. And they are not even on the radar of
many organisations.
Defining the normal
to detect the abnormal
Mr. Aurora compared an organisation to the human body. The human
body is being attacked by unknown unknowns every second. Yet, we have thrived
and survived for millions of years. The skin is our firewall. There are still
things that get in. The body’s immune system reacts and fights by firstly
understanding what is normal and detecting what is abnormal on the basis of
that.
“Around four years ago Cambridge University’s mathematicians
came up with a concept largely based on this principle. If the human body can
understand and fight back autonomously, why would an organisation not be able
to do it? Because there is information. The information is in the data. By
using mathematics and unsupervised machine learning, if you are able to
establish a pattern of life, then anything abnormal which disturbs that normal
pattern will be detected,” explained Mr. Aurora.
The abnormal could be a device talking to a certain server
at 2 in the morning, which it has never done before. Or a user downloading huge
amounts of information which he has never been done before. It is a very subtle
change in the behaviour of a device, a user or a network. That user or the
device have not broken any rules. Using only rules, an organisation would not
be able to detect that anomalous behaviour.
The organisation can mathematically deduce that if this is
happening here, and the other parts of the network are doing this, this anomaly
is a leading indicator (Mr. Aurora stressed that leading indicator is key here)
of a much larger problem that might be brewing in the organisation. This approach
allows organisations to take proactive measures.
‘The battle at the
border is over’
Sitting inside the network and using the principle of the
immune system, Darktrace establishes a pattern of life, within a week. We asked
if the system is previously trained on datasets.
Mr. Aurora replied, “We don’t even tell the system if this
is a bank or a law firm or a government entity. Because the moment you start
putting some rules, it starts with presumptions and the learning goes learn
wrong. Every entity operates differently. Even a Bank A will be different from
a Bank B.”
Within a week the system understands how the organisation
works, how users and devices behave and is able to alert the organisation to anything
abnormal that they should investigate.
Frequently, it is found that most of the large attacks we
read about, are the culmination of several leading indicators that existed for
a long time. For instance, employees using VPN to hide their activity, such as
shopping or browsing prohibited websites pose a naive insider threat.
Unknowingly they present massive risks.
Mr. Aurora said, “That is a result of these leading
indicators which are already present in the organisation and were going
unnoticed.”
“The battle at the border is over. You cannot defend the
border any more. The real battle is inside. The battle now is how do you deduce
from the leading indicators and pro-actively stop them early in the tracks
before they become headlines.”
The system gets wiser as it processes more and more traffic.
This is being taken a step further now. Machines are autonomously responding to
threats. Deducing the real issue from the threats, the machines are able to
take very precise action, like slowing down the progress, or allowing the human
to intervene and giving that little extra time to stop the threat. This is
particularly handy for fast moving threats like ransomware. During the WannaCry
attack in May this year, Darktrace’s Enterprise Immune System successfully detected and
contained the attacks for a number of its customers, including an NHS
(National Health Service) agency.
This is accomplished by Darktrace’s Antigena solution. Its autonomous
response capability allows organisations to directly fight back, and networks
to self-defend against specific threats, without any disruptions.
Mr. Aurora said, “This is a cyber arms race. You cannot
fight those machine or AI led attacks using conventional security teams, who
will raise a ticket and try to understand from logs as to what is going on. Traditional
security people are like firefighters. They solve problems. We have shifted the
paradigm. Instead of just firefighting the known issues, you discover the
unknown issues and take action.”

- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The use of artificial intelligence (AI) has increased rapidly, accelerating during the pandemic. Machine learning, deep learning algorithms and models process massive amounts of data to enable faster, smarter, and better decision-making. As a result, tech-enabled forecasting holds enormous promise for the financial industry which has long been the steward of massive data sets.
Business leaders recognise the importance of data tailored to each function and the role analytics tools play in leveraging data. In this context, data-driven decision-making analytics software inherently provides a competitive advantage.
Advancements in data, analytics and machine learning mean that businesses with large amounts of data have an incredible opportunity to capitalise on it. However, they must do so with an eye toward scale, change management and a curiosity culture.
Data-driven decision making
Dr Geraldine Wong, Chief Data Officer, GXS, revealed in an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief, OpenGov Asia, that collecting, extracting, structuring and analysing business insights was historically a time-consuming task that slowed data-driven decision-making.
Dr Geraldine, who is among the 2022 Global Top 100 Innovators in Data and Analytics Today, notes that business intelligence software has since enabled people without significant technical experience to analyse and extract insights from their data. Less technological expertise is needed to provide reports, trends, visualisations, and insights that aid decision-making.
AI technologies such as machine learning and natural language processing are evolving and, when combined with data, analytics and automation can assist organisations in achieving their objectives – whether improving customer service or optimising the supply chain.
Companies should clearly understand what AI means in the business and then recognise how it adds value to the business. The idea of skill set, and multi-cultural definition is significant.
According to Dr Geraldine, everyone can be a data scientist. The major challenge is finding the appropriate fit – the right individuals with the proper skill set – and then keeping them motivated and engaged.
Moreover, having the right set of digital tools to manage data insights content and digital marketing is essential. With this, organisations can create a strategy to engage their target customer segments from the start to the end of their customer journey. For example, companies can harness insights into customer behaviour and patterns, personas, conversion rate optimisation and many more digital metrics essential to anticipating customer needs and offering products and services which are most relevant to them.
“The way you market your products using these digital technologies will boost engagement because it is derived from data-driven insights,” Dr Geraldine believes.
Data privacy and trust concerns could also be a cultural component. Distinct cultures have varied ways of communicating and creating trust, as well as different approaches to cyber security and fraud prevention.
Dr Geraldine feels that it is essential for companies to take seriously their responsibility in protecting data privacy as well as to know how to build and earn the trust of their customers.
As part of the trust and innovation mesh, Dr Geraldine says there are fundamental questions that should be addressed – How do we make information more accessible? How can we make it simple for people to use our app? How do we ensure that our app is intuitive to our customers? She is convinced that companies have a role to play in bringing traditional physical business to a digital space. When integrated into a digital campaign, traditional marketing can reach more people, spread the message faster and increase the return on investment for the campaign.
However, this only happens when different products and services are promoted through a multi-channel approach as part of an integrated marketing strategy. To move prospects down the sales funnel, businesses need to ensure that the conversation with customers remains seamless across multiple communication channels, whether online, offline or both.
Effective data governance
Effective data governance allows business users to make decisions based on high-quality data and well-managed information assets. However, putting in place a data governance framework is not easy. Dr Geraldine strongly believes that data governance should be a priority in both the public and private sectors.
Data ownership issues, data inconsistencies between departments, and the growing collection and utilisation of big data in businesses are all common concerns. Data governance enables processes to run smoothly and reduces mistakes in a database, giving the business a solid place to start. It also saves time and money.
In terms of data governance across the private and public sectors, Dr Geraldine is convinced that it should be planned and organised carefully and intentionally. A successful data governance strategy involves careful preparation, the proper people and the right tools and technologies.
A data governance framework provides industry best practices for managing data governance initiatives and exploring data stewardship. Data quality, privacy, and governance as a means of building trust are consistently recognised as the most significant issues in data management. As the relevance of data democratisation for business transformation grows, so does the number of non-technical data consumers who desire convenient self-service access to data for their use but are ill-equipped to control data properly.
As anticipated, the outcomes of the data governance journey, data quality and privacy are critical to promoting enterprise-wide data literacy to deliver commercial value while maintaining confidence. It is essential to eliminate operational risks and allow individuals to use data responsibly.
Even if increasing the use of AI and automation accelerates the process of creating value, everyone in the organisation must be able to use data the same way. This can benefit the organisation by making data exchange faster, more widespread and more straightforward. Introducing new data sources and information can aid operational reporting and analysis and make data-driven decisions easier.
Modern business requires data literacy
Data literacy is about educating stakeholders about the information available and organising it in a way that makes it easy to identify and consume. When a data governance team acknowledges the importance of data literacy in an organisation’s data governance strategy, the result is a well-defined data catalogue that any staff member can access.
Establishing trust is a holistic endeavour: it is both a leadership and a design issue. It requires both cultural and practical strategies, as well as the engagement of everyone. Without widespread data literacy and clearly defined data terms and frameworks, communication channels can break down, resulting in catastrophic results.
Dr Geraldine highlighted the recently established Digital Trust Centre (DTC) of the Ministry of Communications and Information that could assist the banking industry in gaining consumer trust. However, digital trust must be developed and started within the organisation before it extends to external stakeholders.
Organisations require solid, mutually beneficial partnerships to successfully grow together and exploit new business opportunities to embark on such a remarkable digital journey. Customers may utilise rapid digital innovation to create new business models, enter new markets and accelerate their profitable expansion.
Moreover, alternative data is becoming increasingly popular in Singapore. Dr Geraldine is excited about the use of securely shared data to make financial services more available to a broader range of customers.
With developing technology, the banking industry may enhance its usage of alternative data protection in five years.
In addition to the ethical and responsible use of AI, another data trend that Dr Geraldine is expecting in the next two to four years is an increase in the number of organisations that will be doubling down on using data to build customer engagement models and within their ecosystems. “What will be interesting to see is how this translates into better customer experience and how companies ensure stringent data protection.”
Given the positive attitude that nations across the world have to technology and the plethora of digital initiatives being put in place to better the citizen experience, Dr Geraldine is optimistic about the future of data governance and the opportunity to use data in a secure manner to improve customer experiences.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
It is difficult to conduct business in today’s world without a dependable website, which is where professional web creation services come in. Developing an online presence for a business or corporation does not end with the creation of a basic website for a company or organisation.
Developers can use Web Development Tools to deal with a range of technologies and should be able to deliver faster and less expensive mobile development. Responsive site design will improve the online surfing experience while also allowing for better SEO, decreased bounce rates, and less upkeep.
The tools that an organisation have selected should be able to give a good RoI. Hence, Cost-effectiveness, Ease of use, Scalability, Portability and Customisation are the factors that should be considered when choosing a Web Development Tool.
Neo4j’s Vice President of Global Cloud and Strategic Sales, Kesavan Nair or simply Kay discusses in-depth how big companies set up their growth engine drivers with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia.
With his proven track record of entrepreneurial leadership in the open-source domains, cloud, SaaS, big data and analytics with both early-stage technology firms and large public companies, Kay is an authority on the topic.
On-Premise vs Cloud
The location of the data is the key distinction between cloud-based and on-premises (prem) versions. Cloud software is hosted on the vendor’s server and accessed using a web browser, as opposed to on-premises software, which is locally installed on the company’s PCs and servers.
When making a choice, a variety of factors must be considered in addition to accessibility – software ownership, cost of ownership, software upgrades, and additional services like support and implementation.
Kay explains that the cloud database as a service (DBaaS) market is one of the fastest-growing markets in enterprise software. “We need to make sure that we are being able to be where our customers want us to be, which is in the public cloud.”
As an example, he cited the use-case of Levis – one of their longstanding customers. Levis had been running on-prem for a long time and wanted to switch as part of their digital transformation strategy. They had eight different applications across various business units which were running on-prem and wanted to move all the services into a cloud service, running on Amazon. Neo4j helped them with the migration in about 3 months.
“That was an excellent example of how the Neo4j AuraDB Enterprise aided in the execution of Levis’ digital transformation,” Kay enthusiastically stated. “For Levis’ Neo4j became one of the main motivators for the enterprise to experiment and try new ideas, which accelerated their transformation quite quickly.”
Neo4j counts both start-ups and established companies in their fold. Their largest customers include the likes of Siemens and Dun and Bradstreet. They also have customers like PwC Australia, PwC U.S, BMW, Walmart and a neo bank in the U.S, Current Bank runs their core database system on Neo4j. The biggest healthcare insurance provider in Brazil, Qualicrop runs its mission-critical database systems in Neo4j.
Speaking of their journey, Kay shared that they started as a database company where most of their customers use the Neo4j database for transactional workloads. Now, interestingly, about 90% of their customers use either a public cloud or a cloud managed by Neo4j.
“We’ll soon cover all the major cloud service providers, so customers can choose where to deploy their apps and where to use the service. This will bring us closer to where our customers are growing,” says Kay confidently.
Graph Data Platforms: The First Choice for Application Development
According to Kay, their graph database promises data consistency, performance, and scalability. It can search for patterns and connections in data’s interconnected relationships. “Neo4j now includes a graph data science platform. Both data scientists and developers can use this platform to meet their demands. And I believe it gives us an extremely attractive product to the market at large.”
When governments had to locate community infections due to the pandemic, the benefits of the Graph Data Platform were most evident. The Graph Data Platform with AI has shown to be a great tool for data management in real-time, from tracking connections via complex social networks to understanding linkages.
On the other hand, graph data science assists organisations in addressing some of their most challenging and complicated problems. “Neo4j Graph Data Science is a platform for connected data analytics and machine learning that enables you to better anticipate the future by understanding the relationships in huge data.”
He shared that those two key strategic products under the Neo4j Aura portfolio of cloud products are AuraDS (built for data scientists) and AuraDB (built for developers).
Graph Database Technology is specifically designed and optimised for identifying patterns and hidden connections in highly interconnected datasets. Graph data stores are easy to use because they mimic how the human brain thinks and maps associations using neurons (nodes) and synapses (relationships).
A graph database stores and queries connected data in a node-and-relationships format efficiently. As a result, graph technology excels at problems where there is no prior knowledge of path length or shape by efficiently finding neighbouring data using graph storage and infrastructure.
Kay listed some of the most typical graph usage cases:
- Fraud Detection & Analytics
- Artificial Intelligence & Machine Learning
- Real-Time Recommendation Engines
- Knowledge Graphs
- Network & Database Infrastructure Monitoring
- Master Data Management (MDM)
All of these have one thing in common – to be successful, an enterprise needs to use datasets that dynamically change over time and are connected to each other.
Neo4j offers four benefits of using graph databases:
- Natural and easy data modelling
- Ability to adapt to changing data structures
- Support for real-time updates and queries running simultaneously
- Storage and a natively indexed data structure
Connected data in property graphs enable the enterprise to illustrate and traverse many interactions and find context for the next breakthrough application or analysis.
Kay encourages businesses to choose a cloud strategy that fits their needs and look for a provider that lets them move their assets whenever they want as many enterprises themselves to have evolving cloud strategies. This is because flexibility is very important.
“With us, Neo4j, you find value. It was predicted that by 2025, all smart applications would use graph technology in some way. So, graph databases are a natural fit for any new application that is being built today. This is because it is much easier to get insights from them,” Kay believes.
Neo4j Graph Database Platform has developed into a common form of information technology and has benefited businesses in a variety of ways. Numerous corporate game-changing use cases in fraud detection, financial services, life sciences, data science, knowledge graphs, and other areas have been made possible by the Neo4j Graph Database’s speed and efficiency advantages.
In the current VUCA environment, data security is crucial, challenging and fluctuating, particularly when dealing with sensitive data and the laws that govern it. Neo4j offers both safety and compliance, and frequently updates, enhances and expands its platforms. They can secure data in a variety of methods, including access control, user roles, protected environments, and system design, among others.
Neo4j Graph Databases has emerged as a critical technology for hundreds of companies, government agencies, and non-governmental organisations and will continue to be there. Kay is optimistic about the future and confident that Neo4j will always be placed to offer the best services for both the public and private sectors.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Business not as usual
COVID-19 has affected everyone; it is a global phenomenon that has forced all sectors to rethink and strategise, prompting many businesses to implement emergency work-from-home plans and the use of various digital platforms.
Simultaneously, many organisations are looking for a solution that offers content design and development with data analytics that would speed up software adoption and serve clients more effectively.
A low-code software platform has been developed to enable organisations to measure, drive and act to maximise the efficacy of their digital transformation and accelerate the return on investment in software applications. This low-code software is a Digital Adoption Platform (DAP) that enables teams to add on-screen navigation hints to websites and apps without recoding them.
In an exclusive interview with Mohit Sagar, Group Managing Director and Editor-in-Chief of OpenGov Asia, Rafael Sweary, President and Co-Founder of WalkMe, elaborated on the latest trends seen in digital transformation and innovation. The explanation was accompanied by a website demonstration during the interview.
Because of the numerous intricacies involved, digital transformation used to be a lengthy process that could take months to complete for businesses. Nowadays, the transition can be completed in a couple of weeks or even days.
“The goal of technology is to help people. Instead of you trying to understand systems and know-how to run systems, you would tell the programme what to do and the platform would walk you through the process and do it for you, making you much more efficient and focused on your task,” said Rafael.
WalkMe guides end-users through business applications used in today’s workplace, identifying pauses and hesitations to provide real-time assistance onscreen without having to toggle between interfaces. Rafael shared that digital adoption has three main objectives: 1) Data – we must unlock visibility into the tech stack and into the workflows required to complete a business process through the use of software, 2) Action–take action right on top of the application to automate mundane tasks, allowing end-users to focus on their most valuable work, and 3) Experience – Data and Action will drive the perfect experience for the end-user, no matter where they sit within the organisation.
Trends that drive the next normal
In his article – Focus on the Future: The Dawn of the Next Normal is Brighter Than You Might Think – Rafael discusses the transition from crisis to a new era. Long-term, he sees four significant shifts that will alter corporate conventions.
First is the new paradigm of business continuity planning (BCP). Continuity was typically done keeping in mind a short-term crisis, such as a data leak or an accident. Most businesses did not plan for an event on the scale of COVID-19.
The pandemic has altered the current context of planning, pushing organisations to accept and deal with a new reality. Most leaders, now, agree that BCP must address long-term type possibilities as well, ensuring that a company is agile and adaptable to any situation.
Second, remote capabilities are now an essential component of businesses to remain functional in any situation. When the pandemic began in 2020, most businesses were forced to implement a work-from-home policy, regardless of their readiness. Organisations quickly recognised that their reliance on technology was growing, and to assist their staff, they required the appropriate digital tools.
However, companies will need to examine their technology to enhance communication, onboarding and training, productivity, and employee engagement as the trend toward permanent remote work continues.
Rafael feels that the third major change is in corporate culture and that it will continue to evolve. Across the board, companies acknowledge that employees are their greatest asset. The more the investment and care for employees, the more likely the chances that a company will prosper in the next normal.
Undeniably, the Covid-19 pandemic has had a major and swift impact on the workplace, with companies making significant efforts to build a distinct culture that reflects their views and keeps the employees content, engaged, and feeling supported.
The fourth change, Rafael proposes, is that digital offerings will drive revenue in the future. Industries are entering into a contactless era where all goods and services can be obtained through technological means. To that end, companies will have to invest in digital offerings that are easy for their customers to navigate and their employees to use.
Businesses that cannot serve their customers digitally have suffered greatly and are struggling to recover. The Next Normal involves increasing the digitalisation of operations and the virtualisation of communication.
It is undeniable that technology priorities have shifted, Rafael opines. Companies may have dabbled with “nice to have” technology before COVID-19, but everything that isn’t critical to core business must go now. Budget cuts will affect all firms, requiring the need to make the best software options possible to maximise ROI. Companies that can find the best technology for their purposes will prosper.
Navigating the New and the Next Normal
Digital transformation is made up of several applications that must collaborate and focus on the outcomes rather than the implementing technologies. Rafael explained that most businesses fail to complete their digital transformation journey because they define it primarily by changing many software or platforms and they begin digitising everything simultaneously.
The fundamental changes in digital transformation are in how organisations work and, as a result, how value is added for customers. Rafael recommends companies begin with their desired results, deploy their chosen platform and understand what they want to achieve based on the benchmarks. He added that when considering digital transformation, avoid thinking about systems.
“Consider the bottlenecks, obstacles and financial opportunities. Then define success, act on it, start working on it, and evaluate whether you met your goal,” he advises.
Because there is a possibility of multiple outcomes, an organisation does not have to worry about just one transformation. Think about, as an alternative, the tasks that need to be completed and the aspects of their firm that they wish to alter.
Businesses employ new strategies and processes to stay relevant as technology rapidly evolves. This modification may need to be implemented promptly for the company to reap the benefits and it must constantly adapt, and experiment with new technology.
“We could help them manage the complete cycle, beginning with review and finishing with benchmarks identifying friction and detailing project action, among other things,” Rafael offers confidently.
WalkMe apart, he says, if businesses want to be more successful in their digital transformation, they must focus on outcomes rather than systems implementation.
Without a doubt, WalkMe is a highly successful option. Close to two thousand organisations around the globe utilise the system, from both the public as well as the commercial sector. Product managers and application owners can make use of the software and feature adoption tools, as well as the change management solutions, that this platform provides for internet, desktop, and mobile applications.
The platform aims to empower business leaders to achieve the potential of their people and technology investments, which he considers to be the most valuable assets of an organisation in the digital economy.
The enterprise-class guidance, engagement, insights and automation platform of WalkMe’s Digital Adoption Platform enables businesses to maximise the full value of their digital assets by providing executives with greater visibility into digital usage and making employees more efficient and productive.
“The Next Normal is different. We can’t ever ‘go back’, but we are being offered incredible opportunities for better business processes, better work experiences, and stronger companies and products. Jump on, the time is now,” Rafael advises.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Governments around the world are under great pressure to deliver services to citizens and businesses quickly, accurately, and efficiently. At the same time, citizens and businesses are demanding that government leaders uncover and minimise possible vulnerabilities in existing programmes, as well as investigate and prosecute crime when it occurs, as countries begin to recover from the pandemic.
This tension is driving innovation – using data analytics – to allow quick delivery with strong integrity. Programme Integrity can be implemented by altering current programmes or creating new ones that are sufficiently resilient against fraud, waste, and abuse.
When adopting new programmes, and allocating funding, governments can prevent harmful long-term impacts and be better prepared for future risks by focusing on monitoring, oversight, and design. This applies especially to those technology-inclined programmes or projects. As a result, the funds can achieve their goals and achieve the anticipated outcomes.
Governments have begun to invest considerably in analytics and data to explore and manage the dangers brought by digitisation, particularly as nations look to be more inclusive. There is a significant trend to invest in pandemic-related data analytics and innovative technologies to operate more economically, efficiently and effectively to prevent and detect fraud, waste and abuse in the government sector.
Data analytics boosts productivity, efficiency and revenue. Analysing data sets allows an organisation to know where it can optimise its processes to increase cost-effectiveness. Areas that are unnecessarily hoarding a company’s resources can be identified and decisions can be made about technologies that can reduce operational and production costs.

Shaun Barry, Global Director, Fraud & Security Intelligence, SAS, revealed in an exclusive interview with Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, that data and analytics are governments’ secret weapons to deal with fraud and scams in the digital era and is essential in any public sector fraud management strategy.
“There is no longer this trade-off between speed and accuracy; you can-do real-time fraud detection in sub-seconds. And you can also use analytics on the vector side, to be able to use artificial intelligence or machine learning to look for patterns that government leaders may have never thought existed or did not know were happening,” says Barry.
The heart of the response that government leaders should have is to adopt data and analytics in real-time to be able to enforce and promote integrity throughout society.
Agencies should roll out data and analytics initiatives to incorporate controls in their accounting and disbursement systems in real-time with great deliberation and must be able to keep up with the volume or number of transactions.
Countries that have performed well in terms of integrity are those who have planned for it – it does not occur by luck or by accident; they have built integrity into their systems. As governments roll out new programmes or evaluate how their present programmes are managed, they must ensure that honesty is at the forefront and the centre of everything.
“What does integrity do? It builds trust among people and it helps to make sure that these people and businesses trust the government and recognise that it is a force for good in society,” Barry emphatically states.
Governments around the world are under great pressure to uncover and minimise possible vulnerabilities in existing programmes, as well as investigate and prosecute crime when it occurs, as countries begin to recover from the pandemic. Programme integrity can be implemented by altering current programmes or creating new ones that are sufficiently resilient against fraud, waste and abuse.
When adopting new programmes and allocating funding, governments can prevent harmful long-term impacts and be better prepared for future risks by focusing on monitoring, oversight and design. This applies especially to those technology-inclined programmes or projects.
3 Vs that Increased Significantly because of Digitisation
According to Barry, 3 Vs were the main reasons for the drastic changes in the government sector – Volume, Velocity and Vector.
The Volume of transactions in and for governments has increased substantially as services move online; subsequently, the volume of fraudulent transactions is increasing as services are being digitised. Fraudsters attempt to use volume to their advantage – hiding bogus claims among the millions of valid transactions that governments process.
Velocity is the speed at which those transactions come. They are coming literally at the speed of light because of the digitisation of services that governments rolled out, especially during the pandemic.
Simultaneously, the speed of fraud is increasing exponentially. As governments undergo digital transformation – moving more services to electronic channels – fraud schemes can be perpetrated instantaneously.
Vector deals with where the threats are coming from. There are state actors who are targeting governments and programmes, there are non-state actors and there are organised crime syndicates. Primarily, these are not the result of mistakes made by individuals or corporations, even though these still exist wherever it is indicated that the vector has shifted.
Fraud schemes are now being perpetrated by criminal networks and organised crime syndicates. Fraudsters are creating synthetic identities with data stolen from breaches. They probe for control weaknesses and exploit vulnerabilities. This level of sophistication adds magnitude to fraud schemes and patterns.
Understanding the Context of Fraud, Waste and Abuse
Fraud in the government sector is a false representation or any deliberate misrepresentation intended to deprive a government.
Threats are accompanied by some indicators like a sudden spike or an unexpected surge in expenditure or unexplained entries or manipulated records in a certain programme area. The absence of substantiating documentation, unauthorised dealings or transactions using non-serialised numbers are reliable indicators of wrongdoing as well. Cash payments in abnormally huge amounts are red flags as is a lack of internal controls.
High employee turnover could lead to or indicate fraud. People may also notice posts on social media, where people are communicating about a method to cheat the system, a means to acquire money when they should not be able to.
Of course, it is vital to understand if these signs of poor operation practices, systemic issues or staff challenges. The answer is often a mix of all. Fraud can take place due to poor operational practices, systemic issues or staff challenges. However, even a robust internal control environment cannot guarantee that no fraud will take place within an organisation.
Traditional fraud management is no longer sufficient, and intelligence technology is needed to mitigate fraud. It would be best to combine technologies such as AI, behaviour analytics and data mining, combined with the auditor’s experience and policy checking to help mitigate risks.
SAS uses industry-leading data analytics and machine learning to monitor payments and non-monetary transactions, as well as events, enabling you to identify and respond to unwanted and suspicious behaviour in real-time.
The Pandemic and Trends in Cybersecurity
Barry is quick to point out that the pandemic has not caused an increase in fraud, but the digital response to the pandemic has created opportunities that fraudsters have taken advantage of it. Bad actors certainly were aware of the significant digitisation in governments and have exploited vulnerabilities.
In their haste to serve people and control the pandemic, governments around the globe rolled out relief programmes quickly and without normal controls, opening up a wider attack surface for bad actors. It is pertinent to note that the pandemic also accelerated the pre-existing trend of digital transformation in government – yielding more opportunities for fraud.
COVID-19 caused huge disruptions which organisations across all sectors are still dealing with – all while fraud chances have multiplied and become more difficult to detect.
Bad actors are quickly growing in strength and effectiveness. Nearly 70% of organisations experiencing fraud reported that the most disruptive incidents came via an external attack or collusion between external and internal sources.
Many employees are now working in a less secure setting as a result of the unexpected move to remote and hybrid working environments. This has resulted in a dramatic increase in internet activity, making it harder to monitor and restrict fraudulent activities.
According to the Cybersource 2021 Global Fraud Report, there has been an increase in fraud attacks and the rate of fraud, especially for organisations based outside of North America. Companies based in the Asia Pacific region have been hit hardest, prompting an increased focus on fraud management and increased spending in this region.
Barry understands that citizens want good information – relevant and timely – not just raw data. They are looking to give leaders actional intelligence and deep insights at the right time. This allows for better decision-making that considers the risks and rewards comprehensively. The public sector must now implement and strengthen controls with robust cyber resilience initiatives.
Public Sector Fraud Management in the New Normal
PwC’s Global Economic Crime and Fraud Survey 2022 revealed that across organisations of all sizes, including the government and public sector, cybercrime poses the most significant threat, followed by asset misappropriation and customer fraud.
Even before the pandemic, the Asia Pacific region faced the highest incidence of medical claims fraud, according to a global claims fraud survey by reinsurance firm RGA. “Survey results suggest that the global incidence of claims fraud is 3.58%, with high claims fraud incidence in the Asia Pacific region.” That trend is widely expected to have increased during the pandemic.
The rise of digital fraud has forced organisations to work hard to enhance technical capabilities and implement more robust internal controls as well as reporting measures.
In the government and public sector, this fraud trend also makes the need to upgrade the government’s fraud management and technology even more crucial to prevent losses and misuse of funds and safeguard the government’s integrity as fraudsters are moving targets and becoming more specialised and professional. As soon as an agency identifies a scheme and puts in controls to mitigate it, the fraudsters quickly find new ways to exploit the system.
From a market growth perspective, the global market size of fraud detection and prevention solutions is predicted to grow from US$ 30.65 billion in 2022 to US$ 129.17 billion in 2029. Governments across countries, including countries in the Asia Pacific region, invest in implementing advanced fraud prevention solutions.
Investment in government fraud technology for detection and prevention can deliver big payoffs – typically 10 to 100 times ROI.
Border Protection During the Pandemic
“I believe that governments have increased their response to the pandemic to safeguard their borders. They must know the commodities and services entering or leaving their country,” Barry observes.
Of course, borders are only one of the many areas in which nations are beginning to invest in response to the pandemic. Governments across the world are beginning to extensively deploy analytics and big data solutions to assess the risks they may face at the border or within.
From the citizen’s viewpoint, people are concerned with financial fraud and want to be confident that the government collects the required customs duties, so those funds can be ploughed back into ongoing national development.
On the immigration side, people are aware of the potential risks associated with illegal immigration, overstays, as well as the risks of terrorism and contraband. Barry shared many nations are undergoing massive reformation that involves examining the entirety of their Immigration and Customs procedures. It employs and integrates real-time analytics to precisely identify these types of risks.
“There is certainly a very big trend that we’re seeing in the market, where government leaders, especially at borders, are investing in analytics coming from the pandemic,” Barry acknowledged.
Customs and border control are important operations not only because they have wide-ranging implications for a country, but also require close cooperation between many organisations to be truly effective.
Long before the pandemic, the ASEAN region has always been one of the world’s largest trading blocs, placing its member states at greater risk of various transnational crimes. The ASEAN Political-Security Community Blueprint 2025 prioritises and encourages ASEAN countries to strengthen cooperation on border management in accordance with respective domestic laws, rules, regulations and policies and to jointly address matters of common concern, including forgeries of identification and travel documents, as well as to explore the use of relevant technologies to manage borders to stem the flow of potential terrorists and criminals, and to coordinate border patrols and inspections.
When COVID-19 emerged in 2019, many countries imposed strict border control measures to mitigate as well as slow the spread of the virus. As global travel restrictions begin to ease and countries reopen their borders, governments expect border control measures to be different in the post-pandemic world.
The effects of the pandemic will remain as the World Health Organisation (WHO) and experts predict that the disease will presumably become endemic. Border control systems are facing more complex challenges, and border officials have the monumental task of managing dynamic health control measures to ensure safer travel to protect their citizens. Therefore, governments have to rethink and redesign how border controls operate.
Advanced data analytics and automation technologies can support government agencies by giving them relevant information to make better decisions on a real-time basis. Once data is collected from multiple sources, artificial intelligence or machine learning is applied to collected and historical data to develop real-time watchlist management, risk assessment and investigation management systems. The automated system can accommodate unstructured data in different forms and from multiple sources.
Adopting a risk assessment engine that allows automation as part of the modernisation, therefore, helps governments ascertain the risks and give them relevant information to make better decisions on a real-time basis while adjusting to the new reality and strengthening recovery.
Automation is not to replace customs inspectors and immigration officers, but to enhance border control management. Data analytics help turn the data into meaningful information quickly, so immigration officers can focus on the outcomes of the analysis.
SAS Support for the Public Sector
SAS has the tools, expertise capability and experience to determine the best strategy for effective fraud detection and prevention in agencies. They are one of the leaders in the space of advanced analytics and AI supporting the public sector. SAS helps governments in predictability and manage risk and identify opportunities by leveraging AI and Analytics. Everything SAS does is designed to empower better decisions.
As fraudsters get more sophisticated with their tactics, agencies are also required to get more sophisticated at fighting back. Digital fraud needs an approach with a faster, more accurate response to new threats.
The key strategy is to move from a reactive to a proactive approach. The SAS Detection & Investigation for Government Solution proactively prevents fraud, waste, abuse and improper payments. It provides a holistic view of fraud based on multisource data points and takes a multifaceted approach to detect hidden relationships and seemingly unrelated events.
Advances in fraud detection technologies can give agencies a more accurate and efficient arsenal than ever for attacking fraud and financial crimes. Sophisticated technology-based approaches can eradicate fraud and find it before the losses mount. Whether it’s contract fraud or Medicaid fraud, a criminal act or the sole attempt of a dedicated fraudster, fraud can be discovered and prevented.
Here are 3 essentials for winning the battle against fraud, waste and abuse:
- Find the patterns: Once agencies can bring together the relevant data, they can develop more complete views of the individuals, providers and businesses in the programs. The more information they have about these entities, the better they can determine what kind of behaviour is typical and what behaviour warrants closer scrutiny.
- Put advanced analytics to work: Traditional rules and outlier detection methods are proper to address known fraud patterns but are not very good at handling sophisticated and evolving fraud schemes today. Three forms of advanced analytics are taking centre stage in the war on fraud:
- Predictive capabilities: predictive modelling allows agencies to see the patterns or interrelationships among various data elements that point to potential fraud, waste and abuse. It enables agencies to move more into fraud prevention mode versus pay-and-chase or detect-and-recover mode.
- Robust social network analysis: this analysis reveals connections among entities to expose organised fraud rings or collusive activities.
- Machine learning: machine learning: a form of artificial intelligence (AI), it is a powerful force for improving fraud detection accuracy and efficiency. It takes government fraud technology to an entirely new level.
- Empower staff for collaboration and efficiency: Responding to and tackling fraud requires the cooperation of multiple agencies and departments. While agencies need to ensure that the automation process runs and robust internal controls work effectively, employees need to be trained so that they can support agencies’ efforts by taking proper action based on insights extracted from the relevant data.
Barry is firmly convinced that utilising analytics in daily operations will spark innovative discoveries that propel the advancement of citizen outcomes and experiences – ones that dismantle siloes, deliver on mandates and offer efficiencies.
There are a plethora of ways that data analytics can be valuable – to what extent relies largely relies on the organisation. But at its foundation, Barry says, it’s all about assisting the organisation in making the best business decisions to serve citizens.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The pandemic has turned the traditional businesses in Malaysia into digital ones and forced them to move online, resulting in the digital economy growing rapidly in 2020.
To give a carefully considered vision and shape to this growth, the government created Malaysia’s Digital Economy Blueprint (MyDIGITAL) which lays the groundwork for the country’s transformation toward an advanced digital economy. It is designed to pave the way for the country to strategically position itself as a competitive force in this new era for the region and globally.
Fabian Bigar is the Chief Executive Officer for the Strategic Change Management Office (SCMO), now formally known as MyDIGITAL Corporation since April 2021. He drives national change management and ensures the successful delivery of MyDIGITAL. Previously he was the Undersecretary for Policy and International Relations in the Ministry of Health, Director of the Civil Service Delivery Unit in the Prime Minister’s Department and the Director for the National Key Economic Area – Healthcare as well as the Director for National Key Results Area – Low-Income Households in the Performance Management and Delivery Unit (PEMANDU) under the Prime Minister’s Department.
In an exclusive interview with Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, Fabian Bigar shared in-depth insights into what the Malaysian government intends to do to speed up and prepare local businesses and society to navigate the rapidly evolving digital and technological landscape.
Digital Ambitions of Malaysia
Malaysia intends to be a regional leader in the digital economy and achieve inclusive, responsible and sustainable socio-economic development.
MyDIGITAL encapsulates the government’s ambitions to transform Malaysia into a digitally-enabled and technology-enabled, high-income country. The Blueprint is designed to articulate the vision, set the direction, lay out the strategy and establish milestones to build the foundation of the growth of the digital economy.
Its three goals are to inspire industry players to become innovators, users and adopters of new business models; to harness human capital to prosper in the digital economy, and to cultivate an integrated ecosystem that allows society to embrace the digital economy.
It has six strategic thrusts that have been identified that are supported by 48 national initiatives, 28 sectoral initiatives and 22 strategies. These are driving digital transformation in the public sector; boosting economic competitiveness through digitalisation; building enabling digital infrastructure; building agile and competent digital talent; creating an inclusive digital society; create a digital ecosystem that is trusted, safe, and ethical.
One of the main functions of MyDigital is to make sure to speed up some of the changes that Malaysia wants for its economy. Part of this is increasing private-public partnerships and collaboration and to be fearless in introducing new innovative ways to nudge certain things to happen, such as introducing the concept of catalytic projects, which is likely to be a game-changer to fast-track some of the initiatives under the Malaysia Digital Economy Blueprint.
“We need to learn from others. From the experiences of other countries, international bodies like World Bank and World Economic Forum and I think we need to be open about what our shortcomings are, and how we improve ourselves and collaborate with these people,” Bigar readily acknowledges.
Implementing MyDIGITAL
The implementation is divided into three phases. Phase 1 began in 2021, will continue in 2022 and aims to strengthen the foundation for digital adaptation. Phase 2 will drive inclusive digital transformation from 2023 to 2025. Phase 3 will position Malaysia as a regional leader in digital content and cyber security from 2026 to 2030.
Of course, during and at the end of each one, the team will monitor and review progress to implement tweaks, changes or revamps. And at all times, the agency is open to collaboration.
“We want to increase digital literacy in Malaysia by working with several organisations to fill in the gaps that we’ve discovered,” says Fabian. “It is vital to quickly identify gaps and figure out how to obtain needed assistance available from both the private and public sectors.”
The Malaysia Cyber Security Strategy has clear plans to make the country more resilient to cyber-attacks. Regulatory policy demands that regulations be reviewed as needed to get rid of old ones that could slow the growth of digitalisation and compromise safety.
Addressing key challenges in digital adaptation
MyDIGITAL recognises the importance of growing the digital economy. They have various programmes such as leadership programmes and are collaborating with various entities to produce joint publications and webinars that will better engage more people and raise awareness about the digital economy.
According to Bigar, genuine and widespread digital adaptation in Malaysia will start by creating awareness. This can be followed by accelerating the adoption and use of these technologies and showing some successful cases where organisations have successfully contributed to the growth of the digital economy.
The outfit is looking for contributions from MSMEs to drive social change and income generation. To facilitate their mainstreaming, the government must help MSMEs in their digitalisation journey to e-commerce.
MyDIGITAL itself has several initiatives and programmes to support the digitalisation of MSMEs. It organises sessions where the agencies involved in the digital economy of Malaysia – federal or state – are invited to learn more, get upskilled and then roll out initiatives for MSMEs.
Simultaneously, the Malaysian government is eager to improve digital literacy and adoption among the small-medium players. They are exploring online financial initiatives, working with the Central Bank to introduce cashless transactions at all levels, including QR codes, wallets and similar options.
A defining moment with “Cloud-First”
Cloud usage is an integral part of the Blueprint and the Digital-First programme aims to ensure the federal and state usage of cloud services. This programme focuses on two main areas: reducing the use of physical storage files by switching to a “cloud-first” strategy and adopting a “paperless culture” at work by finding and using the right technologies to make paperless workflows and transactions possible.
This is expected to make better use of government resources and automate tasks by setting up digital workflows. It will also improve access to data and information by centralising data storage in the cloud and make it easier for civil servants to work remotely.
By 2022 ends, MyDIGITAL is targeting 80% usage of cloud storage across the government in addition to incorporating cloud computing for businesses to procure services without having to own and maintain assets.
Bigar elaborated on the plan to give all businesses a “Digital Compass” – a customised technology roadmap for different industries and businesses with different levels of digital maturity. The Compass will give businesses a step-by-step guide to the digital solutions they can use at different stages of their growth. As part of this initiative, there will also be a programme to raise awareness about the benefits of Intellectual Property (IP) registration to help and encourage businesses to do so.
Bigar believes that cloud should be introduced to smaller businesses as well to allow them to embrace new technology. If one looks at digital literacy across the spectrum of the population, a lot of work has been done for education or those who are in the workforce. This is primarily because of a top-down approach for schools and educational institutes or, in the case of employees, perhaps because they are more intrinsically motivated. MyDIGITAL wants the understanding and uptake of technology to be more widespread – across demographics and geographies – by creating programmes for ordinary citizens.
Bigar wants to bring awareness at the working level in every industry; to educate people on the digital economy and available technology. He firmly believes that government leaders and politicians should better understand what digitalisation is all about by also establishing specific programmes for them.
“We are facing this transition in the digital economy; we must collaborate and take the necessary steps to adopt the new standard and work together. The journey won’t be easy, but we need to be brave enough to take this huge step forward to improve the quality of life for all Malaysians,” Bigar ends optimistically.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The development, integration and implementation of information management are critical in an era when data quality is more important than ever. As attempts are made to create and manage data-driven agencies to achieve goals, public sector leaders must accelerate the current framework to connect dependencies across processes for reliable information.
The public sector is also working hard to gain access to information, which it sees as a valuable asset. To better support important decision-making and meet the demands of demanding citizens, it is necessary to quickly secure and analyse both structured and unstructured data. Obtaining trustworthy data while adhering to data governance and compliance will improve data quality, accuracy and accessibility.
The idea that the public sector should focus more on preventing crises rather than just responding to them is not new. What is new is the ability to successfully predict and mitigate critical events on a regular and consistent basis.
Recent advancements in data analytics, business intelligence, machine learning and artificial intelligence have allowed the public sector to better detect and forecast operational issues. This exponential improvement in the ability to track patterns and identify potential problems in massive historical data sets and millions of pages of unstructured text is revolutionary. Departments can instantly understand where to increase efficiencies, when to manage costs and how to satisfy citizens with the right services by creating business intelligence dashboards.
Further, data analytics develops a single source of truth for compliance and methods to build trust among citizens.
Data governance is the most successful technique for formalising accountability where employees define, produce, and use data to execute their job functions efficiently. Good data is needed to improve productivity and citizen experience – and organisations adopting technology that enables employees to gather instant data will more quickly accelerate their mandates than those that do not.
The rate at which data is generated is increasing all the time. In light of this, it is vital that people prepare themselves to function and engage with data in the larger environment and are empowered to do so. Hyperintelligence, a term that has only lately been coined, makes data accessible to employees at their convenience.
The public sector must have greater proclivity in times of uncertainty to explore and exploit new technology opportunities that have the potential to mitigate risk and ensure business continuity. Decision-making must be based on actionable insights and intelligence garnered from the vast amounts of data the government has access to.
Thus, the necessity to improve usable data analysis while becoming increasingly data-driven in decision-making is almost unanimously acknowledged. And this cannot be done without reliable analytics tools capable of desegregating and connecting previously siloed data, making it manageable from a single place.
The OpenGov Breakfast Insight held on 27 April 2022 at Sheraton Towers Singapore aimed to provide the latest information on how the public sector can use data analytics to drive mission outcomes.
Meeting demands of citizens and upholding data governance

To kickstart the session, Mohit Sagar, Group Managing Director and Editor-in-Chief at OpenGov Asia delivered the opening address.
Data on a global scale has taken on an entirely different dimension and Singapore is no different. In fact, compared to other countries in the region, the nation is well ahead of the curve and leads in data analytics. The public sector has spent a huge amount of money on technological innovations.
“Data can enable governments to make informed decisions,” Mohit asserts.
While Singapore collects massive amounts of data, quantity alone is not enough to make informed decisions. Where, how, and when is critical as is how the data is structured and made uniform. For better and more relevant data, information silos need to be broken down. Democratisation, integration and sharing will all be key.
To democratise data, the public sector needs to empower its entire workforce – from top to bottom. For the most part, data is often only accessible to people in higher positions or specific departments, creating disparity and lacunae. The information gap must be bridged with appropriate empowerment – be it through awareness, training, or skill up-gradation.
Access to large data sets is essential for a government’s digital transformation journey. Of course, data in and of itself is not the end goal – data must serve as a tool to derive understanding that enables effective decision making. Actionable insights from analytics will ultimately enrich the citizen experience.
In closing, Mohit emphasised the importance of partnerships that could help leverage data analytics for an organisation. “Use technology and minimise customisation,” Mohit emphasises. “By working with the right people, a company can accelerate its digital journey towards effective digital transformation.”
Deepening insights through data analytics

Kyung-Whu Chung, Director, Sales Engineering, APAC at MicroStrategy spoke next on the criticality of quality of data in digital transformation. To set the context, Kyung-Whu revealed that a recent survey showed that 94% of respondents say that Data and Analytics are important to their business growth and digital transformation.
There are huge benefits for organisations to using data analytics, including improved efficiency and productivity. Better data analytics leads to faster and more effective decision-making and results in better financial performance. Data analytics also assist organisations to identify and create new promising products and services.
While benefits are clear internally, there are advantages for the consumer as well. Customer satisfaction and experience are both critical for a company to thrive was the key. Data analytics help better understand consumer behaviour, trends, demands and identify issues. It has improved customer acquisition and retention with enhanced customer experience.
However, 70% of the people are not using any analytics tool. The vast majority (97%) of real-time decisions are data deprived. This indicates, surprisingly, that organisations and agencies are still relying on their intuition and manual analysis to solve complex problems with multiple variables.
Barriers that limit the uptake of analytics have been well articulated. Kyung-Whu identified the top three concerns – data and privacy concerns, limited access to analytics and lack of talent and training.
On the issue of privacy, 38% of organisations said more than 50% of their data is certified by an organisation authority or adheres to corporate policies. Despite this, customers are concerned about their sensitive and personal data. Organisations need to build trust and communicate properly on the use of data responsibly. This will encourage customers to be more inclined to provide their information.
When it comes to access, data-driven culture often gets stuck at the top. Access to the organisation’s data and analytics is usually concentrated on specific roles. Democratising data is important as it empowers all departments and encourages data-driven decisions at all levels throughout the company.
The last challenge that organisations need to tackle is the lack of talent and training. While simple enough to understand, there needs to be a more intentional drive and strategy to reskill and upskill employees.

In closing, Kyung-Whu encouraged delegates to expand their thinking and embrace a multi-tool environment. A data-driven culture can only be built on data democratisation, enabling everyone to access every process and every app. Collecting data is only a start, organisations need to enrich the data to gain deeper insights.
The future of citizen experience

Lim Chinn Hwa, Senior Director, Smart Nation Platform Solutions, GovTech elaborated on GovTech’s experience in building a Smart Nation.
Quoting PM Lee, Chinn Hwa shared three ways to understand Singapore’s vision for a Smart Nation: “We see it in our daily living, where networks of sensors and smart devices enable us to live sustainably and comfortably; We should see it in our communities, where technology will enable more people to connect to one another more easily and intensely; We should see it in our future, where we can create possibilities for ourselves beyond what we imagined possible.
A smart nation is about data and what we do about the data, Chinn Hwa asserts. It involves the systematic use of technology that is integrated into a coherent whole, networks of connected smart devices and sensors, and a community connected by technology. It also requires government-built infrastructure and framework, secure and trusted systems, and a culture of experimentation
Accordingly, the key to a smart nation are as follows:
- Sense: Collect data from our physical environment
- Contextualise: Process data for actionable insights
- Act: Act on insights from contextualised data
Chinn Hwa shared that with the Smart Nation Sensor Platform (SNSP) the government is able to develop a 360° view of Singapore with:
- Data collected from land, air and sea sensor platforms
- Interoperability and integration with different Smart Districts across Singapore
SNSP helps to achieve 360° awareness with sensor data that comes from static sensor platforms, mobile sensor platforms and data exchange platforms.
With the SNSP in place, Chinn Hwa emphasises the possibility of data-driven capability in decision making. He adds that GovTech takes the approach of looking at partnerships with agencies but also with industries.
GovTech is the centre of excellence for smart systems and processes but they work with agencies to understand what architecture is optimal for cyber-physical data collection. From the sensor lake, an API is created to extract knowledge out of the data.
SNSP brings meaningful Impact to agencies and citizens because of its secured and scalable Sensor & IoT (Internet of Things) network infrastructure based on GCC. It is a reliable, connected, and interoperable sensor data platform where agencies can Plug ‘n’ Play their sensor assets.
The other benefits include:
- Plug n play: Sensor agnostic and manufacturer-independent approach and easy to integrate
- Interoperability: Use IoT standards to bridge multiple communication protocols and seamless exchange data
- Connected: Ability to perform silo functions while at the same time, coordinating seamlessly with each other within the ecosystem to come together as a totality solution
- Real-Time Processing: For timely situational awareness and mission-critical operations
- Secure and Scalable: Supports IM8 compliance and utilises the power of the cloud (GCC)
Chinn Hwa shared that in GovTech’s work with JTC, they have a better idea of what is going on in the country from the facilities management point of view – they can start thinking about how they can do more with less human resources.
In conclusion, Chinn Hwa forecasts that next-generation smart districts must have smart district technologies that enhance economic competitiveness and sustainability. , They must bring added convenience and security to tenants and visitors to improve their quality of life. He reiterates the importance of data-driven decision-making and the criticality of that within citizen services.
Data analytics in public health

Dr Tan Hwee-Pink, Chief Data Officer, Health Promotion Board (HPB) talked about HPB’s journey of harnessing data analytics to better understand HPB programmes and strategising engagement based on data.
He shared that the HPB began the journey by envisioning a centralised and aggregated dashboard that pulls data from various HPB programmes onto one platform for visualisation and analytics. There was a desire to focus on a data-driven understanding of HPB programmes that can be translated into actionable insights.
According to Dr Tan, there are two ways using data has helped:
- Enriching our understanding of citizens’ geospatial information
Geo-spatial information is regularly collected (since NSC season 1 in 2015) but has rarely been used to characterise our customers. (Home addresses, Community Challenge GRCs, Roadshow locations, Event Locations)
- Empowering with self-help data analytics PowerBI dashboard provides a convenient yet highly flexible format for all to access:
- Relatively low barrier to entry for HPB employees as a visualisation and data analytics tool;
- Fairly streamlined access to PowerBI dashboard as data is de-identified and aggregated.
There were many benefits of using data analytics. Using the case example of data from the National Steps Challenge Season 5, they can:
- Enriching data with geospatial information
Layering geo-spatial information over NSC 5 participants’ characteristics allows HPB to understand the distribution of participants, and to evaluate the effectiveness of HPB’s various outreach channels. This aligns closely with:
- Supporting Partner Management & Expectations
g., MPs of specific regions would be most interested in their constituents; and - Evaluating on-ground engagement efforts
g., Different partners have different strengths or rapport with different demographics - Developing a deeper understanding of target segments with ease
HPB can identify and rectify gaps in real-time. Citizens have different levels of engagement after signing up for the programmes. With the ability to pinpoint registrants who drop off from the programmes, HPB can improve the participant retention rates and enhance the effectiveness of the impact measures.
- HPB can educate the identified demographics on how to synchronise trackers or use the H365 mobile app at the time of tracker collection.
- HPB can send push notifications to participants to encourage more engagement during the programme.
- Uncovering interactions between programmes
HPB can also better understand the relationships and interactions between programmes. This can help to gain a more holistic understanding of the participants’ lifestyles in terms of their activeness.
- Uncover opportunities for cross-marketing
HPB can enhance cross-marketing capabilities by reaching out to registrants or participants in similar programmes. For instance, HPB can encourage corporate challenge and/or youth challenge registrants to also register for the community challenge.
By understanding the sign-up rates amongst eligible registrants or active participants for programmes, HPB can better appeal to the different segments of interest to encourage sign-ups.
In concluding his presentation, Dr Tan Hwee-Pink shared that there is tremendous potential in extracting knowledge from data to make informed and data-driven decisions that can yield a tangible impact on the way citizen services are delivered.
Interactive Discussions
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This session is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and impart professional learning and development to the participants. It is an opportunity for delegates to gain insight from subject matter experts, share their stories and take back strategies that can be implemented in their organisations.
The opening poll inquired about the main challenge delegates face in their data strategy journey. Most (38%) chose a lack of data culture/literacy/skill across employees as their primary challenge. One-third (33%) thought that missing an overall strategy that crosses departments and teams is their biggest obstacle. A lack of a centralised tool for sharing and collaboration was troubling for 19% of the delegates while 10% chose data privacy and security concerns.
A delegate opined that the main issue is that while there is a wealth of data, everyone wants their data warehouse and quick fixes rather than building long-term capability. Other delegates shared that policies are in place but that business users need to be receptive and willing to invest their time.
Another delegate shared the perspective that data is not owned by anyone. “Who has the KPI to ensure that data is being used?” he asked. “No one can hold others accountable for not using data.”
Chinn Hwa mentioned the importance of an augmented user – a user who can use data without needing to reach out to a data scientist. It is important to figure out what tools the user can use so that the person can be self-sufficient without going through specialists. The key is to understand the use cases for the people who need to use the data. When use cases are identified, tools can be customised.
Kyung-Whu shared his understanding of three kinds of users
- People who ask business-related questions
- People who can offer answers
- People without questions
Concern about data privacy and classification. People with the answer have the data. When a business person goes to the people with the answers, the solution tends to be centred on building the system to get the answers instead of getting the business users to be self-sufficient in getting answers.
There are two main concerns. The hesitation comes from not knowing if the business user will be able to come up with the right answer with the data. Secondly, handing over the data requires teaching the business user about the governance surrounding security and classification, which is the least of their concern.
Finally, when it comes to people without questions, Kyung-Whu suggests the importance of giving them the first touchpoint to change their mindset and create a learning path for them.
The second question inquired about the top analytic adoption challenge in their agency. Over one-third (35%) found data quality and accuracy concern the biggest obstacle. Others thought that the lack of talent and training (29%) was of concern. The remaining delegates found other unstated factors (18%) to be the issue while the rest found the limited access to analytics (12%) and complex and difficult tools (6%) challenging.
A delegate shared the difficulty of doing more when the priority is on operations and doing transactional work. Others brought up the inconsistency of data as a primary issue. The problem lies in the fact that everyone understands and captures data differently. It becomes a major issue because the inconsistent data upstream creates a problem downstream because of the time spent cleaning data.
Another delegate found it a challenge when faced with people who do not see how data analytics can bring impact and improve the quality of life. He emphasised the need for different levels of products for different needs.
When asked about their agency’s biggest data management barrier, most (35%) found data collection and cleansing the biggest barrier. Almost a quarter (24%) found providing trusted data to be a hindrance, while another quarter (24%) found data accessibility and sharing the biggest stumbling block. The remaining delegates found real-time insights, and the ability to analyse data in real-time (17%) the biggest challenge.
When asked about what their business users do when they have new data requirements, almost two-thirds (68%) would approach data analysts in their business unit for support. One-fifth (20%) went by their gut feeling, while the remaining would raise a Helpdesk ticket for IT (Information Technology) support ( 6%), or do not face the challenge because of a self-service analytic tool (6%)
On being queried about the application that delegates spend most of their working days on, an overwhelming majority (72%) spent their time on email, followed by productivity applications (like Microsoft Office) (22%) and their business intelligence application (6%).
Asked about whether delegates have considered zero-click experience for data, more than half (56%) have not considered it while 44% have.
Conclusion
The Breakfast Insight concluded with remarks from Kyung-Whu Chung who highlighted the role of data analytics and the need for agencies to begin leveraging it. He urged agencies to become data-driven and advised them to accelerate their digital transformation.
He suggests a paradigm shift that would help with the use of data analytics – bringing intelligence to the general audience (70%) who would not ask questions about data. The key is to offer them “answers to their first questions.” Instead of getting people to reach out to analytics platforms, the strategy should be about injecting intelligence to where people are – through zero-click analytics to solve the problem that we just discussed.
In closing, he invited the delegates to reach out to his team to explore ways they could work together to assist them on their journey. He emphasised that it is not a one-off event, but a long-term journey that MicroStrategy has walked and would be willing to undertake.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
In the wake of the pandemic, people across the world moved comprehensively online = for work, education, entertainment, shopping and financial transactions. This has dramatically increased the surface area for attacks and created unprecedented op[portunties for bad cyber actors.
The simple answer to the challenges faced by the financial services industry and other agencies is to make better use of all available data and advanced analytics to detect and prevent fraud.
Of course, this may be easier said and done. In fact, the plethora of tools, solutions and platforms available may make the task more complicated. The following provides some starting points.
Understand the Categories of Fraud-Detection Tools
The ‘market’ is flooded with potential solutions, all offering to address fraud. The utility of each toolset relies on the business context and available data. All need to be integrated with business processes and supported by policy settings.
Here is a short overview that can assist in mapping such tools in terms of their function(s).
Detect Known Knowns
A watch list that holds information about known criminal entities (people, organisations, addresses, events, etc) is a good, universal start-point.
The challenge is matching the known entity against a new transaction. Simple name-matching systems tend to be quickly overwhelmed with irrelevant matches (imagine searching for Mr Jones on Google – around 5,070,000,000 results!).
Data science can assist here by establishing a probabilistic matching system with variable threshold settings. An organisation can then match the threshold settings to match its risk tolerance.
Detect Networks
The next level of detecting suspicious entities is to see the connection between a current transaction and previously identified fraud. It could be as simple as ‘this person lives at the same address’ to ‘the phone number used has been used to commit fraud before’ and countless variations on this theme.
Some of the most effective network analytics systems used for fraud detection use non-obvious data. For example, the links may well be established by connecting IP addresses, MAC codes etc. Some of the best data may well reside in system logs!
Identify Patterns
Predictive models examine available data against known patterns associated with fraud. At a basic level, the technique can utilise simple attribute matching (eg gender, age, nationality, etc) but more sophisticated tools can substantially increase the accuracy and consume hundreds of variables.
Predictive models are usually based on data analytics but it is also possible to build intelligence-based models when current data holdings do not support sufficient accuracy. The range of processes that can fall into this category is only limited by data availability, the skills of the data science team and the capacity to integrate such systems.
A rich source of data is frequently-ignored metadata. For example, systems that monitor mouse
movements and keystrokes and identify potential deceit based on the way a client completes an online form are available now.
Monitor Trends
This often-overlooked tool can provide early warning if there is a variation in normal trends. For example, a sudden, non-seasonal surge in refund claims from a particular region may indicate the emergence of fraudulent behaviour.
Tools that can automatically monitor trend data at global and more granular levels are readily available and generate alerts when tolerances are breached. While some tools visualise the trend variation on a dashboard, the best tools also generate alerts automatically and do not rely on someone spotting a problem manually or even loading a dashboard.
An integrated, end-to-end, fraud detection and mitigation system may well consist of all or a number of these solutions and usually requires a level of integration with processing platforms. Fortunately, current solutions (eg containers) simplify the challenge.
Fraud Mitigation Framework
Most government agencies and financial institutions collect and maintain large volumes of data in support of their operations. Making optimal use of these data collections underpins the ability to identify and prevent fraud.
Data-driven decision-making relies on:
- being able to collect and see information (data);
- understanding the information and data;
- responding with appropriate counter-measures,
- monitoring/evaluating the effectiveness of these measures; and
- adjusting the system based on the continuous analysis.
Seeing Information/data – if it’s invisible, it is difficult to defeat
The ability to collect and store information and data for downstream processing within required timeframes is a fundamental building block to any fraud-mitigation process. Most organisations collect process data such as applications and claims. Most would also store the results of such processes (eg refused application/claim, approved application/claim).
An organisation that records incidents of identified malpractice in such applications and claims creates a powerful anti-fraud dataset.
Most data systems tend to collect vast volumes of meta-data like system logs. Much of this resource is generally stored and not effectively uses to detect fraud. Tools that collect transaction metadata (eg mouse movements, keystrokes) and feed artificial intelligence that can accurately predict potentially fraudulent intent.
Capturing contextual information for analysis provides additional attributes that will enhance identified fraud but may also provide valuable intelligence around existing but undetected fraud.
Understanding – ‘why’, ‘how’, ‘when’, ‘where’ and ‘what’ happened
Analysis of data and intelligence can reveal how the various fraudulent techniques work. Generally, this relies on a team of subject matter experts working with data science teams to develop deep insights.
Responding – see when suspicious things are happening and stop them
Once the fraudulent techniques are understood, a data science team can build predictive analytics models to detect the adverse patterns in the data to flag similar patterns associated with current (live) transactions. Such models can manage hundreds of variables in close to real-time and identify problematic behaviour with a known level of accuracy and work in close to real-time.
There are many ways of using this process to respond to potential malpractice. One simple example is:
- Applications/claims that are identified as low risk by our risk systems can be expedited. This reduces the cost of processing and increases client satisfaction.
- Applications/claims that are identified as high-risk could be diverted to a process that enables more data collection and/or greater scrutiny.
Monitoring – are countermeasures working?
Once a fraud detection system has been deployed the world will have changed. Eventually, criminals will adjust their approaches and possibly develop new methodologies.
Automated monitoring of an analytics-based system is always desirable as it can detect when expected accuracy or other performance is no longer being achieved. There are many reasons why this will occur but one of them is that criminals have developed new techniques and workarounds.
Monitoring the performance of the analytics-based system and, importantly, collecting and analysing intelligence can close much of this gap.
Adjusting – respond quickly to changed circumstances
The final part of the process closes the loop – lessons learnt through the monitoring processes is fed back into the next version of the system to refresh predictive models and other components.
Why this process?
This process leverages data and intelligence, supports continuous improvement and a capacity to respond to changed circumstances. Importantly, the process maximises the capacity to apply the most appropriate measures to mitigate fraud. In many cases, a response is based only on the detection of a problem. The analysis of the problem provides insights into the method of operation in this case. Once this is understood, an analysis of current data may indicate if this is an isolated case or if more such cases have remained hidden.
Moreover, it ensures that any countermeasures target the real problem. If the problem is potentially widespread, then the effort to build a data-driven model to detect other such cases and a predictive model to identify similar cases in future transactions is warranted. Automated monitoring and feedback loops provide a level of assurance that our solution is still doing what is expected.