Big data is driving fundamental transformation across all
industries and sectors, and finance is no exception. With the proliferation of
mobile devices and rapidly increasing share of digital transactions, financial
institutions are striving to make sense of the massive volumes of data being
generated and captured to understand their customers and predict their needs, so
as to serve them better. Around the globe, they are also under increasing
regulatory pressure to improve risk reporting and controlling money laundering
As a leading bank in Asia, DBS encounters these
issues and hence, it started moving towards becoming a data-driven organisation
a few years ago, taking smart decisions based on data and not instincts.
However, the company’s traditional technology stack for
supporting advanced analytics was expensive to scale and not flexible enough to
support this work.
With Cloudera as a
partner, DBS built
a central data team and enterprise data hub, enabling DBS to scale out more
economically, and experiment more. The agility of the platform allows the bank
to explore use cases and iterate easily and quickly, without the need to worry
about ROI and build an investment case beforehand.
With the ability to more easily store and analyse billions
of events in a modern data platform, DBS can answer questions before they’re
asked to more effectively engage customers and deliver better service.
This has enabled DBS staff to experiment more and be on the
forefront of innovation when it comes to understanding the customer experience
and applying human-centered design to its services.
For instance, machine learning can be used to understand customer
sentiments. All calls to the bank’s call centres are recorded. They can be
converted to text and then machine learning algorithms can be used on the
analytics platform to understand sentiment. Problems can be flagged so that the
bank can reach out to the customers.
Ultimately behavioural information and machine learning, in
combination with biometrics, could even enable ‘invisible authentication’,
where a customer no longer needs to provide many supporting documents or use a
physical device for transactions or answer questions like, ‘What is your
mother’s maiden name’.
In a video interview
with Wee Wu Neo from The Neo Dimension, David Gledhill, Head, Group
Technology and Operations, DBS explained that the use of data goes beyond to
customers. The transformation to a data-driven organization has significantly
improved operations across the organisation.
Data can be used to find out where fraud is happening in the
company. To take a specific use case of this type, trade financing is highly
prone to fraud. To deal with this, DBS started looking at data other than
invoices and transactions to predict the possibility of fraud.
“You look at things like ship movements. If you know the
typical movement patterns of goods from one port to another, then anomalous
goods movement or timing that doesn’t look like typical timing for that type of
transaction or a behavioural shift in importers or exporters or in warehousing,
signals where potentially fraudulent trade might be going on,” Mr Gledhill said.
Data analytics can predict the likelihood of a relationship
manager quitting within the next three months, so that HR staff can intervene early
to retain employees. Data can tell the audit department which branch might have
issues and should be audited next.
Operational staff can understand and predict customer flows,
ATM load, and call centre volumes using data. In fact, one of the first big
data projects DBS embarked upon was figuring out the sequence in which ATMs
should be filled. The bank went from hundreds of instances of ATMs running out
of cash to single digit numbers.
The bank also moved its financial risk information and data
required for regulatory reporting on to the Cloudera platform to simplify
Mr Gledhill said, “We’ve applied it to a whole range of
different use cases and every single one, we see a massive uplift in terms of
the base case that we normally do.”
This has also been aided by the huge active worldwide community
of Hadoop contributors. It includes not just individuals but also tech giants,
such as Netflix, Amazon and Facebook (the platform itself was inspired by
technologies created inside Google). So, the platform keeps evolving and
improving steadily and DBS can build on the contributions made by this vibrant
DBS wanted to make the data analytics capabilities available
to everyone in the bank, as opposed to having a separate team of data scientists
or little pockets of analytics.
However, the oft-repeated cliché of technology being easy
and the ‘people’ aspect being hard was true.
The more difficult part was opening up people’s minds to the
possibilities. The first few use cases played a key role in overcoming
scepticism. They generated a high level of interest and enthusiasm among
different teams within the bank. They began to explore how they could leverage
analytics in their area.
All this improvement in services and operational efficiency
has been achieved while reducing costs.
Mr Gledhill said, “We’ve seen anything in the region of 80%
reduction in operating cost in a much shorter build time. The real big benefit
lift though is the benefit it provides to the business. If you look at our
digitally engaged customers, we see material lift in how much revenue a digital
customer brings to the bank.”
This is an ongoing journey and DBS expects Cloudera to help
them continue along the path towards deeper, better insights.
The state of Punjab has launched inaugurated its first hi-tech integrated command and control centre (ICCC), which will supervise 1,401 closed-circuit television cameras that have been installed across the city of Ludhiana.
The ICCC will monitor traffic, LED lights, sewage treatment plants, common effluent treatment plants (CETPs), rooftop solar panels, and encroachments and defacements. It will oversee the revenue collection of the municipal corporation, including property tax, water and sewerage, disposal, and pet registration. It will measure air quality with data sourced from the central and state pollution control boards. It also has a GPS-based vehicle tracking system to monitor solid waste trucks, corporation vehicles, and city bus services.
As per reports, the centre was set up at a total cost of US $4.5 million. According to the state’s Local Bodies Minister, Inderbir Singh Nijjar, 330 more cameras are being installed in the city that will be linked to the ICCC. The cameras will also help to monitor secondary garbage collection points, compactors along the Buddha Nullah stream, and stray animals.
About 30 vehicle-mounted camera systems are also being installed on police and municipal corporation vehicles that will provide live-feed surveillance footage during protests, public gatherings, or other functions in the city. Additionally, 600 external IR illuminators with a 200-metre range would ensure better monitoring even during zero visibility. Officials believe the centre will bring sweeping change in the functioning of the civic body and police administration.
Punjab has been exploring the use of emerging technology in governance over the past few years. In 2020, it became the first state to roll out a business intelligence tool for big data collection. The tool was provided for free by the Ministry of Home Affairs (MHA). In 2021, the state announced it would integrate crime and criminal tracking networks and systems (CCTNS) following the roll-out of two data analytic tools. The systems enabled police officials in the field to analyse data in a web and mobile-based application. 1,100 tablets were given to police officials in the field and 1,500 mobile phones providing access to a comprehensive database were procured.
Other states around the country are also deploying technology to support public administration activities. Earlier this week, the southern state of Telangana inaugurated a US$ 75 million police ICCC, which will function as a nerve centre for operations and disaster management. It will collect information from multiple applications, CCTVs, and traffic systems for predictive policing.
The ICCC is divided into five blocks. Tower A is the headquarters of the Hyderabad City Police Commissionerate. Tower B is the Technology Fusion Tower that hosts backups-related units like Dial-100, SHE safety, cyber and narcotics cells, and crimes and incubation centres.
Tower C has an auditorium on the ground floor and Tower D has a media and training centre. Tower E houses a command control and data centre for multi-department coordination and CCTV monitoring. The CCTV room will have access to around 922,000 cameras installed across the state.
Police can check footage of 100,000 cameras at the same time. The ICCC has a space for artificial intelligence (AI), data analytics, and social media units. The building also has a sewerage treatment plant and solar panels that can generate up to 0.5 megawatts and. As much as 35% of the land area is dedicated to greenery and other amenities such as a gym and health and wellness centre.
To better serve and protect communities, maintain data security at scale, and perform essential tasks, all government agencies must establish a strong, contemporary data infrastructure that supports data innovation.
Government and the public sector stand to gain considerably by adopting AI into every element of their job. Government AI must consider privacy and security, compatibility with old systems, and changing workloads.
Artificial intelligence is already being used to help run the government, with cognitive applications doing everything from reducing backlogs and cutting costs to handling tasks that humans cannot easily do, such as predicting fraudulent transactions and identifying criminal suspects using facial recognition.
While AI-based technology may fundamentally transform how public-sector employees do their jobs in the coming years — such as eliminating some jobs, redesigning countless others, and even creating entirely new professions — it is already changing the nature of many jobs and revolutionising aspects of government operations.
AI in government services is centred on machine learning and deep learning, computer vision, speech recognition, and robotics. When used correctly, these techniques yield real, measurable results.
Cyber anomaly detection, on the other hand, has the potential to transform cybersecurity strategies in government systems. The possibilities are endless, but they are only now taking shape.
The OpenGov Breakfast Insight on 4 August 2022 offered the most cutting-edge innovative method for enabling large-scale analytics in the public sector.
Public Sector Services Powered by Data and AI
Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, OpenGov Asia acknowledges that data and artificial intelligence will drive the future of government services. “With a unified data platform, the public sector will be able to better serve citizens and protect their communities.”
Governments, in general, are one of the world’s largest employers, with numerous ministries, agencies and departments. The vast network of offices and services introduces significant complexity, operational inefficiencies and, frequently, a lack of transparency.
Agencies must deal with massive amounts of data in various structured and unstructured formats, which will only increase over time. Moreover, they are unable to recognise nor take advantage of the full potential of data, analytics and data due to legacy systems and traditional data warehouses. These are, more often than not, classified by agencies and departments, sabotaging their efforts to undergo digital transformation.
To generate real-time actionable insights and make data-driven decisions, data must be securely shared and exchanged at a scale. Giving government organisations and policymakers access to deeper, more relevant insights into decision-making is only possible through data modernisation.
It is given that much of the information that government agencies oversee is extremely sensitive, including information about the nation’s infrastructure, energy and education as well as information about personal health and financial matters. Data protection at every level of the platform must be ensured through tight interaction with granular cloud provider access control methods.
The fact is that citizens stand to gain through more individualised and effective services, enhanced national security, and wiser resource management that a robust data strategy can give.
Government agencies may adapt to readily access all their data for downstream advanced analytics capabilities to support complicated security use cases by integrating data with analytics and AI.
With such a platform, government security operations teams can quickly identify sophisticated threats, minimising the need for human resources by analytical automation and collaboration and speeding up investigations from days to minutes.
Data stored by public sector bodies can be extremely valuable when shared with other departments and used to elevate data-driven decision-making. The time has come to leverage the cloud’s scale and democratise secure data access to enable downstream BI and AI use cases, allowing government agencies to accelerate innovation.
Governments can improve citizen services while implementing smarter and more transparent governance by leveraging data, analytics and AI for actionable insights at scale. It eliminates data silos and improves communication and collaboration across agencies to achieve the best results for all citizens, delivering personalised citizen services while achieving data security and cyber resilience for a satisfied population.
Building a Scalable Data, Analytics and AI Strategy with Lakehouse Platform
Data infrastructure is an essential aspect of data processing and analysis, according to Chris D’Agostino, Global Field CTO, Databricks.
The complete backend computing support system needed to process, store, transfer and preserve data is referred to as the “data infrastructure.” Without the appropriate data infrastructures, businesses and organisations cannot extract value from their data.
“If there’s one thing that many of us all have in common, it’s that we believe in the impact that data and AI can and will have on the world,” says Chris. “Today, data and AI are transforming every major industry.”
On the other hand, with the ongoing globalisation of artificial intelligence and machine learning, there is an increasing need to rethink an organisation’s whole leadership and thought process, from product strategy and customer experience to strategies to increase the efficiency of human resources.
Rules, models and policies that specify how data is gathered stored, used and managed in the cloud within a company or organisation are contained in cloud data architectures. It controls the data flow, processing, and distribution of that data across stakeholders and other applications for reporting, analytics and other purposes.
Every year, data collection by businesses and organisations increases thanks to IoT and new digital streams. In this climate, cloud data architecture-based data platforms are displacing more conventional data platforms, which are unable to handle the growing data quantities and increasingly demanding end-user applications like machine learning and AI.
Companies are using all available data to expedite, automate and improve decision-making to increase resilience and obtain a competitive edge in the market. These methods for digital transformation are supported by AI and data literacy.
To fully realise the benefit of data and AI, change management is necessary, just like with any change in working practices. It is essential to create a cohesive and evolving plan. This can be based on three pillars: business strategy, operationalisation and architecture (after the technology barriers have been recognised).
Whether it’s a business strategy, data management, or organisational knowledge, it’s critical to assess the organisation’s level of maturity and data literacy.
Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses to deliver the dependability, strong governance, and performance of data warehouses while also allowing for the openness, flexibility and machine learning support of data lakes.
By removing the data silos that normally segregate and complicate data engineering, analytics, BI, data science and machine learning, this unified approach streamlines the current data stack. To increase flexibility, it is created using open standards and open-source software.
Additionally, its shared approach to data management, security and governance works more productively and develops more quickly.
In a global research effort in collaboration with an institution, Databricks polled 117 data leaders and the survey’s findings were illuminating and instructive.
An analytics leader’s biggest regret and issue was not embracing an open standards-based data architecture. “This didn’t surprise us. We are seeing many of our clients adopting the best open-source technologies,” Chris reveals.
In addition, the poll showed that only a small group can be successful with their AI projects, while the multi-cloud is a growing reality.
Most executives say they are currently evaluating or implementing a new data platform to address their current data challenges. During these challenging times, cloud technologies allow businesses to respond and scale rapidly.
With scalable data, analytics and AI strategy, organisations can create significant value. They can implement real-time monitoring, create tailored customer experiences, deploy predictive analytics, and much more. Databricks offers tools that are specifically designed to address the challenges described.
In Conversation With: The Future of Government Services and Shared Data
All the government agencies’ data must be protected and every component must be safeguarded. Unified data with analytics and AI makes it simpler to provide quick access for the organisation’s teams and complete support for security use cases.
Joseph Tan, Deputy Director (Capability Development), Data Science & Artificial Intelligence Division, Government Technology Agency emphasised the importance of data modernisation with a holistic approach. A policy-driven industry that would entrust the organisations’ data will lead to better customer service.
Joseph is convinced that “As technology advances, most businesses are confronted with issues caused by an existing legacy system. Instead of providing companies with cutting-edge capabilities and services such as cloud computing and improved data integration, a legacy system keeps a business constrained.”
A legacy system is computer software or hardware that is no longer in use. The system still meets the needs for which it was originally designed, but it does not allow for expansion. Because a legacy system can only do what it does now for the company, it will never be able to interact with newer systems
“A business might keep using an old system for more than one reason. In the world of investments, for example, upgrading to a new system requires an initial investment of money and people, while keeping an old system running costs money over time,” Joseph explains.
On the other hand, when a whole company moves to a new system, there can be some internal resistance and worries about how hard it will be and what might go wrong. For example, legacy software might have been made with an old programming language, which makes it hard to find staff with the right skills to do the migration.
Additionally, there might not be much information about the system, and the people who made it might have left the company. It can be hard to just plan how to move data from an old system to a new one and figure out what needs the new system will have.
Increased security risk, instability and inefficiency, incompatibility with new technology, company perception and new hire training, single point of failure and lack of information are a few issues that older systems run against.
At best, outdated legacy systems are a pain, and at worst, they can seriously jeopardise an organisation’s overall IT security strategy. Furthermore, the longer a business waits to update a legacy system, the more challenging the transition will be.
System modernisation is almost always a must before digital transformation can occur. Most businesses won’t be able to fully profit from contemporary technologies and solutions without it. “With this, finding the right talent would be very beneficial for the organisation to manage their modern technologies,” says Chris.
Some advantages of updating legacy systems such as enterprises can enhance their IT security and sustain it by taking advantage of vendor upgrades and fixes in the future by updating legacy systems. Modern systems and solutions, including retrofitted legacy systems, are built to deliver optimal performance without consuming excessive amounts of computational power.
Even a legacy system may be modernised to include new features, giving the business users additional capability and a better user experience. The truth is that updated legacy systems require less input from IT staff, freeing them up to focus on activities that really benefit a company.
Similarly, governments all over the world will undergo a fundamental upheaval because of big data and artificial intelligence. Even though the public sector has long used data, the potential and actual use of big data applications have an impact on some theoretical and practical aspects of decision-making. This is fuelled by both the data revolution and the concurrent advancement of advanced analytics.
The availability of data that may be employed in the computer learning process is a major aspect of the maturing of AI technology and the practicality of AI applications to public policy and administration.
However, without the underlying analytical technologies, the data revolution can be seen as only a change in the size of the data that is currently available rather than a fundamental change. As predictive analytics, innovative data and artificial intelligence gain prominence, it is critical to understand their roles in the public sector.
At the start of their data journey, organisations require data capture systems to discover information embedded in all levels of business operations. Following that, the data must be validated for informational accuracy and integrated to reduce the risk of drawing incorrect conclusions and to create a unified view of the business.
The final step is analysis, in which businesses collaborate with data analysts who use cutting-edge analytics tools to peel back layers of proprietary data in search of insights to power change.
Larger companies with more complex data integration and analytics processes can add predictive analytics as the fourth step.
When analysing enormous datasets (often referred to as “big data”), predictive data analytics, also referred to as advanced analytics, uses autonomous or semi-autonomous algorithms to make predictions based on information patterns. Data analysts may provide clients with greater service, which can result in more meaningful transformations, by delivering deeper insights into company data more quickly.
Think about how AI and machine learning might be used in the context of the data processing flow. Analytics tools assist data analysts in identifying areas for improvement in the business after private data has been collected, analysed and combined into a single view.
AI excels at discovering data patterns that humans cannot perceive. This is quickly scalable based on the amount of the dataset. To make data analytics frictionless, machine learning algorithms can also adapt to data pipeline input and human behaviour patterns. This can be accomplished by utilising natural language processing to recode communications between individuals within an organisation so that algorithms can comprehend and act on them.
Artificial intelligence and machine learning have become the “next big thing” in the government sector, while advanced analytics, also known as predictive data analytics, utilises autonomous or semi-autonomous algorithms to evaluate enormous datasets and generate predictions based on information patterns.
By developing deeper insights into company data more quickly, data analysts can provide better service to clients, which can result in more profound transformations. Consider the application of AI and machine learning to the data handling process. After unique data has been collected, analysed and consolidated into a single view, analytics tools assist data analysts in identifying areas for business development.
Smart solutions enable advances that are self-sustaining and AI and ML are at the heart of these. Executives and practitioners agree that AI and ML are catalysts and drivers across both the public and private sectors. As an AI system has a deeper understanding of data platforms and processes, it can continue to enhance its efficacy and capacity to provide personalised insights from massive data silos.
In closing, Chris shared that Databricks was established in 2013 to assist data teams in resolving the most challenging issues facing the globe, and they have been investing in the Asia Pacific region to help this objective forward. “While there are countless possibilities, there are several challenges as well.”
It is insufficient to merely fund and use AI technologies. Businesses and organisations need a talent pool of experts that can use these AI tools in a way that can guarantee the greatest outcomes.
Currently, customers from a wide spectrum of businesses are collaborating with Databricks to tailor their clients’ experiences to improve their capacity to react to market dynamics and safeguard both their own and all stakeholders’ interests. This is most evident in real-time for financial services organisations to help deal with fraud.
“My particular favourite is Databricks’ assistance in Mitsubishi Tanabe’s efforts to quicken drug clinical trials in Japan. The possibilities for our collaboration are virtually endless,” Chris reflects.
Mohit recognises that digital transformation is vital in today’s VUCA environment. What is essential is that industry and government collaborate and work together. For long-term success and sustainability, there have to be partnerships between the public and private sectors.
Strategic alliances gave businesses and government agencies a competitive edge. Partnerships are mutually beneficial, helping each other grow and get better. When people genuinely try to help each other, “it can help to get over certain weaknesses and be first movers in their field.”
The development of the National Capital City (IKN) of the Archipelago has made the integration of spatial data and non-spatial data very strategic. “We need to push for precise, good spatial data that can be operated to support all development sectors in IKN,” says Muh Aris Marfai, Head of the Geospatial Information Agency (BIG) during the opening of the Regional Geospatial Information Network Coordination Meeting for Regency/City in East Kalimantan.
He added that the development of the Geospatial Information (GI) system in East Kalimantan is becoming increasingly important in line with the development of the IKN. Thus, a Regional Geospatial Information Network Coordination Meetings were held to enhance the role of local governments in the construction of network nodes. In addition, it is also an initial assessment of the condition of the regional network nodes.
The network node assessment includes five pillars such as regulations and policies; institutional; human resources; technology; standards for geospatial data and information. The government is looking for solutions for those who convey problems and obstacles related to the construction of their respective regional network nodes.
Currently, there are only two cities in East Kalimantan that have not been integrated into the Regional Geospatial Information Network (JIGD). The two areas in question are Paser Regency and West Kutai.
One of the important factors in the development of the capital city is the availability of data. The government hopes that Paser and West Kutai Regencies will soon build JIGD. The data in JIGD will later be integrated with statistical and financial data which is being pursued together as the One Data Indonesia (SDI) programme, hence the support of the local government to improve the operationalisation of the functions of the Regional Apparatus Organisation (OPD) of network nodes is very important. This is necessary to integrate and synchronise Thematic Geospatial Information (IGT), as well as resolve various spatial problems and conflicts.
Regulations and policies are also very important in realising the strengthening of regional network nodes. Coordination between stakeholders, academia, the private sector, and government partners must run synergistically so that the sharing of spatial data through network nodes can be carried out optimally.
Meanwhile, the nation’s Geospatial Information Agency recently signed an agreement with the Regent of Berau Sri Juniarsih Mas. The collaboration is for organising, developing and utilising Geospatial Data and Information in Berau.
Aris explained that Presidential Regulation Number 27 of 2014 concerning the National Geospatial Information Network (JIGN) regulates the need for the establishment of network nodes. Therefore, each region is obliged to organise a JIGD. Currently, there is the construction of the National Capital City of the Archipelago, so the integration of spatial and non-spatial data in the East Kalimantan region is very strategic.
The collaboration with BIG is an effort to foster the implementation of government affairs related to land use and development investment. The government hopes that the existing Geospatial Data and Information will be able to assist the decision-making process in the planning and programme of the Berau Regency Government.
In addition, the collaboration with BIG is a form of support from the Berau Regency Government for the One Map Policy (KSP) for the success of national development as the KSP can be used as a guideline for implementing regional policies that refer to one standard geospatial reference, one database, and one geoportal.
The KSP plays a very important role in addressing the problem of overlapping land use in the regions that hinder economic growth. The lack of certainty of land availability will greatly affect development investment.
The third update on the government’s measures to secure personal data has been released by Singapore’s Smart Nation and Digital Government Office (SNDGO). To increase transparency regarding how the Government utilises and safeguards citizen data, the Public Sector Data Security Review Committee (PSDSRC) made this annual update a fundamental recommendation.
The number of incidents involving government data increased from 108 in FY2020 to 178 in FY2021. While the number of data incidents reported has increased, none of these incidents was deemed severe enough to have a significant impact on the agency or the individuals affected.
The overall increase in reported data incidents mirrors trends seen in the private sector and globally, as data exchange and use continue to grow. The increase also reflects increased awareness among public officials of the importance of data security and reporting all incidents, regardless of severity.
Additionally, the government began developing the Central Account Management (CAM) Solution in August 2021 to improve the user account management process. The CAM solution automates the process of removing and deactivating user accounts that have been deactivated due to staff turnover. Since its inception in April 2022, 32 per cent of eligible Government IT systems have been configured for CAM onboarding.
In May 2022, the government will also launch the Whole-Of-Government (WOG) Data Loss Protection (DLP) Suite. The WOG DLP Suite prevents sensitive data from being accidentally lost from government networks, systems, and devices. To detect risky user activities, the WOG DLP tools employ technical and process controls.
Since its inception in December 2020, the DPPCC has been developing data privacy protection toolkits that agencies can use to promote data protection while not limiting its use. Furthermore, DPPCC has been collaborating with agencies to co-create solutions to strengthen data privacy and key system protection. To reduce the risk of data exposure, these solutions include dataset segregation and stringent encryption standards.
Likewise, the government recognises that it is impossible to eliminate data incidents, but we must have the expertise and capability to respond quickly when they occur. The government held the first central ICT and Data Incident Management exercises in September 2021 to ensure that the public sector is prepared to respond to data incidents at the WOG level. 33 agencies from five Ministries participated in the exercises.
Developing the public sector’s capabilities and instincts in data management and security is an ongoing process. Since May 20, 2021, the government has launched a series of engagement campaigns and workshops aimed at all government employees. These campaigns and workshops are intended to raise officers’ awareness of the importance of using data securely and to educate them on how to do so in their daily work.
Meanwhile, OpenGov Asia earlier reported that the Personal Data Protection Commission (PDPC) and the Infocomm Media Development Authority (IMDA) have introduced the Privacy Enhancing Technologies (PET) Sandbox to support firms looking to prototype PET initiatives that address common business problems.
The goal of the PET Sandbox is to work with participants from the commercial sector to determine the appropriate PET to use for a given instance and their technological limits to generate guidelines and best practices that will promote more adoption. The PET sandbox will provide a safe environment and a testing ground for PET concepts to achieve this.
Overall, the government’s initiatives have helped to improve the data security posture of the public sector. Singapore will continue to strengthen its security efforts to protect both citizens’ and businesses’ data. The third update on the Government’s personal data protection efforts is available on the microsite “A Secure Smart Nation” (go.gov.sg/publicsector-data-security-review).
COVID-19 has caused significant disruptions in the domestic economy, as community restrictions have limited people’s movement and business operations. The silver lining in the global catastrophe is that the pandemic drastically accelerated digital transformation across sectors. Digital technology has become critical for nations around the world in dealing with the crisis, moving toward economic recovery, and resuming long-term goals.
The application of digital technology to economic activities resulted in the emergence of the “digital economy,” which is defined as an economic system that achieves rapid optimisation of resource allocation and high-quality economic development by identifying, selecting, screening, storing and utilising large amounts of data.
As a result of the pandemic, many new digital businesses were established, and others abandoned traditional approaches in favour of tech-enabled strategies. Digitalisation provides a competitive edge for a country when used in conjunction with complementary policies and initiatives. The value of digitalisation is best harnessed when complementary technologies, resources, and capabilities are properly utilised along with appropriate legislative frameworks.
Malaysia, too, has had a forceful and robust response to the pandemic. Proactive and calibrated policies are assisting in the protection of vulnerable people and the revitalisation of the Malaysian economy. This country unveiled the MyDIGITAL strategy, which is a combination of re-evaluated efforts and new initiatives designed to develop Malaysia’s Digital Economy.
Discussing the latest research and case studies on the current use and possibilities for cloud computing was the focus of the OpenGovLive! Virtual Breakfast Insight on 27 July 2022 for senior digital executives of the Malaysian public sector.
Cloud Computing: An Enabler for Digital Government
Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, acknowledged that the COVID-19 crisis forced nations to move quickly to provide unprecedented emergency assistance to keep citizens safe, households and businesses afloat, protect jobs and incomes and keep the economy from collapsing.
Due to the unprecedented restrictive movement measures, people almost completely shifted to remote functioning – whether for work, education, entertainment, banking or commerce. This caused a massive uptick in the number of transactions online.
Businesses and agencies using cloud-based technologies were able to continue their operations without interruption. Others who were unprepared quickly realised the need for the deployment of digital solutions and platforms, infrastructure, data storage and processing capacities to adapt to the new normal.
What was formerly seen as either non-essential or difficult is now the preferred method of functioning for many. People have tasted, appreciated and gotten used to the ease and effectiveness of utilising information and communications technologies. While many people have had to return to conventional offices there is a continued preference for a hybrid model.
The entire shift demonstrated how critical it is to have stronger platforms for both the public and private sectors. Digitalisation has proven to be too effective to pass up given the promise of higher and safer living standards and greater social inclusion irrespective of the environment.
One of the primary enablers of this move to digital is cloud computing technology. It has enabled the delivery of government services in a more agile, fast and cost-effective manner than with traditional information technology infrastructure. Public service can be future-proofed by migrating government systems to the cloud and incorporating its full capabilities into new digital solutions.
Be that as it may, many governments still struggle effectively use cutting-edge technology to deliver better services to citizens.
Some more technologically advanced nations have demonstrated how the cloud strategy made it possible for ever-innovative ways to improve the delivery of public services. Yet, the deployment of cloud computing in many other nations’ public sectors still faces obstacles. This requires revisions or creations of government-wide policy enabling regulatory conditions more suitable for a robust cloud strategy.
To make Malaysia a nation that is growing sustainably with fair economic distribution as well as equitable and inclusive growth, the digital economy was selected as a key economic growth area (KEGA) in realising WKB 2030.
In Malaysia, MyDIGITAL was set up as a national initiative to show how the government wants to transform the country into a digitally driven, high-income country in the digital age. It is intended to support national development initiatives including the Wawasan Kemakmuran Bersama 2030 (WKB 2030) and the Twelfth Malaysia Plan (RMKe-12).
With the help of MyDIGITAL, Malaysia will be able to successfully convert into a highly prosperous, digitally driven country that leads the region in the digital economy.
Responding to Urgent Necessity with Innovation
Seng Heng Chuah, Malaysia Country Manager, Public Sector, Amazon Web Services emphasised that while COVID-19 has disrupted traditional teaching methods, it has also prompted a rethinking of how education can be delivered.
As forward-thinking educational institutions reimagine their delivery models, they are paving the way for new ways to equip students with the skills necessary to succeed in the digital economy.
With this, the AWS Educate Programme provides resources for students and educators to build cloud skills. It is used in Malaysian educational institutions such as the Asia Pacific University of Technology and Innovation (APU).
At APU, they used serverless tools like AWS Lambda to run a secure multi-platform mobile application to improve the user experience for both their staff and students.
“This not only helped the university to reduce user complaints by 65%, but it also empowered the university to achieve 116 times faster delivery of educational resources,” says Chuah.
He is excited to note that the most advanced cloud customers in Malaysia come from the Education sector. They don’t just use the cloud for R&D but also for day-to-day operations. They get students to develop solutions on the cloud because that’s the future of IT – it’s all about software and services.
More public sector agencies are utilising the cloud, and this development is widespread across all nations. Therefore, citizens demand more intelligent applications in addition to e-government portals. They want to communicate with the government more effectively, perhaps through chatbots and other such platforms.
The cloud is a vital facilitator of the digital economy, which is seen as a growth driver by many nations, whereas high-performance computing (HPC) is used most efficiently in the science and research sectors. Similarly, the government can also use the cloud to implement solutions for the Internet of Things (IoT), Blockchain (BC) and Artificial Intelligence (AI), which would be expensive and time-consuming to implement on-prem.
“At the end of the day, it’s about creating a better environment for everyone to thrive. And that includes carving out a space for local innovation – something we’re passionate about,” Chuah firmly believes.
Many government organisations use the cloud to enhance the delivery of their services. For instance, the Department of Statistics Malaysia (DOSM) moved to the AWS Cloud and made its census data accessible to 9 million consumers.
When DOSM switched from maintaining expensive on-site infrastructure to using resources from the cloud, the government could save 25%–50% on resource expenses. This improvement enabled DOSM to manage all traffic on the census portal, even at its high of 200,000 users.
AWS and the government are working together to set up a hybrid cloud data centre using AWS Outposts. They anticipated that the hybrid cloud data centre, which is only intended for use by the federal government, would be available by the end of the year.
Also, while most apps may be easily moved to the cloud, others must first be re-architected or “modernised,” and some must stay on-premises for the foreseeable future due to low latency and local data processing requirements, or data residency. These programmes must be installed in on-site datacentres, branch offices, manufacturing facilities, dining establishments, edge nodes in major metro areas, 5G networks, and other distant places.
Chuah shared three trends that underpin the need to support applications that may need to reside outside of traditional cloud regions and availability zones, in addition to existing legacy on-premises and edge workloads.
The first trend is the emergence of a new class of ultra-low latency applications, such as real-time gaming, video streaming, AR/VR, autonomous vehicles, content creation, engineering simulations and ML inference at the edge. These applications are used in on-premises datacentres, branch offices, hospitals, factory floors, retail locations, on the outskirts of cell tower sites and near groups of artists, scientists and engineers. End users frequently access these ultra-low latency applications via mobile devices, so they must be deployed at the 5G network edge to benefit from 5G’s speed and bandwidth benefits.
The processing of local data is a second trend. Customers’ digital transformation initiatives and increased use of IoT devices are producing massive amounts of data. Due to cost, size, bandwidth or scheduling limitations, some of these data sets must be processed locally because they can’t be transferred to the cloud.
Data residency is the third trend. Due to security and tax laws, data sovereignty and shifting geopolitical factors, customers may be required to keep their data in a certain nation, state, or municipality. When a customer’s data residency requirements cannot be met by a cloud region, they must maintain and/or install additional IT infrastructure to support those workloads.
Sustainability, according to Chuah, is still AWS’ top focus; but as they step up their fight against COVID-19, they haven’t forgotten about another pressing global issue: climate change. “We are dedicated to developing a sustainable business for both our clients and the environment.”
This commitment to sustainability is seen in AWS’s co-founding of The Climate Pledge in 2019. Its goal is to have all its operations powered by renewable energy by 2025 and to achieve net-zero carbon emissions by 2040.
As a significant technology firm, AWS is aware of its environmental impact and the efforts it might take to lessen it. Organisations can save their energy use by up to 76 per cent by migrating to the cloud. “Ensuring we have the right components to thrive in this digital economy is necessary by building a plan, assessing the readiness for the cloud, and migrating and modernising the workloads.”
All innovations and initiatives shared by Chuah demonstrate not only the revolutionary power of digitisation and modernisation, but also the resilience of the human spirit in the face of hardship.
The Cloud Imperative – A Leadership Question
Andre Mendes, Chief Information Officer of the US Department of Commerce, shared some examples of his experiences over the last decade where cloud hosting became a de facto standard.
In 2009, the Special Olympics became almost 100% Cloud-based globally and they had a testing ground for many vendors with a minimal budget. This resulted in the Special Olympics experiencing no disruption and increased its athlete base from 1.4 million to 4.5 million over the next ten years.
“This is a sample of massive progress despite resistance,” says Andre. “Even though many donations came in from large IT players and lower-risk non-profits, there were sceptics and risk avoiders.”
As an example, the US Department of Commerce’s International Trade Administration (ITA), which is already 100% cloud-based in 2018. ITA had a minimal budget, but they could maintain a fully integrated environment with IT, communications and telephony.
From the start of the pandemic, ITA maintained seamless operations and led the way with ZTA and Borderless Networks as they reinvested in custom Agency software functionality.
“Leadership must be adaptable as the environment evolves,” Andre reiterates. “Leaders identify a fundamental shift in the competitive environment and act to mitigate a potential disruption or, better yet, gain an advantage by seizing new opportunities before competitors do. For most businesses, digital transformation begins from the outside in.”
Even the most forward-thinking transformation strategies are doomed to fail if they do not place equal emphasis on the inside and outside of the organisation.
According to Andre, an organisation needs to support the shift and the new working methods that come with it; therefore, leaders must spearhead the transformation. Any digital transformation programme, including cloud migration, is more likely to be successful when the leadership is on board.
As more businesses migrate to the cloud, a growing number of internal cloud migrations happen as businesses switch between multiple cloud providers. It’s crucial to evaluate the organisation’s requirements and identify the variables that will control the transfer, including historical data, critical application data and application interoperability.
Next, it is necessary to classify data to identify which needs migrating, and what needs scrubbing. Determining these requirements will help the organisations create a sound plan for the tools they’ll need during migration. They’ll also be able to choose the right destination volumes and decide whether the data needs to be encrypted at rest and in transit.
“Always look for innovation; the biggest risk is not moving forward,” Andre is convinced. “Innovation represents the enhancement of something that has already been, and the most innovative people will eventually experience long-term entrepreneurial success. Consumers and peers recognise businesses as true innovators and leaders when they take the biggest risks, close the widest gaps, and seize the newest chances.”
Following the informative talks, the delegates took part in discussions encouraged by polling questions. The goal of OpenGovLive! Virtual Breakfast Insight is to provide live audience engagement, inspire participation, and allow people to learn from and grow professionally from real-life experiences.
On being asked what the delegates’ cloud strategy was, most responded with a hybrid cloud. Delegates said that they could use cloud services where they are most effective while keeping certain operations on-premises or within a private cloud. This allows for greater flexibility.
On how organisations evaluate the success of their cloud adoption, the majority opted for high availability/downtime management, while others were resource productivity, efficiency, and cost saving.
Most organisations lack a system for evaluating the success of cloud adoption. Furthermore, there isn’t much information available on assessing the success of cloud adoption within an enterprise.
Delegates said that the number one criterion for choosing a cloud service provider is still price. This is followed by security, and by performance.
A delegate felt that before a business can effectively choose a good provider, it needs to know what its business needs are. When organisations know precisely what they need in terms of technical, service, security, data governance, and service management, they can ask their small group of potential providers better questions.
On being asked what they thought were barriers to going digital and using the cloud, management support and budget were seen as the greatest ones.
With speed and agility being a clear advantage of cloud adoption, the cost quickly becomes a barrier to success. Adopting the cloud makes deploying more environments and leveraging more resources easier and quicker, but it also comes with higher prices and significant security issues for careless teams.
In the last poll, the delegates were asked how they planned to update their legacy and application systems. The majority answered application assessment to move to the cloud, while others worked with a cloud service provider and outsourced to a system integrator.
For many organisations, legacy systems are seen as stifling business initiatives and processes; however, they have begun to recognise the importance of modernisation to help their business grow.
Mohit agrees that scaling a firm and preserving profitability calls for developing partnerships that simplify digital transformation for customers. “Partnerships are the way forward for companies who want to use the cloud.”
To market, sell, create, integrate, customise, deploy, and support new applications on-premises, in the public cloud, or in hybrid cloud architectures, partnerships can offer the necessary knowledge.
“Public service must be genuinely available for the citizens and Cloud is the future,” says Mohit. “Partnerships enable providers to diversify their offerings by adding things like managed security, IoT solutions, and analytics.”
In the complicated, developing world of cloud computing, IT companies frequently collaborate for financial gain, but they also do so more frequently because customers expect things to function.
Andre was delighted to be invited as a speaker and was encouraged by the fact that many young people, particularly women, are representing the IT arena and that many new skills will be developed.
Chuah said that the pandemic has taught the world that “changing is the only constant”. People can be confident that they can bend, respond and adapt without breaking when life throws them a curveball if digital innovation is at the core of a long-term economic plan.
AWS is steadfast in its stance. They are passionate about driving the public sector’s digital transformation as they are committed to the development of cloud computing services, catalysing the development of sustainable digital government. With the appointment of the Cloud Service Provider (CSP) panel, they will continue to deliver their commitment to continue supporting the Government’s strategic initiative.
A cloud migration needs a set of plans and vision as the first step and the “only way to go to the cloud is to try and be confident to use it.”
Chuah spoke about the AWS Migration Acceleration Programme (MAP), a comprehensive and tried-and-true cloud migration programme. Enterprises’ migrations can be complex and time-consuming, but with an outcome-driven methodology, MAP can help them accelerate their cloud migration and modernisation journey.
“We remain focused on supporting Malaysia to lead in today’s digital economy as we leverage our global experiences with more than 7,500 public sector agencies to enable our customers through Cost Savings, Staff productivity, Operational Resilience and Business Agility.”
Cloud computing services, artificial intelligence (AI), the Internet of Things (IoT), 5G technology, fixed broadband Internet, and blockchain technology are expected to lead the information technology and telecommunications sector over the next few years. According to a recent survey, technology companies are investing in core and fundamental technologies to serve digital transformation. Encouraging the digital transformation of business could be a crucial step as the Department of Enterprise Management earlier estimated that the country’s gross domestic product (GDP) could surge by US$30 billion if the country successfully digitises its small and medium-sized enterprises (SMEs).
Cloud computing services in Vietnam are forecast to develop with better security than physical servers, helping organisations increase productivity and reduce machinery and infrastructure costs. Vietnam’s cloud computing market is predicted to grow by nearly 26% annually, the fastest pace in Southeast Asia and higher than the global average of 16%.
About 66.67% of enterprises are applying AI to their digital transformation process. AI can manage and optimise infrastructure and customer support and is expected to access all businesses in the future. Last year, the government issued a national strategy on the research, development, and application of AI until 2030, aiming to gradually turn Vietnam into an innovation and AI hub in ASEAN and the world.
Internet of Things
The rate of firms using the IoT this year has reached 86.67%, an increase from 66.67% in 2021. IT experts believe it has the greatest development potential at present. Businesses can connect IoT devices from afar, collecting, and managing their data, processing data on demand, and sharing data with devices outside the IoT network. In this way, the IoT can automate processes, minimise labour and infrastructure costs, manage supply chains, optimise energy use, and improve sustainability.
The 5G is expected to contribute about 7.34% to the country’s GDP by 2025, according to the Institute of Information and Communications Strategy under the Ministry of Information and Communications (MIC). The application of 5G services will help telecom enterprises boost the use of AI and IoT in smart city building and business operations and meet digital users’ demand for high-definition videos, virtual reality, and augmented reality.
According to the MIC’s Authority of Telecommunications, telecoms network infrastructure has been expanded to 100% of communal-level localities. The 2G, 3G, and 4G mobile networks have covered 99.8% of the population while 5G has been piloted in 16 provinces and cities.
By the end of last year, there were nearly 71 million mobile broadband subscribers and 18.8 million fixed ones, a 4% and 14.6% increase from 2020, respectively. Internet traffic in Vietnam also rose by over 40% last year. Currently, the proportion of adults using smartphones in Vietnam is 73.5%. Vietnam aims to increase the rate to 85% by the end of 2022.
With an increase in blockchain technology applications and rapid, large-scale digital transformation, Vietnam has the potential to compete in the global market and become a hub for technology. It is estimated that by 2030, blockchain will create 40 million jobs, and 10-20% of the country’s economic infrastructure will run on blockchain-enabled systems.
The Department of Statistics Malaysia (DOSM) inked a Memorandum of Understanding (MoU) with the Asia Pacific University of Technology and Innovation (APU), in driving data mining, data analysis and data analytics training among academicians and students.
The signing ceremony was held recently on APU’s campus situated at Bukit Jalil and was signed by the Chief Statistician of Malaysia and the Vice-Chancellor of APU. It was witnessed by senior officials from both institutions.
Through this MoU, DOSM and APU agreed to the sharing of data analysis, research sharing analysis results and new findings. The MoU also implements collaboration in the areas of statistics research and knowledge exchange pertaining to information supply, storage, exchange, and improvement of official information as mutually agreed upon and allowed by the law.
In addition, the partnership will also produce subject matter experts for both parties to satisfy the current and future needs for data by using the latest statistical techniques, the application of data science and analytics and other innovative approaches.
For the aspiration of new knowledge, experience, expertise and research, this collaboration signifies a smart partnership between both parties. DOSM will provide selected micro datasets for research purposes and the development of new knowledge among the APU community.
It is hoped that academics and students will make full use of this data to enhance their skills in data mining using a large-scale dataset and therefore improving analytical capability as well as evidence-based decision making, Malaysia’s Chief Statistician said during his speech.
APU’s Chief Operating Officer, who delivered the welcome speech earlier stated that DOSM is in the process of establishing the National Big Data Analytics and Data Processing Centre, APU is passionate about Big Data, launching the first Big Data or Data Science programme in Malaysia at the post-graduate level in 2015.
We are thrilled to establish this relationship with DOSM, as we think we can go beyond getting data and contribute to the creation of predictive analytics for our decision-making on institutional as well as national levels,” the CEO concluded.
In 2020, the Data Analytics Market generated a revenue of US$22.99 billion and is projected to reach a market value of US$ 346.24 billion by 2030, growing at a 30.7% CAGR.
Data analytics comprises a majority of enterprises’ processes as it helps them manage, process, and streamline large data sets. This is done in real-time and improves a company’s decision-making capability. Other objectives of data analytics include helping firms develop a better understanding of clientele as well as narrowing down the massive amounts of data to the targeted audience. This can effectively improve a company’s marketing. Many companies are increasing their adoption of data and business analytics to analyse the mammoth volumes of data generated in both offline and online trading.
The factors contributing to the upsurge of data analytics include the adoption of big data analytics software by different organisations, and delivering and enhancing decision-making. Other factors include competitive advantages provided by data analysis, allowing for choices to be made promptly. Further, cloud-based data analytics software has been introduced among small and medium enterprises and this has positively impacted the growth of the market.