We are creating some awesome events for you. Kindly bear with us.

Singapore OpenGov Leadership Forum 2022 Day 1: Spurring Organisations’ Capability in Augmented Intelligence, Big Data and Advanced Analytics

Government and enterprises, in an unprecedented period in history, have been compelled to accelerate and bring forward their digital transformation strategies. The pandemic has vaulted the governments and businesses into the next stage of digital transformation and online services.

Personalisation, efficiency and effective services are only possible with a comprehensive, 360o view of citizens and customers. This understanding is built on and powered by data. More than ever, data has become integral for organisations interested to get ahead of the curve, gaining a competitive advantage and engaging their customers more effectively.

To become a truly data-driven organisation that operates in real-time, agencies must deploy multiple modernisation initiatives, including application modernisation, artificial intelligence, machine learning, cloud, edge computing and analytics.

In that regard, Singapore has taken the lead in championing the use of data. Singapore has unveiled two new programmes to drive the adoption of artificial intelligence (AI) in the government and financial services sectors. It also plans to invest another SG$180 million ($133.31 million) in the national research and innovation strategy to tap the technology in key areas, such as healthcare and education.

The fund is on top of SG$500 million ($370.3 million) the government already has set aside in its Research, Innovation and Enterprise (RIE) 2025 Plan for AI-related activities, said the Smart Nation and Digital Government Office (SNDGO) in a statement in November 2021

These investments have been earmarked to support various research in areas that address challenges of AI adoption, such as privacy-preserving AI, and of societal and economic importance including healthcare, finance, and education. The funds also will facilitate research collaborations with the industry to drive the adoption of AI.

The future lies in harnessing data to deliver more effective and personalised services and the government has signposted the future with their policies. Agencies need a platform that draws together disparate applications, systems and teams with data being the backbone and making it easier to gain actionable insights. This platform should be able to unlock and repurpose the existing data for countless modern applications and use cases securely and efficiently.

The focus of the first day at the OpenGov Leadership Forum was aimed at unpacking the importance of data in empowering the public and private sectors to power mission outcomes, better serve citizens, ensure security and compliance, enhance IT efficiency and maximise productivity.

Morning Session

Powering a new world reality through data

Mohit Sagar: The critical importance of technology in a data-driven world.

Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.

“We are in the age of Metaverse,” Mohit opens. “While cryptocurrency was once viewed with suspicion – banks denounced it and people called it a hoax. Yet in 2022, it has become the currency of the future.”

The fact is that the world is rapidly changing and there is a need to stay ahead of the curve and stay relevant – and people are capable of accelerating things. Organisations were able to rapidly change governance and personalise information for customers and citizens.

Today, responsive citizen engagement is more important than ever. Organisations can deliver faster, more personalised and interactive experiences for citizens and other agency stakeholders with event streaming. “Data, and universal access to it, is the key to transforming organisations,” Mohit asserts.

Citing the example of Regeneron, Mohit points out that after developing a COVID-19 treatment in mere months, Regeneron adopted a data catalogue and is developing a data governance framework to speed up its drug development pipeline.

“Information and insights are all there only if you have data at the drop of a hat,” says Mohit. Empathically, he points out that while organisations are talking about the importance of Artificial Intelligence (AI), the key lies in utilising the new technology and truly embracing it.

“Habits are not shifting enough,” says Mohit. “The real challenge is getting people to truly understand how to make decisions through intelligence and not emotions.”

Mohit acknowledges that there are costs involved in having data in real-time but asserts that it is also the future. When the tools, people and technology are aligned, the big question is: what is the next step and how can organisations be more relevant?

In closing, Mohit urges delegates to partner with organisations that can help them strategise ways to leverage data. Data is the most essential ingredient and catalyst of our time and partnerships can allow organisations to transform their operations. Experts can assist organisations in delivering responsive citizen engagements and making their digital transformation journey smoother, cost-effective and impactful.

The value of connected data in digital transformation

Robin Fong: From collected data to connected data to build relationships

Robin Fong, Regional Director – ASEAN, Neo4j spoke next on the use of graph data to contextualise and reveal connections, especially indirect connections among dispersed data.

“Having data and being able to run business intelligence and analytics is not enough,” Robin comments. “The next step is to be able to identify relationships.”

Data relationships create context and interrelationships create structure. Graph data adds context by capturing and storing relationships natively and processing them efficiently. That is how knowledge is created – when data is contextualised.

Governments and enterprises have been amassing lots of data and allocating large budgets to store them. It is now time to make sense of all the collected data to uncover hidden gems of insights, knowledge and wisdom by connecting them in a graph data platform.

Connectivity and networks require organisations to move from collected data to connected data, he explains. While organisations can take data and add some basic organising principles to create a knowledge base, the context is shallow and quickly ages because the underlying infrastructure is not built for relationships. However, if organisations can combine data, semantics and a graph structure, they will end up with a knowledge graph that has dynamic and very deep context because it is built around connected data.

Robin shares that Neo4j is in the business of helping the world make sense of the data. In fact, they are the founder of the graph data category and the world’s leading Graph Data Platform adopted by thousands of organisations globally. Moreover, Neo4j is the only Graph Data Platform vendor in the Govtech Data bulk tender for Data Science & AI.

Graphs are not new, Robin acknowledges. They are already deployed in a lot of situations today among leading companies – in banking and e-commerce. Graphs are extensively used across a wide range of sectors and use – fraud detection, supply chain management, customer experience, compliance and privacy management, personalisation and recommendations, employee or customer or patient or product 360, medicine research and cybersecurity.

Doctor.ai is a great use case from the healthcare industry as an example. Neo4j powers the Voice Chatbot for Doctor.ai  in their work with Singapore Healthcare AI Datathon and EXPO 2021 with NUHS-NUS.

Graph Data enable fast access of patients to their private health records, monitor health and provide advice while it also creates alerts and makes doctor appointments. For doctors, Graph Data has enabled quick access to patients‘ health histories, assists in the decision-making process, makes machine learning predictions and pushes the newest research.

Robin strongly suggests delegates consider potential business problems/use cases where Connected Data (Graph Technology) may be useful and relevant. To get started, he provides steps on how organisations can get started and encourages delegates to contact Neo4j for a Discovery Workshop.

Before bringing the presentation to an end, he invited delegates to connect with him and the team if they would like to explore ways Neo4j can help and support agencies in transforming their organisation.

Harnessing Graph Data Technology in establishing a Smart Government

Damien Wong – Building real-time Smart Government with data in motion

Damien Wong, Vice President, Asia Pacific & Japan,  Confluent elaborated on the use of Graph Data in powering smart governments.

Event streaming is a real-time infrastructure revolution that is fundamentally changing how governments think about data and build applications. Rather than viewing data as stored records or transient messages, data could be considered to be a continually updating stream of events. Event-driven architecture is the future of data infrastructure.

“The world is changing,” Damien opines. “The world has changed for the current generation because technology is shaping how businesses need to respond to these changing expectations. The younger generation has never walked into a bank branch, and likely will never understand why anyone would ever need to do so since everything can be done online today.”

Most organisations today, are “becoming software.” Ride-hailing, he said, was an excellent example.  Not too long ago, people needed a taxi, they would call a taxi dispatch service, wait for the ride to be confirmed and look out for the vehicle to arrive – there was no information on how long the taxi would take to arrive or the ETA to destinations. Today, all that information is given almost instantaneously on apps.

Today, software is the interface, Damien is convinced. It was not that it was not there before but rather than being an adjunct to the business, it has become the business. However, to make this transition, organisations have had to move on from relying solely on traditional data architectures. New architecture needs to be fast and responsive while batch processing has moved to real-time processing.

“Data systems need to be connected not treated in silos,” Damien emphasises. “In the new reality, services would be fast, in real-time and connected.”

This transformation is happening everywhere, and it is drastically causing people to rethink their approaches and systems:

  • Cloud: Rethinking Data Centres

The cloud has changed how organisations think about data centres and running technical infrastructure. Today, every company is moving to the cloud.

  • Machine Learning: Rethinking Decision Making

Machine learning has changed how decisions are being made, and this happens increasingly in an automated manner, driven by software that communicates to other software.

  • Mobile: Rethinking User Experience

Mobile devices and internet connectivity have dramatically changed the user experience of how customers interact with organisations and have raised the bar for expectations.

  • Data in Motion: Rethinking Data

Event streaming has changed how people think about and how people work with the data that underlies all the other trends.

“Data in Motion is the central nervous system for today’s enterprises,” he asserts. “And Apache Kafka is the event streaming technology powering Data in Motion.”

For Damien, the traditional use of data at rest is to consolidate data into a warehouse and apply analytics. Data in motion is, on the other hand, understanding the predefined actions that will be taken when encountering a specific event or data stream.

The rise of event streaming can be traced back to 2010 when Apache Kafka was created by the future Confluent founders in Silicon Valley. From there, Kafka began spreading throughout Silicon Valley and across the US West Coast. In 2014, Confluent was created to turn Kafka into an enterprise-ready software stack and cloud offering, after which the adoption of Kafka started to accelerate. Today, tens of thousands of companies across all kinds of industries the world over are using Kafka for event streaming.

If Kafka is the engine (the core technology), then Confluent is the ready-to-use product around that. Confluent is a natural candidate for real-time operations like command and control, cyber security and other anomaly detection solutions. It can enable event-driven architecture that helps modernise IT applications and hasten the addition of new citizen services or capabilities. Apart from that, data infrastructure for data in motion, Confluent will help organisations move towards multi- and hybrid- cloud and DR operations.

In conclusion, Damien encouraged delegates to consider some questions as they navigate through the paradigm shift:

  • Are you looking to become a real-time smart agency? If so, how mature are you in leveraging data-in-motion platforms to support this?
  • What are some of the use cases you’re implementing around this?
  • Are there challenges that are holding you back from successfully making this transformation?

Damien affirmed the need for organisations to embrace the importance of real-time data if they want to stay relevant. Data in Motion is the ultimate key when it comes to delivering better services and empowering business missions.

Polling Results

Throughout the session, delegates were polled on different topics.

In the first poll, delegates were asked to vote on their priority in 2022. Half of the delegates indicated digital acceleration as their priority, followed by workforce transformation (33%) and tech modernisation (17%).

On what their biggest challenge was, a majority of the delegates (35%) indicated the lack of skilled staff who understand big data analysis. The remaining were split between the lack of quality data and proper data storage (30%), not able to synchronise disparate data sources (15%), not able to derive meaningful insights through data analytics (15%) and the inability to get voluminous data onto big data platform (5%).

Concerning the maturity of their organisations in using data and analytics, (38%) indicated that their organisations use performance dashboards to slice, dice and drill down. Other delegates indicated that they distribute static reports regularly (24.8%), combine data with predictive modelling, AI and machine learning techniques (24%) and use self-service analytics (14%).

The delegates were asked if they are familiar with the advantages of graph technology and how it will enhance their daily decision-making process. Just over half (53%) were familiar but are currently not using this it while about a quarter (26%) were not familiar but interested to know more. The remaining delegates are familiar and currently using this technology (21%)

On the common Data Integration/Connection challenge faced by delegates, most (35%) indicated disparate data formats and sources as the main challenge, while others expressed that low-quality or outdated data (29%) was. The remaining delegates face the challenge of data that isn’t available where it needs to be (24%), followed by the issue of having too much data (12%).

With regard to processing real-time data, most (65%) felt that they were emergent (some processes and knowledge, non-standardised), followed by limited: ad-hoc, unstructured, uncontrolled, reactive (29%), and structured: standardised, governance, scale, proactive (6%)

When asked about what would be important for a successful AI adoption in their organisation, an overwhelming majority (94%) indicated that starting small and building the business case by demonstrating initial wins would be important. The remaining delegates indicated aligning all departments on the single vision and garnering support (6%)

Inquiring about being the essential tenet for ethical AI to work, most delegates (40%) believe in the need for an effective and practical ethical framework/ Governance model for AI. The other delegates were split between AI solutions that should allow for auditability and traceability (26.7%), guaranteeing privacy by design in machine learning systems (26.7%), and the iimportance of training AI models with carefully-assessed and representative data (6.7%).

In the final poll for the morning session, delegates were asked what they would invest in, if they had an unlimited budget. Just over a third (35%) said they would spend on integrating disparate systems, followed by spending on resources to improve delivery timeline (29%), updating legacy technologies (18%), improving security and compliance (12%) and staff training / upskilling (6%).

Afternoon Session

Data Virtualisation in supporting advanced analytics

Elain Chan – Advanced analytics and cloud modernisation with data virtualisation

Elaine Chan, Regional Vice President Sales – ASEAN & Korea, Denodo spoke about how data virtualisation can help with advanced analytics and cloud modernisation.

As data analytics and data-driven intelligence take centre stage in today’s digital economy, logical data integration across the widest variety of data sources, with proper security and governance structure in place has become mission-critical.

Based on a Denodo Global Cloud Survey 2021, cloud adoption is on the rise with a 25% increase year-over-year in advanced cloud workloads. This indicates that more complex workloads are moving to the cloud and that COVID-19 has perhaps driven that increase.

Today, the hybrid cloud model remains in the lead, with more than one-third of users leveraging that architecture. Private cloud also saw some good gains, with nearly 25% of their workloads still being run on-premises.

One of the key benefits that cloud technologies provide is the ability to scale faster, although performance and ease of data management also provide strong benefits, identified by 31% and 20% of participants, respectively.

Data Virtualisation allows for flexibility, access from anywhere and lowers the costs of operations. However, there are also concerns about how the transition to cloud might create new data silos, security and latency.

Elaine believes that there is a need for logical data architecture. “Data Fabric is the best path to data management automation,” Elaine opines. In layman terms, it can be broken down as follows:

  1. “Integrate data” from disparate data sources, on-prem and in the cloud
  2. Securely deliver an “integrated view” of the different data objects
  3. Consume the “integrated data” for analytics and operational purposes
  4. Automate the entire process using AI/ML

According to Elaine, Denodo logical data fabric sits between the data sources and the consumers and have a few characteristics:

  • Unified Data Integration and Delivery
  • Allows reusing existing analytics systems
  • Allows using the best system for each need
  • Abstraction: No Lock-In
  • Evolve / Optimise infrastructure without affecting data consumers
  • Dramatically Increased Productivity
  • Minimise data replication: virtual or smart, selective data replication

Breaking down the essential capabilities of data virtualisation, Elaine highlights five aspects

  • Data Abstraction: Decoupling applications and data usage from data sources and infrastructure
  • Zero Replication, Zero Relocation: Physical data remains where they are
  • Real-Time Information: Most reporting and analytical tools can easily connect for real-time data
  • Self Service Data Marketplace: A Dynamic Data Catalogue for self-service data discovery and data services available in the virtualisation layer
  • Centralised Metadata, Security & Governance: Manage access across all data assets in the Virtualisation layer for enterprise data security and supports dynamic data anonymisation
  • Location-agnostic Architecture: For hybrid and multi-cloud acceleration

Delving into the use case of  Statistics Netherlands, Elaine elaborated about the requirements that the data management team was looking for:

  • Create tailored reports for government agencies that want to change public policies for people who need extra support
  • Add new data sources without affecting the continuity of other public service agencies and at the same time making them more agile in the process
  • Expand the data services supporting more teams without increasing infrastructure costs for storage and servers.

With Denodo, the logical data warehouse created using data virtualisation enabled Statistics Netherlands to create one access point to explore and access all data, bringing data to its fingertips. It also created a self-service culture for data consumers that is easy to use, while enabling Statistics Netherlands to implement security and governance by centralising authentication and authorisation.

Summing up the presentation, Elaine pointed out good infrastructure in place is necessary to support more advanced analytics. Data virtualisation helps to complete enterprise information, combining Web, cloud, streaming, and structured data. It promises ROI realisation within 6 months, with the flexibility to adjust to unforeseen changes, and an 80% reduction in integration costs, in terms of resources and technology. Most importantly, there is real-time integration and data access, enabling faster business decisions.

She encouraged delegates to reach out to her directly if they have any queries about the journey towards data virtualisation.

Generating incisive insights through Graph technology

Tony Tan – Advanced Analytics and Machine Learning on Connected Data

Tony Tan, Co-Founder & Deputy Chief Executive Officer, Imperium Solutions spoke about why graph analysis is possibly the single most effective competitive differentiator for organisations pursuing data-driven operations and decisions after the design of data capture.

“Optimising supply chain is tricky,” Tony opens. “Even after over 50 years with billions invested and R&Ds and building of complex ERP systems, and advancements in operation management, we are still facing a supply chain problem.”

For example, in Singapore, many people like to own cars, but the recent BMW models do not come with touch screens, satellite radios, digital keys and the stop-and-start engine.  Manufacturers are good at supplying first-tier suppliers but many of the problems are further downstream.

This begs the question – is there a technology that will bring everyone closer to solving issues? And if so, where are the opportunities in the bottlenecks? Tony believes that creating a breakthrough in solving recurring issues requires methods outside of what has been tried.

Drawing a parallel to the issues with the supply chain, Tony says that fraud has many facades. PWC published a report last year based on a survey they conducted with over 5,000 respondents between 2019-2020. They claimed that 42 billion dollars were lost in financial fraud.

The majority of this is based on 4 types of customer fraud – cybercrime, asset misappropriation and bribery/corruption. However, there are others such as accounting fraud, procurement fraud, deceptive business practices, AML / sanctions, tax, IP theft and anti-trust. The problems are aplenty, Tony claims, which takes up time and energy investment to resolve.

Problems also abound in the metaverse. While blockchain is here to stay, Tony feels, and decentralised finance enables the open and transparent exchange of digital currency. However, such a new system, unregulated, can also be a breeding ground for criminals and hackers, ripe for exploitation. With scammers on the rise, there is a need to establish relationships between users to identify scammers more efficiently.

Tony acknowledges that technology is the key towards solving many of the issues that companies faced – and data is at the centre of it. The operations of major companies, Linkedin, Google, Netflix and as well as the largest bank in the US are powered by Graph Technology. Gartner says it is the top 8 technologies of the near future.

With Graph Technology, relationships between data points are established which enables people to swiftly locate information and redefines the way we are looking at data today. It allows going deep into the relationship and can be used for a variety of problems and domains such as:

  • Companies, markets
  • Countries, history, politics
  • Sciences, art, teaching
  • Technology, networks, machines, applications, users
  • Software, code, dependencies, architecture
  • Criminals, fraudsters, terrorists

TigerGraph is currently deployed by 8 of the largest banks in the world, including Goldman Sachs, JPMorgan, Bank of America, and ICBC (China).

“The time to use graph is today,” Tony says. To face mounting challenges, there is a real need to harness the insights through graph technology which can amplify the connected data.

Polling Results for Afternoon Session

Throughout the afternoon session, delegates were polled on different topics.

In the first poll, delegates were asked to vote on their priority in 2022. Most of the delegates (48%) indicated digital acceleration as their priority, followed by tech modernisation (30%) and workforce transformation (22%)

When asked about what their biggest challenge is, a third (33%) indicated the lack of skilled staff who understand big data analysis as the biggest challenge. The remaining votes were distributed between not being able to synchronise disparate data sources (29%), the lack of quality data and proper data storage (17%), not able to derive meaningful insights through data analytics (8%) and the inability to get voluminous data onto big data platform (13%).

On organisation maturity in using data and analytics, a majority (41%) indicated that their organisations use performance dashboards to slice, dice and drill down. Other delegates have embedded visualisation into our process and transactional systems (23%), distribute static reports regularly (18%), combine data with predictive modelling, AI and machine learning techniques (12%) and use self-service analytics (6%).

The delegates were also asked if they are familiar with the advantages of graph technology and how it will enhance their daily decision-making process. Most (40%) were familiar but are currently not using this it while others are familiar and currently using this technology (33%) and the rest were not familiar but interested to know more (27%).

On the common Data Integration/Connection challenge faced by delegates, just over half (52%) indicated disparate data formats and sources as the main challenge, while others (18%) expressed that low-quality or outdated data was. The remaining delegates face the challenge of data that isn’t available where it needs to be (12%), followed by the issue of having too much data (12%) and the use of wrong integration software (6%).

On the maturity of their organisations in processing real-time data, the majority (44%) felt that they were emergent (some processes and knowledge, non-standardised). The rest were split between limited: ad-hoc, unstructured, uncontrolled, reactive (28%), and structured: standardised, governance, scale, proactive (28%).

When asked about what would be important for a successful AI adoption in their organisation, a huge majority (65%) indicated that starting small and building the business case by demonstrating initial wins would be important. The remaining delegates were split between aligning all departments on the single vision and garnering support as important (17.5%) and establishing clear lines of authority and ownership across the entire organisation (17.5%)

Asked about the essential tenet for ethical AI to work, about half (52%) believe in the need for an effective and practical ethical framework/ Governance model for AI. The others were split between the belief that AI solutions should allow for auditability and traceability (22%), the importance of training AI models with carefully-assessed and representative data (17%) and guaranteeing privacy by design in machine learning systems (9%).

In the final poll for the morning session, delegates were asked what they would invest in, if they had an unlimited budget. The majority of the delegates (35%) would spend on updating legacy technologies, followed by spending on resources to improve delivery timeline (36%), integrating disparate systems (21%), and staff training / upskilling (8%).

Closing

To conclude the day, Mohit emphasised the importance of understanding and harnessing data to derive insights that will help organisations stand out among competitors. Data is the new future that can help to improve services for an increasingly data-driven world.

Send this to a friend