In response to the recent global events that are causing consumer shifts, many organisations are accelerating their digital transformation efforts. Digital transformation has gained importance and is perceived as a strategy for both survival and growth in the new normal. This has increased the need to use innovative technologies to create new business models, products, or services.
As the decades-old IT systems, responsible for running traditional workloads, look to modernise, there is still a need — as there always has been — for reliable, scalable and secure infrastructure. One technology that is both synonymous and non-negotiable with such efforts is cloud. Cloud services are now imperative, and make a real difference in ensuring that important enterprise services can keep going in almost all scenarios.
However, one common problem that financial services, government agencies and businesses face when moving to cloud services is to ensure that ongoing services run well even as the organisation migrates to newer solutions. Adapting to current technological trends while eliminating the risks of breaking existing systems and interrupting current business operations is vital.
Any organisation would baulk at the prospect of migrating to a cloud environment in one massive move. Incremental modernisation allows them to continue running their mission-critical applications on their current infrastructure while adapting and building new cloud-native applications in parallel.
Organisations across both the private and public sectors have begun to alter their perception of migrating workloads and applications to the cloud. Beyond a doubt, making a shift from a legacy to a managed cloud infrastructure is daunting on many levels. Discarding proprietary technology accumulated over the years can hold organisations back from making the move. Concern over data latency and volumes linger, especially when it comes to streaming data using the public cloud. Having the right people, processes and systems is a serious consideration. Combined with the cost of technology, infrastructure and reorganisation, these can give good reasons for pause.
The need of the hour is for these organisations to see a reduction in infrastructure cost, the ability to scale up and support a several-fold increase in traffic, reduced time to deploy and a simplified production rollout and recovery process. Enterprises need to have a rapid roll-out of digital capabilities by improving overall time-to-market and reducing the total cost of ownership. A great solution that can ease transformation, is to use container-based technology to develop, build, package and deploy applications and business solutions in a more efficient, secure and scalable way. Cloud-native solutions contribute towards a reduction in long-term operations costs, better system resilience, more efficient processes and enhanced security.
This begs the question: Do organisations have the capability to support cloud-native solutions to enable holistic improvement of infrastructure, to enhance the efficiency, scalability and security of their operations?
The OpenGovLive! Virtual Breakfast Insight held on 24 November 2021 aimed to help delegates understand ways to overcome the barriers to successful cloud migration and modernise infrastructure and application delivery to better serve the citizens and customers.
Embracing the inevitability of a hybrid cloud reality
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.
The pandemic has vaulted the governments and businesses headfirst into the next stage of digital transformation and online services. In the region, the Singapore government has taken the lead with the investment in public cloud estimated to amount to US$ 3.6 billion by 2023 and 70% of eligible government systems to be on the commercial cloud by 2023.
There is a need to rethink cloud strategy, Mohit asserts. There are too many legacy systems and organisations cannot afford to hide behind those systems anymore.
Currently, citizen happiness is the most important benchmark for governments and enterprises because it is about the uptake of the technology, Mohit contends. Acknowledging that the technology is here to stay, there is a need to look into upskilling the workforce. When planning, he says, “Technology has to be seen as an investment and not an expense.”
What cloud offers is the flexibility to rapidly respond to the changes demanded by digitally-savvy citizens. Agencies now have the capability to not only move workloads between on-premises data centres and public cloud but also make a change and upload data instantly.
Agencies that embraced cloud services proved more responsive and were able to continue operating remotely and serving their citizens, demonstrating agility, scalability and speed even amid a pandemic.
Against such a backdrop, organisations must boldly accept the new digital reality. They must harness technology to enhance the working experience and drive organisation goals in the new normal. And there are a lot of solutions available now. Mohit acknowledges. Global companies have been looking into the design of high-performance computing solutions that will tackle some of the world’s toughest challenges.
At the same time, navigating this shifting terrain must be done securely – there is a need to bake security into the process and tools. This means that security is readily built into the infrastructure across workloads and applications. Compliance and regulation are also an intrinsic part of creating a safe environment. Policy and guidelines are established to provide accountability and build trust with citizens and consumers alike.
But neither security nor compliance concerns are a reason to not transform. “Do not hide behind safety or governance,” Mohit cautions. “These issues should not deter people from embracing technology – they need to be confronted, not avoided.”
Ultimately cloud is here to stay. Mohit concludes. Organisations can get a head start now or play the more tenuous game of catch up later.
Firmly convinced that the transformation need not be done alone, he urges delegates to partner with organisations with the expertise to facilitate digital transformation. The process needs to be done at scale and with speed. The right partners bring a wealth of expertise and experience that will make the journey far easier to manage and navigate.
Exploring international use cases of hybrid cloud platforms
An Nguyen, Director, Cloud Solutions, Red Hat spoke next on trends in hybrid cloud adoptions.
An observes that there is a general shift in enterprises moving to public cloud and even data centres have begun moving to the public cloud. Nonetheless, he also notes that the use of on-premise private cloud infrastructure is still significant, pointing to the inevitable shift to a hybrid cloud model.
An emphasised the need to be adaptive. For An, the added benefit of using hybrid cloud is the ability to offer better customer services through quicker feedback from consumers. Cloud providers can monitor and keep everything updated for the organisations. Apart from that, public cloud infrastructure providers can also give you the most complete view of what you are using.
To become agile, cloud is an essential component. While the implementation and focus of hybrid cloud may not be easy initially, An contends that there are tremendous benefits to be reaped. This includes improved security, application or data portability, automation and orchestration, ease of management or operations, ease of implementation or deployment and architectural consistency.
Leading companies have demonstrated the possible use cases and the solutions that Red Hat offers depending on the organisation’s needs. The company has done work across a wide range of sectors, and banking has been particularly active.
An shared that approximately 2000 customers are using Openshift for mission-critical systems around the world. Red Hat possesses a full stack container platform and the operating systems to support applications to deliver the best business impact.
For Deutsche Bank, the impetus to implement multi-cloud technology stems from the desire to standardise and unify the platform of the myriad of applications. Their journey first began with the standardisation of the operating system followed by optimisation through the Openshift container platforms. Since Openshift serves all kinds of container workload, it helped to streamline processes in Deutsche Bank and enable automation.
Red Hat also provided support and expertise to Amadeus, on how they can move towards cloud-native workloads and navigate the complexities of their unique situation. In another instance, BMW needed help with expansion into new markets without investing in building data centres. To do that, they needed standardisation to move the workload from one country to another seamlessly. By adopting Openshift Dedicated, BMW could connect devices across different public cloud providers.
An emphasises that Red Hat’s OpenShift is the industry’s leading enterprise Kubernetes application development platform, helping customers deliver new customer experiences, open new lines of business, and modernise their existing application portfolio. He encouraged delegates to reach out to him should they have any queries on how the hybrid cloud model.
Peering into a digital future: Taking pre-emptive steps to stay ahead of the game
John Baddiley, Head of Strategic Relationships, Bank of New Zealand, shared BNZ approached the move to hybrid cloud and elaborated on the journey thus far.
BNZ has been around for 160 years and employs over 5000 staff all over the country. They have a strong focus on customer outcomes and experience, which has been reflected in the awards that they have been recognised for over the past few years.
John says that long history has many benefits, but can be a potential roadblock when it comes to digital transformation: something that has worked in the past it can be difficult to let go of it. However, BNZ has taken the step to change, modernise and transform its operations.
Cloud adoption is a key part of their transformation story. Rapid change, innovation and adoption mean that BNZ customers expect more every day. “Approaches to technology that worked five years ago will not work today,” he believes.
Monolithic solutions do not provide the flexibility required to adapt or provide the resilience of an ‘always on’ world. He is convinced that systems need to be able to change rapidly and securely to meet the expectations of customers, regulators, and shareholders. At these crossroads, technology can be a strategic advantage or a strategic inhibitor.
The desire to adopt cloud was a strong diver for BNZ when they first embarked on the journey. He shared that the various intent and goals BNZ had in mind led them to take different strategies in the hybrid cloud transformation depending on the application and needs.
A strategy was to refactor applications such that they are cloud-native. They had to bite the bullet to lift and shift some applications from on-prem to cloud environments because the previous best practices are no longer ideal.
One approach was to lift and shift applications that were closest to the mainframe. Another strategy was the ‘Outside In’ one, where the customer or banker facing applications were brought into a zero trust architecture. The third approach was to build applications that are cloud-native using their engineering foundations to deliver and consume cloud services.
However, John cautions that several dependencies must be addressed and operationalised before shifting or building applications on the cloud. These include:
- Engineering Platforms including integration, deployment pipelines, monitoring and management.
- Cloud skills and experience will be required. Choose a model (uplift, capability enhance, outsource) that is right for the workload.
- Connectivity to and between the cloud environment(s) from legacy data centres
- Finance and Cloud Accounting capabilities to ensure that business units have visibility of current and forecast spend
- Patterns for cloud-native software architecture and application transition must be developed, shared, and enforced
- Security standards and patterns must be defined and deployable
Speaking from experience, John says that organisations embarking on this journey need to learn to manage risk during transformation. BNZ has built a cloud governance ecosystem that integrates all aspects of governance, risk management and regulatory compliance. Producing an ecosystem that helps to manage risk rather than having only one party.
Apart from that, John stresses the importance of engaging with regulators to give them confidence that the shift of the workload to the cloud is robust.
To that end, BNZ produced the 7 perspectives of CAST, which is a NAB-designed framework that defines minimum controls, standards and techniques for the adoption of cloud services for the material workload.
BNZ has chosen to apply the framework to material workloads and to assess for use with all workloads. The CAST Framework consists of 7 perspectives (or areas of focus), each with minimum mandatory standards, techniques, and controls for migrating material workloads to public cloud services.
BNZ applies CAST to all material workloads (those rated as heightened or extreme in the Application Inherent Risk Assessment).
In closing, John highlights that multi-cloud treatment varies by business significantly. He stresses that the most critical processes and applications must be designed to migrate quickly if required.
Being able to stay agile, nimble and relevant is the ultimate key to surviving in a rapidly changing world.
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and facilitate discussions that impart professional learning and development for participants.
One being asked about their organisation’s biggest challenge in digital transformation strategy, delegates were evenly split between culture (48%) and skills (48%).
A delegate opined that culture is the biggest challenge because digital transformation requires a different way of working, understanding finances and budgeting.
Asked which elements of transformation are the most challenging in their organisation, half of the delegates felt that (IT/Software) architecture and development (50%) was the most challenging element. About 405 thought leadership was an issue while 5% felt IT) operations was of concern.
A delegate said they had difficulty in bringing legacy products onto cloud-native platforms within the healthcare industry.
Mohit agrees that the journey is not an easy one; however, it is an inevitable one. He reminds delegates that this is where experts can help. The technologies used in 2020 and 2021 were “band-aid technologies.” Organisations need to prepare themselves to be ready for the next hit. Digital transformation is not a strategy but something that needs to be deployed. Security and privacy cannot be a stumbling block on that journey, Mohit cautions them.
The next question inquired on the percentage of workloads delegates see themselves moving to cloud over the next 3 years. Just under half (42%) indicated that more than 50% would be moved to cloud, followed by 30% – 50% of the workload (37%) and 10% – 30% of the work (21%).
Mohit firmly believes that a cloud-first policy is necessary because it is possible to have both on-prem and cloud services. Critical data can be kept on-prem but others can go into the new environment.
John adds that it is a question of capacity and finance. The selection of applications and data that go on cloud is a matter of how much work is associated with shifting each application and the maintenance to ensure that everything is working. For BNZ, John estimated that 80% of the workload will run on cloud eventually and all new applications are cloud-native.
When asked about how John manages the compliance and regulators, John explained that CAST helped to accelerate shifting the workload to the cloud in demonstrating the measures that are in place.
Another delegate wanted to know how critical it is to have a mature strategy or process like CAST before migrating critical applications to cloud. John says it depends on the organisation’s risk appetite and how much regulators care about how organisations run workloads. He feels that every organisation needs some form of risk control framework but that it does not necessarily need to be as comprehensive as CAST.
It is also not about selecting one total cloud, John opines. When choosing to deploy individual applications, organisations need to understand the capacity of teams and as well as the suitability of the features of the cloud.
John adds that it is important to examine the dependencies and then shift those that did not have dependencies. By shifting things to cloud, infrastructure gets taken care of and affords people more time to deliver value and focus on things that matter to the business.
On the most important outcome they are seeking in their digital transformation, delegates were equally split between the reliability of newly deployed changes (26%) and innovative platform and culture to support new ideas (26%). Similarly, better security and governance models got 16% as di operational efficiency (16%). The remaining delegates voted for reducing the cost of operations (11%) and the speed of developing and deploying changes (5%).
Polled about their top consideration in adopting / choosing multi-cloud, most delegates selected inter-communication and workload portability among the clouds (28%). This was followed by an even split between tools and services available on the new cloud (17%) and data sovereignty and residency (17%). The remaining delegates were equally divided in a three-way split between support within multi-clouds (11%), cost optimisation (11%) and complexity of migrating existing apps (11%). The remaining delegates (5%) chose the availability of skill set to navigate the new cloud as the top consideration.
The final question asked what the biggest benefit that Edge Computing brings to their organisation as part of their digital transformation strategy. About a third (35%) indicated that Fast-to-Adopt IoT Solutions was the biggest benefit, was followed by a quarter (25%) who opted for fast, affordable networks at the edge (25%). The remaining votes went with hardware-based security leadership (15%) and AI and computer vision expertise (5%).
Asked to share more on how BNZ baked security and compliance enforcement into the cloud deployments, John explained that BNZ has CSAMs for every cloud service, which defines how the service must be configured for each use case. On top of that, they use an attestation process with CAST to make sure that they have checks to ensure that implementation teams are following the architecture and policies correctly.
BNZ is working towards embedding as many of the CAST checks into pipelines, although this is at a very early stage. He added that they are also building up patterns to enable Zero Trust Architectures, which will help bake in the infrastructure aspects of security to our solutions.
Apart from that, John revealed that they run a secure code warrior programme, which teaches code security practices to all of their developers. He emphasised that it is important to remember that security is everyone’s responsibility, not just the security team.
In closing, Guan Hao, Industry Technical Specialist, Intel, thanked all the delegates for their participation and insights on the topic.
He reiterated that applications and data are growing and organisations will need an infrastructure that can handle the load. With the changing reality that the world is in currently, he stressed the importance of employing the right technologies to help to lubricate the process of digital transformation. To that end, cloud is the cornerstone of digital transformation.
On top of that, organisations will need flexible infrastructure to handle the demands of storage, network and multiple cloud platforms.
Finally, An recapped the many use cases for hybrid cloud that delegates need to understand to be able to identify their unique journey. He urged the delegates to consider the intercloud connection and make sure that architecture is cloud-native.
He invited delegates to reach out to him and the team if they had queries or wanted to understand the unique value that hybrid cloud can bring to their organisations.
To improve access to mental health support in Singapore, a digital mental health platform and a pharmaceutical firm have signed an exclusive partnership to provide more access to biopsychosocial care to people in Singapore. The collaboration aims to counter the stigma surrounding mental health and bridge the gap in treatment.
This collaboration, which will connect psychosocial professionals such as Counsellors and Psychologists, to healthcare providers such as Psychiatrists and Primary Care doctors, is the first of its kind to join key facets of the mental health care ecosystem on one platform for users and aims to set the standard for holistic mental healthcare.
Under this partnership, the platform will provide free access to counselling or psychological support for three months via its mobile app. The app also includes composite self-service content, tracking and one-on-one behavioural coaching and therapy. The pharmaceutical firm will help Singapore connect psychosocial and pharmacological care, which also adds value to their practice.
– Countering Stigma on Mental Health
According to a page, Major Depressive Disorder (MDD) is one of the leading causes of out-of-pocket healthcare expenditures in the Asia-Pacific region, and up to 90% of people living with MDD do not seek help. In Singapore, the treatment gap for the condition stands at over 73% according to a study by the Institute of Mental Health, due to stigma as well as accessibility issues attributed to fragmented care models between biological and psychological care.
The treatment gap for MDD can be significantly narrowed with proper mental healthcare infrastructure in place and timely care delivery. Countering stigmas associated with seeking help, increasing psychosocial education, and providing seamless access to psychological as well as pharmacological care is paramount in bridging the treatment gap.
Providing Mental Health Support Via Mobile app
One of the main goals of the partnership is to pioneer more seamless access to biopsychosocial care for the community with their combined expertise in pharmacological and psychosocial care respectively. The initiative will not only deepen cross-sectorial synergies within the mental healthcare provider ecosystem but also provide access to psychosocial support via the mobile app and the chat.
The pandemic has revealed the urgency and necessity for resources and opportunities for mental health support, given how the stressors of life can lead to and even exacerbate underlying mental health conditions. This is compounded by social distancing, which is important to decelerate the spread of COVID-19 yet disrupts social rhythm and deprives people of their regular coping mechanisms.
The partnership started based on a shared purpose to challenge the stigmas associated with mental health and to develop a digital mental health space that will help support the mental healthcare ecosystem in Singapore.
The platform’s focus lies in providing psychosocial support to users through its proprietary architecture on its mobile apps; the partnership will see the pharmaceutical complement this approach, elevating the biopsychosocial ecosystem for mental health support in Singapore by connecting psychosocial and pharmacological care to add value to their practice and thus address current treatment gaps. Ultimately, the partnership will enable greater accessibility to precise and timely mental health care.
As reported by OpenGov Asia, Singapore has implemented a National Mental Health Strategy to address. A national effort to promote mental health and well-being beyond the COVID 19 epidemic has now been set up through a new interagency task force. This will expand the existing COVID-19 Task Force on Mental Wellness (CoMWT), which was first organised by the Ministry of Health (MOH) last October to address the worldwide pandemic’s mental health concerns.
A key part of the strategy is the development of an online portal through the Health Promotion Board of the Singapore Ministry of Health, which serves as an inventory of mental health resources. The site contains “expert-cured” content for mental and well-being. It is a resource for “individuals who need information for themselves or their loved ones.” The web and mobile app platform in which the Ministry hosts a range of health content, benefits, and e-services will be introduced.
The public sector has a high potential for data and Artificial Intelligence (AI) to have a huge transformative impact. After all, governments have access to tremendous amounts of data and government operations affect everyone in small and large ways every day.
While it is no secret that only rich data catalyses Artificial Intelligence, its adoption among government entities appears to be uneven and generally lags behind the private sector. Many agencies struggle to bridge the gap that exists between their existing IT infrastructures, practices, and the value that new digital technologies make possible. However, for some governments, there are entire departments, or pockets within departments, where adoption is robust, advanced and successful.
Everyone agrees that the massive amounts of digital data generated by citizens’ activity represent an incredibly valuable resource. Unfortunately, the ever-expanding data resource is often underutilised today. Public sector agencies struggle to unlock the value of their data due to outdated legacy systems and limited analytics capabilities, being data-rich but insight-poor. They often grapple with the associated, yet unnecessary, challenges of big data – high costs, poor data quality, and inconsistent data sources and formats – without experiencing any of the enticing benefits.
There are many lessons to draw from the events of COVID-19 but perhaps one of the most critical is the importance of being able to use data to prepare for potential scenarios and inform our decision making. Public sector agencies require a multifaceted approach, including the ability to quickly integrate new data, make accurate, multilevel forecasts, and provide data-driven insights for policymakers.
Against this backdrop, having a robust data and AI strategy in place will help the public sector better harness the power of data.
The prevailing question is: What is the successful path to the adoption and deployment of AI?
There is a mixed picture of AI adoption in government, and it is likely owing to an environment that is often risk-averse, subject to myriad legislative hurdles and vast in its reach. That being the case, the use of AI has expanded beyond discrete use cases and experiments into wider adoption. There are obvious signs which point to the potential explosion of AI adoption even though gaps in capabilities and strategy are apparent.
Pursuing their missions every day, government agencies spend much of their time focused on operational issues. That time-consuming focus is required in government departments and offices that are held accountable for achieving clearly defined missions. If they fall short, the consequences can be devastating – for the citizens they serve, as well as for the government organisation itself.
In that context, it’s easy to see how AI remains a second-tier priority for some government leaders who have operational roles. This presents government leaders with a paradox. Many have no time to fully embrace AI due to everyday demands, but those AI advances could be instrumental in unlocking real, measurable operational improvements that have the effect of reducing strains on resources and giving them more time to fulfil their mission.
In light of this, how can people understand which AI capabilities are most likely to be adopted in government? What are the biggest untapped opportunities for AI adoption in government? What obstacles and challenges unique to the government are most important to understand today to ensure progress tomorrow?
The OpenGovLive! Virtual Breakfast Insight held on 3 December 2021 aimed at imparting knowledge on how government agencies can accelerate, innovate and transform their advanced analytics capabilities, make data an integral part of their decision making and adopt AI to better serve the citizens
Harnessing the game-changing potential of data and AI in government for optimal outcomes
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.
The world has fundamentally changed and the challenges of these times will require sophisticated solutions to meet the new demands of the world. Without a doubt, technology is a priority and the enabler, Mohit asserts.
Decisions are made every day, but they should not be done blindly. To make informed decisions, people need actionable insights. Today, citizens expect government services to be personalised, intuitive, engaging and anticipatory. To deliver the best citizen experience and stay relevant, having data and universal access to it is the key to transforming organisations.
“Data is like a diamond,” Mohit posits. “Data that is not refined and polished will not produce insights – tools have to be used to achieve that.”
Data fuels AI, Mohit believes. Effectively building and deploying AI and machine learning systems require large data sets. Developing a machine learning algorithm depends on large volumes of data, from which the learning process draws many entities, relationships, and clusters.
As Singapore accelerates its Smart Nation efforts, data will only become a more precious commodity. The nation has unveiled two new programmes to drive the adoption of A) in the government and financial services sectors. It also plans to invest another SG$180 million ($133.31 million) in the national research and innovation strategy to tap the technology in key areas, such as healthcare and education.
The fund is on top of SG$500 million ($370.3 million) the government already has set aside in its Research, Innovation and Enterprise (RIE) 2025 Plan for AI-related activities, said the Smart Nation and Digital Government Office (SNDGO) in a statement in November 2021.
These investments have been earmarked to support various research in areas that address challenges of AI adoption, such as privacy-preserving AI, and areas of societal and economic importance including healthcare, finance, and education. The funds also will facilitate research collaborations with the industry to drive the adoption of AI.
For Mohit, AI will transform every industry and create huge economic value. Technology, like supervised learning, is automation on steroids. It is very good at automating tasks and will have an impact on every sector – from healthcare to manufacturing, logistics and retail.
Beyond a doubt, AI is becoming more commonplace, says Mohit, citing examples such as the Robot dog, Spot, outdoor security robot O-R3 and the multi-purpose all-terrain autonomous robot, or Matar. AI is here to stay. While Singapore has been doing well in AI adoption, the country is still in its infancy – the government is only beginning to harness the technology of AI.
Mohit urged agencies to recognise the beneficial use cases of AI. He reminded the delegates that the complexity of the challenges besetting the world today requires sophisticated solutions. As such, it would be wise for delegates to partner with experts to better place themselves to respond with agility and efficiency in a rapidly evolving world.
Capitalising on the opportunities for AI adoption in government
Dr Steve Bennett, Director, Global Government Practice, SAS, spoke about the different challenges and success in AI for government applications.
Steve shares that the practice of using data to make better decisions was pioneered in government in WWII, giving rise to operations research, defined as “A scientific method of providing executive departments with a quantitative basis for decisions regarding the operations under their control.”
Today, using data to make better decisions may be identified as Artificial Intelligence, which supports better decisions by training systems to emulate specific human tasks through learning and automation.
Steve observes that AI is an increasing priority for the government – 75% of government managers want to deploy AI to help them “keep up.” At the same time, global government leadership sees an increasing opportunity; 80% of government data is estimated to be in formats not easily leveraged before AI.
He points out several opportunities where AI can make a real difference in how jobs are done in the public sector. In health, AI has been used to promote public health in India, improve cancer outcomes through better decision making in Amsterdam, keep the U.S. food supply safe and make CVOID-19 outbreak predictions that result in targeted policy-making decisions.
It is also extensively used in public safety and security, such as the F-35 predictive maintenance, keeping women safe from gender violence in Spain and reducing judicial case delays. In citizen services, AI has been used to reduce youth recidivism in Oregon and reduce unemployment in Denmark.
Attractive as AI is, there are technical and organisational challenges that public sector employees need to be aware of, Steve observes. He explains that there are two categories of AI challenges.
The first is technical and organisational challenges. AI requires a copious amount of data that is well-organised, clean and in good shape. The data readiness of government agencies needs to be in place before AI models can be trained.
Apart from that, there is also a skill gap in the government. Public sector employees need to understand how the models work so that they can understand when they can trust and challenge the model. Then there are also cultural realities, such as leaders who are not ready to accept the insights that come from AI models.
The second category of challenges comes from legal, ethical and societal challenges. There are geopolitical concerns, issues of ethics and values, as well as legal implications related to AI adoption.
In summary, Steve reiterates that the complex problems of today herald a time of change. To stay relevant and efficient to citizens, government agencies need to understand the benefits and considerations of using technology and harness it accordingly.
Deploying AI in government services
Frederic R Clarke, Principal Data Scientist and Director, Machine Intelligence & Novel Data Sources (MINDS), Australian Bureau of Statistics, spoke next on the use case of his agency’s effort in unlocking data to support Australia’s effort in managing the COVID-19 pandemic. It involves using integrated and multisource data and machine intelligence to derive new insights on the economic and social impact of the pandemic.
According to Frederic, federal state and territory governments seek to understand both the transient and enduring impact of the pandemic so that they can better target policies that assist Australia’s recovery in the aftermath. The pandemic is not a singular disruptive event, he says, it is a series of connected crises of varying duration that plays out on a local, national and global scale. It has amplified many existing problems while creating new ones.
For Frederic, a complex problem like a pandemic cannot be understood from a single perspective or a single source of data. The fundamental challenge is that the effects of the pandemic are deeply interconnected, dynamic and multifactorial. To connect the dots across a broad canvas of interrelated economic and social factors, policy analysts need a dynamic multisource-evidence base and new analytical techniques.
The economies in society form complex systems, Frederic asserts. As a result, public policy is fraught with problems that are notoriously difficult to isolate and objectively specify – complex systems do not yield to the familiar linear analytical techniques based on reduction principles.
Frederic opines that the interrelated web of problems is like a spider’s web – one intervention tugs on the interrelated web of issues and can have a ripple effect that can create unintended consequences in many other areas. He suggests that these considerations are not specific to the pandemic but a general set of policy concerns that cut across traditional portfolios and jurisdictions.
Frederic shares that 3 paradigms underpin the analytical approach in Frederic’s organisation.
- Data analysis is citizen-centric: The focus is on a system-wide framing in an analytical context
- Analysis is iterative: Defining the problem is part of the problem. There is a need to align with the objectives and changing needs of the policymakers at every stage and set directions for their analysis based on previous results
- Producing analyses that give integrated data: Since no single source of data can provide all the observations that can address informational needs in a complex policy space, being able to combine data sources is critical.
Frederic uses the example of the Australian government studying the impact of the pandemic on jobs and employment. To do so, they modelled the labour market as a system of connected entities – businesses, persons, households, jobs, locations, etc. – that interact through different types of relationships. Then, the concepts, entities, relationships and associated metadata are represented and stored in a knowledge graph. They use automated reasoning and machine learning approaches to integrate data and find new insights.
Believing firmly in the use of AI, Frederic encouraged the use of AI in government services that can help to drastically improve decisions making through high-quality insights.
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and facilitate discussions that impart professional learning and development for participants.
The first poll inquired on the percentage of overall IT investment that delegates foresee being committed to data and AI deployment over the next 2 years. Just over half (54%) of the delegates felt 10% – 30% of their IT investment would go into data and AI deployment. About 42% predicted that between 30% – 50% would be allocated while 4% said more than 50% would be deployed.
When asked about their biggest challenge in terms of data analytics, most delegates indicated the lack of skilled staff who understand big data analysis (61%) as the biggest challenge. The rest of the delegates were either not able to derive meaningful insights through data analytics (17%), lack of quality data and proper data storage (17%) or were not able to synchronise disparate data sources (5%).
Delegates shared the sentiment that data seemed to be understood only by a few. Getting everyone to produce insights is a “management challenge,” one delegate opines. There is a gap between the data scientist and departments, as well as the lack of knowledge to ask the right questions. There were also other challenges such as the lack of domain knowledge among the data scientists and having to manage a huge amount of data and legacy systems.
In response to these challenges, Frederic shared that his organisation’s strategy is to build data science teams that consist of domain experts. They do not expect that a single data scientist will have the full array of technical skills and domain knowledge. On the volume of data, he suggests the need to look at computing platforms as part of the capability. To analyse data, it is not merely the mathematical and statistical expertise. People need the tools to process a large volume of data.
In ranking the biggest challenge they face when implementing their AI strategy, almost half (46%) went with lack of properly skilled teams. Other delegates found the inflexible business processes and teams (21%), the lack of availability of data (21%), ineffective project management/governance (8%) and ineffective third-party partners (4%) as their biggest challenges.
Participants expressed a range of responses such as the culture of pushback when it comes to AI adoption, having the right skill set to achieve certain objectives, data classification frameworks, compliance requirements and high cost.
As far as cost goes, Steve offeree his experience of extending algorithmic techniques to take small amounts of data and artificially build and sample training data sets out of small data sets.
Frederic echoed Steve’s point and asserted that sampling is a powerful strategy. However, the issue lies in being able to sample without introducing biases, such that the model can return results that reflect the presence or absence of characteristics. He also posited the idea of agile development for producing analytical and statistical results. He opines that the challenges of implementing AI are never singular – it involves the capabilities of multidimensional teams and issues of cloud deployment.
Frederic expanded on the considerations surrounding sampling. “It depends on your purpose,” he says. For instance, in the case of generating classification sets through coding and mapping responses to the code, there is no need to include all the data in basic questions. In those cases, the model integrity and model accuracy depends on choosing the right set of training cases. However, if the purpose is for analytics and exploratory model building, one needs to be very careful in the application of sampling, since one may not know what is to be tested or what could be found.
On the most common use case of AI in their organisation, delegates were almost equally divided between developing smarter products or services (28%), driving intelligent business processes (24%), automating repetitive tasks (24%) and developing a more personalised relationship with stakeholders (24%).
On whether AI adoption remains a second-tier priority in the face of pressing requirements to deliver critical services, more than half of the delegates (56%) indicated that the lack of required skill sets is hindering the desired adoption. Other delegates indicated that AI has not been fully embraced due to everyday demands (28%) or that there is not enough budget to deploy the required AI solutions (11%). The remainder (5%) said AI adoption takes a back seat for some government leaders who have operational roles.
Besides the issue of privacy and security, some delegates felt that there is a lack of value proposition that businesses can come up with. Another delegate opined that organisations should not only look at people who develop AI but the managerial capability in understanding the potential and limitations of AI.
Mohit echoes that point of view and asserts the need to raise the skill set internally, but that the deeper insights require bringing experts from the outside.
On the most important ingredient for successful and wider AI adoption in the public sector, more than half of the delegates (55%) indicated that starting small and building the business case by demonstrating initial wins is the most important. That is followed by the belief in aligning all departments on the single vision and garnering support (20%), establishing clear lines of authority and ownership across the entire organisation (15%) and other considerations (10%).
The final poll asked delegates for their thoughts on the essential tenet for ethical AI to work. Most of the delegates believe in the need for an effective and practical ethical framework/ governance model for AI (56%), followed by the belief that AI solutions should allow for audibility and traceability (22%) and training AI models with carefully-assessed and representative data (11%).
In closing, Steve expressed his gratitude towards everyone for their participation and highly energetic discussion. Delegates believe that AI can make a difference in tailoring benefits for citizens and generating incredible insights. However, being able to manage the challenges of the lack of data or ethical considerations are important hurdles to cross.
He highlighted Frederic’s point about the application of agile approaches to insights delivery and reiterated that some of the best practices for AI adoption are in starting small and having transparency and audibility in the data.
Steve emphasised the edge AI can offer organisations in their journey towards delivering better government services. He reiterated that the digital transformation is an ongoing and collaborative journey and encouraged the delegates to connect with him and the team to explore ways in which AI can help agencies improve their operations.
Singapore and the United Kingdom will work more closely to facilitate digital trade between the countries by signing three Memorandums of Understanding (MOUs) in the areas of Digital Trade Facilitation, Digital Identities and Cybersecurity. The partnership will make digital transactions by businesses easier, safer and cheaper.
Singapore has been working with like-minded countries to advance a global digital architecture that is open, inclusive, interoperable and secure. In this regard, I am pleased to sign these MOUs with a digitally-progressive partner like the United Kingdom, to further strengthen the links between Singapore and the UK in digital trade facilitation, digital identities and cybersecurity. Such partnerships enable businesses in both countries to seize opportunities in the growing digital economy as we seek to recover from the pandemic.
– Josephine Teo, Minister for Communications and Information and Minister-in-charge of Smart Nation and Cybersecurity
These MoUs will further support opportunities to grow digital delivery of cross-border services between the UK and Singapore, provide a basis for working closely with like-minded digital partners, and help set a global benchmark on high-standards digital cooperation to bring economic and societal benefits to both countries.
The MoUs will also support the shared goals and key tenets of the UK-Singapore Digital Economy Agreement, which seeks to promote trusted, robust and connected digital markets for people and businesses. The agreement will establish rules to enable trusted cross-border data flows and ensure high standards in data protection.
These partnerships in the areas of Digital Trade Facilitation, Digital Identities and Cyber Security between Singapore and the UK will strengthen the digital connectivity between our countries and will support the shared goals and key tenets of the UK-Singapore Digital Economy Agreement (DEA), where negotiations are ongoing and targeted for conclusion in the near-term.
Under the Digital Trade Facilitation MoU, the countries will share knowledge and implementation of pilot projects in areas such as electronic trade documents and invoicing. This will help drive the development and adoption of digital trade facilitation solutions at a bilateral and international level.
Benefits to the digitalisation of trade include improving accessibility for small and medium-sized enterprises to engage in cross-border trade, among other things. The sharing of best practices will also influence the creation of secure global supply chains and interoperable digital ecosystems
Under the Digital Identities Cooperation MoU, Singapore and the UK will work more closely to develop mutual recognition of digital identities between the countries. The MOU is an important step in the route to achieve interoperability of digital identity regimes between different jurisdictions, which can allow for more reliable identity verification and faster processing of applications, among other things.
The Cybersecurity MoU acknowledges the shared vision between the UK and Singapore in maintaining the economic and social benefits of open, peaceful and secure cyberspace. The two countries also acknowledge their common interest in addressing the international challenges and promoting bilateral collaborations to strengthen cyber security.
The MOU will build on strong existing cyber cooperation between the UK and Singapore in seeking opportunities for collaboration in areas such as the Internet of Things (IoT) security, promoting cyber resilience and capacity building. As cyber security underpins the digital economy by promoting secure digital trade, the MOU will also build on existing workstreams between the UK and Singapore to build secure and resilient cyberspace for businesses and consumers.
Singapore and the UK have been collaborating in many fields, including the digital economy. As reported by OpenGov Asia, Singapore and the UK will officially launch negotiations concerning the UK-Singapore Digital Economy Agreement, which will include establishing rules to enable trusted cross-border data flows and ensure high standards in data protection. The partnership will benefit consumers and businesses from both countries.
The question nowadays is no longer whether organisations should migrate to the cloud, but how they can leverage the cloud for innovation, efficiency, and growth. According to the survey commissioned by more than half (51%) of respondents in Singapore said that their entire applicable infrastructure now resides in the cloud, while half said they plan to move more of their workloads into the cloud as possible.
In addition, close to two-thirds (64%) of respondents’ compute workloads are now supported by public cloud, colocation and managed hosting services. At the same time, IT infrastructure spread has reached an equilibrium and leaders expect it to hold steady over the next three to five years.
The survey also revealed business growth (36%), efficiency (23%), and improved security (8%) as the top three factors that drive decisions on where businesses should run their cloud infrastructure in Singapore. More than a third (35%) of respondents say IT executives play a key role in driving the direction of the company, as silos between functional areas continue to dissolve.
Driven in large part by the power of the cloud, today’s technology landscape is evolving at a breakneck pace, while IT is penetrating all areas of the organisation. In this environment, IT leaders have the power to help companies and organisations see around corners to solve both their short-term and long-term business challenges and provide critical guidance in the areas of business growth, security, efficiency, and customer experience.
Singapore and largely APAC are adopting both public and private cloud and many organisations are looking to do so even more in the next few years to boost digital transformation securely and sustainably. Organisations will be increasingly faced with complex considerations when employing hybrid and multi-cloud strategy which makes it more crucial for them to engage solutions experts to help them fully realise the value of the cloud.
Over the next 12 months, respondents anticipate their infrastructure spending will include on-site data centres (55%), managed hosting (52%), public cloud (51%), and colocation (34%). However, 60% of respondents also said they envision not owning a data centre in the next five years. Around 54% say legacy apps are the main factor keeping them from abandoning data centres.
As part of the Singapore government’s objective to harness the capabilities of commercial cloud computing platforms to governmental systems, many public sector agencies are migrating their IT systems to the Government Commercial Cloud (GCC).
GCC Service brings the modern innovations and capabilities of commercial cloud computing platforms to less sensitive Government systems. These leading ICT capabilities are augmented by robust cybersecurity measures and systems to protect the data that resides on commercial cloud platforms.
Multinational conglomerates are leading the cloud computing revolution, providing organisations with Commercial Cloud options that are scalable and customisable. Rather than being mired in the cost and hassle of racking, stacking and maintaining computing hardware on-site, developer teams can instead focus on what they do best—build and deliver digital applications that create value for stakeholders within and beyond their organisation.
Government agencies can tap on commercial cloud software to incorporate advanced functionalities into their digital services instead of trying to reinvent the wheel. Application testing and deployment can be automated and done in real-time, speeding up the delivery of high-quality Government digital services to citizens and businesses.
As reported by OpenGov Asia, agencies require a reliable and secure data management platform that allows for quick migration to the GCC, high data quality and managed data access for users. As a result, choosing the correct data strategy and the long-term platform is even more crucial in their migration to the GCC. In light of this, Singapore’s Government Technology Agency (GovTech) is upgrading the GCC service to make it easier for government agencies to manage and safeguard their use of public cloud services.
The growing potency of an AI Platform combined with a Graph Data Platform is successfully enhancing machine learning models and ultimately tackling complex decision making effectively. Undeniably, both technologies are working hand-in-hand to make data relationships simpler by being scalable, performant, efficient and agile.
The most evident advantages of Graph Data Platform were seen during the pandemic when governments needed to track down community infections. From tracing connections via complicated social networks to comprehending interconnections, Graph Data Platform with AI Platform has proven to be an excellent tool for data management in real-time.
Graph Data Platform successfully assists in aiding organisations to make data-driven, intelligent decisions. Additionally, it helps prevent fraud and potential information leaks that mushroom disproportionally with the rapid COVID-driven digitalisation.
The added agility that Graph Data Platform offers, makes it clear that the combination should be the preferred decision-making methodology. Further, an AI Platform along with a Graph Data Platform has proven to be cost-effective for the government and financial institutions.
In times of crisis, obtaining information in real-time has become critical for decision-making. With a Graph Data Platform, information can be structurally arranged quickly. These powerful capabilities are the missing link for agencies to drive actionable outcomes from the data.
Organisations will benefit from an enhanced machine learning model to build an intelligent application that traverses today’s large, interconnected datasets in real-time. The copious volumes of data that organisations generate and collect need to be analysed and interpreted if they are to streamline methods in forecasting based on real-time information and serve as an effective decision-making tool.
OpenGovLive! Virtual Breakfast Insight held on 2 December 2021 provided the latest insights on delivering an effective and efficient citizen or customer experience using Graph Data Platform. This was a closed-door, invitation-only, interactive session with the top Indonesian private and public sectors.
Mining and optimising the “new oil”
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.
Data is referred to as the new oil, Mohit says, but in and of itself it holds no value. It needs to be mined, refined and optimised to become a performing asset.
The world has fundamentally shifted and the challenges of these times will require sophisticated solutions to generate actionable information that will be vital for decision-making in real-time. Technology and data are the key pillars, Mohit asserts. While both the public and private sectors have vast amounts of data, are they obtaining genuine value from it?
This raises two fundamental questions: What technology is being used to find data today? Is there untapped technology that has not been explored?
Globally, public and private institutions are looking for excellent tools for data management in real-time. Obtaining real-time analysed data to help make critical decision making, moving an organisation or business from “good to great.” He stressed that increased visibility can help organisations make better decisions. It empowers people to make informed decisions, he asserts.
To enhance citizen experiences and to deal with the constant change, institutions need to be more intuitive to sense and respond to new technology opportunities to drive digital transformation. Websites need to be easy to use and safe to use across different mediums and devices. For Mohit, developing new competencies will increase trust and engagement, ease of use and ways of responding to a request.
Governments across the world are looking for excellent tools for data management in real-time that can provide insights into data, Mohit observes. The growing potency in an AI Platform combined with a Graph Data Platform has been proven to strengthen machine learning models and address complex decision making effectively, making it an ideal tool.
Graph Data Platform offers a tremendous edge in detecting and interpreting data, Mohit opines. Graph technology can now detect and interpret data to expand finance decision making and understand citizens better. It also offers fast screening, which is particularly effective for discovering money laundering, terrorist financing or corruption to improve governance and compliance.
One of the most obvious use cases for Graph Data Platform is contact tracing. Since COVID-19 proliferates through social interactions, Graph Data Platforms are perfectly suited to helping scientists and policymakers expose and understand connected data – from tracing connections through multifaceted social systems to understanding dependencies between people, places and events.
Before closing, Mohit stressed that organisations need to get smarter about leveraging resources and tools around them to achieve their business goals. He reminded agencies of the complexity of the challenges besetting the world today and the need to elevate the technology they are using. Against this backdrop, it would be wise for delegates to partner with experts to better place themselves to respond with agility and efficiency in a rapidly evolving world.
Accelerating growth through harnessing insights
Joko Parmiyanto, Chief of IT Transformation Division, Statistics Indonesia, spoke next about the strategies to pivot towards being an insights-driven organisation.
The challenges in this day and age are endless: issues of unintegrated data collection, the accuracy and coherence of data, the lack of policy and quality assurance, little attention to data users, lack of relevance and timelines and issues of data access.
He further explained what moving towards Indonesia One Data entails:
– Data Standard: Standards governing methodologies covering concepts, definitions, scope, classifications, measures, and units
– Metadata standard: Structured information that serves to describe the content and sources of data so that they can be easily found, used, or managed again
– Reference Code: The ability of data to be exchanged or shared between interacting systems
– Interoperability: The data generated must use the Reference Code and Master Data available on the One Data Portal
Emphasising the importance of utilising metadata-driven applications, Joko opines that reliable metadata gives the government more information and the ability to know – what is the collected data among ministries/agencies, what the data represents, how data moves through systems and who has access to it. For him, metadata-driven is the key success to realise data integration and orchestration among ministries/agencies.
Moving towards a single source of truth can help to streamline the flow of information and ensure information accuracy. Empowered by technology to manage, streamline and harness data, his organisation has launched Indonesia Data Hub (INDAH), which is a one-stop collaboration platform that aims to improve data literacy and value of statistics as well as support data interoperability and data exploration.
In summary, Joko reiterated the value of properly utilising and organising data. The insights generated through the proper use of technology can be the differentiating factor that propels the growth of the organisation.
Unlock the power of context and relationships with Graph technology
Benny Kusuma, Country Head – Indonesia, Neo4j, elaborated how Graph Data Platforms can elevate the operations and tackle issues that organisations and institutions are facing.
“Data is the new oil,” Benny agrees, building on Mohit’s opening analogy. “Data is the new plutonium.” In 2017, The Economist declared data to be the world’s most valuable resource while Forrester calls it “the new currency of business.”
Benny explains that a traditional database stores data in rows, columns and tables. They are great for quick storage and retrieval of data and aggregating. However, the architecture is not built for understanding relationships. Storing data as a graph on the other hand – as a network or web of interconnected things has some specific advantages. “It can be a game-changer” when applied to the right use case, unlocking new insights for otherwise impossible decision-making.
When that is made accessible it accelerates digital transformation and empowers decision making like never before. Data shapes every facet of the organisation; it inspires ideas, solves problems and allows organisations to monetise the vast reserves of data.
Yet, Benny observes from a study, that half of the data is still untapped and the pool of unconnected data is growing. A report forecasts that there will be 175ZB of data generated by 2025. However, 55% of an organisation’s data will be “dark” – unquantified and untapped – according to another recent global research. There is tremendous business potential in curating data relationships from the untapped, unconnected data, Benny opines.
Today, business leaders recognise that data is key to success, yet very few can say that their organisations successfully tap the value of all of their data and data relationships. To become truly data-driven and data-proven requires a system and a method that not only makes data more intelligent with an organisation’s growing business and data strategy, but also helps agencies find and tap into connections within data.
This is where Graph Data Platforms come into the picture: establishing relationships and connecting data. With other modes of organisation, basic organising principles are added to data to create a knowledge base. However, the context is shallow and quickly ages because the underlying infrastructure is not built for relationships. If the system can combine data, semantics and a graph structure, organisations will end up with a knowledge graph that has dynamic and very deep context because it was built around connected data.
Neo4j is the creator of the Property Graph and Cypher language at the core of the GQL ISO project. With thousands of Customers World-Wide, Neo4j is headquartered in Silicon Valley and has outposts in Singapore, Indonesia China, Australia, India and Japan.
Articulating the value of Neo4j, Benny asserts that Neo4j’s Graph Data Platform technology gives an edge in producing deep context through processing collected data to connected data. He points out that analysts have taken notice and ranked Graph Data Platforms as one of the top 10 trends in data and analytics in the last 3 years.
Graph Data Platforms are extremely versatile and can elevate the capability of companies and agencies – their use cases range from oversight, resource management, science and education, planning to security. With Graph Data Platforms, people can solve the previously unsolvable. Top financial institutions, retailers and telecoms, global governments overseeing civilian affairs, defence, and intelligence use Neo4j to analyse, optimise and protect. They have enabled customers to manage financial fraud, patient outcomes, the mission to Mars, global fare pricing and vaccine distribution.
In closing, Benny reminded delegates that Neo4j created the graph category and that it is a tool that can catapult organisations in their growth through faster and better-quality insights.
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and facilitate discussions that impart professional learning and development for participants.
The first poll inquired on the biggest challenge that delegates face when analysing information to handle a critical decision-making situation during a crisis. Most delegates (37%) indicated that exploring data relationships is the biggest challenge, followed by the difficulty in drawing conclusions (29%). The rest of the delegates expressed that their challenge lies in the interpretation of data (17%), the effectiveness of the data (13%) and ineffectiveness of the data (4%).
Mohit feels it is about how agencies look at data as a whole, identify relationships and contextualise the data. He also added that data has to be anonymised and shared, otherwise it is not “oil”.
On being asked what they experience as the greatest hurdle to becoming more data-driven, almost half (44%) of the delegates said that the skillset of the required workforce was the greatest hurdle. The rest felt their greatest hurdle was the annual IT budget or finance (28%), IT business or related projects alignment (24%) and challenges of IT infrastructure (4%).
On the pain points in their data-driven decision-making journey, an overwhelming majority (68%) found the use of data to drive business in a better more effective way to be a major hurdle while the rest (32%) opted for the need to capture more data (32%) as the ket issue.
Mohit believes that the issue is with generating insights. Capturing data is expensive but without proper organisation and sense-making of the data, the expenditure will not translate into usable insights. The key is to upskill so that agencies can harness the insights from data.
For use cases that best depict how Graph Data Platforms can be valuable to their organisation’s work, most (32%) found AI and Machine Learning the most compelling use case, followed by real-time analysis (24%). The rest of the delegates were split between identity graph (16%), customer 360 (12%), supply chain (12%) and fraud/money laundering (4%).
When asked about the current usage of Graph Data Platforms in their department or organisation, nearly half (46%) admit that they use it to a limited extent and are in the initial phase of exploring how it can be of value.
Other delegates use Graph Data Platforms at the enterprise level and are curious to find out more about scalability and distribution (advanced users/clients) (27%). The rest either use it on a small scale and have some understanding of it works (18%) or use it in several projects but not at the production level – not on large scale – (9%).
Inquiring what delegates thought were the advantages of Graph Data Platform and how it will enhance their daily decision-making process, about half (46%) were familiar but have not implemented the technology. The rest of the delegates were either not familiar and have not implemented the technology (33%) or have already implemented and are currently using the technology (21%).
In closing, Benny expressed his gratitude to everyone for their participation and highly energetic discussion.
He is firmly convinced of the edge that Graph Data Platforms offer organisations in their journey towards digital transformation. Complex problems require innovative solutions and harnessing Graph Data Technology can boost capabilities by generating real-time information and deeper analysis.
Before ending the session, Benny highlighted the importance of a Graph Data Platform in vaulting organisations to greater heights. Reiterating that digital transformation is an ongoing and collaborative journey, Benny encouraged the delegates to connect with him and the team to explore ways forward.
Researchers from Nanyang Technological University, Singapore (NTU Singapore), have developed a technology, called Dynamis, that makes industrial robots nimbler and almost as sensitive as human hands, able to manipulate tiny glass lenses, electronics components, or engine gears that are just millimetres in size without damaging them. The breakthrough was first published in the top scientific journal Science and went viral on the internet when it could match the dexterity of human hands in assembling furniture.
We have since upgraded the software technology, which will be made available for a large number of industrial robots worldwide. Mastering “touch sensitivity” and dexterity like human hands has always been the holy grail for roboticists, as the programming of the force controller is extremely complicated, requiring long hours to perfect the grip just for a specific task.
– Professor Pham Quang Cuong, NTU Associate Professor
Clients purchasing the latest robots sold will have an option to include this new technology as part of the force controller, which reads the force detected by a force sensor on the robot’s wrist and applies force accordingly: apply too little force and the items may not be assembled correctly while applying too much force could damage the items.
Today, Dynamis has made it easy for anyone to programme touch-sensitive tasks that are usually done by humans, such as assembly, fine manipulation, polishing or sanding. These tasks all share a common characteristic: the ability to maintain consistent contact with a surface. If the human hands are deprived of our touch sensitivity, such as when wearing a thick glove, the researchers would find it very hard to put tiny Lego blocks together, much less assemble the tiny components of a car engine or of a camera used in our mobile phones.
The technology is a technology for force feedback, which is becoming more and more important in the practical use of robotics. The system is advanced, yet easy to use and light enough to be integrated into the standard robot controllers.
Known as “Force Sensor Robust Compliance Control”, the new software powered by Dynamis, a complex Artificial Intelligence (AI) algorithm, requires only a single parameter to be set – which is stiffness of the contact, whether it is soft, medium, or hard. Despite its “simple set-up”, it has been shown to out-perform conventional robotic controllers which required an enormous amount of expertise and time to fine-tune.
This backbone technology was further improved and was first deployed in custom-built robots, such as, which can handle fragile optical lenses and mirrors with human-like dexterity, now used by multiple companies worldwide. Current robots in the market have either high accuracy but low agility (where robots perform the same movements repeatedly such as in a car factory), or low accuracy but high agility (such as robots handling packages of different sizes in logistics).
By deploying this technology, robotics engineers can now imbue robots with both High Accuracy and High Agility (HAHA) on a large scale, paving the way for industrial applications that were previously very difficult or impossible to implement, such as handling and assembly of delicate, fragile objects such as optical lenses, electronics components, or engine gears.
As reported by OpenGov Asia, Singapore’s IT manufacturer and NTU collaborated to enhance local DS&AI education, empowering students with the tech tools and skills needed to inspire a brighter future. The Lab will put together the IT firm’s cutting-edge deep-learning technology with NTU’s global strengths in artificial intelligence and data science, allowing local data scientists and AI experts to pioneer the development of meaningful AI solutions in important industries.
According to NTU, the Lab was still in the planning stages in 2018, and roughly 150 NTU students enrolled in the Bachelor of Science in Data Science and Artificial Intelligence programme have benefited from the Lab’s resources since then.
“Singapore has moved from preventing cyber threats to assuming breaches have occurred”, said Josephine Teo, Minister for Communications and Information, Singapore. When Minister Josephine Teo made this statement in Estonia during the Tallinn Digital Summit, she underscored the need to have a strong cybersecurity posture. Singaporeans have not forgotten the cyberattack in 2018, where a quarter of the city-state’s population healthcare records were breached during a cyberattack against the country’s healthcare system.
It was after the 2018 data breach that Singapore’s position on cybersecurity changed from one of trying to prevent attacks to one that assumes that an attack has already occurred. “It’s just a question of ‘when’, it’s not a question of ‘if’,” explained Minister Teo.
Without a doubt, the pandemic has drastically and unexpectedly accelerated the need for a new network security model. Zero trust security is not a new concept, but it has now taken centre stage and security leaders agree that it will improve security and simplify security processes for distributed teams and hybrid networks.
A widespread move to remote work and the corresponding need for better remote workforce security has spurred investment in zero-trust security. The ability to authenticate and monitor all traffic, regardless of its position inside or outside of an organisation’s network, promises to reduce or eliminate many security risks. However, rolling the model out has proven to be complicated, presenting organisations with a mixed bag of successes and obstacles. One key reason is that zero trust adoption is a logistical challenge, not just a technical one. Security modernisation often depends on the progress of user identity consolidation and cloud transformation, both complex and long-term projects.
Moreover, organisations are facing challenges with overall cloud transformation. Organisations have accelerated their cloud adoption plans but are not fully prepared. When large chunks of data have not yet moved to the cloud from isolated data centres, it can become harder to secure using a single security tool. Identity and access management (IAM) complexities also proved equally challenging for zero trust adoption. Teams are struggling to shift to a zero-trust approach due to the complexities of user access needs in their organisation.
Zero trust relies on a single source of truth for identity management, yet larger organisations, in particular, have often accumulated multiple incompatible identity providers over the years. They must also understand access patterns across a huge number of applications — most of which cannot be shut down even for a moment to be migrated to a new identity platform.
The pandemic has further exposed the weaknesses of the traditional ‘castle-and-moat’ security model. Remote work has expanded attack surfaces infinitely outwards – more than ever, agencies need to start from the assumption that their ‘castle’ is already compromised. Zero Trust has emerged as a compelling security framework to address the failures of existing perimeter-based security approaches.
This leads to fundamental questions: what does it take to adopt and deploy zero trust architecture? Are organisations equipped to enhance the efficiency and security of their mission-critical applications and websites?
This was the focus of OpenGovLive! Virtual Breakfast Insight on 1 December 2021, which aimed to impart knowledge on how to deploy the zero-trust model seamlessly and to overcome common obstacles in zero-trust adoption.
Embracing the security imperative in a hybrid world
COVID-19 has fundamentally changed culture, Mohit asserts. With remote working entrenched in the new normal, hybrid work is the new reality. For him, the world cannot and will not go back to what it used to be – to demand employees return to physical work in offices in the name of security will not bode well.
Organisations must learn how to keep individual secure where he or she is working while keeping the work environment secure. “Culture has shifted and we must evolve with it,” Mohit is firm.
Singapore is embracing a Zero Trust strategy. According to the Singapore Cybersecurity Strategy 2021, the three strategic pillars are: building resilient infrastructure, enabling safer cyberspace and enhancing international cyber cooperation.
Mohit observes that the prevailing priorities of the public sector are to roll out innovative and secured digital services quickly, encourage inter-agency collaboration, enable a hybrid workforce and increase availability and security.
There is no doubt that rapid digitalisation increases the risk that organisations will face. In May 2021, Asia Pacific experienced a 168% YoY increase in cyber-attacks. There were reported malicious attacks that destroyed data in destructive/wiper-style attacks (average cost of $4.52 million) and ransomware attacks ($4.44 million).
“But just because it is a little bit hard, it does not mean that organisations should go back in time and revert to the old model,” Mohit says. Instead, he stresses, organisations need to embrace the challenges of security head-on instead of eschewing them. There is no turning back when it comes to digitalisation. Organisations can no longer hide behind the word “security” as an excuse not to modernise.
Although the challenges of the future abound, Mohit remains optimistic because of partnerships that can enable organisations to expand their capacities. He urges delegates to partner with organisations with a wealth of expertise and experience that can make the journey of security far easier to manage and navigate.
Hedging against cyber-attacks with Zero Trust
Fernando Serto, Chief Technologist and Evangelist, Asia Pacific, Japan and China, Cloudflare spoke on the ways Cloudflare can support agencies in building a secure Zero Trust architecture.
Cloudflare is a global network located in 250 cities in more than 100 countries, one of the fastest, that is trusted by millions of web properties. With direct connections to nearly every service provider and cloud provider, the Cloudflare network can reach 95% of the world’s population within 50 milliseconds.
As a company, Cloudflare provides Zero Trust Services, Cloudflare Network Services and Cloudflare Application Services. With the Zero Trust Services, Cloudflare helps to secure internal operations on a single global network by providing ZTNA with private routing, remote browser isolation, SWG with CASB and identity/endpoint integration.
Most people know Cloudflare for their application services such as WAF with API protection, rate limiting, load balancing, bot management, L7 DDoS protection, CDN and DNS. However, Fernando explains that Cloudflare also offers an integrated global edge platform and harnesses its unified software stack to run all its services. With the network services, Cloudflare offers WAN-as-a-Service, Firewall-as-a-service, L3 & L4 DDoS protection, network interconnection, and smart routing.
On the topic of Zero Trust Services, Fernando explains that the key concept is that it assumes that the network has been breached or that a breach is inevitable. Zero Trust is centred on requiring continuous verification through real-time information. Organisations need to identify and be able to decouple users from the network.
Yet the challenges are aplenty, Fernando warns delegates. He observes that today’s corporate WAN architecture is broken. Perimeter security is a bottleneck and does not work, applications are in the cloud and have a high latency for remote users. It is also difficult to scale and expensive. If anything, Fernando opines, “COVID-19 has taught us that the old model does not work.”
The security perimeter is and will be, constantly susceptible to vulnerabilities. Pulse Connect Secure VPN software has reportedly been exploited by attackers and many are targeted by accident. He adds that applications inside the WAN are also at risk, citing numerous reports of cyberattacks and system breaches. Vulnerabilities will always exist, but, how quickly organisations patch them will make the difference. Regardless, patching vulnerabilities takes time.
For Fernando, the switch to Zero Trust network access with private routing can help to mitigate these issues. With Cloudflare’s offering, security and connectivity are optimised, driving agencies’ speed and security in a work-from-anywhere world.
Cloudflare’s Zero Trust platform offers solutions for two problems. Traditionally, multiple point products require multiple policy managers and multiple client deployments. Cloudflare, however, offers one seamless platform that uses one policy manager and one client deployment. The other issue with traditional approaches is that platforms only integrate one identity provider (IdP) repeatedly and inconsistently. To address this, Cloudflare integrates many IdPs and tenants of the same IdP just once.
Concluding his presentation, Fernando emphasises the simple and effective threat defence that Cloudflare offers. In a fast-changing environment and changes in work models and culture, Cloudflare secures the networks of agencies working with a remote workforce seamlessly.
Starting a Zero Trust Government
Jeffrey Brown, Chief Information Security Officer, State of Connecticut spoke next on the establishment of a zero-trust government in his work in Connecticut.
The state of Connecticut has over 50 state agencies and three branches – executive, legislative and judicial. The key industries are financial services and insurance; aerospace and defence; bioscience and healthcare; film, TV and digital media; and advanced manufacturing.
In terms of management, the state government has the responsibility of handling a 24/7 Digital government, election infrastructure, 911 network, state critical infrastructure, healthcare, finance, transportation and the trust of 3.5 million citizens.
For Jeffery, the “chewy centre”, perimeter security model, whereby everyone inside the corporate network is trusted, is dead. Zero trust is now the dominant model of cybersecurity. The assumption with Zero Trust is that the network has already been compromised. It is an approach that deems networks both inside and outside as critical. There is a stricter identity verification process whereby every user and device has to prove that they are not a cyber attacker.
Jeffery believes that trust is a vulnerability that can be mitigated and that no one can achieve perfect trust. For him, “it is a balance.” He outlines three approaches that the state of Connecticut has undertaken:
- Know that all zero trust schools of thought make sense only if they support the business
- Learn about the most common frameworks (NIST 800-207)
- Understand that zero trust is a marathon, not a sprint. Not everyone can achieve zero trust, but everyone can adopt it.
In pursuing zero trust government, there were many lessons that Jeffery learnt along the way. In essence, it is a process with different components such as implementing multifactor authentication (MFA) everywhere, having 24/7/365 security monitoring, addressing identity and access management, leaning on federal partners and ultimately planning for the future.
Before closing his segment, he encourages delegates to take a risk-based approach to ensure that the most important pieces are first addressed and look at how Zero Trust can be implemented within their agencies to enable the government to do more.
Understanding the fundamentals of Zero Trust architecture
Gerald Caron: The nuts and bolts of Zero Trust architecture
Gerald Caron, Chief Information Officer & Assistant Inspector General for Information Technology U.S. Department of Health and Human Services, Office of the Inspector General, shared the various aspects and characteristics of Zero Trust architecture.
While most people focus on the identity aspect of Zero Trust, Gerald believes that it is the data that organisations are trying to protect – that is the goal of Zero Trust. Beginning with Zero Trust core principles, Gerald notes that Zero Trust hinges on five core principles related to trusting no one and having the protection of the right size.
Trust no one
- Know your people and your devices: Validate identity at every step
- Design systems assuming they are all compromised: Distrust everything, so when a breach happens you are as protected as you can be
- Use Dynamic Access Controls: Access to services must be authenticated, authorised, encrypted at all times, and can be revoked during a session
- Constantly evaluate risk: Include context in risk decisions; Monitor and log in every location possible; Aggregate log, system, and user data
Right size protections
- Invest in defences based on the classification of data: Spend more money defending the systems at greater risk
Gerald adds that not all data is equally important. Organisations need to identify what is important. Zero Trust recognises these differences and categorises data based on its sensitivity and mission criticality. This categorisation is considered when protecting the data and granting access.
Apart from that, the paths are also not equal. In a Zero Trust environment, the path the data takes between the client and host impacts the level of risk, thus impacting how much a connection can be trusted. Connections with higher risk either restrict access to data/services or require a higher level of authentication.
While traditional authentication checks a user’s credentials once and uses that initial authentication for any subsequent activity before log-out, identity authentication is much stricter in Zero Trust. Multiple factors are considered when validating access, including the user’s role and location, the state of the device attempting access, and the data or services being accessed. Organisations need to look at all these factors to develop a risk-tolerance framework to decide what a user can or cannot do.
At the same time, Zero Trust assesses the state of each device attempting to access the network – for example, the device’s operating system version and patch level – to ensure that the client does not introduce additional risk to the environment.
Zero trust architecture features dynamic access control. While traditional authentication happens once, at the start of the session, and remains in place, Zero Trust authenticates dynamically each time new data is accessed or when something triggers a change in risk.
Gerald shares that a Step-up event in a Zero Trust environment can mitigate some of the potential risks a client may introduce. During the event, the system requires an additional authentication that can help control, although not entirely offset, the risk introduced by a client.
As for monitoring, continuous, detailed monitoring and logging are critical elements of Zero Trust as they contribute to a holistic picture of each user’s session and the overall environment. Data collected from monitoring and logging is linked with known threats and data/system sensitivity to drive cyber protections.
Gerald emphasises the importance of understanding the baseline and knowing what “normal” looks like – only then can organisations react to “abnormal.” As for risk evaluation in a Zero Trust environment, the authentication is evaluated dynamically, each time new data or resources are accessed or when something triggers a change in risk level.
Before ending his presentation, Gerald cautioned against being caught up with the tools and technology. He emphasises the importance of first understanding the organisation’s risk tolerance, methodology and threshold for risk. He also recommended his capability model as a way to understand the organisation’s functional capability and identify where the gaps are.
Gerald hopes that his presentation offered a quick overview of Zero Trust architecture that could help kick start the journey for delegates thinking about adopting a Zero Trust approach.
A delegate asked the speakers about their experience of working within Zero Trust architectures. For him, the Zero Trust environment have caused people to be dispossessed of the services that they used to enjoy.
In response to that, Gerald points out that “humans are the weakest point” and it is not always a malicious person but someone trying to get their job done. It is vital to understand how users work and what data is needed. He sees Zero Trust as an opportunity to improve operations. By looking into various technologies, the government is essentially optimising processes and performance enhancements. However, he stresses the importance of understanding the user in the process of implementing new systems and not limiting it to the IT domain.
Fernando adds that implementing frameworks with legacy technologies that it where there will be a negative user experience. The end goal is to make the user experience as seamless as possible. Therefore, addressing legacy technology will be important in this process.
Mohit is convinced that it is not only about legacy technology but also legacy governance and processes. Change and transformation have to be holistic, encompassing all aspects.
The first poll inquired how delegates plan to implement Zero Trust across their extended environment. Most (34%) indicated that they have already started implementing zero trust with a primary focus on identifying our critical assets, followed by delegates who are not yet ready to implement zero-trust due to the lack of resources and skills needed(19%). The rest of the delegates indicated that they work with multiple security partners to build a practical and pragmatic roadmap to implement zero-trust (14%) or have made huge investments in different technologies and are not sure where to start due to operational complexities (14%).
A delegate observed that the consideration for Zero Trust needs to be ground in yielding a particular business value. At the same time, the end-user friction needs to be considered – the processes need to be made less difficult for everyone in the company.
Fernando opines that the user experience needs to lubricate the process of building a Zero Trust architecture. The technology is there to allow people to move faster. Mohit felt that there was resistance when the cloud-first came out but governments are slowly changing their policy to embrace the cloud.
On their organisation’s current security priority, over half the delegates indicated that enabling Endpoint Mobile Management & Protection (EMM) / BYOD/ IAM is their highest priority (55%). The rest of the equally divided between employing DDoS, Web Application Firewall, Bot Management, Data Loss Prevention (15%), ensuring secure access to applications hosted on cloud service providers (e.g. Microsoft, Amazon, Google) (15%), and ‘others’ (15%)
In response to the results, Fernando observes that the emphasis on the end-point could be because of the hybrid situation that organisations are in.
Exploring key drivers for their organisation in initiating and augmenting an identity access / Zero Trust management programme, exactly half the delegates indicated security/data protection/ breach prevention was the key driver. That is followed by internal/industry/regulatory compliance (19%), response to audit or security incidents (13%). The rest of the votes were split evenly between operational efficiency (6%), reduce endpoint, insider and IoT security threats (6%) and others (6%).
On the best scenario that describes their organisation’s journey, nearly 3 out of 5 of the delegates are of the view that ZTNA solution will work alongside VPN serving different use cases for years to come (59%). Other delegates felt that they see shifting users gradually from a VPN to a ZTNA solution but will always keep VPN for a core set of users (29%). Just over a tenth (10%) acknowledged they would migrate all users to a ZTNA solution (12%).
Looking at the polls, Fernando opines that the reason why people might have a foot on each side in ZTNA and VPN is due to the focus on user and identity in the marketing of services. Gerald adds that it might have to do with culture and the resistance to change. He believes that VPN is not iron-clad and that he would rather be effective than compliant.
When asked about the Zero Trust tenets that are most compelling to their organisation, just under a third (30%) placed continuous authentication, authorisation/Trust earned through entity verification at the top. This was followed equally by end-to-end access visibility and audit (21%) and data protection, e.g. secure connection (21%). The rest of the delegates were compelled by the facilitation of least privileged access (14%), no trust distinction between an internal or external network (7%) and others (7%).
The final poll inquired on the most likely approach that the delegate’s organisation might take in evolution to SASE (Secure Access Service Edge). An overwhelming number of delegates are likely to take a best of breed approach to select partners that are most appropriate to my organisation’s needs (77%). The remaining delegates were split between staying with existing partners and consolidating as necessary (15%) and looking for partners who can provide a complete SASE solution (8%).
In closing, Mark Huang, Product Director, Securecraft, acknowledged the mounting challenges in a drastically changed world. He emphasised that the journey of setting up a Zero Trust architecture need not be taken alone – Cloudflare at Securecraft can help government agencies with the task of making their services more secure.
Before ending the session, Mark thanked the delegates for the robust discussions and invited delegates to reach out to him and the team if they wanted a deeper understanding of how to get started on securing the government.
As the local distributor of Cloudflare, Mark emphasised that Securecraft would be more than happy to offer any support that delegates might need in their digital transformation journey.