As the volume, variety, and velocity of data continue to reach unprecedented levels, big data analytics has drawn significant interest. According to a recent report, the worldwide enterprise data analytics market is forecasted to grow at a 9.4% annual growth rate through 2018, reaching $59.2B.
Many organizations are keen on adopting big data techniques to analyze huge volumes of data that conventional business intelligence solutions cannot touch, and discover insightful knowledge for better decision making.
Recently deep learning, which extracts high-level abstractions from data, has emerged, and shows great potential for solving business problems. Several many startups are using deep learning techniques in their applications because it is effective for running many tasks.
OpenGov spoke to Ju Fan, Research Fellow, National University of Singapore, and Wei Wang, PhD Student, National University of Singapore, both of which are working on developing a distributed deep learning platform, Apache SINGA.
Apache SINGA is a distributed deep learning platform that entered Apache incubator in March of this year. The project is funded by the National Research Foundation, Ministry of Education, and A*STAR. SINGA is a valuable tool for big data analytics because:
- It supports various deep learning models, and thus has the flexibility to allow users to customize the models that fit their business requirements
- It provides a scalable architecture to train deep learning models from huge volumes of data
- It provides a simple programming model, making the distributed training process transparent to users.
We talked to Ju Fan and Wei Wang about their research in deep learning, how they got into the Apache incubator, case examples, and what challenges them about their research.
The very beginnings…
“For this project, we started from a research problem on multi-model data retrieval. We were to use different data from different modalities, like image data or text data. Later, I found that deep learning was really effective for extracting features from different modalities,” said Wei Wang.
He then started developing his work with deep learning and has since published papers on algorithms and retrieval problems.
His mentor, Prof Ooi, is an expert on database distributed computing and advised that Wei Wang work on the system part. Deep Learning training is very time consuming because it takes a long time to train a complex model over a large data set.
In order to train the model, Wei Wang used a Stochastic Gradient Descent algorithm, commonly used for deep learning models. This is because it updates the model parameters based on parameter gradients.
Additionally, distributed training had to be applied because datasets can be quite large and models, quite complex. This accelerates the speed of training through the use of more computing resources, which catered to running different training frameworks in a scalable manner.
After they finished the first version of SINGA, Prof Ooi suggested that they try the Apache Incubator and get more people outside NUS to contribute to this project.
Wei Wang submitted his proposal to be included in the incubator and would receive comments from Apache mentors. Within Apache, this project was the only project of its kind, focused on deep learning.
Wei Wang told us that the team is now working on improving the system. “We are working to improve this system in terms of: scalability, efficiency, and the features to support different applications,” he said.
SINGA applied to Healthcare Data Analytics
With the data analytics power of SINGA, Ju Fan explained that they are collaborating with the National University Hospital System to work with data scientists and medical specialists in the healthcare domain.
They would look at data relating to diagnosis, medications, and lab tests results, with the greater aim to reduce the cost of healthcare and improve performance of services.
“The approach we draw knowledge from the healthcare data,” stated Ju Fan, “We are carrying out two applications of SINGA: the first is to predict risk of hospital readmission, and the second is chronic disease progression modelling.”
These two applications show how SINGA is helpful in analysing electronic medical record (EMR) data because:
- Hospital readmission contributes a significant proportion of healthcare spending, while a large proportion of readmissions are potentially avoidable. Predicting risk of readmission for potentially fatal diseases can effectively yield lower costs and better healthcare quality.
- Chronic diseases tend to evolve and progress over a long time, and if their conditions are not properly managed, more serious comorbidities as well as complications may ensue. Disease progression modelling can help with the early detection and management of chronic diseases.
“Working with healthcare analytics is quite challenging because of two things: the data is sparse and personalised medications,” Ju Fan told us, “To address these problems we apply the deep learning techniques because deep learning has a good ability to find the high level abstractions from the raw data.”
The benefits of having such a personalised system are clear, patients would have better treatment, doctors would perform more efficient, and hospitals would be able to reduce the overall cost of treatment.
Going forward, Apache SINGA will continue to develop and improve as SINGA could be useful to other data types and applications. The team is currently working with a local security company on malware detection, using deep learning techniques.
The team behind Apache SINGA will release version 2 of their programming model next month, January 2016.
For more technical details and development schedule, interested readers please refer to http://www.comp.nus.edu.sg/~dbsystem/singa/
Ju Fan received his PhD in computer science from Tsinghua University, China in 2012. He is currently a research fellow in the School of Computing, National University of Singapore. His research interest includes big data analytics, crowdsourcing, and database management.
Wei Wang is a Ph.D. student in the computer science department of the National University of Singapore. Currently, he is working on an Apache incubator project (SINGA) for developing a general distributed deep learning system.
Every piece of data that travels over the internet — from paragraphs in an email to 3D graphics in a virtual reality environment — can be altered by the noise it encounters along the way, such as electromagnetic interference from a microwave or Bluetooth device. The data are coded so that when they arrive at their destination, a decoding algorithm can undo the negative effects of that noise and retrieve the original data.
U.S. and Ireland’s researchers have now created the first silicon chip that is able to decode any code, regardless of its structure, with maximum accuracy, using a universal decoding algorithm called Guessing Random Additive Noise Decoding (GRAND).
By eliminating the need for multiple, computationally complex decoders, GRAND enables increased efficiency that could have applications in augmented and virtual reality, gaming, 5G networks, and connected devices that rely on processing a high volume of data with minimal delay.
One way to think of these codes is as redundant hashes added to the end of the original data. The rules for the creation of that hash are stored in a specific codebook. As the encoded data travel over a network, they are affected by noise or energy that disrupts the signal, which is often generated by other electronic devices. When that coded data and the noise that affected them arrive at their destination, the decoding algorithm consults its codebook and uses the structure of the hash to guess what the stored information is.
GRAND works by guessing the noise that affected the message and uses the noise pattern to deduce the original information. GRAND generates a series of noise sequences in the order they are likely to occur, subtracts them from the received data, and checks to see if the resulting codeword is in a codebook. While the noise appears random in nature, it has a probabilistic structure that allows the algorithm to guess what it might be.
The GRAND chip uses a three-tiered structure, starting with the simplest possible solutions in the first stage and working up to longer and more complex noise patterns in the two subsequent stages. Each stage operates independently, which increases the throughput of the system and saves power.
The device is also designed to switch seamlessly between two codebooks. It contains two static random-access memory chips, one that can crack codewords, while the other loads a new codebook and then switches to decoding without any downtime.
The researchers tested the GRAND chip and found it could effectively decode any moderate redundancy code up to 128 bits in length, with only about a microsecond of latency. Médard and her collaborators had previously demonstrated the success of the algorithm, but this new work showcases the effectiveness and efficiency of GRAND in hardware for the first time.
Developing hardware for the novel decoding algorithm required the researchers to first toss aside their preconceived notions. We could not go out and reuse things that had already been done. This was like a complete whiteboard. We had to really think about every single component from scratch. It was a journey of reconsideration. When we do our next chip, there will be things with this first chip that we will realise we did out of habit or assumption that we can do better.
– Lead Researcher
Since GRAND only uses codebooks for verification, the chip not only works with legacy codes but could also be used with codes that have not even been introduced yet. In the lead-up to 5G implementation, regulators and communications companies struggled to find consensus as to which codes should be used in the new network. Regulators ultimately chose to use two types of traditional codes for 5G infrastructure in different situations. Using GRAND could eliminate the need for that rigid standardisation in the future.
Moving forward, the researchers plan to tackle the problem of soft detection with a retooled version of the GRAND chip. In soft detection, the received data are less precise. They also plan to test the ability of GRAND to crack longer, more complex codes and adjust the structure of the silicon chip to improve its energy efficiency.
The development, integration and adoption of information management and governance frameworks are a necessity, especially in an era where the quality of data collected is more important than ever. As information is a strategic asset, governments need to protect, leverage and analyse both structured and unstructured information to better serve and meet mission requirements.
Public sector leaders need to lay the groundwork to correlate dependencies across events, people, processes and information to establish data-driven organisations to accomplish their mission. This is why creating effective information and data governance strategies, policies and frameworks to drive the quality, accuracy and availability of insights are essential.
With recent advancements in data analytics, business intelligence, Machine Learning and Artificial Intelligence, governments can better predict and anticipate problems more accurately rather than react to them.
While this is not new, the difference today is the regularity, accuracy and consistency delivered made available with the current power of analytics. Massive data sets, millions of pages of unstructured text and information stored across silos and borders can now be analysed to identify patterns, forecast trends and mitigate problems.
Data analytics allows governments to see the bigger picture – understand where to increase efficiencies, cut waste, improve policies and monitor budgets.
Public sector agencies are working to radically improve their operations and services – driving the need to structure, collect and store data that will improve analysis and offer better actionable insights. Therefore, there has never been a more important time for data collaboration and a single source of truth.
The Singapore public sector has been leading the charge in digital transformation and data analytics in the region. The nation has developed new infrastructure to digitally industrialise the management, governance and use of data to support and scale data transformation initiatives.
In the world of data democratisation, breaking down information silos is the first step toward user empowerment. This can only be done with reliable analytics tools capable of desegregating and connecting previously siloed data, making it manageable from a single place. Governments need to be more intuitive to sense and respond to new technology opportunities that could drive digital transformation in times of constant change.
HyperIntelligence – a relatively recent concept – is about making data available to the staff to ensure convenience, access and safety. Considered the future of Data Analytics, it relies on trusted sources and personalising information for specific roles within the company. The future of Data Analytics is to provide critical data insights for specific keywords on all web applications.
To effectively leverage data insights to deliver citizen-centric services, data and analytics are crucial for government agencies. While it cannot be used to solve every challenge in society today, it is a great step in the right direction.
This was the focal point of the OpenGov Breakfast Insight on 10 September 2021 – a closed-door, invitation-only, interactive session with Singapore’s top government agencies. This session aimed to provide the latest information on how government agencies can use data analytics to drive mission outcomes.
Finding Partners to Leverage Data Analytics
To kickstart the session, Mohit Sagar, Group Managing Director and Editor-in-Chief at OpenGov Asia delivered the opening address.
Data on a global scale has taken on an entirely different dimension and Singapore is no different. In fact, compared to other countries in the region, the nation is well ahead of the curve and leads in data analytics. The public sector has spent huge amount of money on technological innovations.
While Singapore collects massive amounts of data, quantity alone is not enough to make informed decisions. Where, how and when is critical as is how the data is structured and made uniform. For better and more relevant data, information silos need to be broken down. Democratisation, integration and sharing will all be key to bettering citizen services and enhancing citizen experience.
To democratise data, the public sector needs to empower its entire workforce – from top to bottom. For the most part, data is often only accessible to people in higher positions or specific departments, creating disparity and lacunae. The information gap must be bridged with appropriate empowerment – be it through awareness, training or skill up-gradation.
Access to large data sets is essential for a government’s digital transformation journey. Of course, data in and of itself is not the end goal – data must serve as a tool to derive understanding that enables effective decision making. Actionable insights from analytics will ultimately enrich the citizen experience.
In closing, Mohit emphasised the importance of partnerships that could help leverage data analytics for an organisation. By working with the right people, a company can accelerate its digital journey towards effective digital transformation.
Global State of Enterprise Analytics
Kyung-Whu Chung, Director, Sales Engineering, APAC, MicroStrategy spoke next of the criticality of quality of data in digital transformation. To set the context, Kyung Whu revealed that a recent survey showed that 94% of respondents say that Data and Analytics are important to their business growth and digital transformation. While this may be obvious, it bears more elaboration and explanation.
There are huge benefits for organisations to use data analytics, including improved efficiency and productivity. Better data analytics leads to faster and more effective decision-making and, ultimately, results in better financial performance. Data analytics also assist organisations to identify and create new promising products and services.
While benefits are clear internally, there are advantages for the consumer as well. Customer satisfaction and experience are both critical for a company to thrive was the key. Data analytics help better understand consumer behaviour, trends, demands and also identify issues. It has improved customer acquisition and retention with enhanced customer experience.
Contrarily, the same survey showed that only 21% of potential business users are using data. The vast majority (97%) of real-time decisions are data deprived. This indicates, surprisingly, that organisations and agencies are still relying on their intuition and manual analysis to solve complex problems with multiple variables.
Barriers that limit the uptake of analytics have been well articulated. Kyung-Whu identified the top three concerns – data and privacy concerns, limited access to analytics and lack of talent and training.
On the issue of privacy, 38% of organisations said more than 50% of their data is certified by an organisation authority or adheres to corporate policies. Despite this, customers are concerned about their sensitive and personal data. Organisations need to build trust and communicate properly on the use of data responsibly. This will encourage customers to be more inclined to provide their information.
When it comes to access, data-driven culture often gets stuck at the top. Access to the organisation’s data and analytics is usually concentrated on specific roles. Democratising data is important as it empowers all departments and encourages data-driven decisions at all levels throughout the company.
The last challenge that organisations need to tackle is the lack of talent and training. While simple enough to understand, there needs to be a more intentional drive and strategy to reskill and upskill employees.
In closing, Kyung-Whu encouraged delegates to expanded their thinking and embrace a multi-tool environment. A data-driven culture can only be built on data democratisation, enabling everyone to access every process and every app. Collecting data is only a start, organisations need to enrich the data to gain deeper insights.
Health AI Strategy
Delegates then heard from Sutowo Wong, Director, Analytics and Information Management Division, Ministry of Health, Singapore who elaborated on the AI strategy and use cases in the nation’s health division.
Sutowo acknowledged as the nation shaped its health AI strategy, it needed to be mindful of the external macro trends. One such trend is the democratisation of data and analytics. Self-service analytics and the rising demand for data visualisation requires a better user experience for both data and insights.
The next trend was the rise of analytics apps. Role-based actionable insights needed to be more easily consumed and deployed. Moreover, the ability to support decision making is still the most significant challenge to realising value from investments in analytics.
Singapore’s health AI strategy is aligned with the national AI strategy – a vision that is committed to making Singapore a leader in developing and deploying scalable, impactful AI solutions in key sectors of high value and relevance to citizens and businesses states by 2030.
Specifically, the vision in the health field is to transform and enhance policy decision making, delivery of care and patient outcomes as well as internal operations through the development and deployment of scalable AI solutions in the healthcare sector.
To achieve the vision, the government has developed a strategy and framework for health, identified and driven impactful and feasible AI use cases that could be scaled across the healthcare system, and leveraged ecosystem enablers for AI in the health sector.
Sutowo shared the example of the self-learning retinal screening tech as a successful use case of deploying AI in healthcare. Singapore Eye LEsionN Analyser (SELENA+) is a deep learning system jointly developed by the Singapore National Eye Centre (SNEC) and the National University of Singapore (NUS) that can cut the time needed to screen for Diabetic Retinopathy (DR). SELENA+’s capabilities in analysing retinal images could be extended to a predictive risk assessment model for cardiovascular diseases.
Another use case is AI in the health grand challenge, JARVIS. The initiative aims to help primary care teams stop or slow disease progression and complication development in Diabetes, Hypertension and hyperLipidemia (DHL) patients by 20% in 5 years.
Singapore is upskilling talent based on the whole government analytics competency framework. In the end, Sutowo believes, that beyond AI, the rapid growth in digital health presented opportunities to redefine Singapore care and financing models.
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This session is designed to provide live-audience interaction, promote engagement, hear real-life experiences and impart professional learning and development for the participants. It is an opportunity for delegates to gain insight from subject matter experts, share their stories and take back strategies that can be implemented in their organisations.
The opening poll inquired about the main challenge delegates face in their data strategy journey. Almost half (47%) chose a lack of data culture/literacy/skill across employees as their primary challenge. A little less than one-third (32%) thought that missing an overall strategy that crosses departments and teams is their biggest obstacle. Data privacy and security concerns are the biggest challenges for 16% of delegates while 5 % chose a lack of a centralised tool for sharing and collaboration.
The second question inquired about the best option to overcome the people challenge. Again, almost half (44%) believed that their best choice is to increase data literacy by providing education and certification programs. A quarter chose the leadership team to mandate all employees to use the analytic tool as their best option while 19% opted to improve the current process for business users to get instant data. About a tenth (12%) indicated that providing employees with a self-service analytic tool would be their best option.
On being asked about what their business users do when they have new data requirements, almost two-thirds (65%) approached the IT department directly for support. While almost a quarter (23%) went by their gut feeling, 12% opted to raise a Helpdesk ticket.
The next question was about their agency’s biggest data management barrier. Delegates were equally divided (26%) between data collection and data accessibility and sharing. A little more than one-fifth (21%) identified data accuracy – providing a single source of truth – as their main barrier. While 16% chose real-time insights, about 11% went with regulatory compliance.
Delegates were asked what their agency is doing to manage their data management challenges. Almost half (48%) chose a combination between working with current service providers for better efficiencies, and sourcing for service providers to bridge the gap. A third chose to work with current service providers to improve efficiencies and maintain costs. Almost a fifth (19% ) chose to source for service providers to bridge the gap, alongside existing vendors.
On being queried about how many systems their agency store its data, almost three quarters (71%) employed over 10 systems. One-fifth (20%) used 2-4 systems while 10 % had between 5-9 systems.
The next poll asked which applications delegates spend most of their working days. A majority (70%) spent their time with email while a quarter often used productivity applications (like Microsoft Office).
Asked about whether delegates have considered zero-click experience for data, almost two-thirds (68%) have not considered it while a third (32%) have.
The final issue asked delegates’ top data strategy priority in the next 2 years. Well over half (60%) prioritised data sharing to generate insights across agencies boundaries to equip decision-makers with information needed to execute operations better and plan for future contingencies. The remaining delegates were equally split (20%) between prioritising to empower staff with meaningful data insights to drive decisions and to accelerate legacy modernisation to improve resilience and agility.
The Breakfast Insight concluded with remarks from Kyung-Whu Chung who highlighted the role of data analytics and the need for agencies to begin leveraging it. He urged agencies to become data-driven and advised them to accelerate their digital transformation.
In closing, he invited the delegates to reach out to his team to explore ways they could work together to assist them on their journey.
Good citizen experience is one of the most essential components of an effective government. Unfortunately, it is still a far cry from the seamless, personalised engagements that citizens have and expect from the private sector. Getting information or accessing services from government agencies online continues to be a tedious process and often remains a frustrating experience in most countries. And whilst many governments are prioritising improvement in the way they engage with their customers, bureaucratic processes and outdated policies can often stymie good intentions.
The public sector must shift to citizen-centric digital offerings, with an effective strategy to deliver private sector level digital services. As reported by OpenGov Asia in the exclusive interview with John Mackenney, Principal Digital Strategist, APAC, Adobe, John believes that the strategy for personalisation goes further in government than private sectors. Government has the responsibility of equity – to make sure everyone has access to what is needed and ensure that no one is left behind within society.
To implement these strategies and plans, effective policies must be put in place that support and facilitate government objectives. Currently, misaligned policies, obsolete culture and a lack of leadership often hinder the public sector’s desire for meaningful transformation. That is why Adobe is helping governments update their policies, promote a citizen-centric culture, and encourage forward-thinking leadership as part of their long-term strategy.
Adobe is helping revolutionise public sector agencies through cutting-edge digital transactions because it recognises that great citizen experiences have the power to inspire, transform and revitalise agencies. Such experiences also engender trust and compliance on the part of citizens, just as it does in the private sector. Adobe connects content and data, introduces new technologies that democratise creativity, shape the next generation of storytelling and inspire entirely new business categories.
OpenGov Asia had the opportunity to speak exclusively to Jennifer Mulveny, Director of Government Relations, Asia-Pacific at Adobe on this topic.
Jennifer oversees all public policy issues that impact Adobe’s business in the Asia-Pacific region, including data, international trade, privacy, cybersecurity and intellectual property. She has held several advisory roles within government and business, specialising in international trade and technology policy matters. She currently co-chairs the special interest group on public policy for the Australia Information Industry Association (AIIA).
On the purpose of policies, Jennifer explained that they are designed to support outcomes that improve lives by providing products, services and ultimately help mitigate negative situations. Jennifer divided policies into two categories – policies that prioritise the needs and convenience of citizens and those that facilitate a governments ability to operate as efficiently as possible.
Companies can work to influence policies that advance their own business or facilitate their customers to improve their business. Companies often promote policies that benefit an entire industry or ecosystem, or initiatives that are simply good for the community. Policy strategy can also be more reactionary as companies or associations seek to improve existing well-intended policies, but often the details are hard to implement.
To help governments understand what policies to change or update, Adobe first looks to get a comprehensive understanding of what the government is prioritising based on their goals and electoral pledges. Delivering on mandates and promises is essential – and policymakers recognise that building trust from citizens is more critical now than ever.
Adobe works to increase trust between citizens and governments by creating meaningful and dependable online engagements. Providing the tools needed to delight citizens, whether it is online or an in-person transaction. Consolidating hundreds of citizen-facing government websites with irrelevant information that are hard to navigate into a few personalised, interactive sites with meaningful and streamlined content.
To do this effectively, a strong leader in government has to promote policies that incentivise agencies to put the citizen first by consolidating websites, updating outdated content and digitising paper forms. In 2018 the United States Congress did this with support from the White House by passing the 21st Century IDEA Act, which aims to improve government customers’ digital experience and reinforce existing requirements for federal public websites.
As a result of this policy, US agencies are complying by turning paper forms into digital interactions, enabling digital signatures, modernising websites and overhauling portals for citizens to communicate more efficiently with public officials.
In Australia, a law will soon be passed that promotes the sharing of non-sensitive citizen data between agencies so that that information can be consolidated into a “single view of the citizen” to streamline applications and other processes.
Certain policies can be appropriate for a particular time and place, but as society evolves, some policies will be obsolete. Adobe either recognises several things that need to change or probes the government to find out what they would like to improve regarding citizen engagement. Technologies can assist the public sector to major issues such as improving healthcare and protecting the environment.
Adobe’s research on citizen engagement shows that while people often look for information on a government website, they often do not find the necessary information – or it is a long, convoluted process and they often give up and look elsewhere. This can be easily measured by the time a citizen spends on a site and ultimately turns to a call centre to answer an inquiry often costing governments significant resources. Governments must invest in making the online citizen experience more convenient and intuitive.
An effective policy that facilitates a better citizen experience is one that gives agencies the ability to move data between government agencies. In Australia, sharing citizen data between agencies is an arduous process and one that dissuades agencies from transferring and sharing necessary data. This is why a citizen can often be asked several times for their name, date of birth and address when they have multiple transactions. The Australian government is introducing a policy that makes sharing data across platforms far easier.
Another example is a policy surrounding an electronic or digital signature that got updated out of necessity during the COVID-19 pandemic.
On the global scale, Jennifer opined that the US 21st Century Integrated Digital Experience Act (IDEA) is the best example of a policy that improves the citizen experience. The policy sets a threshold for all government agencies to create and consolidate better website experiences. As a result, many U.S. agencies are stepping up to meet the threshold. Notably, the U.S. Census Bureau launched a mobile app for the first time and the U.S. Department of Energy has digitised paper forms.
Jennifer touched on the topic of how to measure policy success which is not always easy. Objectively measuring the effectiveness of a policy is difficult as each government agency has different Key Performance Indicators (KPIs). However, in general, a “citizen first” policy can be considered successful if it can be demonstrably shown to reduce costs and save time for citizens or public servants. Adobe conducted a study in Australia that showed an improved digital experience could save Australians about 2 days a year. In addition, she notes that success is in creating a more enjoyable environment and culture for citizens.
Jennifer is passionate about advocating citizen-centric policies that prioritise citizens’ needs by creating personalised experiences instead of reactionary-based policies. To create such policies, the public sector needs to obtain and analyse citizen data to make informed decisions, but this does not need to be personal data. Even minimal information that the government can glean from citizens that do not sign into a website, such as their general geographic location and search terms can be insightful.
To help the public sector improve citizen experience, Adobe systematically analyses the obstacles that impede agency aspirations. Categorising the level of difficulty of each obstacle and prioritising which problem to solve first.
Jennifer conceded that one of the most difficult hurdles to overcome is culture. Strong leadership and a genuine team approach for a successful outcome are required to make a significant transformation in citizen experiences.
For example, consolidating citizen-facing engagements such as tax returns, passport applications, social services and license applications into one portal will require buy-in from the agencies that have oversight across those services. Public servants may initially be reluctant to move out of their comfort zone which is why good policy, such as data sharing initiatives can help bolster these initiatives.
Emphasising the significance of partnerships, Jennifer is convinced that there should be a strong relationship between the public and private sectors when it comes to digital transformation. Collaboration between the public and private sectors is the best way to produce new solutions and policies that best serve citizens.
By collaborating with the public sector, Adobe has gained unparalleled insights about government goals and in exchange, the public service gains new skills and global experience from companies that have invested heavily in their products to deliver best-in-breed services.
Jennifer believes many policies, such as allowing agencies to share a certain level of non-sensitive citizen data to create “one view of the citizen” is essential to creating a better experience. She supports non-partisan global policy think tanks that make recommendations of what technology policies governments should adopt based on the economy and demographics. It is essential for both large and small, domestic and foreign-based companies to advocate for policies that mutually provide private sector growth and public sector trust.
Transport for NSW is hoping that aggregated data collected by a Dutch consumer electronics company and LiDAR systems might provide it with more timely insight into conditions and hazards on the state’s road network. The agency, in collaboration with iMOVE Cooperative Research Centre (CRC), currently relies on videos taken by crews for safety assessments, from which certain road attributes are extracted.
However, TfNSW wants to speed up the process, and has embarked on a project that will “convert raw data… into an international standard five-star rating system”. The project will deliver 20,000 km of road attributes in NSW using TomTom’s MN-R map data, as well as prove feature extraction techniques and machine learning for LiDAR data.
MN-R is the model that the consumer electronics company uses to keep its mapping data up-to-date. It combines several layers of data collection techniques, including from the use of its navigation systems and from sensors. In addition to understanding road conditions and hazards, TfNSW hopes the project could also lead to the development of predictive algorithms around injuries and fatalities in the future. The project will feed into a global ‘AiRAP’ initiative from a non-profit roads rating agency, the International Road Assessment Programme (iRAP).
TfNSW is also working with the University of Technology Sydney and geospatial data experts an NSW software company on the project. The local company has previously partnered with the consumer electronics company to extract more than 50 road assets and safety features such as road markings, safety barriers and trees from LiDAR data.
The IRAP global innovation manager, who is overseeing the project, said AI had the “potential to reduce costs and increase the frequency and accuracy of data”. She noted that making faster and more affordable data collection possible means that safety assessments can be done on an annual basis across the whole road network.
The project comes at a time when the federal government is planning to tie infrastructure funding to “measurable improvements in safety”, according to the draft national road safety strategy 2021-30. Canberra has previously set targets for 90% of national highways and 80% of state highways to meet a three-star or better safety standard.
Raising the standards of the world’s roads to a three-star or better standard for all road users will help to focus policy and investment. With crash costs typically halving with each incremental improvement in star rating the potential for 3-start or better roads to save lives is significant. More than 1100 people are killed on Australian roads each year, while around 40,000 are seriously injured.
The iMOVE CRC Managing Director stated that using technologies such as AI to enhance our suite of safety policy tools is a great step forward. “These powerful and insightful tools can inform sound investment by the government that saves lives and unlock significant benefits… through reduce road trauma,” he said.
The project is the second such project in recent years where TfNSW has sought to use AI and ML for road safety improvements. In 2019, TfNSW built a proof-of-concept using ML technology from Microsoft to identify potentially dangerous traffic intersections and fast-track remediation.
Chinese Academy of Sciences (CAS) has launched a research centre to use and share earth observation data to help the world meet the United Nations Global Sustainable Development Goals (SDGs). Chinese President sent a congratulatory letter for the launch, urging all sides to strengthen cooperation and contribute to the UN SDG 2030 and establish a community with a shared future for mankind. The research centre helps China fulfil the promise made to the 75th session of the UN General Assembly last year.
The big data research centre will pool information from satellites, aircraft, drones, and ground arrays and sensors on issues like environmental commons, urban and rural-urban fringe development, food security, and energy decarbonisation.
We will establish monitoring and evaluation systems for these goals, and share collected data with relevant UN organisations and developing countries.
– Director, International Research Center of Big Data for Sustainable Development Goals
The UN has attached great importance to the collection and sharing of such data. UN Under-Secretary-General and Executive Director of the UN Environment Programme supported CAS for working together with others to bridge the data gaps for the Sustainable Development Goals (SDGs) and to improve the effectiveness in using environmental data.
Environment-related data and indicators at all fingertips are critical so that people can understand which of their actions are improving the environment and which are deteriorating it further. Investing in big data for the environment is no longer an option, but an absolute necessity.
Former assistant director-general for Natural Sciences at UNESCO praised that the centre has an international vision. The centre will be contributing to a workforce worldwide that knows how to work across disciplinary boundaries and to work across the national boundaries to come with some solutions for the common problems.
Chinese researchers have promised to overcome challenges like lack of data, insufficient research, and uneven progress in implementation. CBAS Director said that As long as human beings exist, we always face the challenge of how to maintain sustainable development. Hence, applying big data to understand and to promote research on the subject is our eternal task. The research centre will provide critical support in the form of knowledge, expertise, and technical means and this is to help achieve a balanced and sustainable development in step with the UN goals.
According to an article, the Chinese government has issued a Three Year Plan for new data centres, demanding that new facilities become more efficient. The Three-year Action Plan for the Development of New Data Centers (2021-2023) also limits the growth of data centres to 20% and sets out a national architecture supporting national cloud hubs, provincial data centres, and Edge data centres.
As reported by OpenGov Asia, China has been active lately in passing several new laws and regulations relating to data privacy and security. The two recent laws which tend to focus more on those handling data national security and/or public interest are Critical Information Infrastructure or Important Data.
China passed the Data Security Law (DSL). The key focus of the DSL is the protection and security of critical data relating to national security and the public interest. The most significant element of the law is the so-called data classification system whereby the government will classify different types of data based on its level of importance and then publish a protection/security standard for each class of data.
The nation has released the Security Protection Regulations on Critical Information Infrastructure (CII Regulation). The CII Regulation is an implementing rule of the Cybersecurity Law (CSL). It applies only to Critical Information Infrastructure (CI”), which refers to the network and IT system that is critical to national security and public interest but may also have implications for companies that supply or service such networks and systems.
China will introduce several policy measures, including promoting the institutional opening-up of the digital field, to develop digital trade and deepen international cooperation in this area. The government will benchmark international high-level economic and trade rules and advance the institutional opening-up of the digital field.
In order to build more demonstration zones, China will guide local governments to carry out pilot trials based on opening-up platforms such as the pilot free trade zones to stimulate innovative activities of trade in services. Addressing a digital trade development-themed forum during the 2021 China International Fair for Trade in Services being held in Beijing, China will participate in the formulation of related international rules and safeguard the multilateral trading system.
In the next step, the government will deploy more resources to support the growth of national-level export bases for digital service to create favourable conditions to boost digital trade, he said. The Ministry of Commerce released a report on China’s digital trade development at the forum, noting the country’s digital trade makes up an increasingly larger part of its foreign trade.
The ministry data showed that during the 13th Five-Year Plan period (2016-2020), China’s digital trade volume soared from $200 billion in 2015 to $294.76 billion in 2020, an increase of 47.4%, and its share in trade in services surged from 30.6% to 44.5% during this period.
COVID-19 has spurred more consumers to shop online, and this meant more businesses are transacting online and across borders. Increased trades stimulated by initiatives will continue to drive more cross-border transactions and the need for low-cost, fast, transparent digital payment options.
An academician at the Chinese Academy of Engineering said that with computing power improvement and the acceleration of transmission, digital trade has become a new form of China’s foreign trade. Digital trade takes data as the production factor and digital delivery as the main method. An estimate based on statistics showed China’s digitally-delivered service trade hit $294.76 billion in market size last year, an increase of 8.4% year-on-year, accounting for 44.5% of its total service trade.
The digital economy has become an increasingly important driving force of China’s high-quality economic growth. China’s total market size of the digital economy ranked second worldwide last year, just behind the United States. China Academy of Information and Communications Technology released a white paper that showed the market size of China’s digital economy surged 9.6% year-on-year in 2020.
China will establish and improve relevant laws and regulations for the right classification of data resources, protection of cross-border transmission, privacy and public safety during China’s 14th Five-Year Plan period (2021-2025).
In addition to participating in the formulation of international rules and technical standards such as data security, currency and tax, the government will actively uphold international exchanges and cooperation in cyberspace in the coming years.
As reported by OpenGov Asia, China’s digital economy has taken centre stage and opened up new possibilities for international cooperation as it is becoming ubiquitous in modern life. China saw the display of a wide range of cutting-edge technologies and applications, including a cloud-computing processor the size of a business card, an AI machine that can grade homework and tests, and smart home technologies that allow users to voice-control heaters and floor-cleaning robots through a mobile app.
In recent years, China has been actively promoting digital industrialisation and industrial digitisation and has been pushing forward the deep integration of digital technologies with economic and social development. Digitisation, networking and intelligence should provide more momentum for economic and social development, thus creating a new chapter for digital economic cooperation.
The established financial sector is implementing digital strategies to remain competitive and relevant in the face of rapidly evolving customer expectations. However, fintech start-ups and big tech firms have already begun to fill that void by providing customers with better and more engaging experiences via intuitive, simple-to-use services. These new entrants have eaten away the profit margins of conventional financial firms. Traditional establishments must now be more innovative and think more creatively if they want to not only survive but thrive.
New-age technologies have created several platforms and solutions that can ensure the immediate survival and long-term growth of such institutions. Social media, cloud, big data and analytics, machine learning (ML), and artificial intelligence (AI) can all contribute to improving processes and customer experience in the existing setup.
That apart, companies must simultaneously reconsider traditional innovation models and launch new offerings. An innovation mindset allows banks to create the right digital strategy to meet their short-term, as well as long-term goals.
Data is becoming increasingly important in economies and societies, not least in the financial services sector. Technological advancements have vastly enhanced financial service providers’ ability to capture, store, combine and analyse a much broader range of customer data, ranging from the current or previous location to customer habits and preferences.
Data is simply the most recent way for banks to deliver on their core promise: to listen to their customers, create services that benefit them, and personalised experiences. Critical banking technology architectures leverage this to include a frictionless process layer for banks of the future to deliver on the foundational pillars of digital customer experience.
This, in turn, will allow organisations to build, optimise and secure digital interactions from frontend to backend, across the entire technology stack in a multi-cloud future environment.
Special-purpose applications will become more adaptable and self-configuring as programmes and devices become smarter. Machine learning encompasses dozens of approaches and algorithms for improving the performance of advanced technologies. Algorithms will improve greatly as they are used, and they will include mechanisms to optimise them.
Overarching all this is the pandemic. The crisis’ acceleration of digital transformation has been responsible for rapid evolution in data architecture, allowing for the use of appropriate production and channel to provide better customer experiences.
This was the focal point of the OpenGovLive! Breakfast Insight on 3 September 2021 – a closed-door, invitation-only, interactive session with Singapore’s top financial institutions and organisations. The session aimed to provide the latest information on delivering an effective and efficient customer experience.
Finding Partners to leverage Data and Technology
To kickstart the session, Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia delivered the opening address.
Mohit agreed that technology today is being used at a high speed by people. Now the retail, financial and public sectors are also rapidly rolling out new IT, tech software and solutions to survive and thrive in the digital age. Digital transformation is the ‘latest buzz word’ used by everyone across the world in every imaginable area and sector.
While the adoption of various technologies increased significantly during the pandemic, the solutions cannot be referred to as digital transformation. For the most part, organisations relied on band-aid technologies and ad-hoc platforms to stay afloat.
With remote working models in place early in the pandemic, people have grown accustomed to accessing information at any time, on any platform and with any device – courtesy of retail businesses.
The retail industry has embraced personalisation and transformed itself in a variety of ways, including accessibility, options, ease of doing business and security. Customer expectations, based on the retail service delivery model, have had a significant impact on the financial and public sectors.
Mohit emphasised the importance of partnership to leverage data and cloud computing in an organisation. By partnering with the right people, a company can accelerate its digital journey towards digital transformation.
Technology and Data Strategies for the New Normal
Delegates next heard from Dan Brassington, Chief Technical Advisor, APAC, Splunk. Dan agreed with Mohit’s premise, stating that digital transformation is widely used by organisations across all industries and that the future of organisations must be empowered by data running in a multi-cloud world.
He acknowledged that customers in Splunk have not changed; as consumers, they are still moving forward, and this is what organisations are attempting to achieve to ensure that the company is constantly evolving.
Dan agreed that there are fundamental shifts in data-driven decision-making as a COVID-19 forced unprecedented change at unprecedented speed.
The first is forced adoption is where online, mobile and call centre channels play critical roles in leveraging and communicating with customers. There was a clear tipping point when retail migrated to digital and contactless payments in conjunction with how businesses started to consider the data behind their customers’ journeys.
Another significant fundamental shift that occurred as a result of the pandemic was overnight virtualisation. Every business shifted to remote working. Businesses needed to make data accessible and secure and, simultaneously, had to figure out how to enable the people involved in the process to run the organisation.
He observed that as technology evolution take place, businesses must become more resilient and take into consideration the market structure and the reality of national economics.
Speaking on digital channels, he emphasised how they have become the new drivers – the latest ways used to communicate with customers. The caveat is that companies must continue to expand, grow and manage these digital channels. Dan has had to resolve various challenges for their customers – how they plan to continue to evolve within their digital channel, how they will support it with data and what the future of it will be.
The fact of the matter is that all of the innovation and transformation is so that businesses can meet their customers’ expectations and provide smooth experiences in a bid to remain relevant and profitable.
Dan firmly believes that adopting a cloud strategy is essential for organisations to enjoy a better data experience. He claimed that organisations currently only use 60%-70% of their data, they must better leverage data if they are to offer digital services and experiences everywhere they operate. This will then ensure the organisation’s digital transformation.
Dan knows that success is dependent on a cloud strategy. To achieve their goals, organisations must consider visibility, control security posture, the ability to detect an issue before it occurs and observability by understanding the end processes around the application for a better customer experience.
Dealing with an Ecosystem
Simron Sharma, Director of Client Experience, Standard Chartered Bank, revealed that the bank is always looking to their customers or clients, listening to their needs and exploring ways to meet their needs and demands.
Nonetheless, Simron says it is no longer adequate in and of itself as customer demands have increased over time. It is no longer just about listening to them but about listening to an entire ecosystem. Despite all the methods and solutions for connecting with their customers – surveys, ratings, NPS and complaint forms – the question remains, “Is it enough?”
Simron elaborates, “At Standard Chartered Bank, we firmly believe that our relationship managers are probably the bank’s biggest critics because they interact with clients every day and provide us with insights on what the clients demand.”
Simron agrees that the right partners help significantly in providing insights as well as future trends that can guide the bank’s digital journey effectively.
After the informative presentations, representatives from the different organisations participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear the real-life experience, and impart professional learning and development for participants.
Delegates were first asked how mature their organisation’s cloud strategy is. Over half (54%) answered that they have defined a cloud strategy and were starting to implement it. About a third (31%) said they are still evaluating cloud and experimenting with different options. 15% confirmed that they have a comprehensive cloud strategy and are well ahead in its practical implementation.
Interestingly, one of the companies said it had actually started its cloud journey a few years ago, had recently developed a cloud strategy and has ever since been working on furthering its cloud adoption.
Commenting on barriers, a representative from a Japanese Bank said that not all banks are ready to store data in the cloud environment as they worry about governance and the risk of exposing client’s data.
A delegate from an insurance company shared that they are in between these adoptions as their cloud system does not have an appropriate database. The cost of migration was also a concern for their company to fully adopt the cloud.
The second question asked what the main drivers for data analytics and data capabilities within their organisation were. Similar to the previous question, over half (54%) said it was to gain better insights and to create a personalised customer experience. Over a third (35%) indicated it was to help generate additional revenue through highly targeted marketing strategies and 11% use it for risk mitigation and improving the employee experience (as organisations think that risk and security play the most important part to secure customers data).
The third question inquired about the challenges their financial institutions’ face with getting data insights. An overwhelming majority (82%) agreed the issue is disparate data sources from multiple data silos. The remaining delegates were evenly split (7% each) between of shortage of staff who understand big data analytics and data storage and quality. The last group (4%) went with no proper software for data analytics.
With many financial institutions looking at digital transformation today, delegates were questioned on the importance of digital transformation to their organisations. Almost three fourths (73%) confirmed that it was very important, and they are undergoing transformation currently. Interestingly, about 15% acknowledged that it is important but said it came with certain constraints.
The fifth poll asked if customer-centric organisations are structured around the experiences of their customers. To answer this, delegates had to indicate which statement (presented to them) they most agree with.
A third (33%) opted for design and offer the right services, products and experiences to the right customers. Another 33% said they ensure that their customer requirements are filtered through to capabilities – as such, processes, capabilities, and systems are systematically aligned to customer needs. A fifth (21%) agreed that they understand who “banks with us “– the market, types of customers and how the needs of customers differ. Just over a tenth (13%) indicated that they are a collaborative organisation, where top-down and bottom-up decisions are aligned.
The final question asked what the main challenge was they faced when implementing a digital strategy. Over a third (35%) felt it was legacy technologies lacking integration capabilities – even though they understood that legacy issues are always going to be here. Under 30% went with inflexible business processes and teams while 22% opted for regulatory constraints. About 13% said it was the lack of properly skilled teams.
The Breakfast Insight concluded with remarks from Dan Brassington, who re-emphasised the role of data technology and the need for businesses to begin adapting to it. He urged organisations to become data-driven, digital organisations and advised them to accelerate their digital transformation.
In closing, he clarified that Splunk was not there to sell their products but to assist organisations in leveraging data and the cloud. He invited the delegates to reach out to his team to explore ways they could work together to assist them on their journey.