Telematics has transformed the vehicle
insurance industry over the past few years. Insurers are increasingly offering
usage-based insurance (UBI) products, based on the data gathered from GPS-connected
Telematics, a portmanteau of telecommunications
and informatics, enables the real-time transmission, receiving and storage of
data and information from remote objects such as vehicles. The information
could include location tracking of vehicles, driver behaviour, maintenance
requirements and data on accidents.
Now, developments in Internet of Things
(IoT) technology present a further opportunity for insurers to move beyond
connected vehicles and offer new insurance propositions in the home, commercial
property, life, and other insurance lines. IoT can connect insurers to the
assets they are insuring, whether that is a vehicle, a property, or a person.
This allows insurers to understand the exact status of any insured items and
potentially take action in response to an abnormal situation, such as detecting
a fire, a vehicle collision, or an abnormal heartbeat.
To take advantage of these next generation
IoT-based insurance propositions, Octo
Telamatics, a market-leading telematics service provider, developed a new
platform, which they called the Next Generation Platform (NGP).
pioneer in vehicle telematics
Established in Italy in 2002, Octo
Telematics was one of the first companies to support insurers in the then
vehicle telematics space. Since then, the company has established itself as the
global leader in the telematics service provider (TSP) space. It has one of the
largest global database of telematics data, with over 186 billion miles of
driving data collected and 438,000 crashes and insurance events analysed (as of
31 December 2017).
Octo Telematics captures a comprehensive
set of data from a vehicle, including the speed, location, and journey
duration, as well as aspects of a driver's behavior, such as how harshly they
accelerate or brake and how quickly they corner. This data is combined with
further contextual information, such as the local weather conditions, road
type, and current traffic situation, and analysed to provide the insurer with a
detailed profile of the true risk posed by a specific driver at any particular
learning, the company can make more accurate predictions and risk models, thus allowing the insurer to calculate a premium level that accurately
reflects the risk and usage to the level of the individual policyholder.
Telematics also offers the ability to
detect and respond to crash and claim incidents in real time, and to react in
the most appropriate way, by alerting emergency services or dispatching
roadside assistance. The telematics data captured during a claim incident also
provides insurers with a precise view of a claim, enabling decisions about
liability, the likelihood of fraud, and estimated cost of repair to be made in
a fraction of the time. As claims expenses typically account for between 70%
and 80% of an insurer's costs, this can have a significant effect on an
insurer's profitability. In addition, it can dramatically improve the
customer's experience of the whole process, and boost customer retention.
of dealing with growing volumes and variety of IoT data
Octo Telematics had developed a proprietary
telematics platform that had evolved over the last 15 years. However, with the
growth of IoT and an order of magnitude increase in the number and types of
connected sensors, Octo Telematics foresaw that the current platform would
increasingly become a constraint on the company's growth ambitions.
One of the biggest challenges was that the
existing platform was designed around the needs of vehicle telematics and could
not easily accommodate other types of sensors, such as wearables, smart
watches, smart locks, smoke detectors, and surveillance devices, which are
becoming increasingly important components in IoT insurance propositions.
The growth in sensors also implied the need
to monitor data from potentially tens of millions of connected devices in the
near future. Though the old platform was able to support over 5 million
vehicles, the platform was reaching the limits of scalability.
The data management platform also needed to
accommodate a broad range of inbound data types, ranging from real-time streams
from in-vehicle telematics devices to bulk uploads from other sources, such as
third-party weather data. There is also significant variation in the formats of
stream data depending on the application and capabilities of the sensor. This
can vary from relaying a simple journey start and finish time through to
detailed crash reconstruction data from units incorporating sophisticated
sensors, such as six-axis accelerometers with very high sampling rates. This
data diversity will increase further, potentially to include images and video
The increased number of items being
monitored would also require significant growth in the compute and storage
capability needed to support real-time analysis across many millions of sensors.
The data captured by the platform, whether
from vehicles, properties, or people, needed to be closely coupled to the
incident response and claims processes to enable insurers to offer
policyholders a fully integrated IoT insurance proposition. The existing
platform lacked this high degree of integration.
In addition to capablities of accommodating
potentially orders of magnitude, the data management and analytics
infrastructure would also have to ensure the total security of all inbound data
or "data in motion," as well as that of data within the platform
residing on disk and other storage media ("data at rest"). The challenge
is compounded by the sheer volume of data from connected sensors, numbering in
the millions, distributed across a wide geography.
and implementing a co-innovation approach
Due to the scale, complexity, and
criticality of the development needed to realise the NGP in a time frame that
would allow Octo Telematics to capture the emerging IoT opportunity, the
company decided to adopt a co-innovation development model.
Octo Telematics used its understanding of
the evolving insurance market to define the functional requirements of an NGP
capable of supporting a broad range of new IoT-based insurance propositions.
Key technology partners were identified for
the development of the NGP: Cloudera, Software AG, Salesforce, SAS, and SAP.
Using this co-innovation approach, Octo Telematics and its partners were able
to accelerate the design and implementation of the NGP, delivering a complex
and challenging development project in under 24 months.
The initial phase of formulating the
approach and conducting a dialogue with the partners to refine and improve the
architecture of the NGP took seven months. A jointly agreed co-innovation
roadmap was created. The implementation took 18 months of development, with an
initial prelaunch version being released to key existing Octo Telematics
clients at the end of 2016. Following the beta testing phase, the full
commercial version of the NGP was released in July 2017. All new Octo
Telematics clients are now supported on the NGP, with a migration plan in place
to move the majority of existing clients to the new platform.
11 billion additional data points daily
The resulting NGP enables Octo Telematics
to store, process, and analyse data generated by over 5.3 million drivers
totaling 175 billion driven miles, and that increases by over 11 billion
additional data points daily. It also allows for complete flexibility in the
selection of sensors, analysis and output of data for all insurance and
And the backbone of this NGP is powered by
Cloudera’s machine learning and analytics platform. The Cloudera Enterprise suite
includes a set of tools to provide security, governance, and workload
management functionality operating within an integrated data and platform
model. The platform provides the underlying infrastructure to ingest, process,
and analyse huge volumes of structured and unstructured data, while being able
to perform analytics on both streaming and static data sources. All inbound
data, data moving between multiple clusters, as well as data stored within the
platform, is encrypted.
A "scale out" hardware approach was
adopted, as opposed to “scale up”. Scale-up is done by adding more resources to
the existing nodes of a system, while scaling out involves adding additional
infrastructure capacity in the form of new nodes, which can be done through the
use of commodity on-premises and cloud-based hardware. This avoids the need for
investment in expensive high-performance servers, as storage and compute
The NGP also utilises Cloudera's Shared
Data Experience (SDX)
module to define and enforce unified user and role-based access and security
policies, as well as provide auditing capabilities at the application, cluster,
and environment level.
Using Apache Spark, Octo Telematics is able
to leverage the huge volumes of data, the compute power of multiple clusters,
and a resilient distributed data set (RDD) structure to quickly implement,
train, and test machine learning models.
These models allow Octo Telematics and its
insurance customers to better understand, model, and price risk, and can form
the core of new innovative insurance products.
Inherent in the Cloudera Enterprise
platform's distributed computing model is the ability to operate the NGP both
on-premises and across private or public cloud. The ability to flexibly use
major cloud service providers such AWS, Google Cloud Platform, and Microsoft
Azure means the NGP can support transient but compute-intensive projects, such
as testing new pricing algorithms or risk model development, on a usage-based
The NGP has resolved the capacity issues of
the previous platform and is now continuously scalable. It will only require
additional cloud-based compute and storage resources to accommodate the growth.
The enhanced functionality in areas such as
CRM and incident analytics, as well as the increased capacity of the NGP, means
that Octo Telematics can offer all insurance clients detailed, real-time crash
reconstruction capability. This will allow users to drive significant
efficiency improvement in claims processing, identify potential fraud, and
enhance the customer's claims experience.
Octo Telematics' insurance clients also benefit
from the additional functionality of the NGP by being able to introduce new
types of IoT-based insurance products. For instance, one client introduced a
property insurance product that uses a home hub, developed by Octo Telematics,
that is equipped with smoke, heat, flood, and intrusion sensors. Another
insurer introduced a pet insurance product using IoT-based GPS tags worn by the
pet. Yet another is piloting the use of smart watches as part of a health and
life insurance offering.
Furthermore, the NGP is reducing time to
market for new product launches by more than 50%. The time to implement a new
UBI product has been reduced from two to three months to four weeks.
Currently, most insurers implement UBI offerings
as stand-alone projects requiring parallel core administration and claims
systems. To address the inefficiencies and complications from this, Octo
Telematics is working with core insurance software vendors to develop a range
of connectors that will allow direct integration between the NGP and an
insurer's core processing systems. This direct integration will significantly
reduce the cost of entry and complexity for insurers wanting to offer IoT-based
As of November 2017, Octo Telematics had
developed a connector allowing direct integration of the NGP with the policy
administration and claims suite of Guidewire, a software for property and
casualty (P&C) insurance providers.
Octo Telematics is also looking at
extending vertical-specific functionality to the NGP beyond the telematics
sector, in support of a wider spectrum of industries, such as the telecoms,
energy and utilities sectors.
The Enterprise GIS (eGIS) system collates and presents geospatial data from diverse sources in an integrated manner to allow the Police to have better situational awareness and effect a quick and informed response. With thousands of real-time GPS data being ingested into the eGIS system every minute, Police get a real-time view of events as they happen and plan for resources, keeping the public and officers safe.
This is a collaboration by Singapore Police Force and Home Team Science and Technology Agency. The team received the Esri Special Achievement in GIS Award in July 2020, for its innovative application of mapping and analytics technology. The team is the only one from Singapore to win this international award in 2020 out of over 300,000 candidates globally.
In a release by HTX it explained that the system contains over 800 disparate data sets including topographical maps, building locations, navigation routes, street views, terrains, locations of Neighbourhood Police Centres and police cameras.
The geospatial data from these diverse sources are stacked on top of one another on one electronic platform and allows the Police to look at this to get the ‘big picture’, and immediately identify crime hot spots, map crimes, deploy officers for emergency calls, and make data-driven decisions.
HTX Assistant Chief Executive Tay Yeow Koon said in a release that “With a quantum increase in data and information over the last few years, the Police required a system to help them see the data in a unified manner. The eGIS enables police officers to be ‘smarter’, react faster, and make data-driven decisions to prevent, deter and detect crimes”.
The eGIS platform also enables interconnectivity between systems, helps to develop new capabilities like blue force tracking, provides a visual context to location data and brings insights to the users with geospatial analysis and dashboards. Today, the eGIS exists as a map visualisation in Police command & control systems and other backend operational applications.
“The eGIS is one of the enterprise-wide solutions that HTX worked hand-in-gloves with SPF as an integrated joint ops-tech team to conceptualise, plan and implement capabilities to achieve SPF’s Capability Vision 2025 to safeguard Singapore. This project has also opened up opportunities to introduce the upcoming map hackathon (or internally known as SPF Mapathon) to proliferate the use of GIS among SPF officers to solve real-life problems at work”, explains Bernard Phang, Director/Policing Programme Management Centre (PPMC).
Some of the new capabilities such as 3D geodata and line-of-sight analysis were introduced to provide a new visual perspective to SPF officers when planning for significant events such as the National Day Parade. 3D models of buildings, when used with line-of-sight analysis, allow Police to analyse and plan for any visual obstructions before going for site survey. A heatmap analysis tool in the system allows ground officers to determine the areas around Singapore they should concentrate on a daily basis.
Today, the prototype has evolved into an enterprise level platform serving 16 SPF systems and various users from SPF, SCDF and HTX. In addition to SPF, SCDF and CNB have also implemented their own eGIS platforms.
Photo Credit: www.htx.gov.sg
OpenGov Asia hosted its third installment of its Virtual Breakfast Insight: Powering Smarter Data and Resilient Government with Advanced Analytics on 29th July 2020.
This audience comprised of senior digital executives from the Indonesian public sector. The session once again witnessed a 100% turnout with delegates from 16 different government agencies in attendance.
The session was opened by Mohit Sagar, Group Managing Director and Editor- in- Chief at OpenGov Asia.
Mohit shared that the whole world came to a sudden shut down when the COVID – 19 hit us. Everyone was shocked and scared by the magnitude of its impact. However, governments didn’t get a chance to slow down.
In fact, they were the ones who kept nations going by ensuring all necessary services were provided as uninterrupted as possible.
During this process governments collected a lot of data about their citizens’ needs and requirements.
Mohit emphasised that it is imperative for the governments to extract relevant insights from this data to identify services that are more in demand than others and how to provide them.
Times like these, he stressed, require strong leadership. Leaders who can recover and respond to the current crisis and also plan for a better future.
He concluded his presentation by highlighting the importance of working with the right partners (both internal and external) who can help recognise the opportunity amid crisis and make the best of it.
Joseph Musolino, Global Sales & Strategy Consultant, Fraud and Security Intelligence for SAS shared his insights with the audience.
He began by sharing an interesting stat that 61% of the organisations in the last year picked Machine Learning and AI as the most significant tools for the next year.
Joseph then elaborated on the numerous challenges that organisations face in making AI and Analytics a part of their current working paradigm.
He then expounded on the various SAS Analytical capabilities that can help agencies and organisations overcome the afore mentioned challenges and adapt analytics tools quickly.
To validate this, he shared actual examples of the various areas where governments are deploying analytics in serving citizens better: customs, healthcare, taxation and judicial issues.
Dr. Ian Opperman, Chief Executive Officer and Chief Data Scientist, the New South Wales Data Centre took over to share his learnings on Data sharing during COVID-19.
He began by highlighting privacy concerns as the major issue when it comes to sharing data, especially between government organisations.
Ian further emphasised the importance of source and context in which the data is being analysed, i.e. data from open source or in a closed and controlled environment.
He shared an actual example of how his agency gained insights from open data sources during the COVID-19 pandemic.
Ian also shared how powerful and useful insights were, if carefully extracted from various open data sources and shared with various concerned parties.
After Dr. Opperman’s presentation the session went into a more interactive time with the polling questions addressed to the audience.
On the question of the biggest impact of the COVID-19 had had on an organisation, a majority of the audience voted for increased demand for services with rising expectations from citizens (45%). Another major section voted for disrupted sectors looking to the government to provide innovative policies and processes (35%).
A senior executive from the Ministry of Health shared that there has been an increased demand from the government to transform digitally and serve citizens better. So the major focus and challenged has been digital transformation of the government.
On the next question of how the pandemic has changed the functioning of agencies / departments, delegates responded with several interesting reflections.
Pertinent to the topic of discussion, a majority of the audience were of the view that they have become dependent on data and analytics to make decisions (40%).
A delegate from the Ministry of Finance shared that he voted for the above option as a result of what he has personally experiences.
He elaborated that he is heavily involved in policy development to overcome the challenges during COVID-19. In so doing, he has realised that data is of paramount importance when it comes to making well informed decisions. And analytics is a powerful tool for drawing useful insights from data.
On the question, “have advanced analytics and AI become a higher priority for your organisation as a result of the COVID-19 pandemic”, the audience was equally split between strongly agree (45%) and agree (45%).
A delegate from the Ministry of Education and Culture shared that there is an urgent need to make analytical tools a higher priority. During critical times like a pandemic, making the right choice of what is best for the citizens and students can be hard. Analytics can play a vital role in evaluating the various options and choosing the best out of them.
The session concluded with closing remarks by Febrianto Sibaro. He expressed his gratitude towards the delegates for attending the event and sharing their insights.
The delegates acknowledged that they gained a lot more information about data analytics and how it can improve their day to day workings in serving citizens better
The New Zealand government has launched a set of standards designed to act as a guideline for government agencies on how to use algorithms.
The new Algorithm Charter is the first of its type. According to New Zealand’s Minister of Statistics, the charter will help to improve data transparency and accountability.
The Algorithm charter for Aotearoa New Zealand demonstrates a commitment to ensuring New Zealanders have confidence in how government agencies use algorithms. The charter is one of many ways that government demonstrates transparency and accountability in the use of data.
This is most notably observed when algorithms are being used to process and interpret large amounts of data.
Using algorithms for data analysation and decision making is a risky task. The charter will help determine whether the algorithms are being used in a fair, ethical, and transparent way.
So far, 21 agencies have signed the charter. This includes the Department of corrections, Ministry of Education and the ministry for the environment.
In it, departments pledge to be publicly transparent about how decision-making is driven by algorithms, including giving “plain English” explanations; to make available information about the processes used and how data is stored unless forbidden by law
By signing the charter these agencies have agreed to commit a range of measures such as explaining how decisions are informed by the algorithms; making sure data is fit for purpose by managing and identifying biases, ensuring that privacy, ethics and human rights are maintained.
The development of the charter was recommended by the New Zealand government chief data steward and chief digital officer who said that safe and effective use of operational algorithms required more attention and greater consistency across the New Zealand government.
The recommendation was presented after a call made to the New Zealand government about how the government agencies were using algorithms to analyse data.
There were claims that the New Zealand government agencies were potentially using citizen data collected through the country’s visa application process.
This was done to determine those who were breeching their visa conditions by filtering people based on their age, ethnicity, and gender.
A former New Zealand Immigration minister originally rejected the idea, stating that immigration looks at a range of issues such as those who have made and have had rejected multiple visa applications.
He said that it looks at people who place the greatest burden on the health system, people who place the greatest burden on the criminal justice system and uses that data to prioritise those people.
It is important that the integrity of New Zealand’s immigration system is protected and that that immigration resources are used as effectively as possible.
Departments committed to the charter included New Zealand’s accident compensation scheme – which was criticised in 2017 for using algorithms to detect fraud among those on its books – and the corrections agency, which has deployed algorithms to determine an inmate’s risk of reoffending. The immigration agency, found in March to be profiling applicants by algorithm, is also a signatory.
The New Zealand government added that the algorithm charter would evolve and will be reviewed next in 12 months to make sure that it has achieved its intended purpose without creating unnecessary burden or halting progress.
Agencies must also consider te ao Māori, or Indigenous, worldviews on data collection and consult with groups affected by their equations. In New Zealand, Māori are disproportionately represented in the justice and prison system.
In its endeavour to explore and understand how governments around the world are tackling the pandemic, OpenGov Asia organised another Virtual Breakfast Insight.
The session was held on 24th July 2020 with public sector agencies in Singapore to understand how the have adapted to these unexpected times.
Singapore’s government has been driving the adoption of digital and smart technologies throughout the city state as a part of its Smart Nation initiative well before COVID-19 hit the world.
Keeping up with the trend of the series, the session saw a 100% from public sector executives in Singapore.
The session was opened by Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia.
Mohit emphasised the critical role that governments are playing at this point and the enormous amount of data they are producing in this process.
In the digital era, data is unequivocally the new currency; so, it is very important to understand how this data is managed.
Apart from responding to the current situation and recovering from it, governments also have to plan for a secure future.
Mohit advised the audience to also look at the opportunity in crises and collaborate with partners who have a similar vison, who are adding value to their organisations.
After the opening, Remco den Heijer, Vice President, ASEAN, SAS, addressed the audience. He began by briefly introducing his organisation and its mission of improving lives by making better decisions.
Better decision making is done by providing organisations with software that helps manage their Data and Analytics.
Remco shared interesting examples of ways data and real time analytics stream helped governments and public to stay updated with the developments in the world during the last 4 months.
As per his observation, he felt that governments all around the world are tackling the COVID 19 pandemic in a phased approach that rested on 3 basic pillars:
He concluded by highlighting the vital role of Data and Analytics tools in helping governments reimagine the world post the pandemic and emerge stronger and better able to meet the needs of our citizens.
Joseph Musolino, Global Sales and Strategy Consultant, Fraud Security Intelligence, SAS then shared his insights.
He appreciated countries like Singapore that are at a relatively mature stage in their digital transformation journey. For these countries, adopting AI and Analytics is not the challenge. For them the challenge is to deploy it fast and make it more effective.
Joseph opined that the focus for these countries should now be to take AI and Analytics to enterprises and make it faster and easier to deploy.
He highlighted some of the areas where governments are currently deploying advance analytics to strengthen their delivery mechanisms.
They include – Customs, Pandemics, Medical, Taxation and Judicial systems.
In order to give the audience a detailed understanding of how exactly the theory plays out, he demonstrated real life situations where analytics helped government serve citizens better.
He concluded by informing the delegates about their new platform which is a step forward into the next-gen analytics.
After these rich insights, Jeanne Holm, Chief Data Officer and Senior Technology advisor to the mayor of City of Los Angeles took the virtual stage.
Jeanne shared first-hand account of how governments can use predictive data analytics during critical times and also for operations in general.
She explained that that the administration in LA is using advance analytics for two major purposes.
- observing data real time for city management
- predictive analytics that echoes with the before mentioned recovery and to reimagine the city
The LA mayor’s office uses integrated data sets from different sources to have an overall view and make better informed decisions.
Communication of this information to the people is also a major priority of the office and they utilise technology to enable that.
Jeanne shared some of the technology driven initiatives by the LA government that are serving people better during the pandemic hit and in the future. They are:
- Angeleno App: This app is a single way that allows people to access any city service and make e- payment for it.
- Shake Alert LA: This is a warning system that helps send out e- signals out quickly during an earthquake and also informs them of the magnitude and intensity of it.
- Augmented reality public parks games: that let LA residents, especially younger kids visit zoos and park virtually and learn from them while being safe.
- Predicting what we breath: This is a program that uses machine learning to understand urban air quality using satellite and ground data.
- Autonomous piloting on slow streets: This program enables safe autopiloting on certain streets that have less traffic. This is determined from real time satellite and ground data.
- Data Science federation: Open Data and Technology is a team sport. It is important that all the people in this eco system are working towards the same goals. This is exactly what this federation does, collaborates governments and educational institutions working on technology.
After Jeanne’s presentation the session took a more interactive form with the polling questions for the audience.
On the question of how the COVID-19 pandemic changed the way their department/agency functions, major part of the audience voted for “more reliant on social/communication technology” (42%).
On the next question of which of the capabilities will be most useful if a situation like COVID-19 occurred in the future, a majority of participants voted for “better understanding of critical operational processes and human capital required to keep government and healthcare operations running during the lockdown” (52%).
The session also featured a demonstration of a SAS knowledge management solution that stores and catalogues information about the analytical assets of an organisation.
This demonstration helped the audience better understand how easy and useful it is to actually use these applications and the ways in which it can make their functioning more effective.
To give a context to solution demonstration, Mohit put up few questions for the audience which brought out some interesting findings.
On the question of what the typical challenge in is starting a data science project, the audience were split between “lack of understanding on the available data asset” (31%) and “lack of collaborative environment to support team effort” (36%).
The chief data officer from a government agency reflected that technology is rarely an issue when it comes to implementing data science projects. The challenge is more to do with the organisation culture. Traditionally, people look at data as a mere record that needs to be stored. They do not understand that effective data management can help them solve several their routine problems.
There is an urgent need to make data literacy a part the organisation’s culture and make people realise the potential of data.
The session concluded with closing remarks by Joseph. He expressed gratitude to all the attendees and for them sharing their thoughts and experiences.
The well aligned content throughout the session will go a long way to encourage the delegates to utilise Data and Analytics in their workings.
The year 2020 has left us all changed forever. The pandemic has impacted every aspect of our society and personal lives.
As the people struggle to cope, governments and public sector agencies across the globe are working relentlessly to remain sustainable and functional.
The public sector has not had any respite during the last few months. In fact, efforts to deal with the present and to be better prepared for the future, the focus on digital transformation and resilience have redoubled.
OpenGov Asia hosted the second in a series OpenGov Live! Virtual breakfast Insights to discuss what “Powering Smarter and Resilient Government with Advanced Analytics and AI” entails.
The virtual session saw a full house, with public sector executives from various Malaysian agencies in attendance.
Technology is a powerful tool that can lead us towards a better future
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, set the tone for the discussion by highlighting the pain and pressure on governments over the last four 4 months.
Governments are dedicating all their energy and resources to provide whatever is necessary to maximise the wellbeing of its citizens.
Even as governments have been working hard to protect residents, they have been collecting huge volumes of data from the general public.
This data has been for logistical purposes, for tracking and tracing, for infection control and overall citizen care.
In such times, Mohit said, the role of leaders becomes very important. Leaders not only manage the response to the crisis but also the recovery and path to a better future.
Technology is a powerful tool to communicate and connect with people and has the potential to lead to a better life ahead.
Mohit concluded his opening on an optimistic note pointing out that in a crisis danger and opportunity coexist.
With the right mindset and right people around, one can capitalise on the opportunity that present themselves.
Data Analytics can help government agencies plan better during stressful times
After the opening, the stage was taken by Remco den Heijer, Vice President – ASEAN, SAS.
Remco briefly introduced the audience to SAS and its mission of improving lives through better decisions.
He then shares his observations about the phased approach being followed by governments in tackling the pandemic and how SAS data analytics tools support that.
Remco presented the three stages highlighted by him:
To expound on the Respond Stage, he talked about collaboration with governments in India and the United States to predict their medical infrastructure requirements using SAS’s data analytics.
Similarly, during the Recovery Stage, SAS worked closely with government agencies to prepare them as they were getting ready to reopen their economies.
Data analytics helped government determine their revenue streams, expenses and distribution patterns of stimulus packages.
Remco closed by saying that SAS technology can help governments Reimagine the future – to be better prepared for the next emergency well in advance, avoiding loss to lives and resources.
Data is the biggest commodity that governments have today
The next presenter was Joseph Musolino, Global Sales and Strategy Consultant for Fraud & Security Intelligence, SAS.
Joseph began by emphasizing the importance of data analytics as a tool that can help get valuable insights to understand the environment and the need of the hour.
He further emphasized the relevance of data analytics in the light of the pandemic tying it to the three stages of Respond, Recover and Reimagine.
With governments investing billions of dollars in stimulus packages to support its citizens, it becomes imperative to make sure it reaches people who really need it.
Advance data analytics is a useful tool that can help governments identify the needs of its people and fulfil them by accurately allocating the resources.
Analytics can also help governments manage to their revenue streams and distribute aid to people as they enter the reimagining stage.
He concluded by advising delegates learn a lesson from this pandemic and be well prepared for the next critical event that might hit us.
Analytics can help government get useful insights by measuring the impact of their response to the pandemic
Chris Buxton, Chief Digital Officer, Stats New Zealand shared his insights on the topic.
He began by talking about the three-horizon model by Mckinsey and contextualizing it to New Zealand’s response to the pandemic.
Horizon One is concerned with seeing the way the pandemic has impacted the nation. In this horizon, data and analytics tools help the government to measure the virus’ impact and suitable plan its response.
Horizon Two is more focused on measuring the response of the initiatives by government.
The data related to this horizon helps government identify the areas that needed help more than the others and then act accordingly.
Horizon Three is focused on getting the country back to the speed to which it was before.
Monitoring long-term data indicators like employment rates, GDP, etc. will allow governments to decide their course of action in the years to come.
With this, the session moved into an interaction with participants as they discussed various polling questions around the topic.
On the question of what was most impacted in their respective organisations due to the COVID – 19 they audience were fairly evenly split.
They were divided between Increased demand for services with rising expectations from the citizens (32%), Workforce planning and Need to test the resilience of working remotely both in short and long term (32%).
The Director of Public Services, GLC’S and Telco Sales for SAS Malaysia shared that in the current stage of the pandemic, organisations are already in the reset mode. At this point all areas mentioned in the poll question are equally impacted and need attention. Had this question been asked at the beginning of the pandemic, the answer would have been different.
On the question of which area does your organisation needs to develop most to respond more efficiently to the next COVID – 19, the delegates seemed divided between Integrated operations models to keep the government running efficiently and sustainably (40%) and Use of data and analytics to improve situational awareness for real time decision making (37%).
A senior technical delegate from MIMOS explained that data sharing among government agencies and organisations was a big challenge. For one, they are not integrated and secondly, there are several restrictions on data sharing. This is one area he felt that needed more attention.
On the final question on the biggest risk when spending billions on stimulus packages during the COVID – 19 pandemic, almost half the delegates voted that the money has not been disseminated to the right citizens or businesses that need it the most.
A senior delegate from a social security organisation reflected that one of the major lacunae in distribution of the stimulus packages was that people who were not affiliated to an organisation or were contributing to a social security fund were deprived of the benefits of government’s stimulus package.
The session concluded with remarks by Nik Ariff Nik Omar, Director of Public services, GLC’S and Telco Sales for SAS Malaysia.
It was very interesting to know that about 25% of the attendees were already using the analytical and AI tools that were discussed during the session.
Nik also encouraged the rest to make data analytics an integral part of their workings to enhance their delivery to citizens.
He thanked all the delegates for taking their time out and wished them the best as they are on their journeys to return to the normal.
It has been highlighted over the past few months that in cities, outbreaks or clusters of COVID-19 positive individuals can grow very fast in heavily populated built up areas.
Tracking how people move around urban areas can pinpoint where disease might transmit fastest and farthest.
Places where there have been large gatherings of people have seen high infection rates such as a few notable Church gatherings, outbreaks from people socialising in nightclubs and restaurants as well as outbreaks in residential blocks.
Governments have the task of predicting where are the places that have the highest probability of spreading the disease and governments need to be equipped with the tools and technology to help them do this.
Big-data studies of human mobility need to be combined with epidemiological models. And the demographic profiles of people coming into contact at any particular location need to be included.
In many cities, the details of everyday interactions in cities are not documented well enough to model risk factors accurately, as experiences with COVID-19 show. Resorts, conferences, religious gatherings and workplaces have all experienced notable outbreaks.
Groups living in close proximity are a very high risk risk. Almost 93% of Singapore’s COVID-19 cases in the first 48 days occurred in dormitories for migrant workers.
Each block houses hundreds or thousands of workers. Cases there increased rapidly in a very short space of time to more than 40,000, or more than 12% of that population, compared with fewer than 2,600 infections elsewhere in the city-state of 5.3 million people.
Mapping the Spread of COVID-19
A model of disease spread can be built and refined as data and knowledge improve on human flows on three levels. City-wide – a map which highlights the main flows of people throughout the city. Then by busiest locations and busiest timings should then be mapped and then thirdly record demographics and types of human interactions.
By combining all these insights, governments will be better able to anticipate superspreading locations and target precautionary measures, such as delaying reopening businesses, quarantining arrivals, tightening crowd control and intensifying cleaning and disinfection in particular places.
How Government can tap into pre-existing resources
All sources of data on human mobility need to be tapped. For example, ‘smart’ cities such as Singapore have networks of cameras on lamp posts to track traffic flows. These could be reconfigured to track the density and mixing of people anonymously.
Data from geolocation and contact-tracing apps can map where people go, who they interact with and for what length of time.
Funding agencies can help fund studies of human movement and interactions in key superspreading locations such as transport hubs.
Urban analysts and modellers need to study and document the types of face-to-face interactions, networks and crowd mixing.
Governments should use these data and models to target their public-health strategies. More effective targeting of measures will help to avoid ‘virus fatigue’ among the public and help education and the economy by allowing places to minimize the risks of some kinds of reopening.
The Selangor government set up a task force to monitor, oversee and educate the public on the transmission of Covid-19 in the Malaysian state back in March 2020.
The task force is led by former Health Minister, Datuk Seri Dr Dzulkefly Ahmad and assisted by four professional individuals who had previously worked with him at the ministry, as well as representatives from the Selangor Health Department.
Analytical platforms using Big Data Analytics and Machine Intelligence have been set up by the Selangor state government to assist its state Task Force for Covid-19 (STFC) in battling the coronavirus outbreak.
STFC chairman and former health minister Datuk Seri Dzulkefly Ahmad said the introduction of the advanced computerised systems is to allow his task force to offer more informed recommendations and evidence-based advice to the Chief Minister when undertaking local interventional measures in handling the outbreak.
“As a guide to the Selangor Chief Minister who directs the state machinery in undertaking localised interventional measures, STFC is determined to put forward all recommendations and necessary actions in line with scientific proof and the latest information.
“To achieve this purpose, an analytical platform compatible with Big Data Analytics and Machine Intelligence has been developed so that STFC’s recommendations can be backed up by the latest scientific research, which is ever changing during the period of this epidemic.”
He explained the use of Big Data and Machine Learning would then allow the state to conduct communal screenings and contact tracing in a more efficient and organised manner.
Successfully Using Tech at a Local Level to Fight COVID-19
In a recent Health Summit online panel Dr Helmi, Head of the Digital Epidemiology and Data Analytics portfolio in the Selangor Task Force for COVID-19, spoke of how the team delivered two solutions, the population and location risk-ranking analytics platform for COVID-19 and a digital contact tracing initiative leveraging on QR technology.
The team created a QR code for digital contact tracing that does not require sharing of personal data. The telephone number is the only data captured, which is found to be highly acceptable by users. Each shop and premises are given unique QR codes, which visitors scan when entering.
Out of Selangor state’s 6.2 million population, an impressive 5.9 million or about 95% used the codes. If an establishment is linked to a COVID-19 case, all the visitors can be traced and contacted, and slow the transmission.
As of now, store visits totalled some 30 million, but the team has managed to detect 225 patients and 1,075 contacts.
State Government Efforts to Complement Efforts at Federal Level
Datuk Seri Dzulkefly Ahmad has said the intelligence gathered by the STFC would ultimately be meant to work in tandem to complement the ongoing efforts being taken at the federal level.
“By supplying a little expertise and experience from the members of STFC, we are committed to raising the ability of the state of Selangor, and among others, submit recommendations to the State Health Department to contain the outbreak of the SARS CoV-2 virus, as well as reducing the number of Covid-19 patients in Selangor.
“The state government acts within the framework of complementing the efforts of the Health Ministry, who are the Custodians of Health for the entire Malaysia,” Datuk Seri Dzulkefly Ahmad said.
Since Selangor is the most populous state in Malaysia and a technology front-runner, its success has paved the way for the rest of the country. The QR approach was replicated by the federal government and then nationwide. The team’s area profiling and risk app enabled active detection, which has also been adopted throughout Malaysia.