Telematics has transformed the vehicle
insurance industry over the past few years. Insurers are increasingly offering
usage-based insurance (UBI) products, based on the data gathered from GPS-connected
Telematics, a portmanteau of telecommunications
and informatics, enables the real-time transmission, receiving and storage of
data and information from remote objects such as vehicles. The information
could include location tracking of vehicles, driver behaviour, maintenance
requirements and data on accidents.
Now, developments in Internet of Things
(IoT) technology present a further opportunity for insurers to move beyond
connected vehicles and offer new insurance propositions in the home, commercial
property, life, and other insurance lines. IoT can connect insurers to the
assets they are insuring, whether that is a vehicle, a property, or a person.
This allows insurers to understand the exact status of any insured items and
potentially take action in response to an abnormal situation, such as detecting
a fire, a vehicle collision, or an abnormal heartbeat.
To take advantage of these next generation
IoT-based insurance propositions, Octo
Telamatics, a market-leading telematics service provider, developed a new
platform, which they called the Next Generation Platform (NGP).
pioneer in vehicle telematics
Established in Italy in 2002, Octo
Telematics was one of the first companies to support insurers in the then
vehicle telematics space. Since then, the company has established itself as the
global leader in the telematics service provider (TSP) space. It has one of the
largest global database of telematics data, with over 186 billion miles of
driving data collected and 438,000 crashes and insurance events analysed (as of
31 December 2017).
Octo Telematics captures a comprehensive
set of data from a vehicle, including the speed, location, and journey
duration, as well as aspects of a driver's behavior, such as how harshly they
accelerate or brake and how quickly they corner. This data is combined with
further contextual information, such as the local weather conditions, road
type, and current traffic situation, and analysed to provide the insurer with a
detailed profile of the true risk posed by a specific driver at any particular
learning, the company can make more accurate predictions and risk models, thus allowing the insurer to calculate a premium level that accurately
reflects the risk and usage to the level of the individual policyholder.
Telematics also offers the ability to
detect and respond to crash and claim incidents in real time, and to react in
the most appropriate way, by alerting emergency services or dispatching
roadside assistance. The telematics data captured during a claim incident also
provides insurers with a precise view of a claim, enabling decisions about
liability, the likelihood of fraud, and estimated cost of repair to be made in
a fraction of the time. As claims expenses typically account for between 70%
and 80% of an insurer's costs, this can have a significant effect on an
insurer's profitability. In addition, it can dramatically improve the
customer's experience of the whole process, and boost customer retention.
of dealing with growing volumes and variety of IoT data
Octo Telematics had developed a proprietary
telematics platform that had evolved over the last 15 years. However, with the
growth of IoT and an order of magnitude increase in the number and types of
connected sensors, Octo Telematics foresaw that the current platform would
increasingly become a constraint on the company's growth ambitions.
One of the biggest challenges was that the
existing platform was designed around the needs of vehicle telematics and could
not easily accommodate other types of sensors, such as wearables, smart
watches, smart locks, smoke detectors, and surveillance devices, which are
becoming increasingly important components in IoT insurance propositions.
The growth in sensors also implied the need
to monitor data from potentially tens of millions of connected devices in the
near future. Though the old platform was able to support over 5 million
vehicles, the platform was reaching the limits of scalability.
The data management platform also needed to
accommodate a broad range of inbound data types, ranging from real-time streams
from in-vehicle telematics devices to bulk uploads from other sources, such as
third-party weather data. There is also significant variation in the formats of
stream data depending on the application and capabilities of the sensor. This
can vary from relaying a simple journey start and finish time through to
detailed crash reconstruction data from units incorporating sophisticated
sensors, such as six-axis accelerometers with very high sampling rates. This
data diversity will increase further, potentially to include images and video
The increased number of items being
monitored would also require significant growth in the compute and storage
capability needed to support real-time analysis across many millions of sensors.
The data captured by the platform, whether
from vehicles, properties, or people, needed to be closely coupled to the
incident response and claims processes to enable insurers to offer
policyholders a fully integrated IoT insurance proposition. The existing
platform lacked this high degree of integration.
In addition to capablities of accommodating
potentially orders of magnitude, the data management and analytics
infrastructure would also have to ensure the total security of all inbound data
or "data in motion," as well as that of data within the platform
residing on disk and other storage media ("data at rest"). The challenge
is compounded by the sheer volume of data from connected sensors, numbering in
the millions, distributed across a wide geography.
and implementing a co-innovation approach
Due to the scale, complexity, and
criticality of the development needed to realise the NGP in a time frame that
would allow Octo Telematics to capture the emerging IoT opportunity, the
company decided to adopt a co-innovation development model.
Octo Telematics used its understanding of
the evolving insurance market to define the functional requirements of an NGP
capable of supporting a broad range of new IoT-based insurance propositions.
Key technology partners were identified for
the development of the NGP: Cloudera, Software AG, Salesforce, SAS, and SAP.
Using this co-innovation approach, Octo Telematics and its partners were able
to accelerate the design and implementation of the NGP, delivering a complex
and challenging development project in under 24 months.
The initial phase of formulating the
approach and conducting a dialogue with the partners to refine and improve the
architecture of the NGP took seven months. A jointly agreed co-innovation
roadmap was created. The implementation took 18 months of development, with an
initial prelaunch version being released to key existing Octo Telematics
clients at the end of 2016. Following the beta testing phase, the full
commercial version of the NGP was released in July 2017. All new Octo
Telematics clients are now supported on the NGP, with a migration plan in place
to move the majority of existing clients to the new platform.
11 billion additional data points daily
The resulting NGP enables Octo Telematics
to store, process, and analyse data generated by over 5.3 million drivers
totaling 175 billion driven miles, and that increases by over 11 billion
additional data points daily. It also allows for complete flexibility in the
selection of sensors, analysis and output of data for all insurance and
And the backbone of this NGP is powered by
Cloudera’s machine learning and analytics platform. The Cloudera Enterprise suite
includes a set of tools to provide security, governance, and workload
management functionality operating within an integrated data and platform
model. The platform provides the underlying infrastructure to ingest, process,
and analyse huge volumes of structured and unstructured data, while being able
to perform analytics on both streaming and static data sources. All inbound
data, data moving between multiple clusters, as well as data stored within the
platform, is encrypted.
A "scale out" hardware approach was
adopted, as opposed to “scale up”. Scale-up is done by adding more resources to
the existing nodes of a system, while scaling out involves adding additional
infrastructure capacity in the form of new nodes, which can be done through the
use of commodity on-premises and cloud-based hardware. This avoids the need for
investment in expensive high-performance servers, as storage and compute
The NGP also utilises Cloudera's Shared
Data Experience (SDX)
module to define and enforce unified user and role-based access and security
policies, as well as provide auditing capabilities at the application, cluster,
and environment level.
Using Apache Spark, Octo Telematics is able
to leverage the huge volumes of data, the compute power of multiple clusters,
and a resilient distributed data set (RDD) structure to quickly implement,
train, and test machine learning models.
These models allow Octo Telematics and its
insurance customers to better understand, model, and price risk, and can form
the core of new innovative insurance products.
Inherent in the Cloudera Enterprise
platform's distributed computing model is the ability to operate the NGP both
on-premises and across private or public cloud. The ability to flexibly use
major cloud service providers such AWS, Google Cloud Platform, and Microsoft
Azure means the NGP can support transient but compute-intensive projects, such
as testing new pricing algorithms or risk model development, on a usage-based
The NGP has resolved the capacity issues of
the previous platform and is now continuously scalable. It will only require
additional cloud-based compute and storage resources to accommodate the growth.
The enhanced functionality in areas such as
CRM and incident analytics, as well as the increased capacity of the NGP, means
that Octo Telematics can offer all insurance clients detailed, real-time crash
reconstruction capability. This will allow users to drive significant
efficiency improvement in claims processing, identify potential fraud, and
enhance the customer's claims experience.
Octo Telematics' insurance clients also benefit
from the additional functionality of the NGP by being able to introduce new
types of IoT-based insurance products. For instance, one client introduced a
property insurance product that uses a home hub, developed by Octo Telematics,
that is equipped with smoke, heat, flood, and intrusion sensors. Another
insurer introduced a pet insurance product using IoT-based GPS tags worn by the
pet. Yet another is piloting the use of smart watches as part of a health and
life insurance offering.
Furthermore, the NGP is reducing time to
market for new product launches by more than 50%. The time to implement a new
UBI product has been reduced from two to three months to four weeks.
Currently, most insurers implement UBI offerings
as stand-alone projects requiring parallel core administration and claims
systems. To address the inefficiencies and complications from this, Octo
Telematics is working with core insurance software vendors to develop a range
of connectors that will allow direct integration between the NGP and an
insurer's core processing systems. This direct integration will significantly
reduce the cost of entry and complexity for insurers wanting to offer IoT-based
As of November 2017, Octo Telematics had
developed a connector allowing direct integration of the NGP with the policy
administration and claims suite of Guidewire, a software for property and
casualty (P&C) insurance providers.
Octo Telematics is also looking at
extending vertical-specific functionality to the NGP beyond the telematics
sector, in support of a wider spectrum of industries, such as the telecoms,
energy and utilities sectors.
An American multinational developer of analytics software has committed to up-skill a minimum of 500 students in analytics across Malaysia by the end of 2020, in response to increased demand for data science expertise. Under the banner of the firm’s Software Certified Young Professionals (SCYP), the program will collaborate with the Malaysia Digital Economy Corporation (MDEC) to help drive the adoption of emerging technologies across the country.
Central to such efforts will be enabling students to work towards the certification in programming, machine learning and visual analytics through e-learning courses, supported by access to online communities and webinars.
The Managing Director of Malaysia at the firm stated that the company has a deep-rooted history in academia. Launching a program to empower Malaysian students with the firm’s analytics knowledge and expertise helps in answering the rising demand for technology professionals in Southeast Asia.
Business organisations need people who can make sense of data, manage and analyse it, build models and determine what information delivers the most value. Students with an analytical skillset will be highly sought after.
Once students have completed the e-learning courses and attended the associated webinars, a certification exam will follow before connections with SAS customers seeking young data science professionals.
Within Southeast Asia, “free or heavily subsidised” online courses are available to undergraduate, postgraduate and PhD students who are enrolled at a university, business school or university college in Malaysia, Indonesia or Vietnam. There are currently three courses available for students in Malaysia and Vietnam, and five courses on offer in Indonesia, spanning data analytics, statistics, machine learning and virtualisation.
The CEO of MDEC stated that the agency’s strategic partnership with the software company aligns perfectly with its commitment to ensuring delivery of technology relevant programmes to Malaysian students and help Malaysians make the digital leap into the era of the Fourth Industrial Revolution.
The agency sees its public-private partnership initiatives such as the tech firm contributing to Malaysia’s overall growth of the data science skills required in the workforce to support the digitally-driven economy, which is also critical to meet the demand of the current and future job market.
Growing demand for tech professionals
OpenGov Asia earlier reported that Malaysians with niche skills in technology have far brighter prospects in 2020 as many sectors are hiring in their push forward with digitalisation. A Malaysia-based consultancy’s 2020 salary survey revealed that job opportunities and higher pay were expected for those in mid to high-level management positions in eight sectors.
Talents with niche skills who are changing jobs, on the other hand, are looking at an increment of up to 30 per cent due to demand outstripping supply, the firm’s Country Manager for Malaysia said in a statement accompanying the survey report.
The survey also encouraged as employers may be more open to hiring job seekers with the necessary tech skills but who may have less industry experience.
Moreover, as Malaysia invests more into its technological infrastructure, the more it will see tech talent flooding into the nation, thereby growing its digital economy and pushing forward its Industry 4.0 goals.
The Ministry of Information and Communications (MIC) and the Cuban Ministry for Communications held an online training course: “Designing and developing big data systems” for Cuba. It was officially opened at Hanoi and La Habana. The training course took place within a week with the coordination of the Embassy of Cuba in Vietnam and two of Vietnam’s leading ICT groups: VNPT and Viettel.
According to a press release, the objective of the course was to provide advanced knowledge about big data such as analysing, designing, and developing big data systems for IT application and e-government development in regulatory agencies.
The course will aid Cuba to solve challenges and tools for big data as well as related content. It attracted nearly 50 attendants from Cuba’s Ministry of Communications, ministries, sectors, corporations, and ICT enterprises.
Topics conveyed by Vietnamese lecturers and experts from the Authority of Information Technology Application (MIC), VNPT, and Viettel included: general knowledge about big data; big data processing; the storage and handling of big data; infrastructure requirements; how to manage big data using IPv6; analysis and presentation tools, models, methods and techniques math for analysing and integrating big data, etc.
The event is one of the activities in a series of activities celebrating the 60th anniversary of the establishment of diplomatic relations between Vietnam and Cuba and the Vietnam – Latin America Relationship Development Plan in 2020.
In the framework of cooperation between the two ministries, in July 2019, MIC coordinated with VNPT, Viettel, and Bkav to organise training courses on cybersecurity in Havana for Cuba. Furthermore, to promote the specialised ICT cooperation between the two, MIC undertook several activities like participating in the La Havana international book fair in Cuba, publishing two books in Spanish and copyright granting activities, exchanging radio and television programs, and promoting images and the relationship between the two countries.
In the coming time, MIC will host an investment promotion conference in the field of ICT with Latin American countries in October and continue to host a 01 information security training course for Cuba, scheduled for November.
Vietnam has also been providing support to Laos’ digital transformation. As OpenGov Asia earlier reported, thanks to a program under Viettel, all citizenship data has been uploaded to the system, improving the capacity to manage data and information about people, and helping reduce administrative procedures. This is the first time that Laos has implemented the management of electronic civil status instead of the registration of civil status as before.
The unit in Laos was the first licensed by the Central Bank of Laos to officially deploy mobile money and is also the only company developing this service in the country, offering a new secure and quick payment method for more than six million people. This field is expected to generate 30-50% of Unitel’s telecoms revenue in the future. Founded in October 2009, the Viettel subsidiary operates across all 17 provinces and cities in Laos and has led the market for eight consecutive years. It is also the Laos government’s partner in implementing the country’s key e-government systems.
With increased demand and far wider usage, the pandemic has significantly impacted the financial sector in a multitude of ways. With stay-at-home advisories and lockdowns in place, reliance on online banking and digital commerce shot up astronomically. And the industry had to keep pace with this transactional mushrooming to stay intact, relevant and to meet the consumer requirement.
The latest OpenGovLive! Virtual Breakfast Insight on 18 September 2020 explored how the financial sector industry in the Philippines is coping with the new normal and how it can better equip itself to last the long haul.
The packed OpenGov Asia virtual hub was a testament to the relevance and timeliness of the session. Senior digital executives from across the sector joined in to discuss and explore what has been done and what can be done in the field.
Data compliance for financial organisations as essential as ensuring the flow of money in the economy
The event opened with a welcome address from Mohit Sagar, Group Managing Director, and Editor-in-Chief, OpenGov Asia and a quick round of introductions.
Mohit set the ball rolling by sharing how the pandemic has created a lot of chaos in the financial sector industry forcing the leaders to make the digital transformation as much a priority as ensuring the continued availability money in the system.
Data intelligence and governance are key issues for ﬁnancial industries in the post-COVID-19 era. Data strategy – whether to have an integrated approach or a siloed outlook – is dependent on each organisation’s culture and it’s thinking on how to survive in the current environment.
At the same time, of paramount importance in the GDPR era is for organisations to make sure their data strategy is compliant to industry and privacy regulations.
Since data compliance is of such significance, Mohit concluded by advising delegates to partner with champions in the field rather than trying to do everything in-house.
Drivers and pillars of data governance in organisations
After Mohit set the tone for the discussion, Sachin Tonk, Director, Data and Privacy Operations, Standard Chartered Bank shared his insights.
Sachin began by charting the journey of data governance and where it sits today. Data evolution is very challenging as, not only, is it complicated and complex, but the rate of change is also very fast.
He went on to address the question of why we need data governance in the first instance. The rationale sits in two main categories:
- Internal demand that includes in-depth analysis, agility for growth and real-time operations.
- External demand that comprises new data products, GDPR, MIFID ll and M&A.
All the factors under these two umbrellas make data governance both tedious and, often, convoluted.
Sachin opined that data governance and privacy is going to be the biggest priority for organisations in the coming year. He expounded on the major pillars of data protection policy in any organisation.
The first being robust governance framework, policies and processes. This must be complemented by the second pillar I.e. proper awareness and training to create a culture with compliance in its DNA. He also added that security and IT technology come are the glue binding all components together.
Continuing in the same vein, Sachin spoke about the various essential actors involved in the governance process. Data owners, data stewards and monitors have to come together and collaborate to get the right spirit of data governance.
He then shared that the key to having a robust data governance policy is creating a catalogue of questions related to the actors. When formulating the ideal governance policy, he advised teams to go for small wins rather than opt for prototyping and then scaling up across the organisation.
In closing, Sachin noted that data governance was not a one-time activity. He emphasised the importance of monitoring progress and measuring the success of the governance policy and constantly working to improve it.
Data governance is becoming increasingly challenging and complicated for organisations
After Sachin’s informative presentation, Varghese Mathew, Business Director, Hitachi Vantara, Philippines, spoke to the topic at hand.
Varghese began by sharing some interesting statistics s about the challenges faced by organisations in the data governance domain.
Almost 74% of organisations have difficulty in evaluating quality and reliability of data, 61% have too many data sources and almost 90% need an intelligent data governance strategy.
Varghese further explained the need for organisations to have data governance in place in the current digital era.
The sheer volume of data makes its governance enormously complex, inevitably driving organisations to go tech in order to manage data more efficiently.
Other drivers include technological silos and regulations like GDPR, MIFID etc. Moreover, increasingly, countries are formulating their own regulations around data protection, making it tough for organisations to survive amid the complexity. Companies that are not compliant across the board pay a heavy price.
Varghese explained that the objectives of having a data governance policy are to manage the huge volumes of structured and unstructured data, the data being kept in silos within organisations, multiple business goals and the rapid speed and demand for compliance.
He then went on to share the Hitachi Vantara approach. The goal is to help businesses deal with different types of data silos and to make sure it is visible and governed well and on an intelligent platform where it can be analysed.
Their solutions help organisations ensure every bit of their data is available, insightful and actionable – making it easier to govern.
Varghese also explained how the Hitachi Vantara solution can help organisations make better sense of their data. It can help organisations save time and resources by not indulging in unnecessary data forensics, regulator reporting, etc.
He underscored this by sharing a case study where Hitachi Vantara helped a customer organise and make sense of their data. He shared how Hitachi helped Rabobank reduce time to discovery of data for governance and regulatory reporting by automating communication monitoring.
After Varghese’s presentation, it was time for polling questions and to involve the delegates in an interactive discussion.
On the first question regarding the current primary reason for data governance projects, a majority of the audience voted for compliance and regulatory requirements (46%).
A senior delegate from a major Philippines bank shared that she voted for compliance and regulatory requirements because once this is done, a better quality of information, data security and privacy will obviously follow.
On the next question about a centralised data compliance strategy, delegates were divided between, data being stored and managed centrally (56%) and some data being centralised while some are managed by specific department/country/ business (44%).
Another delegate shared that they voted for option one as their data is managed centrally and integrated in one place. Governance, data quality and analysis are done in one place for the consumption of management and operations.
On the final question about rating your organisation’s biggest concern in meeting GDPR requirements, a major chunk of the delegates voted for data protection: needing clearer details on how data is processed and secured with timely notification if data is compromised (57%).
A delegate reflected that data protection is the biggest priority of their organisation currently; with employees working from home, the data is more vulnerable to disruptions than ever before. Thus, it is very important to ensure that the data we are working on is fully secure and protected.
After the engaging discussions and deliberations, the session came to an end with closing remarks by Verghese Mattew.
He thanked all the delegates for taking their time and participation in the Virtual Breakfast Insight. He concurred that their ideas and reflections were in-line with the trends in the global space where Hitachi is operating. He also echoed their sentiments regarding the struggle faced by organisations to make their data accessible as well as keeping it safe and protected from a compliance point of view.
In closing, Varghese assured the delegates that Hitachi Vantara solutions were available to assist them in the same way as they have done for numerous organisations thus far.
OpenGov Asia achieved another significant milestone and a new industry benchmark with the launch of its first-ever OpenGovLive! Virtual Tech Day on 16 September 2020. The one-of-a-kind hands-on virtual workshop not only enabled a high degree of interaction among remote participants but also let them collaborate virtually in groups on an in-house, dedicated, customised platform.
OpenGov Asia’s innovative gamification model and novel delivery mechanism were one of the highlights of the event. As importantly, the topic of discussion and the problem addressed during the workshop was extremely relevant and timely. This was borne out by the hundred per cent attendance which comprised of delegates from various government agencies in Indonesia.
The innovative OpenGovLive! Virtual Tech Day had delegates address a very real challenge that the Indonesian government currently faces – the issue of appropriate distribution of social benefits to the citizens who are invisible digitally and for whom the government has almost no data.
Jakarta has 56,175 active COVID- 19 cases (data as on 15 September) and cases are constantly on the rise. Consequently, there is an urgent need for the government to act quickly as well as accurately and efficiently.
SAS’s smart data analytics solutions like its data investigator, proactive alerting, etc. can be of great assistance to the Indonesian government in this testing time and can help them in serving citizens far more effectively.
During the OpenGovLive! Virtual Tech Day, SAS was able to practically illustrate the power and effectiveness of its solutions through the simulations and demonstrations in the gamification sessions. Each scenario required intense collaboration and detailed discussion to arrive at possible resolutions of the problems at hand. While delegates thoroughly enjoyed the gamification, they realised how effective the heuristic methodology is for collaborative learning.
An interesting observation was that even though the problems presented were hypothetical, the delegates could not help but think of them in their current context. This allowed them to apply themselves diligently and passionately.
In addressing each of the three scenarios, the delegates had specific questions about SAS solutions. Their engagement the conversations, demonstration and simulations were an indication that they were keen to understand how the solution would suit their current reality. The interactive, real-life scenario approach generated a lot of interest and curiosity among the delegates; it also sparked a desire to learn more about how SAS solutions could help them on their journey.
Challenge for governments to support digitally invisible citizens
The workshop revealed powerful and practical ways to work with data at a country level. The session was opened by Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia.
Right off the bat, he raised a pressing challenge that governments all over the world are facing while dealing with the pandemic: supporting digitally invisible citizens.
Taking responsibility for the well-being of its citizens, governments earmarked generous amounts of stimulus and support packages. But lack of data on digitally invisible citizens – those who are not reflected in official digital records, that do not do online banking or do not have digital identities – led to misappropriation of resources and the most vulnerable being left out.
This digitally invisible population is unable to receive the benefits as there are no records or data on them making it impossible for governments to ensure that assistance reaches them.
The workshop addressed this problem though OpenGov Asia’s unique gamification model.
Delegates were divided into 5 different teams (Black Widow, Scarlet Witch, Captain America, Iron Man, Thor) allowing them to collaborate and devise relevant digital options and strategies that could assist governments to resolve these critical issues.
Data is of the essence to governments
After Mohit’s opening, Fauzi Efendi, Sales Director, SAS Indonesia spoke with the delegates. Taking the lead from Mohit, he shared how data is of the essence to governments as they are trying to help citizens through these trying times. He also shared how the SAS solutions have the potential to help governments better serve citizens effectively, efficiently and comprehensively.
Gamification: solving problems while playing
After Fauzi’s address the session geared into a more interactive and practical (and fun) exercise: learning through gamification.
The simulation was played out in three phases/scenarios where each one focused on a specific aspect that needed to be addressed to solve the hypothetical problem. OpenGov Asia’s platform is designed in such a way that contending teams are not privy to the discussions of other teams during the breakout sessions.
Scenario 1 emphasised the very first aspect of consideration in the process: Data Management and Data Preparedness. The teams worked together in a breakout session to chalk out detailed steps of how they would resolve the problem at hand.
After the breakout session, each team disclosed their answers to the moderator to inform a session of rich discussion and exchange among the different groups. Teams shared why they thought of certain options as optimal or non-optimal given the hypothetical problem.
Another attraction of these breakout sessions was the wild card that lets the teams propose a solution outside the given options. This makes the discussion even more insightful as OpenGov accepts the fact that no-one can have ALL the answers. Collaboration and outside-the-box thinking are the ultimate problem-solving methodology.
To give the delegates a better understanding of how exactly SAS can assist them in data management and preparedness, Wibowo Leksono, Senior Consultant, SAS demonstrated how their solution manages data smartly.
After the demonstration, it was time for our delegates to solve for Scenario 2 that highlighted the importance of Data Visualisation and Communication.
The teams again went into a breakout session to chart out their next steps for Scenario 2. After active engagement and participation, the teams came back with their answers to the moderator. The delegates once more engaged in reflections and discussions based on the answers revealed by different teams.
After the intense time of sharing and debate, Wibowo once again stepped up to demonstrate how the SAS’s visualisation solution can help delegates in this scenario.
Finally, it was time for delegates to address Scenario 3 that focused on Data Modeling and Analysis using AI and Advanced Analytics.
Once again, the teams went into discussion sessions to determine their immediate next steps for Scenario 3. After contemplation and discussion, teams shared their answers. This was followed by reflections on the optimal and non-optimal options by delegates representing different teams.
After the three scenarios team Captain America were declared the final winners of the day for their farsightedness and teamwork.
The declaration was followed by a final demonstration to help delegates understand how SAS’s visual investigator tool can help government agencies analyse citizen data better. With search, proactive alerting and analysis of inter-entity relationships, the tool is all-encompassing and powerful.
Wibowo also addressed questions from the audience to clarify and expound on the various solutions demonstrated during the session.
The Virtual Tech Day concluded with closing remarks from Febrianto Siboro, Managing Director, SAS Indonesia.
Febrianto thanked the delegates for joining the session and enthusiastically participating.
He clarified that the demonstrations were designed as slides and not videos intentionally as the concepts are complex and are best understood through interaction. He appreciated his colleagues for easy-to-understand and informative demonstrations.
He encouraged the delegates to believe in the power of data and the potential to derive insights from it. He solicited the participants’ contributions as their genuine information and data would be invaluable to SAS’s research.
Before signing off, Febrianto invited delegates to get in touch with him and the team to explore how best they could deploy any of the SAS solutions. In addition to what had been shared in the Tech Day, Febrianto shared that SAS has an AML solution with unlimited users.
He introduced Gumilar and Fauzi to the delegates as the right people to contact for further information and collaboration on data management strategies and solutions.
OpenGov Asia has a legacy of challenging the norm, constantly pushing boundaries and being ahead of the curve.
Entrenched in its functioning are a remote working culture and a structure that employs a global workforce. While OpenGov Asia could not foresee the scope and depth of the pandemic, it was already well-positioned to rapidly adjust as to the restrictions imposed.
None-the-less, OpenGov Asia knew it had to realign its offerings to suit the new norm. OpenGov Asia takes immense pride in the fact that it was among a select few organisations in the industry that reinvented themselves and were able to pivot from a F2F way of doing business to a virtual one.
Within weeks, the organisation reshaped its business model and invested heavily in infrastructure to transform all its content-platforms into virtual offerings – 100% audio-visually interactive. Over the last few months, OpenGov Asia has pioneered and championed delivery of its unique and powerful OpenGov Breakfast Insights into OpenGovLive! Virtual Breakfast Insights.
Not only did they take the offerings online, but OpenGov Asia was also determined to continue its tradition of service excellence and social responsibility. To truly make the events Virtual Breakfast Insights, they ensured that delegates received a real (not virtual) breakfast. Tying up with local partners in each country that they operate, they had real food delivered to participants in time for the start of each virtual event.
This not only ensured that they retained the integrity of the “breakfast” in their signature virtual offering but also ensured technology partners, food aggregators, restaurants and delivery companies got business in this difficult time.
The national data portal was launched by the Ministry of Information and Communications (MIC) during a ceremony in Hanoi on 31 August.
The portal provides data on state agencies in service of political and socio-economic activities, contributing to the process of e-government building in Vietnam. New digital services, in the process of a digital government building, as well as open data will be provided on the portal. This will make it easier for the public to use them to serve research, study, or product invention, as well as offer feedback to state agencies to improve operating efficiency.
According to a press release, at the event, leaders of the ministries of information and communications, natural resources and environment, science and technology, health, education, and training signed a cooperation agreement with the Vietnam Social Security, the Vietnam National University-Hanoi, and the Vietnam Post Corporation.
The organisations agreed to promote open data and develop the portal. Speaking at the event, the Deputy Minister of Information and Communications, Nguyen Thanh Hung, expressed his belief that Vietnam’s rankings of e-government building in the world, now at 86th place, will improve.
The national database on Vietnamese enterprises is essentially completed while the national population database will be completed next year, he said, adding that apart from the government’s determination and efforts, the involvement of leaders of ministries, agencies, and localities is also needed because they own important data.
The Deputy Head of the MIC’s Authority of Information Technology Application, Do Cong Anh, said the goal of building the portal is promoting data governance in state agencies towards developing data sustainably for the e-government.
Further, as OpenGov Asia reported, MIC announced that the Vietnam 2020 White Book on ICT is scheduled to be issued by 20 December. At a meeting in Hanoi on 26 August, MIC Deputy Minister, Phan Tam, requested relevant units of this ministry to ensure compilation quality and progress so that the document will be published before 20 December.
In the White Book, the first section will provide an overview of the ICT development in Vietnam and the world during the year, the message by the MIC Minister, along with the highlights and ranking of the country’s ICT sector in the world. It will also include articles on the importance of the national digital transformation program to national socio-economic development, as well as orientations for IT and telecommunication development.
The second section will feature data about the top 20 localities in IT revenue, the number of their IT businesses and workers, and the top 20 IT enterprises in terms of revenue, workforce, and contribution to the state budget. The last section will be reserved for introducing ICT agencies and organisations.
Apart from this, the 2020 White Book will also be added with the assessment of the main outcomes of the implementation of Vietnam’s IT industry development program for 2015-2019, an overview of the national digital transformation program, and statistics about foreign investment in the IT sector.
The emergence of artificial intelligence (AI) and machine learning techniques is changing the world dramatically with novel applications such as the internet of things, autonomous vehicles, real-time imaging processing and big data analytics in healthcare.
In 2020, the global data volume is estimated to reach 44 Zettabytes and will continue to grow beyond the current capacity of computing and storage devices. At the same time, the related electricity consumption will increase 15 times by 2030, swallowing 8% of the global energy demand. Therefore, the need to reduce energy consumption and increase the speed of information storage technology is urgent.
Berkeley researchers led by the HKU President (when he was in Berkeley), in collaboration with a team at Stanford University, announced that they have invented a new data storage method: They make odd-numbered layers slide relative to even-number layers in tungsten ditelluride, which is only 3nm thick. The arrangement of these atomic layers represents 0 and 1 for data storage.
The researchers creatively make use of quantum geometry: Berry curvature, to read information out. Therefore, this material platform works ideally for memory, with independent ‘write’ and ‘read’ operation. The energy consumption using this novel data storage method can be over 100 times less than the traditional method.
This work is a conceptual innovation for non-volatile storage types and can potentially bring technological revolution. For the first time, the researchers prove that two-dimensional semi-metals, going beyond traditional silicon material, can be used for information storage and reading. The team’s work was published in the latest issue of the journal Nature Physics.
Compared with the existing non-volatile (NVW) memory, this new material platform is expected to increase storage speed by two orders and decrease energy cost by three orders, and it can greatly facilitate the realization of emerging in-memory computing and neural network computing.
The research was inspired by the research of the HKU Professor’s team on “Structural phase transition of single-layer MoTe2 driven by electrostatic doping”, published in Nature in 2017; and the lead of the Stanford team’s Lab research on “Use of light to control the switch of material properties in topological materials”, published in Nature in 2019.
Previously, researchers found that in the two-dimensional material-tungsten ditelluride, when the material is in a topological state, the special arrangement of atoms in these layers can produce so-called “Weyl nodes”, which will exhibit unique electronic properties, such as zero resistance conduction. These points are considered to have wormhole-like characteristics, where electrons tunnel between opposite surfaces of the material.
In previous experiments, the researchers found that the material structure can be adjusted by terahertz radiation pulse, thereby quickly switching between the topological and non-topological states of the material, effectively turning the zero-resistance state off and then on again.
The team led by the HKU President has proved that the atomic-level thickness of two-dimensional materials greatly reduces the screening effect of the electric field, and its structure is easily affected by the electron concentration or electric field. Therefore, topological materials at the two-dimensional limit can allow the turning of optical manipulation into electrical control, paving towards electronic devices.
In this work, the researchers stacked three atomic layers of tungsten ditelluride metal layers, like a nanoscale deck of cards. By injecting a small number of carriers into the stack or applying a vertical electric field, they caused each odd-numbered layer to slide laterally relative to the even-numbered layers above and below it. Through the corresponding optical and electrical characterizations, they observed that this slip is permanent until another electrical excitation triggers layers to rearrange. Furthermore, to read the data and information stored between these moving atomic layers, the researchers used the extremely large “Berry curvature” in the semi-metallic material. This quantum characteristic is like a magnetic field, which can steer electrons’ propagation and result in nonlinear Hall effect. Through such an effect, the arrangement of the atomic layer can be read without disturbing the stacking.
Using this quantum characteristic, different stacks and metal polarization states can be distinguished well. This discovery solves the long-term reading difficulty in ferroelectric metals due to their weak polarization. This makes ferroelectric metals not only interesting in basic physical exploration but also proves that such materials may have applicational prospects comparable to conventional semiconductors and ferroelectric insulators. Changing the stacking orders only involves the breaking of the Van der Waals bond.
Therefore, the energy consumption is theoretically two orders of magnitude lower than the energy consumed by breaking the covalent bond in traditional phase change materials and provides a new platform for the development of more energy-efficient storage devices and helps us move towards a sustainable and smart future.
On 21 August 2020 OpenGov Asia organised another highly interactive OpenGovLive! session to inform and empower delegates from a wide spectrum of government agencies in Thailand.
As governments around the world struggle to cope with these stressful times, OpenGov Asia supports them by sharing knowledge on how technology and analytics can help better manage people and nations. The overwhelming attendance and engagement from the Thailand audience was itself a testimonial to the relevance of the topic.
The session opened with a presentation by Mohit Sagar, Group Managing Director and Editor-in-Chief at OpenGov Asia.
Mohit opined that the world is at a crucial point right just now and the decisions governments take currently will shape the way the future looks.
With almost every city shut down or constrained as a result of the pandemic, public services were one sector that was under a lot of pressure; ensuring all citizens had access to basic amenities to survive the lockdown and stay safe.
However, not all services were seeing the same kind of demand, making it critical for governments to identify high priority areas and act accordingly.
During the last few months, governments have also collected a huge trove of data. Driving insights from this data can help them validate decisions for current and future needs.
Not only do governments need to analyse the huge volumes of data, but they also need to rethink their policies and practices in these unprecedented times. They no longer just need to respond to the pandemic and recover from it, they must also learn to and plan to thrive in these times.
Mohit emphasised deploying the latest technology and data analytics tools as a smart and effective way of doing things. He also highlighted that the technology will be effective only when complemented with sound leadership.
He concluded by advising the governments and public sector executives to partner with the champions in technology and use their expertise to serve citizens better.
After Mohit’s presentation, Nutapone Apiluktiyanut, Managing Director, SAS Thailand came forward to share his perspectives with delegates.
Nutapone briefly introduced the audience to SAS and its mission of improving lives through better decision making.
He then shared his observations about the phased approach being followed by governments in tackling the pandemic and how SAS’s data analytics tools support that.
Nutapone presented the three stages highlighted by him:
To expound on the Respond Stage, he talked about collaboration with governments in India and the United States to predict their medical infrastructure requirements using SAS’s data analytics.
Similarly, during the Recovery Stage, SAS worked closely with government agencies to prepare them as they were getting ready to reopen their economies. Data analytics helped governments determine their revenue streams, expenses and distribution patterns of stimulus packages.
Nutapone closed by saying that SAS technology can help governments Reimagine the future – to be better prepared for the next emergency well in advance, avoiding loss to lives and resources.
After Nutapone, Joseph Musolino, Global Sales and Strategy Consultant Fraud and Security Intelligence, SAS spoke on the topic from another angle.
Joseph began by sharing interesting statistics that pointed to the fact that organisations globally consider Machine Learning and AI the most significant data initiative for the next year.
Joseph felt that the focus for these countries should now be to take AI and Analytics to enterprises and make both these technologies faster and easier to deploy.
He highlighted some of the areas where governments are currently deploying advanced analytics to strengthen their delivery mechanisms. They include – Customs, Pandemics, Medical, Taxation and Judicial systems.
In order to give the audience a detailed understanding of how exactly the theory plays out, he demonstrated real-life situations where analytics had helped governments serve citizens better.
He concluded by informing the delegates about their new platform which is a step forward into next-gen analytics.
After Joseph’s presentation, Evelyn Wareham, Chief Data and Insights Officer at the Ministry of Business, Innovation and Employment, New Zealand shared her insights on the topic. Her goal was to update all the delegates on the latest developments in data strategy in New Zealand Public sector and specifically in the Ministry of Business, Innovation and Employment.
She began by stating New Zealand’s current vision for data strategy, i.e. a future where data is regarded as an essential part of New Zealand’s infrastructure and where data use is underpinned by public trust and confidence. In the same vein, she also shared the major aims of this vision that include:
- Investing in making the right data available at the right time
- Building partnerships within and outside government
- Implementing open and transparent practices
- Growing data capability and supporting good practice
Evelyn then spoke about New Zealand’s strong integrated data infrastructure that brings together steams of information from Tax, Housing, Census, Education, Benefits, etc. that helps the government make wise policy decisions for the country.
This integrated data structure is of particular importance in the Business, Innovation and Employment Ministry as it helps track indicators of economic growth of the nation.
The aim of this data and the insights are to help the government formulate policy; analytics helps enhance service experience, build operational intel for compliance and enhances performance.
Evelyn also shared some examples of how they are using data analytics currently. During the different stages of the COVID-19 pandemic the government was relying on the various real-time data resources that helped them gauge the status economy and take decisions accordingly.
One example of this is the data on spending that was tracked that helped the government identify the requirements and make suitable fiscal provisions.
Another example of this was a graph that helped the government see how businesses were coping with the pandemic and how it was impacting them.
The last example Evelyn shared was keeping track of travel data across borders to ensure the safety of people in New Zealand.
She concluded by saying that the country is still on a journey towards becoming perfect in using data and analytics to serve the public better.
After Evelyn’s informative presentation, it was time for the polling session. The delegates showed great interest and engagement during the session.
On the first question of the biggest impact on your organisation as a result of the COVID 19 pandemic, a majority of delegates (42%) voted for workforce planning and the need to test resilience of working remotely in both long and short term.
A delegate shared that working from home has been the biggest impact on almost all organisations as nobody was expecting it and they were not prepared for it either. Providing the right infrastructure and ensuring a strong network for all employees as they log in to the systems at the same time was a huge challenge for their organisation.
On the next question of the area where your organisation needs to develop most to respond more efficiently to the next pandemic of such magnitude, the audience was split between integrated operations models to keep the government running (37%) and use of data and analytics to improve situational awareness for real-time decision making (31%).
A delegate shared that they voted for use of data and analytics to improve situational awareness for real-time decision making as governments in this region do not use analysis as much as they ideally should. Using analytics would speed up fundamental processes, giving them more time to work on more complex things.
On the final question of major change in how your department/organisation works due to COVID-19, most of the delegates voted for increased digitisation of back-office administration and processes (55%).
A senior delegate reflected that before using data analytics and advanced technology in our operations, we need to improve our back-office functions from manual to digital. Once that is updated, we can deploy AI/ML or machine learning to enhance our performance.
After the polling session, Joseph addressed the audience with concluding remarks. He thanked and appreciated the audience for taking time out of their busy schedules. He also said that they have only touched the tip of the iceberg for this topic and there is so much to share and discuss.
He encouraged delegates to reach out and engage in more discussions about analytics and the surrounding ecosystem.