Recently OpenGov spoke to Mr Lup Yuen Lee, Chief Technology
Officer at UnaBiz[f1] ,
the exclusive network operator of Sigfox’s low-power
wide-area network (LPWAN) in Singapore and Taiwan. UnaBiz is the
first IoT-dedicated network operator in Asia to roll out a nationwide[PB2]
Recently, UnaBiz enabled full indoor coverage of the Sigfox
IoT network at all 4 terminals of Changi Airport. Smart solutions providers and
system integrators have developed Sigfox-enabled solutions such as temperature
sensors and other applications for the Smart Airport.
Taipei City Government last year to build an IoT Innovation Lab. It is working
with Airbus to advance research in
digitalization of aircraft maintenance operations through the adoption of IoT
solutions. In collaboration with bike-sharing company, oBike, UnaBiz is rolling
out geolocation services for one million bikes on Sigfox Global LPWAN.
In a wide-ranging discussion, Mr Lee talked about different
types of IoT applications and their network and power requirements and shared
his views on the complexities of IoT development.
Deep vs Wide IoT
Mr Lee explained that there are two classes of IoT, ‘deep’
IoT and ‘wide’ IoT. Amazon Echo and Alexa are examples of deep IoT. Deep IoT
devices require high bandwidth and power supply.
With the Amazon devices, the voice command goes to the cloud
for processing and generating an output command. This has high computation power requirements and
hence, these devices don’t work well on a low-bandwidth network or low battery
power. As a result, they tend to stay fixed in offices or homes.
UnaBiz looks at wide IoT, which refers to devices that are very
light, battery-powered and operate on pervasive networks. They can work anytime,
anywhere in Singapore and do not rely on WiFi or the cellular network.
“We don’t think everyone will be able to afford an Amazon
Echo. It is a very powerful device but it is not cheap, because there’s so much
complexity inside it. In short, we’re just trying to be a very simple kind of
IoT network, where you press a button and it triggers the backend. The device does
not need to pair with Bluetooth or WiFi. It is simple and fuss-free, even for
the elderly,” Mr Lee added.
Subscription also tends to be easier, and there is no need
to worry about SIM cards because all the devices have a built-in ID. The ID
indicates that the device belongs to a certain company, and all messages can be
directed there, without any routing elsewhere. The packets are a mere 12 bytes,
so bandwidth requirement is limited and users are not expected to pay an
exorbitant price to use the network.
energy-efficient IoT solutions
When Sigfox was created in Europe, it was for primarily for outdoor applications, for low-power sensor kind of networks that need to send data intermittently. UnaBiz has been working on such applications that help collect data on outdoor environment, such as the weather and haze conditions. UnaBiz has been exploring indoor applications as well, such as tackling rodent infestations in F&B or retail shopping malls.
The trouble with
rat traps is that if a rat gets caught, it has to be gotten rid of immediately.
Otherwise the rodent will decompose, and the other rats will disperse. So how
can building owners know if there is a rat stuck in the trap and alert someone
to clean it up. Doing regular manual checks is simply a waste of manpower.
“The problem with
this kind of use case is that the rats run around in very strange places, deep
inside the building. You cannot guarantee that there’s WiFi network in air con
vents, ducts etc.,” said Mr Lee.
solution needs to be able to penetrate into distant locations, without being
constrained to just public areas or by WiFi coverage. Sigfox was found to be a
good solution because of its pervasiveness. One base station can provide
coverage for the whole building.
Lee said, “We’re actually trying a few types of tracking solutions. You can
install a GPS module, however as we all know, running GPS on any device uses up
a lot of power.”
“The second idea
being explored is WiFi geolocation –
like an Android or iPhone which can use WiFi hotspots for locations –
but if you think about it, they might not work in the wild in areas such as a
reservoir because there is no network there, or in secluded areas such as big
drains or canals for flood monitoring, on the rooftops of buildings with solar
panels to monitor power storage and usage (UnaBiz is currently working with Sunseap on power metering), or at the basements of industrial buildings
for monitoring water leakage.”
The third alternative
would be to use the Sigfox network for geolocation.
Mr Lee said, “You
can either use a high-power one which will drain your battery faster or you can
choose something like Sigfox geolocation which requires no power, as long as it
transmits one message a day.”
“For Sigfox, it’s
easy, just one base station can penetrate the whole building indoors. There’s
no need to shift the base station around and you do not need to put in additional
“When we talk about networks, power and costs matter. If the
rat trap needs to be hooked up to the mains, then it’s not going to work. You
cannot be assured that there will be power source anywhere you go- so it has to
be battery-powered. Battery power means that it has to be a very low power kind
of network, WiFi will probably drain it because it consumes too much power,” he
The reason why Sigfox is so energy efficient is that the way
it transmits in the form of a broadcast, sending out very small packets, as
mentioned earlier. Every time a message is sent, three packets are sent at
three different frequencies (this is called frequency hopping). When running on
unlicensed frequencies, some of the packets might get blocked. If one is
blocked, the others can still go through, ensuring that the message is
“Because it is ‘broadcast’, the communication is very simple
There’s no need to negotiate – 3G and WiFi networks need to authenticate with
the hotspot. They need to make sure the password is correct. After that they
need to keep the session alive, whereas Sigfox can shut down after each broadcast,
reducing power consumption,” Mr Lee said.
Therefore, Sigfox is ideally suited to applications that
need to be delivered at a very low cost, have less frequent communication
requirements, and require exceptional battery performance.
There are numerous smart cities applications that requires
such monitoring sensors where deployment need to be pervasive. If we think of waste
management, building management, critical infrastructure monitoring, and imagine
the need to put a sensor on all the fire hydrants, all AED (Automated external defibrillator)
devices, all the power meters, all of the trees in Singapore, the cost and
simplicity of deployment becomes crucial.
And how does Sigfox achieve wide penetration? Because it is
an ultra-narrow band technology. Transmissions on wireless networks are divided
into different channels. With Sigfox the communication channels are very
narrow. Each message is 100 Hz wide. Because these channels are so small, the possibility
of interference is very low.
Other networks like LoRa have the advantage of being able to
send bigger packets, but bigger packets also impliy higher risk of
Barriers to take-off
in IoT technologies
IoT technologies have been around for a while. And there is
a market for interesting applications. Even if it is a small market like
Singapore, technologies developed here can be exported worldwide.
Then what is holding back development and deployment?
Mr Lee has been an Adjunct Lecturer at Temasek Polytechnic School of Informatics & IT since April 2015, teaching and mentoring the next
generation of ICT professionals in networking and IoT technologies. From his teaching experience, he realised that working on IoT
technologies is quite difficult.
“Because it involves a whole range of skills. You have to
know about hardware. You have to know about these devices. You have to know
what powers these devices, what is the transmission range of these devices.”
“Next you have to go up in the cloud. These things will
transmit to some base station, the base station will be connected the cloud.
You have to figure out how the data goes into the cloud. Then how do you build
a cloud that can handle all these devices. It’s quite easy to handle one device
at a time, for prototyping. But to handle hundreds of thousands of devices is
quite challenging,” he explained. It is very difficult to find people with that
wide a skillset.”
Then there is the question to how to analyse all the data
from the devices
“Very few jobs in Singapore that have that kind of data processing
requirements. We are one of the first to actually do this kind of large scale
analytics. We need tools to be able to massage the data.”
Mr Lee also said that today we see a lot of devices created
just for the sake of it. These are examples of technology looking for a problem
to solve. Identifying problems is a crucial step.
He provided an example of a very real problem UnaBiz is
trying to address.
At a home for patients suffering from disabilities, some of
the residents go out for work. The officials want to make sure that they report
to work on time and that they also come back on time. It is about ensuring that
they are safe and are not getting lost. The home cannot afford to give the
residents expensive phones or trackers. Even if they do, the devices will run
out of battery when their clients do not return.
UnaBiz proposed using one of its motion-triggered Sigfox
devices. Residents can carry the device around and everytime they move, it
sends a message to the cloud. Then, an algorithm is used to do machine learning
and figure out where the person is. Being mindful of privacy concerns, the technology
is kept accurate to a radius of around 1 km, which is enough to know if the
person is safe, without pinpointing their exact location.
This is only one example of tracking solutions for
non-motorised assets. Other use case include tracking bicycles, people, pets.
The device must be affordable and accessible for the mainstream users to adopt
and benefit from.
accelerating IoT development and deployment would require connecting the people
with expertise in devices and in cloud computing with the business people,
placing them all on one team. This would enable the creation of solutions with
real value, solving real-life problems.
The Hong Kong University of Science and Technology (HKUST) and a major cloud computing company recently signed a Memorandum of Understanding (MoU) to further their collaboration in nurturing local technology talent, collaborating on cutting-edge technology research and facilitating the research works of HKUST researchers.
Under the MoU, HKUST and the global leader in cloud technologies will nurture local talent in knowledge relating to data analytics, cloud computing and artificial intelligence (AI) to further develop Hong Kong’s pool of technology professionals and meet industry demands.
The two parties will roll out joint talent development programmes, providing workshops and seminars to prepare students with the practical skills they will require when using advanced technologies as well as promote a culture centred around new technology innovations. Internship opportunities will also be offered to HKUST students by the tech firm.
This partnership aims to support HKUST to move forward as an international leader in education and research. Drawing on its robust cloud experience, the tech firm will continue supporting HKUST in its exploration of Elastic High-Performance Computing (E-HPC) for accelerating HKUST’s research activities in pure science and engineering, as well as AI and machine learning. HKUST will also receive advisory to build an integrated, secure and flexible research platform connecting its campus in Hong Kong and the soon-to-be-opened Guangzhou campus.
The Vice-President for Research and Development at HKUST noted that the tech firm is an active supporter of pioneering research and academic-industry cooperation. Since 2018, both parties have jointly launched 10 collaborative research projects to advance the frontiers of innovative technologies and address challenges of the industry. HKUST aims to strengthen this partnership to pave way for an even closer collaboration, which shall bring greater benefits not just for us, but for the society and region as a whole.
The General Manager for Hong Kong SAR, Macau SAR, and Philippines, of the tech firm stated that the partnership with HKUST is an affirmation of our commitment to nurturing technology talent and fostering local innovation ecosystems. HKUST is a leading university with world-class research across disciplines.
He added that the company is delighted to work with HKUST to continuously prepare young talent for the future and looks forward to co-developing more advanced technologies to empower various industries and advance the GBA as a technology hub in Asia.
Through this collaboration, HKUST and the tech firm will also work together to explore new solutions in areas such as AI, next generation intelligent data processing platforms and serverless computing, in view of addressing technology-related challenges and opening up fresh opportunities for academic and industry experts to cooperate on projects.
The two parties have been long-term collaborators. Supported by the tech firm and its parent group’s AIR Programme, the two organisations have worked together on research projects spanning machine intelligence and data computing. Supported by the cloud tech firm, HKUST organised a FinTech hackathon to encourage its students to identify and solve industry challenges and develop innovation solutions.
The global cloud computing market size is expected to grow from US$ 445.3 billion in 2021 to US$ 947.3 billion by 2026, at a Compound Annual Growth Rate (CAGR) of 16.3% during the forecast period. While technology spending in APAC has increased, the setback due to the recent COVID-19 pandemic is impending. The cloud technology adoption is expected to increase in sectors where the WFH initiative is helping to sustain enterprise business functions.
Malaysia’s Public Sector Data Center (PDSA) services will be upgraded to specific Cloud Computing services for use by all Government agencies. The Government is looking to integrate its services into what will be known as the government hybrid cloud or the MyGovCloud which combines services from MyGovCloud@PDSA and the Public Cloud and has chosen four Cloud Service Providers (CSPs) to make this happen.
The MyGovCloud Initiative is in line with the Fifth Initiative under Malaysia’s Digital Economy Blueprint (MyDigital) which targets 80% storage usage of Cloud Computing across the public sector this year. It aims to drive the digital transformation of the public sector holistically which can stimulate the growth of the digital economy and agenda digitization throughout the country.
The CSP selection was done following an evaluation of the proposals sent by both domestic and foreign cloud service providers. As a result of this assessment, the Government agreed to appoint a panel of four companies as CSPs to provide the best and most secure cloud services for the government portals.
To empower MyGovCloud, Centralized Contract Agreement Cloud Computing also known as a Cloud Framework Agreement (CFA) was signed between the Government, the CSPs and locally appointed CSPs known as Managed Service Providers (MSPs). The CFA contract with the Government, CSP and MSP involves four CSP companies and four MSP companies.
The involvement of the CSPs and MSPs is expected to open investment potential in the country worth between RM12 billion and RM15 billion by 2025. This investment will serve as the ‘backbone’ of the formation of a sustainable digital ecosystem to achieve sustainable economic growth, thus will make Malaysia the main digital hub in the ASEAN region.
The MyGovCloud initiative is believed to produce more experts in cloud computing through ongoing training and program certification provided by CSP and MSP. This is also possible to improve the competence and ability of civil servants in tandem with current needs and technology.
Through the implementation of CFA, public sector agencies will be able to enjoy a better, faster, and more efficient cloud service and will avail of competitive prices offered at discounted rates in bulk by the CSPs and MSPs. Overall, this CFA agreement is aimed at innovation in the procurement of cloud services across all public sector agencies and will help in strengthening the function and role of MAMPU in the future.
With this deal sealed, it is hoped that more benefits will be enjoyed by the citizens of Malaysia as well as public sector agencies, promoting developmental digital technology among all Malaysians.
The government cloud market is expected to grow from US$15.4 billion in 2017 to US$ 28.8 billion by 2022, at a Compound Annual Growth Rate (CAGR) of 13.4%.
Government departments across the world are realising the importance of maintaining and controlling cloud data for continuity and compliance purposes. The government cloud enables these agencies to manage and store data securely and efficiently, resulting in enhanced and unified teams that can handle bigger projects at an effective cost.
Cloud hosting services provide a slew of advantages for government agencies and other small departments. These services may be taken on rent to fulfil computing power and data storage requirements, as often as needed, instead of a one-time investment being incurred for the procurement of servers and the handling of ongoing expenses towards maintaining expensive data centres.
In addition, the government cloud provides greater computing capability particularly in the implementation of disaster recovery and relief situations, as it enables government agencies to develop customised solutions for backup, with regards to the data and application types, sequence, and backup location. It involves the replication of application and data on a virtual machine, and the data can be recovered automatically when a disruption occurs.
Data is information that has been organised in a way that makes it simple to move or process. It is a piece of information that has been converted into binary digital form for computers and modern methods of information transmission.
Connected data, on the other hand, is a method of displaying, using, and preserving relationships between data elements. Graph technology aids in uncovering links in data that conventional approaches are unable to uncover or analyse.
Different sectors have invested in big data technologies because of the promise of valuable business insights. As a result, various industries express a need for connected data, particularly when it comes to connecting people such as employees or customers to products, business processes and other Internet-enabled devices (IoT).
In an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia, Chandra Rangan, Chief Marketing Officer of Neo4j shared his knowledge on how a connected data strategy becomes of paramount importance in building a smart nation.
Connected data enables businesses
A great example of the power of graph technology, and a very common use case for Neo4j, is its use in the financial sector to uncover fraud. Finding fraud is all about trying to make connections and understand relationships, Chandra elaborates. A graph-based system could detect if fraud is taking place in one location and determine if the same scenario has occurred in other locations.
“How does one make sense of this? Essentially, you are traversing a network of interconnected data using the relationships between that data. Then you begin to see patterns develop and these patterns provide you with answers so that you can conclude whether there is fraud.”
What is of great concern is that fraud is occurring with much greater frequency and with a higher success rate nowadays. The key to stopping and mitigating the impact is time. Instead of detecting a fraud that occurred hours or days ago,
“What if the organisation could detect it almost immediately and in real-time as it occurs?” asks Chandra. “Graph offers this kind of response and is why it’s a great example of value!”
Supply chain and management are other excellent examples of RoI. One of Neo4j’s clients, which operates arguably the largest rail network in the United States and North America created a digital twin of the entire rail network and all the goods. With graph technology across their network, they can now do all kinds of interesting optimisation much faster, leading to better, more efficient outcomes for their entire system.
The pandemic has taught the world about the value and fragility of supply chains. Systems across the globe are being reimagined as the world’s economy realise the need to become more digital and strategic. More supply sources, data, data sharing, customer demands, and increased complexity necessitate modern, purpose-built solutions.
Apart from all the new expectations and requirements for modern supply chains, systems need to and are becoming more interconnected because of new technologies.
Maintaining consistent profitability is difficult for firms with a high proportion of assets. Executives must oversee intricate worldwide supply chains, extensive asset inventories and field operations that dispatch workers to dangerous or inaccessible places.
With this, organisations need a platform that connects their workforces and makes them more capable, productive and efficient. A platform that provides enterprises with real-time visibility and connectivity, while also assuring efficiency, safety, and compliance.
Modern technologies are required to improve interconnectivity, maximise the value of data, automate essential procedures, and optimise the organisation’s most vital workflows.
Modern data applications require a connected platform
“When we programme, when we create applications, we think in what we are calling a graph. This is the most intuitive approach that you can have,” says Chandra.
Any application development begins with understanding the types of questions people want to solve and then mapping it to a wide range of outcomes that they want to achieve. These are typically mapped in what is known as an entity relationship diagram.
Individuals’ increased reliance on systems that work in a way that makes sense to them and supports them has increased criticality. And frequently, when these systems fail, Neo4j makes sense of complexity and simplifies what needs to be done, resulting in a significant acceleration.
As the world becomes more collaborative, integrated, and networked, nations must respond more quickly to changes in their business environment brought on by the digital era; otherwise, they risk falling behind or entering survival mode.
The proliferation of new technologies, platforms, and devices, as well as the evolving nature of work, are compelling businesses to recognise the significance of leveraging the most recent technology to achieve greater operational efficiencies and business agility.
A graph platform connects individuals to what they require, and when and when they require it. It augments their existing process by facilitating the effective recording and management of personnel data. Neo4j Graph Data Science assists data scientists in finding connections in huge data to resolve important business issues and enhance predictions.
Businesses employ insights from graph data science to discover activities that point to fraud, find entities or people who are similar, enhance customer happiness through improved suggestions, and streamline supply chains. The dedicated workspace combines intake, analysis, and management for simple model improvement without workflow reconstruction.
As a result, people are more engaged, productive, and efficient with connected data. Nations can bridge information and communication gaps between executive teams, field technicians, plant operators, warehouse operators and maintenance engineers. Increasing agility and productivity offers obvious commercial benefits.
In short, organisations easily integrate their whole industrial workforce to increase operational excellence and decrease plant downtime, hence maximising revenues. This methodology is based on a collaborative platform direction.
Contextualising data increases its value
According to Chandra, data is a representation of the world in which people live, and people use data to represent this world. As a result, the world is becoming more connected, and people no longer live in silos and continue to be associated in society.
“If you think about data as the representation of the world that we live in, it is connected data and we can deal with all the complexities that we need to deal with when we try to make sense out of it,” explains Chandra.
Closer to home, connected data is crucial to Singapore’s development as a smart nation. “Connected data is at the centre of each of those conversations around developing the nation. When you think of Singapore as a connected ecosystem and when you think about citizens, services, logistics, contract tracing, and supply chain.”
Chandra believes that the attributes have saved the connection between data and people, which is why connections are important. Once people understand those connections, it becomes much easier and much faster to derive the insights that are required.
Without connected data, organisations lack key information needed to gain a deeper understanding of their customers, build a complete network topology, deliver relevant recommendations in real-time, or gain the visibility needed to prevent fraud.
Thus, “knowing your customer is understanding connected data.” With the right tools, data may be a real-time, demand-driven asset that a financial institution can utilise to reinvent ineffective processes and procedures and change how it interacts with and comprehends its consumers.
“Me as a person – who I am, my name, where I live – these are all properties of who I am. But what really makes me me, are the relationships I have built over time. And so, the notion that almost every problem has data that you can really make sense of with graphs is the larger “Aha” moment,” Chandra ends.
COVID-19 has caused significant disruptions in the domestic economy, as community restrictions have limited people’s movement and business operations. The silver lining in the global catastrophe is that the pandemic drastically accelerated digital transformation across sectors. Digital technology has become critical for nations around the world in dealing with the crisis, moving toward economic recovery, and resuming long-term goals.
The application of digital technology to economic activities resulted in the emergence of the “digital economy,” which is defined as an economic system that achieves rapid optimisation of resource allocation and high-quality economic development by identifying, selecting, screening, storing and utilising large amounts of data.
As a result of the pandemic, many new digital businesses were established, and others abandoned traditional approaches in favour of tech-enabled strategies. Digitalisation provides a competitive edge for a country when used in conjunction with complementary policies and initiatives. The value of digitalisation is best harnessed when complementary technologies, resources, and capabilities are properly utilised along with appropriate legislative frameworks.
Malaysia, too, has had a forceful and robust response to the pandemic. Proactive and calibrated policies are assisting in the protection of vulnerable people and the revitalisation of the Malaysian economy. This country unveiled the MyDIGITAL strategy, which is a combination of re-evaluated efforts and new initiatives designed to develop Malaysia’s Digital Economy.
Discussing the latest research and case studies on the current use and possibilities for cloud computing was the focus of the OpenGovLive! Virtual Breakfast Insight on 27 July 2022 for senior digital executives of the Malaysian public sector.
Cloud Computing: An Enabler for Digital Government
Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, acknowledged that the COVID-19 crisis forced nations to move quickly to provide unprecedented emergency assistance to keep citizens safe, households and businesses afloat, protect jobs and incomes and keep the economy from collapsing.
Due to the unprecedented restrictive movement measures, people almost completely shifted to remote functioning – whether for work, education, entertainment, banking or commerce. This caused a massive uptick in the number of transactions online.
Businesses and agencies using cloud-based technologies were able to continue their operations without interruption. Others who were unprepared quickly realised the need for the deployment of digital solutions and platforms, infrastructure, data storage and processing capacities to adapt to the new normal.
What was formerly seen as either non-essential or difficult is now the preferred method of functioning for many. People have tasted, appreciated and gotten used to the ease and effectiveness of utilising information and communications technologies. While many people have had to return to conventional offices there is a continued preference for a hybrid model.
The entire shift demonstrated how critical it is to have stronger platforms for both the public and private sectors. Digitalisation has proven to be too effective to pass up given the promise of higher and safer living standards and greater social inclusion irrespective of the environment.
One of the primary enablers of this move to digital is cloud computing technology. It has enabled the delivery of government services in a more agile, fast and cost-effective manner than with traditional information technology infrastructure. Public service can be future-proofed by migrating government systems to the cloud and incorporating its full capabilities into new digital solutions.
Be that as it may, many governments still struggle effectively use cutting-edge technology to deliver better services to citizens.
Some more technologically advanced nations have demonstrated how the cloud strategy made it possible for ever-innovative ways to improve the delivery of public services. Yet, the deployment of cloud computing in many other nations’ public sectors still faces obstacles. This requires revisions or creations of government-wide policy enabling regulatory conditions more suitable for a robust cloud strategy.
To make Malaysia a nation that is growing sustainably with fair economic distribution as well as equitable and inclusive growth, the digital economy was selected as a key economic growth area (KEGA) in realising WKB 2030.
In Malaysia, MyDIGITAL was set up as a national initiative to show how the government wants to transform the country into a digitally driven, high-income country in the digital age. It is intended to support national development initiatives including the Wawasan Kemakmuran Bersama 2030 (WKB 2030) and the Twelfth Malaysia Plan (RMKe-12).
With the help of MyDIGITAL, Malaysia will be able to successfully convert into a highly prosperous, digitally driven country that leads the region in the digital economy.
Responding to Urgent Necessity with Innovation
Seng Heng Chuah, Malaysia Country Manager, Public Sector, Amazon Web Services emphasised that while COVID-19 has disrupted traditional teaching methods, it has also prompted a rethinking of how education can be delivered.
As forward-thinking educational institutions reimagine their delivery models, they are paving the way for new ways to equip students with the skills necessary to succeed in the digital economy.
With this, the AWS Educate Programme provides resources for students and educators to build cloud skills. It is used in Malaysian educational institutions such as the Asia Pacific University of Technology and Innovation (APU).
At APU, they used serverless tools like AWS Lambda to run a secure multi-platform mobile application to improve the user experience for both their staff and students.
“This not only helped the university to reduce user complaints by 65%, but it also empowered the university to achieve 116 times faster delivery of educational resources,” says Chuah.
He is excited to note that the most advanced cloud customers in Malaysia come from the Education sector. They don’t just use the cloud for R&D but also for day-to-day operations. They get students to develop solutions on the cloud because that’s the future of IT – it’s all about software and services.
More public sector agencies are utilising the cloud, and this development is widespread across all nations. Therefore, citizens demand more intelligent applications in addition to e-government portals. They want to communicate with the government more effectively, perhaps through chatbots and other such platforms.
The cloud is a vital facilitator of the digital economy, which is seen as a growth driver by many nations, whereas high-performance computing (HPC) is used most efficiently in the science and research sectors. Similarly, the government can also use the cloud to implement solutions for the Internet of Things (IoT), Blockchain (BC) and Artificial Intelligence (AI), which would be expensive and time-consuming to implement on-prem.
“At the end of the day, it’s about creating a better environment for everyone to thrive. And that includes carving out a space for local innovation – something we’re passionate about,” Chuah firmly believes.
Many government organisations use the cloud to enhance the delivery of their services. For instance, the Department of Statistics Malaysia (DOSM) moved to the AWS Cloud and made its census data accessible to 9 million consumers.
When DOSM switched from maintaining expensive on-site infrastructure to using resources from the cloud, the government could save 25%–50% on resource expenses. This improvement enabled DOSM to manage all traffic on the census portal, even at its high of 200,000 users.
AWS and the government are working together to set up a hybrid cloud data centre using AWS Outposts. They anticipated that the hybrid cloud data centre, which is only intended for use by the federal government, would be available by the end of the year.
Also, while most apps may be easily moved to the cloud, others must first be re-architected or “modernised,” and some must stay on-premises for the foreseeable future due to low latency and local data processing requirements, or data residency. These programmes must be installed in on-site datacentres, branch offices, manufacturing facilities, dining establishments, edge nodes in major metro areas, 5G networks, and other distant places.
Chuah shared three trends that underpin the need to support applications that may need to reside outside of traditional cloud regions and availability zones, in addition to existing legacy on-premises and edge workloads.
The first trend is the emergence of a new class of ultra-low latency applications, such as real-time gaming, video streaming, AR/VR, autonomous vehicles, content creation, engineering simulations and ML inference at the edge. These applications are used in on-premises datacentres, branch offices, hospitals, factory floors, retail locations, on the outskirts of cell tower sites and near groups of artists, scientists and engineers. End users frequently access these ultra-low latency applications via mobile devices, so they must be deployed at the 5G network edge to benefit from 5G’s speed and bandwidth benefits.
The processing of local data is a second trend. Customers’ digital transformation initiatives and increased use of IoT devices are producing massive amounts of data. Due to cost, size, bandwidth or scheduling limitations, some of these data sets must be processed locally because they can’t be transferred to the cloud.
Data residency is the third trend. Due to security and tax laws, data sovereignty and shifting geopolitical factors, customers may be required to keep their data in a certain nation, state, or municipality. When a customer’s data residency requirements cannot be met by a cloud region, they must maintain and/or install additional IT infrastructure to support those workloads.
Sustainability, according to Chuah, is still AWS’ top focus; but as they step up their fight against COVID-19, they haven’t forgotten about another pressing global issue: climate change. “We are dedicated to developing a sustainable business for both our clients and the environment.”
This commitment to sustainability is seen in AWS’s co-founding of The Climate Pledge in 2019. Its goal is to have all its operations powered by renewable energy by 2025 and to achieve net-zero carbon emissions by 2040.
As a significant technology firm, AWS is aware of its environmental impact and the efforts it might take to lessen it. Organisations can save their energy use by up to 76 per cent by migrating to the cloud. “Ensuring we have the right components to thrive in this digital economy is necessary by building a plan, assessing the readiness for the cloud, and migrating and modernising the workloads.”
All innovations and initiatives shared by Chuah demonstrate not only the revolutionary power of digitisation and modernisation, but also the resilience of the human spirit in the face of hardship.
The Cloud Imperative – A Leadership Question
Andre Mendes, Chief Information Officer of the US Department of Commerce, shared some examples of his experiences over the last decade where cloud hosting became a de facto standard.
In 2009, the Special Olympics became almost 100% Cloud-based globally and they had a testing ground for many vendors with a minimal budget. This resulted in the Special Olympics experiencing no disruption and increased its athlete base from 1.4 million to 4.5 million over the next ten years.
“This is a sample of massive progress despite resistance,” says Andre. “Even though many donations came in from large IT players and lower-risk non-profits, there were sceptics and risk avoiders.”
As an example, the US Department of Commerce’s International Trade Administration (ITA), which is already 100% cloud-based in 2018. ITA had a minimal budget, but they could maintain a fully integrated environment with IT, communications and telephony.
From the start of the pandemic, ITA maintained seamless operations and led the way with ZTA and Borderless Networks as they reinvested in custom Agency software functionality.
“Leadership must be adaptable as the environment evolves,” Andre reiterates. “Leaders identify a fundamental shift in the competitive environment and act to mitigate a potential disruption or, better yet, gain an advantage by seizing new opportunities before competitors do. For most businesses, digital transformation begins from the outside in.”
Even the most forward-thinking transformation strategies are doomed to fail if they do not place equal emphasis on the inside and outside of the organisation.
According to Andre, an organisation needs to support the shift and the new working methods that come with it; therefore, leaders must spearhead the transformation. Any digital transformation programme, including cloud migration, is more likely to be successful when the leadership is on board.
As more businesses migrate to the cloud, a growing number of internal cloud migrations happen as businesses switch between multiple cloud providers. It’s crucial to evaluate the organisation’s requirements and identify the variables that will control the transfer, including historical data, critical application data and application interoperability.
Next, it is necessary to classify data to identify which needs migrating, and what needs scrubbing. Determining these requirements will help the organisations create a sound plan for the tools they’ll need during migration. They’ll also be able to choose the right destination volumes and decide whether the data needs to be encrypted at rest and in transit.
“Always look for innovation; the biggest risk is not moving forward,” Andre is convinced. “Innovation represents the enhancement of something that has already been, and the most innovative people will eventually experience long-term entrepreneurial success. Consumers and peers recognise businesses as true innovators and leaders when they take the biggest risks, close the widest gaps, and seize the newest chances.”
Following the informative talks, the delegates took part in discussions encouraged by polling questions. The goal of OpenGovLive! Virtual Breakfast Insight is to provide live audience engagement, inspire participation, and allow people to learn from and grow professionally from real-life experiences.
On being asked what the delegates’ cloud strategy was, most responded with a hybrid cloud. Delegates said that they could use cloud services where they are most effective while keeping certain operations on-premises or within a private cloud. This allows for greater flexibility.
On how organisations evaluate the success of their cloud adoption, the majority opted for high availability/downtime management, while others were resource productivity, efficiency, and cost saving.
Most organisations lack a system for evaluating the success of cloud adoption. Furthermore, there isn’t much information available on assessing the success of cloud adoption within an enterprise.
Delegates said that the number one criterion for choosing a cloud service provider is still price. This is followed by security, and by performance.
A delegate felt that before a business can effectively choose a good provider, it needs to know what its business needs are. When organisations know precisely what they need in terms of technical, service, security, data governance, and service management, they can ask their small group of potential providers better questions.
On being asked what they thought were barriers to going digital and using the cloud, management support and budget were seen as the greatest ones.
With speed and agility being a clear advantage of cloud adoption, the cost quickly becomes a barrier to success. Adopting the cloud makes deploying more environments and leveraging more resources easier and quicker, but it also comes with higher prices and significant security issues for careless teams.
In the last poll, the delegates were asked how they planned to update their legacy and application systems. The majority answered application assessment to move to the cloud, while others worked with a cloud service provider and outsourced to a system integrator.
For many organisations, legacy systems are seen as stifling business initiatives and processes; however, they have begun to recognise the importance of modernisation to help their business grow.
Mohit agrees that scaling a firm and preserving profitability calls for developing partnerships that simplify digital transformation for customers. “Partnerships are the way forward for companies who want to use the cloud.”
To market, sell, create, integrate, customise, deploy, and support new applications on-premises, in the public cloud, or in hybrid cloud architectures, partnerships can offer the necessary knowledge.
“Public service must be genuinely available for the citizens and Cloud is the future,” says Mohit. “Partnerships enable providers to diversify their offerings by adding things like managed security, IoT solutions, and analytics.”
In the complicated, developing world of cloud computing, IT companies frequently collaborate for financial gain, but they also do so more frequently because customers expect things to function.
Andre was delighted to be invited as a speaker and was encouraged by the fact that many young people, particularly women, are representing the IT arena and that many new skills will be developed.
Chuah said that the pandemic has taught the world that “changing is the only constant”. People can be confident that they can bend, respond and adapt without breaking when life throws them a curveball if digital innovation is at the core of a long-term economic plan.
AWS is steadfast in its stance. They are passionate about driving the public sector’s digital transformation as they are committed to the development of cloud computing services, catalysing the development of sustainable digital government. With the appointment of the Cloud Service Provider (CSP) panel, they will continue to deliver their commitment to continue supporting the Government’s strategic initiative.
A cloud migration needs a set of plans and vision as the first step and the “only way to go to the cloud is to try and be confident to use it.”
Chuah spoke about the AWS Migration Acceleration Programme (MAP), a comprehensive and tried-and-true cloud migration programme. Enterprises’ migrations can be complex and time-consuming, but with an outcome-driven methodology, MAP can help them accelerate their cloud migration and modernisation journey.
“We remain focused on supporting Malaysia to lead in today’s digital economy as we leverage our global experiences with more than 7,500 public sector agencies to enable our customers through Cost Savings, Staff productivity, Operational Resilience and Business Agility.”
Cloud computing services, artificial intelligence (AI), the Internet of Things (IoT), 5G technology, fixed broadband Internet, and blockchain technology are expected to lead the information technology and telecommunications sector over the next few years. According to a recent survey, technology companies are investing in core and fundamental technologies to serve digital transformation. Encouraging the digital transformation of business could be a crucial step as the Department of Enterprise Management earlier estimated that the country’s gross domestic product (GDP) could surge by US$30 billion if the country successfully digitises its small and medium-sized enterprises (SMEs).
Cloud computing services in Vietnam are forecast to develop with better security than physical servers, helping organisations increase productivity and reduce machinery and infrastructure costs. Vietnam’s cloud computing market is predicted to grow by nearly 26% annually, the fastest pace in Southeast Asia and higher than the global average of 16%.
About 66.67% of enterprises are applying AI to their digital transformation process. AI can manage and optimise infrastructure and customer support and is expected to access all businesses in the future. Last year, the government issued a national strategy on the research, development, and application of AI until 2030, aiming to gradually turn Vietnam into an innovation and AI hub in ASEAN and the world.
Internet of Things
The rate of firms using the IoT this year has reached 86.67%, an increase from 66.67% in 2021. IT experts believe it has the greatest development potential at present. Businesses can connect IoT devices from afar, collecting, and managing their data, processing data on demand, and sharing data with devices outside the IoT network. In this way, the IoT can automate processes, minimise labour and infrastructure costs, manage supply chains, optimise energy use, and improve sustainability.
The 5G is expected to contribute about 7.34% to the country’s GDP by 2025, according to the Institute of Information and Communications Strategy under the Ministry of Information and Communications (MIC). The application of 5G services will help telecom enterprises boost the use of AI and IoT in smart city building and business operations and meet digital users’ demand for high-definition videos, virtual reality, and augmented reality.
According to the MIC’s Authority of Telecommunications, telecoms network infrastructure has been expanded to 100% of communal-level localities. The 2G, 3G, and 4G mobile networks have covered 99.8% of the population while 5G has been piloted in 16 provinces and cities.
By the end of last year, there were nearly 71 million mobile broadband subscribers and 18.8 million fixed ones, a 4% and 14.6% increase from 2020, respectively. Internet traffic in Vietnam also rose by over 40% last year. Currently, the proportion of adults using smartphones in Vietnam is 73.5%. Vietnam aims to increase the rate to 85% by the end of 2022.
With an increase in blockchain technology applications and rapid, large-scale digital transformation, Vietnam has the potential to compete in the global market and become a hub for technology. It is estimated that by 2030, blockchain will create 40 million jobs, and 10-20% of the country’s economic infrastructure will run on blockchain-enabled systems.
The use of artificial intelligence (AI) has increased rapidly, accelerating during the pandemic. Machine learning, deep learning algorithms and models process massive amounts of data to enable faster, smarter, and better decision-making. As a result, tech-enabled forecasting holds enormous promise for the financial industry which has long been the steward of massive data sets.
Business leaders recognise the importance of data tailored to each function and the role analytics tools play in leveraging data. In this context, data-driven decision-making analytics software inherently provides a competitive advantage.
Advancements in data, analytics and machine learning mean that businesses with large amounts of data have an incredible opportunity to capitalise on it. However, they must do so with an eye toward scale, change management and a curiosity culture.
Data-driven decision making
Dr Geraldine Wong, Chief Data Officer, GXS, revealed in an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief, OpenGov Asia, that collecting, extracting, structuring and analysing business insights was historically a time-consuming task that slowed data-driven decision-making.
Dr Geraldine, who is among the 2022 Global Top 100 Innovators in Data and Analytics Today, notes that business intelligence software has since enabled people without significant technical experience to analyse and extract insights from their data. Less technological expertise is needed to provide reports, trends, visualisations, and insights that aid decision-making.
AI technologies such as machine learning and natural language processing are evolving and, when combined with data, analytics and automation can assist organisations in achieving their objectives – whether improving customer service or optimising the supply chain.
Companies should clearly understand what AI means in the business and then recognise how it adds value to the business. The idea of skill set, and multi-cultural definition is significant.
According to Dr Geraldine, everyone can be a data scientist. The major challenge is finding the appropriate fit – the right individuals with the proper skill set – and then keeping them motivated and engaged.
Moreover, having the right set of digital tools to manage data insights content and digital marketing is essential. With this, organisations can create a strategy to engage their target customer segments from the start to the end of their customer journey. For example, companies can harness insights into customer behaviour and patterns, personas, conversion rate optimisation and many more digital metrics essential to anticipating customer needs and offering products and services which are most relevant to them.
“The way you market your products using these digital technologies will boost engagement because it is derived from data-driven insights,” Dr Geraldine believes.
Data privacy and trust concerns could also be a cultural component. Distinct cultures have varied ways of communicating and creating trust, as well as different approaches to cyber security and fraud prevention.
Dr Geraldine feels that it is essential for companies to take seriously their responsibility in protecting data privacy as well as to know how to build and earn the trust of their customers.
As part of the trust and innovation mesh, Dr Geraldine says there are fundamental questions that should be addressed – How do we make information more accessible? How can we make it simple for people to use our app? How do we ensure that our app is intuitive to our customers? She is convinced that companies have a role to play in bringing traditional physical business to a digital space. When integrated into a digital campaign, traditional marketing can reach more people, spread the message faster and increase the return on investment for the campaign.
However, this only happens when different products and services are promoted through a multi-channel approach as part of an integrated marketing strategy. To move prospects down the sales funnel, businesses need to ensure that the conversation with customers remains seamless across multiple communication channels, whether online, offline or both.
Effective data governance
Effective data governance allows business users to make decisions based on high-quality data and well-managed information assets. However, putting in place a data governance framework is not easy. Dr Geraldine strongly believes that data governance should be a priority in both the public and private sectors.
Data ownership issues, data inconsistencies between departments, and the growing collection and utilisation of big data in businesses are all common concerns. Data governance enables processes to run smoothly and reduces mistakes in a database, giving the business a solid place to start. It also saves time and money.
In terms of data governance across the private and public sectors, Dr Geraldine is convinced that it should be planned and organised carefully and intentionally. A successful data governance strategy involves careful preparation, the proper people and the right tools and technologies.
A data governance framework provides industry best practices for managing data governance initiatives and exploring data stewardship. Data quality, privacy, and governance as a means of building trust are consistently recognised as the most significant issues in data management. As the relevance of data democratisation for business transformation grows, so does the number of non-technical data consumers who desire convenient self-service access to data for their use but are ill-equipped to control data properly.
As anticipated, the outcomes of the data governance journey, data quality and privacy are critical to promoting enterprise-wide data literacy to deliver commercial value while maintaining confidence. It is essential to eliminate operational risks and allow individuals to use data responsibly.
Even if increasing the use of AI and automation accelerates the process of creating value, everyone in the organisation must be able to use data the same way. This can benefit the organisation by making data exchange faster, more widespread and more straightforward. Introducing new data sources and information can aid operational reporting and analysis and make data-driven decisions easier.
Modern business requires data literacy
Data literacy is about educating stakeholders about the information available and organising it in a way that makes it easy to identify and consume. When a data governance team acknowledges the importance of data literacy in an organisation’s data governance strategy, the result is a well-defined data catalogue that any staff member can access.
Establishing trust is a holistic endeavour: it is both a leadership and a design issue. It requires both cultural and practical strategies, as well as the engagement of everyone. Without widespread data literacy and clearly defined data terms and frameworks, communication channels can break down, resulting in catastrophic results.
Dr Geraldine highlighted the recently established Digital Trust Centre (DTC) of the Ministry of Communications and Information that could assist the banking industry in gaining consumer trust. However, digital trust must be developed and started within the organisation before it extends to external stakeholders.
Organisations require solid, mutually beneficial partnerships to successfully grow together and exploit new business opportunities to embark on such a remarkable digital journey. Customers may utilise rapid digital innovation to create new business models, enter new markets and accelerate their profitable expansion.
Moreover, alternative data is becoming increasingly popular in Singapore. Dr Geraldine is excited about the use of securely shared data to make financial services more available to a broader range of customers.
With developing technology, the banking industry may enhance its usage of alternative data protection in five years.
In addition to the ethical and responsible use of AI, another data trend that Dr Geraldine is expecting in the next two to four years is an increase in the number of organisations that will be doubling down on using data to build customer engagement models and within their ecosystems. “What will be interesting to see is how this translates into better customer experience and how companies ensure stringent data protection.”
Given the positive attitude that nations across the world have to technology and the plethora of digital initiatives being put in place to better the citizen experience, Dr Geraldine is optimistic about the future of data governance and the opportunity to use data in a secure manner to improve customer experiences.
It is difficult to conduct business in today’s world without a dependable website, which is where professional web creation services come in. Developing an online presence for a business or corporation does not end with the creation of a basic website for a company or organisation.
Developers can use Web Development Tools to deal with a range of technologies and should be able to deliver faster and less expensive mobile development. Responsive site design will improve the online surfing experience while also allowing for better SEO, decreased bounce rates, and less upkeep.
The tools that an organisation have selected should be able to give a good RoI. Hence, Cost-effectiveness, Ease of use, Scalability, Portability and Customisation are the factors that should be considered when choosing a Web Development Tool.
Neo4j’s Vice President of Global Cloud and Strategic Sales, Kesavan Nair or simply Kay discusses in-depth how big companies set up their growth engine drivers with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia.
With his proven track record of entrepreneurial leadership in the open-source domains, cloud, SaaS, big data and analytics with both early-stage technology firms and large public companies, Kay is an authority on the topic.
On-Premise vs Cloud
The location of the data is the key distinction between cloud-based and on-premises (prem) versions. Cloud software is hosted on the vendor’s server and accessed using a web browser, as opposed to on-premises software, which is locally installed on the company’s PCs and servers.
When making a choice, a variety of factors must be considered in addition to accessibility – software ownership, cost of ownership, software upgrades, and additional services like support and implementation.
Kay explains that the cloud database as a service (DBaaS) market is one of the fastest-growing markets in enterprise software. “We need to make sure that we are being able to be where our customers want us to be, which is in the public cloud.”
As an example, he cited the use-case of Levis – one of their longstanding customers. Levis had been running on-prem for a long time and wanted to switch as part of their digital transformation strategy. They had eight different applications across various business units which were running on-prem and wanted to move all the services into a cloud service, running on Amazon. Neo4j helped them with the migration in about 3 months.
“That was an excellent example of how the Neo4j AuraDB Enterprise aided in the execution of Levis’ digital transformation,” Kay enthusiastically stated. “For Levis’ Neo4j became one of the main motivators for the enterprise to experiment and try new ideas, which accelerated their transformation quite quickly.”
Neo4j counts both start-ups and established companies in their fold. Their largest customers include the likes of Siemens and Dun and Bradstreet. They also have customers like PwC Australia, PwC U.S, BMW, Walmart and a neo bank in the U.S, Current Bank runs their core database system on Neo4j. The biggest healthcare insurance provider in Brazil, Qualicrop runs its mission-critical database systems in Neo4j.
Speaking of their journey, Kay shared that they started as a database company where most of their customers use the Neo4j database for transactional workloads. Now, interestingly, about 90% of their customers use either a public cloud or a cloud managed by Neo4j.
“We’ll soon cover all the major cloud service providers, so customers can choose where to deploy their apps and where to use the service. This will bring us closer to where our customers are growing,” says Kay confidently.
Graph Data Platforms: The First Choice for Application Development
According to Kay, their graph database promises data consistency, performance, and scalability. It can search for patterns and connections in data’s interconnected relationships. “Neo4j now includes a graph data science platform. Both data scientists and developers can use this platform to meet their demands. And I believe it gives us an extremely attractive product to the market at large.”
When governments had to locate community infections due to the pandemic, the benefits of the Graph Data Platform were most evident. The Graph Data Platform with AI has shown to be a great tool for data management in real-time, from tracking connections via complex social networks to understanding linkages.
On the other hand, graph data science assists organisations in addressing some of their most challenging and complicated problems. “Neo4j Graph Data Science is a platform for connected data analytics and machine learning that enables you to better anticipate the future by understanding the relationships in huge data.”
He shared that those two key strategic products under the Neo4j Aura portfolio of cloud products are AuraDS (built for data scientists) and AuraDB (built for developers).
Graph Database Technology is specifically designed and optimised for identifying patterns and hidden connections in highly interconnected datasets. Graph data stores are easy to use because they mimic how the human brain thinks and maps associations using neurons (nodes) and synapses (relationships).
A graph database stores and queries connected data in a node-and-relationships format efficiently. As a result, graph technology excels at problems where there is no prior knowledge of path length or shape by efficiently finding neighbouring data using graph storage and infrastructure.
Kay listed some of the most typical graph usage cases:
- Fraud Detection & Analytics
- Artificial Intelligence & Machine Learning
- Real-Time Recommendation Engines
- Knowledge Graphs
- Network & Database Infrastructure Monitoring
- Master Data Management (MDM)
All of these have one thing in common – to be successful, an enterprise needs to use datasets that dynamically change over time and are connected to each other.
Neo4j offers four benefits of using graph databases:
- Natural and easy data modelling
- Ability to adapt to changing data structures
- Support for real-time updates and queries running simultaneously
- Storage and a natively indexed data structure
Connected data in property graphs enable the enterprise to illustrate and traverse many interactions and find context for the next breakthrough application or analysis.
Kay encourages businesses to choose a cloud strategy that fits their needs and look for a provider that lets them move their assets whenever they want as many enterprises themselves to have evolving cloud strategies. This is because flexibility is very important.
“With us, Neo4j, you find value. It was predicted that by 2025, all smart applications would use graph technology in some way. So, graph databases are a natural fit for any new application that is being built today. This is because it is much easier to get insights from them,” Kay believes.
Neo4j Graph Database Platform has developed into a common form of information technology and has benefited businesses in a variety of ways. Numerous corporate game-changing use cases in fraud detection, financial services, life sciences, data science, knowledge graphs, and other areas have been made possible by the Neo4j Graph Database’s speed and efficiency advantages.
In the current VUCA environment, data security is crucial, challenging and fluctuating, particularly when dealing with sensitive data and the laws that govern it. Neo4j offers both safety and compliance, and frequently updates, enhances and expands its platforms. They can secure data in a variety of methods, including access control, user roles, protected environments, and system design, among others.
Neo4j Graph Databases has emerged as a critical technology for hundreds of companies, government agencies, and non-governmental organisations and will continue to be there. Kay is optimistic about the future and confident that Neo4j will always be placed to offer the best services for both the public and private sectors.