Citizen engagement is always a top priority
for governments, and a high
level of citizen engagement is considered to be an indicator of a developed
The ladder of participation introduced
by Sherry R. Arnstein in 1969 shows three different zones of citizen
participation. The two bottom rungs describe a zone of non-participation where
governments are manipulating different ways to cure and educate participants instead
of enabling and empowering citizens to participate.
The second zone is where governments
allows citizens to have a voice and be heard. Informing and consultation are the
rungs where governments inform citizens about their decisions and directions
and request consultation from powerholders and citizens. Here a voice is heard
but with no muscles, with no real change or right to decide.
The third zone represent the highest level
of power, where the relationship between governments and citizens is more of a partnership and
the level of citizen control is developed with increased degrees of
Ladder of citizen engagement
Governments around the world have experienced
one or more of the stages above; from informing to empowering; from providing
citizens with objective information on the government plans to the highest
possible level of customer engagement where the government opens all doors to
hear customers’ voices (suggestions and complaints) that can drive the change.
Crowdsourcing is an effective tool
for citizen participation. It first appeared as a business practice in which an
activity is outsourced to the end customers or the crowd. The word crowdsourcing
also reflects efficiency by involving a low cost solution, the customer
centricity by involving large numbers of people and the fact that it has a benefit as a business model.
Crowdsourcing is a type of smart/online
activity in which an individual, organisation or a private business proposes to
a group of individuals of varying knowledge and different interests, through a call
to the contact center, a text message, or even a photo or a video. It is
completely a voluntary work of undertaking a certain task.
Crowdsourcing is a practice that
should complement the efforts of building smart cities. It is a tool that ensures
services are provided in a satisfactory manner and the element of smartness with both citizens and
Collecting customer feedbacks via traditional methods,
including websites, long emails and phone calls, is no longer relevant to our
smart era nor convenient to the smart customer. Rather, social media, WhatsAPP,
twitter and Facebook became more convenient channels for customers and not for governments.
Most Crowdsourcing solutions include the following
a mobile application (customised to meet different needs and scenarios) to
gather information (customers’ complaints or feedback) from individuals and public
or private parties.
properly solve the problem in a very systematic manner, the application will be
equipped with tools to identify the location and assign the issue to the
concerned department will take a corrective action and address the issue within
a well-defined timeline.
the end user (the customer) with the update and sustain customer satisfaction.
Crowdsensing = Crowdsourcing + Analytics + IoT.
Crowdsensing is simply the next generation of
Crowdsourcing, where you add two more components to the above-mentioned steps in
order to back your solution with the Analytics arm and the IoT flavor.
5. Use the power of data to analyse the customer voice with other complaints from the same location and correlate customer demographic information with customer insights.
6. To sustain the solution and maintain proactivity, innovative IoT solutions are used to monitor the location and sustain the solution.
Data is increasingly at the core of any business or organisation and is underpinning digital strategies and initiatives more than ever. Data has become a key component of digitalisation and the driving force behind and fuel for analytics, machine learning, edge computing, cloud and other cutting-edge technologies.
As the need to respond more quickly, indeed in as near real-time as possible, data will rapidly become the key competitive advantage. A company’s capacity to compete will be determined by its ability to leverage data – apply analytics and generate intelligence.
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia agrees that “data is the new oil”. But like oil, raw data is not particularly useful in and of itself. Information – processed data – swiftly becomes a decision-making tool that allows businesses to react to market dynamics and make proactive and intentional decisions. The real value of data offers timely actionable insights, trends and projections that can help organisations survive and thrive in a VUCA world.
Generating data is not really the issue at hand. Both the public and private sectors, for the most part, hold massive volumes of data and continue to add to it. Albeit, this has been fairly unorganised and siloed, making it difficult to access and process.
The question is: how can agencies and organisations best derive real value from these mountains of data, which are often distinct, distant and diverse? How do they collect, analyse and rationally build patterns and interconnections to improve decision-making and planning?
While organisations have been deploying AI and ML to gain and analyse insights from the data, a new platform has emerged that has the potential to offer deeper insights – Graph Database Technology.
OpenGov Asia had the opportunity to speak with Nik Vora, Vice President, Asia-Pacific, Neo4j to gain his insights on the importance of graph data and how organisations can derive actual value from it.
Nik Vora is the Vice President of Asia-Pacific at Neo4j. Nik has over 12 years of expertise in the tech industry and joined Neo4j as the company was looking to grow its operations into the Asia Pacific area. In his present position, he oversees the APAC business, which develops solutions for businesses and communities to see the connections and linkages among massive amounts of data to make better decisions.
Genuine innovation or repackaging?
Mohit is keen to know, is this just old technology in new packaging or is there legitimate value-add? If yes, what do organisations gain from Graph Database Technology?
Nik Vora is quick to clarify that the tool is important because it has the capability to extract the inherent value in the data itself. Data needs to be seen as a network and not merely discrete data points – and the best way to visualise these relationships is in graphs.
Graph database technology considers the relationship between data to be just as significant as the data itself. The purpose of the technology is to store information without restricting it to a pre-defined model. The data is maintained in the same way that is initially collected, with each unique item connected to or related to others. In a native graph database, accessing nodes and relationships is a speedy, constant operation that allows one company to traverse millions of connections per second per core.
Companies, agencies and any organisation in the ecosystem, according to Nik, are looking to exploit gain from data. Over the last 24 months, there has been a massive acceleration of digitisation – of supply chains, processes, services and transactions. This has pushed more information online and allows more data to be captured. In turn, businesses rely increasingly on data, leading to more optimisations, depending on how much value an organisation can create from data.
As data becomes more distributed, dynamic and diverse, it is important to capture it in real-time and process it to drive rapid action and feed into strategy, Mohit opines. This means that data needs to be on hand for those who need it. The importance of data availability and accessibility anytime and anywhere is even more pronounced in the current crisis. This is especially true for organisations engaged in providing mission-critical, customer-centric services.
Wholeheartedly agreeing, Nik says the greater the demand for data-driven insight and intelligence, the more important it is to grasp the importance of connectedness in existing data. Graph database technology is uniquely positioned to do this. Since it is modelled as a graph and a network, Graph Database Platform is the ‘most obvious approach’ to look at connections. “The value of relationships itself is the underlying drive for this technology,” he explains.
Investing in data analytics and technologies without first determining what your specific organisation need to succeed is indeed a waste. It is necessary to first build a big data strategy to get the most out of the data a company already has or plans to collect. A big data strategy lays out how data will be used in practice and what kinds of data a business needs to meet specific business goals.
However, does this means that all organisations should reconsider their entire data collection strategy, including how they acquire, store and distribute data? This, Mohit feel, would be markedly prohibitive.
The answer to this is actually a bit of both and while there is an investment involved it is not unreasonable, says Nik. Organisations do not need to modify their data, but they do need to change their perspective.
The key concern should be how data is connected and how it relates to other data sets and points. Organisations have spent many years building data lakes and data warehouses and that all the data that any organisation could need, already exists. What they need to do now is turn on the tab and start looking at the relationships between different data that are connected across silos, processes, networks and transactions.
The challenges and advantages are that it is a very dynamic world. And, given this new understanding of how interconnected everything is, if an organisation does not have a linked data strategy – where they look at data, how it connects and what relationships and dependencies exist – they are missing out on a huge potential.
Many businesses rely on data to assist rather than drive their operations. But why is that? After all, data is only valuable if it can be turned into actionable insights. Finding out what you want from your data and determining its worth is the first step in gaining these insights.
“We are all gaining insight from our existing data in some way,” posits Nik’s. “Organisations should be more intentional about it if they are to gain genuine advantages.”
Within an organisation’s ecosystem, there are many existing relationships and connections. With the plethora of technologies, ecosystems and capabilities available, Nik believes that the ideal time to start investing is NOW. But investment is not just in technology but in people!
Data and analytics leaders are often perceived as the gurus of graph technology, but the truth is, Mohit points out, many still don’t comprehend it themselves. This means that there has to be an upskilling of the entire workforce if a company wants to gain real value from data. So, how do companies get started?
The strategy, Nik believes, is two-pronged: training and staffing. Organisations must empower their existing workforce to understand the value of and how to use Graph Database Technology. Above this, they need to bolster organisational capacity by hiring the right people. Although there is a lot of great talent in the market and a relatively large pool, Nik advises caution in recruitment as skills are relatively easy to fake.
“When you embark on a project or a journey, you have enough (and more talent) in the partner ecosystem, as well as the deployed developer ecosystem, where you can source people from,” Nik acknowledges.”However, it is essential to be careful that potential candidates go through a rigorous selection process.”
Big Data can have ‘infinite value’
There are a lot of one-line proverbs and truisms to push unnecessary products. One is “big data can have an “infinite value”. Is this factual or just another way to justify more expenses on the books.
“The simple answer is that it is up to an organisation to decide how they budget their funds. But it is better to look at it differently. It’s not intrinsically about just money,” Nik explains. “It’s the perspective organisations have of tech. Do they see it as an expense or an investment?”
Yes, companies, in the short term, and tangibly, invest their resources, time and effort, but, more significantly, they are investing their company’s future based on the decisions they make. A case in point is fraud.
“If you look at fraud detection alone, fraud detection has gone offline as well as online; it’s an omnichannel; it’s not just one Forster dealing with one credit card somewhere.”
Fraud detection and anti-money laundering depend enormously on exposing connections and patterns. With the new Neo4J graph data platform, which incorporates both the Neo4J graph data science and the code database, detecting fraud is considerably easier now and Neo 4J has discovered millions of new frauds from its technology.
So going in for a graph database platform now does mean an expense in terms of investing in the technology, training people and setting systems; but it has massive RoI down the line, in addition to protecting a company’s most valuable assets – market reputation and customer trust.
There is a whole new thrust of marketing to customers personally – It’s no longer just a store or an e-commerce website. With people are on social media, rating platforms and a host of mobile apps, getting a complete customer 360 is much more difficult and complex than before. Organisations are increasingly relying on numerous consumers touchpoints to gain a more comprehensive picture of each customer.
To add to the complexity, an organisation can have a million customers or more, with data points spread across billions of records from transactions, events and sensors.
One of their customers AirAsia – one of the largest and most well-known airlines in Southeast Asia – saw a 300% spike in the test group after employing Neo4J’s craft data science. This was because Neo4J was able to gain a significantly deeper grasp of the customer from a single consumer perspective.
To do this, Neo4J did not discard any of the other company’s technology, but instead layered theirs on top of the existing ones, combining all the company’s assets such as data lakes, data warehouses, and data science notebooks, to the power of the graph data platform. As a result, there was a massive performance improvement.
Proof of the pudding is in the eating
While claims are easy to make, the test of the effectiveness of a technology is the success it has in real-life applications. AirAsia apart, the company has a wide spectrum of financial organisations that deploy their solutions.
Neo4j counts a whole host of banks as their satisfied customers including Standard Chartered, prominent banks in Singapore and one of Australia’s largest banks. Most recently. Neo4J partnered with DBS for their hackathon – DBS’s flagship event.
At the same time, Neo 4J has a big number of on-premises start-ups, as well as cloud and digital native accounts, all of which are using the Neo 4J cloud experience in APAC.
These organisations represent the best in their sectors, and they are at the top because they invest in technologies that help them progress. Findings indicate that graph database technology was used in 50% of all Artificial Intelligence projects. This is because incorporating Graph Database Technology into an organisation’s existing AI strategy offers significant improvement at a low cost. The investment is minimal and corporations can increase confidence scores and outcomes for a fraction of expenditure in other solutions.
Embarking on a new transformational journey
Graph analytics applications use algorithms to traverse and analyse graphs to uncover and potentially identify intriguing patterns that represent business prospects.
Business operators can have a better understanding of what they are doing efficiently and inefficiently within their businesses by analysing data. Professionals with an analytics background are capable of answering critical questions once a problem has been recognised.
While Mohit concedes that businesses are on the top because they invest in technologies that help them progress, the pertinent question is: what made them decide to use this technology and how did they get started?
Answering with another truism – the early bird catches the worm, Nik feels that in all likelihood the leading companies had a combination of higher risk appetite, vision and gut instinct. With trailblazers leading the way, the question now isn’t so much about how do companies get started but when do they get started?
With the gains seen in the companies that already deploy graph database technology, others are eager to climb on board. But they seem to be unsure about timing and the most opportune stage to do so.
“We are seeing a lot of other companies that are inspired by these pioneering companies’ successes and are putting a lot of faith and stock in our technology,” Nik acknowledges. “Leaders in any organisation have to understand that technology is an investment and that everyone must embark on. The time is always right to invest in such technology!”
For more information on Neo4J visit https://neo4j.com/
Bank Indonesia and the central bank of the United Arab Emirates have officially signed a memorandum of understanding (MoU) to strengthen payment system cooperation. The agreement aims to make transactions safer and more efficient, as well as cross-border payment systems, particularly retail payments, and anti-money laundering and counter-terrorism financing frameworks.
The UAE’s central bank said in a statement that the agreement also seeks to improve bilateral collaboration in the fields of payment systems and digital financial innovation, including conventional and Islamic finance.
The Memorandum of Understanding covers three primary areas: digital innovation in financial services and payments, cross-border payment systems, particularly retail payment systems, and the AML-CFT framework. The Memorandum of Understanding also serves as a framework for collaboration in both traditional and Islamic financial systems. In addition, the Memorandum of Understanding will be implemented through a variety of initiatives, including policy discourse, information sharing, technical cooperation, a fintech introduction programme, a working-level committee, and any other collaboration that BI and CBUAE deem relevant.
The President of the Republic of Indonesia, Joko Widodo, signed the Memorandum of Understanding with the Vice President and Prime Minister of the United Arab Emirates (PEA), as well as the Ruler of Dubai, Shaikh Mohammed bin Rashid Al Maktoum.
The Memorandum of Understanding with CBUAE, according to Bank Indonesia Governor Perry Warjiyo, is an endeavour by Bank Indonesia to deepen cooperation between Bank Indonesia and strategic partners in numerous vital areas. This Memorandum of Understanding is also a demonstration of Bank Indonesia’s commitment to combating money laundering and terrorism funding, as well as helping the Government of Indonesia’s attempts to become a member of the Financial Action Task Force (FATF).
The signing of the Memorandum of Understanding with Bank Indonesia, according to CBUAE Governor Khaled Mohamed Balama, illustrates CBUAE’s objective to establish an efficient payment system within a robust financial infrastructure. Working with partner central banks to satisfy international standards to boost market trust is one example. This is also a concrete step toward CBUAE reaching a common understanding in terms of developing solutions and bolstering collaborative efforts to combat illegal money activity.
OpenGov Asia reported, Indonesia’s central bank, Bank Indonesia (BI), has continued its innovative initiatives by launching the long-awaited BI-Fast, a real-time retail payment system infrastructure, to meet public demand for fast, mobile, secure, and low-cost transactions. BI-Fast was established to serve industrial consolidation and end-to-end integrated digital economy and finance (EKD), a national step toward realising the Indonesia Payment System Blueprint (BSPI) 2025 and to meet public demand for a fast, easy, affordable, safe, and reliable payment system, according to BI Governor Perry Warjiyo.
BI-FAST is a national retail payment system that allows people to make payments using a variety of instruments and payment channels in real-time, 24 hours a day, seven days a week. Real-time transaction settlement at the bank and customer levels, 24/7 availability, real-time validation, and notification, use of a proxy address as an alternate solution recipient account number, and reliable security features such as fraud detection and an anti-money laundering/countering the financing of terrorism (AML/CFT) system are just a few of the highlights of BI-FAST. Individual credit transfer services are expected to be available by the end of 2021.
The new BI-Fast payment system was scheduled to be operational in the second week of December, with the initial phase focusing on individual credit transfers. Following that, BI-Fast services would be expanded in stages, including bulk credit, direct debit, and payment requests. By providing an alternative to the existing national payment system infrastructure, BI-Fast is projected to boost the national retail payment system.
In response to the recent global events that are causing consumer shifts, many organisations are accelerating their digital transformation efforts. Digital transformation has gained importance and is perceived as a strategy for both survival and growth in the new normal. This has increased the need to use innovative technologies to create new business models, products, or services.
As the decades-old IT systems, responsible for running traditional workloads, look to modernise, there is still a need — as there always has been — for reliable, scalable and secure infrastructure. One technology that is both synonymous and non-negotiable with such efforts is cloud. Cloud services are now imperative, and make a real difference in ensuring that important enterprise services can keep going in almost all scenarios.
However, one common problem that financial services, government agencies and businesses face when moving to cloud services is to ensure that ongoing services run well even as the organisation migrates to newer solutions. Adapting to current technological trends while eliminating the risks of breaking existing systems and interrupting current business operations is vital.
Any organisation would baulk at the prospect of migrating to a cloud environment in one massive move. Incremental modernisation allows them to continue running their mission-critical applications on their current infrastructure while adapting and building new cloud-native applications in parallel.
Organisations across both the private and public sectors have begun to alter their perception of migrating workloads and applications to the cloud. Beyond a doubt, making a shift from a legacy to a managed cloud infrastructure is daunting on many levels. Discarding proprietary technology accumulated over the years can hold organisations back from making the move. Concern over data latency and volumes linger, especially when it comes to streaming data using the public cloud. Having the right people, processes and systems is a serious consideration. Combined with the cost of technology, infrastructure and reorganisation, these can give good reasons for pause.
The need of the hour is for these organisations to see a reduction in infrastructure cost, the ability to scale up and support a several-fold increase in traffic, reduced time to deploy and a simplified production rollout and recovery process. Enterprises need to have a rapid roll-out of digital capabilities by improving overall time-to-market and reducing the total cost of ownership. A great solution that can ease transformation, is to use container-based technology to develop, build, package and deploy applications and business solutions in a more efficient, secure and scalable way. Cloud-native solutions contribute towards a reduction in long-term operations costs, better system resilience, more efficient processes and enhanced security.
This begs the question: Do organisations have the capability to support cloud-native solutions to enable holistic improvement of infrastructure, to enhance the efficiency, scalability and security of their operations?
The OpenGovLive! Virtual Breakfast Insight held on 24 November 2021 aimed to help delegates understand ways to overcome the barriers to successful cloud migration and modernise infrastructure and application delivery to better serve the citizens and customers.
Embracing the inevitability of a hybrid cloud reality
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address.
The pandemic has vaulted the governments and businesses headfirst into the next stage of digital transformation and online services. In the region, the Singapore government has taken the lead with the investment in public cloud estimated to amount to US$ 3.6 billion by 2023 and 70% of eligible government systems to be on the commercial cloud by 2023.
There is a need to rethink cloud strategy, Mohit asserts. There are too many legacy systems and organisations cannot afford to hide behind those systems anymore.
Currently, citizen happiness is the most important benchmark for governments and enterprises because it is about the uptake of the technology, Mohit contends. Acknowledging that the technology is here to stay, there is a need to look into upskilling the workforce. When planning, he says, “Technology has to be seen as an investment and not an expense.”
What cloud offers is the flexibility to rapidly respond to the changes demanded by digitally-savvy citizens. Agencies now have the capability to not only move workloads between on-premises data centres and public cloud but also make a change and upload data instantly.
Agencies that embraced cloud services proved more responsive and were able to continue operating remotely and serving their citizens, demonstrating agility, scalability and speed even amid a pandemic.
Against such a backdrop, organisations must boldly accept the new digital reality. They must harness technology to enhance the working experience and drive organisation goals in the new normal. And there are a lot of solutions available now. Mohit acknowledges. Global companies have been looking into the design of high-performance computing solutions that will tackle some of the world’s toughest challenges.
At the same time, navigating this shifting terrain must be done securely – there is a need to bake security into the process and tools. This means that security is readily built into the infrastructure across workloads and applications. Compliance and regulation are also an intrinsic part of creating a safe environment. Policy and guidelines are established to provide accountability and build trust with citizens and consumers alike.
But neither security nor compliance concerns are a reason to not transform. “Do not hide behind safety or governance,” Mohit cautions. “These issues should not deter people from embracing technology – they need to be confronted, not avoided.”
Ultimately cloud is here to stay. Mohit concludes. Organisations can get a head start now or play the more tenuous game of catch up later.
Firmly convinced that the transformation need not be done alone, he urges delegates to partner with organisations with the expertise to facilitate digital transformation. The process needs to be done at scale and with speed. The right partners bring a wealth of expertise and experience that will make the journey far easier to manage and navigate.
Exploring international use cases of hybrid cloud platforms
An Nguyen, Director, Cloud Solutions, Red Hat spoke next on trends in hybrid cloud adoptions.
An observes that there is a general shift in enterprises moving to public cloud and even data centres have begun moving to the public cloud. Nonetheless, he also notes that the use of on-premise private cloud infrastructure is still significant, pointing to the inevitable shift to a hybrid cloud model.
An emphasised the need to be adaptive. For An, the added benefit of using hybrid cloud is the ability to offer better customer services through quicker feedback from consumers. Cloud providers can monitor and keep everything updated for the organisations. Apart from that, public cloud infrastructure providers can also give you the most complete view of what you are using.
To become agile, cloud is an essential component. While the implementation and focus of hybrid cloud may not be easy initially, An contends that there are tremendous benefits to be reaped. This includes improved security, application or data portability, automation and orchestration, ease of management or operations, ease of implementation or deployment and architectural consistency.
Leading companies have demonstrated the possible use cases and the solutions that Red Hat offers depending on the organisation’s needs. The company has done work across a wide range of sectors, and banking has been particularly active.
An shared that approximately 2000 customers are using Openshift for mission-critical systems around the world. Red Hat possesses a full stack container platform and the operating systems to support applications to deliver the best business impact.
For Deutsche Bank, the impetus to implement multi-cloud technology stems from the desire to standardise and unify the platform of the myriad of applications. Their journey first began with the standardisation of the operating system followed by optimisation through the Openshift container platforms. Since Openshift serves all kinds of container workload, it helped to streamline processes in Deutsche Bank and enable automation.
Red Hat also provided support and expertise to Amadeus, on how they can move towards cloud-native workloads and navigate the complexities of their unique situation. In another instance, BMW needed help with expansion into new markets without investing in building data centres. To do that, they needed standardisation to move the workload from one country to another seamlessly. By adopting Openshift Dedicated, BMW could connect devices across different public cloud providers.
An emphasises that Red Hat’s OpenShift is the industry’s leading enterprise Kubernetes application development platform, helping customers deliver new customer experiences, open new lines of business, and modernise their existing application portfolio. He encouraged delegates to reach out to him should they have any queries on how the hybrid cloud model.
Peering into a digital future: Taking pre-emptive steps to stay ahead of the game
John Baddiley, Head of Strategic Relationships, Bank of New Zealand, shared BNZ approached the move to hybrid cloud and elaborated on the journey thus far.
BNZ has been around for 160 years and employs over 5000 staff all over the country. They have a strong focus on customer outcomes and experience, which has been reflected in the awards that they have been recognised for over the past few years.
John says that long history has many benefits, but can be a potential roadblock when it comes to digital transformation: something that has worked in the past it can be difficult to let go of it. However, BNZ has taken the step to change, modernise and transform its operations.
Cloud adoption is a key part of their transformation story. Rapid change, innovation and adoption mean that BNZ customers expect more every day. “Approaches to technology that worked five years ago will not work today,” he believes.
Monolithic solutions do not provide the flexibility required to adapt or provide the resilience of an ‘always on’ world. He is convinced that systems need to be able to change rapidly and securely to meet the expectations of customers, regulators, and shareholders. At these crossroads, technology can be a strategic advantage or a strategic inhibitor.
The desire to adopt cloud was a strong diver for BNZ when they first embarked on the journey. He shared that the various intent and goals BNZ had in mind led them to take different strategies in the hybrid cloud transformation depending on the application and needs.
A strategy was to refactor applications such that they are cloud-native. They had to bite the bullet to lift and shift some applications from on-prem to cloud environments because the previous best practices are no longer ideal.
One approach was to lift and shift applications that were closest to the mainframe. Another strategy was the ‘Outside In’ one, where the customer or banker facing applications were brought into a zero trust architecture. The third approach was to build applications that are cloud-native using their engineering foundations to deliver and consume cloud services.
However, John cautions that several dependencies must be addressed and operationalised before shifting or building applications on the cloud. These include:
- Engineering Platforms including integration, deployment pipelines, monitoring and management.
- Cloud skills and experience will be required. Choose a model (uplift, capability enhance, outsource) that is right for the workload.
- Connectivity to and between the cloud environment(s) from legacy data centres
- Finance and Cloud Accounting capabilities to ensure that business units have visibility of current and forecast spend
- Patterns for cloud-native software architecture and application transition must be developed, shared, and enforced
- Security standards and patterns must be defined and deployable
Speaking from experience, John says that organisations embarking on this journey need to learn to manage risk during transformation. BNZ has built a cloud governance ecosystem that integrates all aspects of governance, risk management and regulatory compliance. Producing an ecosystem that helps to manage risk rather than having only one party.
Apart from that, John stresses the importance of engaging with regulators to give them confidence that the shift of the workload to the cloud is robust.
To that end, BNZ produced the 7 perspectives of CAST, which is a NAB-designed framework that defines minimum controls, standards and techniques for the adoption of cloud services for the material workload.
BNZ has chosen to apply the framework to material workloads and to assess for use with all workloads. The CAST Framework consists of 7 perspectives (or areas of focus), each with minimum mandatory standards, techniques, and controls for migrating material workloads to public cloud services.
BNZ applies CAST to all material workloads (those rated as heightened or extreme in the Application Inherent Risk Assessment).
In closing, John highlights that multi-cloud treatment varies by business significantly. He stresses that the most critical processes and applications must be designed to migrate quickly if required.
Being able to stay agile, nimble and relevant is the ultimate key to surviving in a rapidly changing world.
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and facilitate discussions that impart professional learning and development for participants.
One being asked about their organisation’s biggest challenge in digital transformation strategy, delegates were evenly split between culture (48%) and skills (48%).
A delegate opined that culture is the biggest challenge because digital transformation requires a different way of working, understanding finances and budgeting.
Asked which elements of transformation are the most challenging in their organisation, half of the delegates felt that (IT/Software) architecture and development (50%) was the most challenging element. About 405 thought leadership was an issue while 5% felt IT) operations was of concern.
A delegate said they had difficulty in bringing legacy products onto cloud-native platforms within the healthcare industry.
Mohit agrees that the journey is not an easy one; however, it is an inevitable one. He reminds delegates that this is where experts can help. The technologies used in 2020 and 2021 were “band-aid technologies.” Organisations need to prepare themselves to be ready for the next hit. Digital transformation is not a strategy but something that needs to be deployed. Security and privacy cannot be a stumbling block on that journey, Mohit cautions them.
The next question inquired on the percentage of workloads delegates see themselves moving to cloud over the next 3 years. Just under half (42%) indicated that more than 50% would be moved to cloud, followed by 30% – 50% of the workload (37%) and 10% – 30% of the work (21%).
Mohit firmly believes that a cloud-first policy is necessary because it is possible to have both on-prem and cloud services. Critical data can be kept on-prem but others can go into the new environment.
John adds that it is a question of capacity and finance. The selection of applications and data that go on cloud is a matter of how much work is associated with shifting each application and the maintenance to ensure that everything is working. For BNZ, John estimated that 80% of the workload will run on cloud eventually and all new applications are cloud-native.
When asked about how John manages the compliance and regulators, John explained that CAST helped to accelerate shifting the workload to the cloud in demonstrating the measures that are in place.
Another delegate wanted to know how critical it is to have a mature strategy or process like CAST before migrating critical applications to cloud. John says it depends on the organisation’s risk appetite and how much regulators care about how organisations run workloads. He feels that every organisation needs some form of risk control framework but that it does not necessarily need to be as comprehensive as CAST.
It is also not about selecting one total cloud, John opines. When choosing to deploy individual applications, organisations need to understand the capacity of teams and as well as the suitability of the features of the cloud.
John adds that it is important to examine the dependencies and then shift those that did not have dependencies. By shifting things to cloud, infrastructure gets taken care of and affords people more time to deliver value and focus on things that matter to the business.
On the most important outcome they are seeking in their digital transformation, delegates were equally split between the reliability of newly deployed changes (26%) and innovative platform and culture to support new ideas (26%). Similarly, better security and governance models got 16% as di operational efficiency (16%). The remaining delegates voted for reducing the cost of operations (11%) and the speed of developing and deploying changes (5%).
Polled about their top consideration in adopting / choosing multi-cloud, most delegates selected inter-communication and workload portability among the clouds (28%). This was followed by an even split between tools and services available on the new cloud (17%) and data sovereignty and residency (17%). The remaining delegates were equally divided in a three-way split between support within multi-clouds (11%), cost optimisation (11%) and complexity of migrating existing apps (11%). The remaining delegates (5%) chose the availability of skill set to navigate the new cloud as the top consideration.
The final question asked what the biggest benefit that Edge Computing brings to their organisation as part of their digital transformation strategy. About a third (35%) indicated that Fast-to-Adopt IoT Solutions was the biggest benefit, was followed by a quarter (25%) who opted for fast, affordable networks at the edge (25%). The remaining votes went with hardware-based security leadership (15%) and AI and computer vision expertise (5%).
Asked to share more on how BNZ baked security and compliance enforcement into the cloud deployments, John explained that BNZ has CSAMs for every cloud service, which defines how the service must be configured for each use case. On top of that, they use an attestation process with CAST to make sure that they have checks to ensure that implementation teams are following the architecture and policies correctly.
BNZ is working towards embedding as many of the CAST checks into pipelines, although this is at a very early stage. He added that they are also building up patterns to enable Zero Trust Architectures, which will help bake in the infrastructure aspects of security to our solutions.
Apart from that, John revealed that they run a secure code warrior programme, which teaches code security practices to all of their developers. He emphasised that it is important to remember that security is everyone’s responsibility, not just the security team.
In closing, Guan Hao, Industry Technical Specialist, Intel, thanked all the delegates for their participation and insights on the topic.
He reiterated that applications and data are growing and organisations will need an infrastructure that can handle the load. With the changing reality that the world is in currently, he stressed the importance of employing the right technologies to help to lubricate the process of digital transformation. To that end, cloud is the cornerstone of digital transformation.
On top of that, organisations will need flexible infrastructure to handle the demands of storage, network and multiple cloud platforms.
Finally, An recapped the many use cases for hybrid cloud that delegates need to understand to be able to identify their unique journey. He urged the delegates to consider the intercloud connection and make sure that architecture is cloud-native.
He invited delegates to reach out to him and the team if they had queries or wanted to understand the unique value that hybrid cloud can bring to their organisations.
As organisations seek to redefine how they create, deliver and capture value, many are looking to digital technologies, which in turn are driving transformative changes across industries. However, the digital transformation process calls for far more than just updating technology or redesigning products.
Failure to connect the effort with employee values and behaviours can bring extra risks to an organisation’s culture. A holistic and collaborative effort can help teams shift their perspective to one where they feel empowered to embrace, and drive digital transformation.
OpenGov Asia had the opportunity to speak exclusively to Donna Benjamin, Engagement Lead, Red Hat Open Innovation Labs in Australia and New Zealand about her perspective on how digital transformation impacts culture and vice versa.
Donna’s professional career has been deeply shaped by her involvement in the global open source community. In her current role, Donna works with Red Hat customers to facilitate project success by helping them embrace an open approach to transformation.
As an engagement lead with Red Hat’s Open Innovation Labs, she is focused on supporting customers through the sustainable, resourceful and effective application of open source technology. Donna firmly believes culture plays a key role in putting strategy into practice when implementing technology solutions.
Red Hat’s Open Innovation Labs help bring people, processes and technology together. Through an immersive teaming residency, customers are armed with the skills, tools and processes to deliver better software, more quickly, to meet the demands of today’s market. Open Innovation Labs combines technical practices with cultural practices – to help teams be more collaborative and impactful.
Moreover, as part of an Open Innovation Labs consulting engagement, customers work collaboratively in direct partnership with Red Hat experts to jumpstart their innovation and software development initiatives. The aim is to help organisations meet their goals and stay ahead of competitors through the use of open source technologies.
Red Hat has a long and established reputation for being a leader in open source technology. To fully benefit from cloud technology, Gunasekharan Chellappan, Country Manager, Singapore, believes that organisations should adopt a Hybrid Multi-Cloud strategy. With a good Hybrid Multi-Cloud strategy, tools from across the various clouds can be made to work together seamlessly.
When defining success, organisations need to ensure their intention and purpose are built on their core values and vision. While this direction often comes from the top, organisations that holistically embrace open principles of inclusivity, adaptability, transparency, collaboration and community are able to accelerate their progress towards digital transformation.
For Donna, the top-down and bottom-up approaches complement each other. Leadership sets the direction for organisations, however, employees must be empowered to contribute to planning and goal setting.
In measuring success, the common criterion is revenue – but retention is also crucial. She feels there is tremendous value to be had from employee satisfaction and the benefits of caring about each other within organisations. Organisations need to create an environment where staff want to stay because they know they are valued. Culture has to focus on respect and treating employees with dignity. This means putting people at the centre.
Apart from trying to increase revenue, retention, customer or employee satisfaction, organisations are also trying to reduce cost wastage and unproductive effort. Success can also be measured by reducing the time it takes to deliver products to market.
Staff at all levels have a significant impact on these aspects. Therefore, organisations must have an intentional and meaningful conversation about value, goals and the way to measure success right from the beginning.
As government agencies and corporations have distinct drivers, the measure of success is going to be different even though they may face similar challenges of scale and technology. In the government space, the measure of success is driven by citizens. Agencies need to be inclusive to ensure wider access to their digital services and more participation of citizens.
To create a culture that is conducive to organisations’ strategy, empowerment is critical. This is relatively more challenging in the Asia-Pacific (APAC) region given how diverse the region is. This diversity brings different layers of culture across the spectrum at a national, organisational and individual level.
Leaders need to be aware of the varying contexts and how it fits into what they are trying to achieve. To create an enabling environment, it is vital to have clarity of vision as well as to get the team onboard. From the cultural point of view, organisations need to create an atmosphere where people feel empowered to embrace the fast-paced change of technology and want to learn new tools and upskill themselves.
The entire situation has been suspended by the pandemic. In the new normal, remote working or hybrid working, has become the default – at least for the foreseeable future. When it comes to these new models, Donna acknowledged that there are trade-offs.
One of the downsides is the loss of real human interactions. Donna feels that organisations must intentionally create space and guidelines for the human aspect in online conversations. For example, Red Hat’s Labs invite teams to create a working agreement to define common ground rules, such as having a cameras-on policy and core working hours for distributed teams. They may also encourage people to use hand gestures during AV interaction, adopting a social contract that recognises the importance of non-verbal communication.
Besides acknowledging the downsides, Donna pointed out that the pandemic has been responsible, to a great extent, for making distributed working the norm. This, in turn, has meant that the metric for employee productivity has changed – organisations no longer assess employees’ productivity by merely the time they spend at work but by the real outcomes they deliver.
Without a doubt, leadership has a huge influence on culture. While she acknowledges that there is no universally accepted definition of leadership, it is a position of influence. Leadership, she says, is less about a position or role and more about modelling behaviours, empowering people, and fostering an enabling environment.
Leadership styles are shifting from managing change to facilitating change. The role is more about helping people become part of the decision-making process, to be adaptable, and to constantly evolve in response to ongoing change.
Information and knowledge is power. When information was less democratised and centralised with top management, people had to accept (at face value) what leaders said. However, as information becomes more widely accessible, the entire workforce can become part of the decision making process. Leaders are not expected to know everything, but are there to source, locate, validate and disseminate information that is beneficial to the organisations’ goals.
When organisations truly empower people – from a top-down and a bottom-up approach – everyone can be a leader and help contribute to the journey. In this way, leadership is not just a single charismatic leader, but a sense of collective responsibility to achieve common goals.
Another impactful practice that Open Innovation Labs adopt is “pairing”. Working together to finish a piece of work in parallel might take longer in the short term, but in the long run, it can be more effective. When organisations support people to work together more effectively, they see how significantly collaboration impacts outcomes.
Donna revealed how Red Hat’s Open Innovation Labs has brought two revolutionary forces together – the technology of open source software and agile development methodologies. This is a paradigm-shifting idea.
Open source has traditionally been an asynchronous activity where people work independently online without real-time, in-person communications. On the other hand, agile methodologies have traditionally been co-located and time-boxed. Merging the two aspects builds on both their strengths, creating far more impactful outcomes.
Looking ahead, Donna believes we can all design our future. However, many people do not feel they have the freedom or capacity to do so. This is where the role of leaders is of the utmost importance. Good leadership will openly and freely empower people to achieve their potential. She firmly believes leaders should help staff define their goals and work together on the necessary steps to achieve them.
Donna’s perspective is in line with the concept of “Open Leadership” that Red Hat advocates. It involves connecting to others, extending trust, being transparent, being collaborative, and promoting diversity and inclusion. Open leaders invite cooperation and productive dialogue to create better solutions as well as empower others to share ideas and value solutions from a broad base of contributors.
In closing, Guna elaborated on the issue of leadership more deeply. He agrees with Donna that great results come from leadership and foundational values that permeate the organisational culture.
Guna is a firm believer in the concept of Level 5 Leadership that was popularised in a book, “Good to Great”. Level 5 refers to the highest level in a hierarchy of executive capabilities. Leaders at the other four levels in the hierarchy can produce high degrees of success but not enough to elevate companies from mediocrity to sustained excellence.
Guna compared the main difference between authoritarian and Open Leadership. Companies with an authoritarian style tend to have one distinctive leader that is almost synonymous with the company itself. Meanwhile, companies that adopt Open Leadership do not usually have a single leader that everyone can point to – the team effort is front-stage. Open Leadership allows like-minded people who have a common set of values to discuss ideas openly to create innovative solutions.
When it comes to organisational culture, there must be a climate for growth and collaboration that exist at all levels. Guna says he cannot understate how critical leadership style is to the success of an organisation. While most governments tend to have a strong, firm structure, Guna believes that change can happen by having the right leadership and organisational culture.
As part of the Singapore government’s objective to harness the capabilities of commercial cloud computing platforms to governmental systems, many public sector agencies are migrating their IT systems to the Government Commercial Cloud (GCC). Government agencies can use commercial clouds to incorporate advanced functionality into their digital services thanks to the GCC, which eliminates the need for them to set up their own data centres.
Agencies require a reliable and secure data management platform that allows for quick migration to the GCC, high data quality and managed data access for users. As a result, choosing the correct data strategy and the long-term platform is even more crucial in their migration to the GCC. In light of this, Singapore’s Government Technology Agency (GovTech) is upgrading the Government Commercial Cloud (GCC) service to make it easier for government agencies to manage and safeguard their use of public cloud services.
According to Kevin Ng, Director of Core Operations Development Environment and Exchange at GovTech, the upgraded service, dubbed GCC 2.0, boasts enhancements in user onboarding and security, among other areas. Speaking at the virtual conference, Ng said the enhancements are being made in response to feedback from GCC users and the learnings that GovTech has gleaned from managing the service. The change in mindset about cloud as code and software, rather than a distinct form of on-premise hardware infrastructure, is the foundation for GCC 2.0’s advancements.
“Today we still think of a cloud as a piece of hardware. We still like to review our architecture diagrams, but it’s also useful to put this architecture into code and deploy it,” said Goh. “And if it is incorrect, let’s tear it down and redeploy again. We no longer need to be constrained by the art of planning in a waterfall manner.”
According to GovTech, the government has around 600 systems on the cloud and is on track to have 70% of eligible systems on the cloud by the end of the year. A good chance for individuals and businesses, as well as the addition of new public services.
In addition, GovTech is making things easier with TechPass, a single sign-on service that gives users access to cloud management portals, public cloud services, and engineering tools in the Singapore Government Technology Stack (SGTS), a compendium of shared software and infrastructure services for quickly developing and testing new applications.
TechPass is part of the Seed security suite, which combines the concept of zero trust with other parts of cloud-based access restrictions to create a secure endpoint device platform. Only secure and authorised devices will be able to develop and manage government cloud apps due to this.
OpenGov Asia reported that the official opening of a Data Science and Artificial Intelligence (DS&AI) Lab was recently announced with the support of the Singapore Economic Development Board (EDB). Singapore’s IT manufacturer and Nanyang Technological University’s (NTU) collaborated to enhance local DS&AI education, empowering students with the tech tools and skills needed to inspire a brighter future.
The Lab will put together the IT firm’s cutting-edge deep-learning technology with NTU’s global strengths in artificial intelligence and data science, allowing local data scientists and AI experts to pioneer the development of meaningful AI solutions in important industries. According to NTU, the Lab was still in the planning stages in 2018, and roughly 150 NTU students enrolled in the Bachelor of Science in Data Science and Artificial Intelligence programme have benefited from the Lab’s resources since then.
These undergraduates attended lessons at the Lab and used the IT company’s sponsored servers to access computing capabilities for their projects and other activities. This effort also aims to help Singapore’s AI programme and its transition to Industry 4.0.
The U.S. Air Force Research Laboratory has chosen to deploy a collection of cloud computing, productivity and collaboration tools, software and products developed and marketed by a giant cloud platform as a pilot program among a segment of its workforce of scientists and engineers. The initial deployment has dramatically enhanced engagement with its worldwide network of external research partners.
AFRL is a global research enterprise supporting two services, the U.S. Air Force and the U.S. Space Force. From laser-guided optics enabling telescopes to see deeper into the universe than ever before, to fundamental science that has spawned innovations in quantum computing and artificial intelligence, AFRL rapidly scales discovery to deliver leading-edge technologies for the military. Core to the success of AFRL’s mission is engaging with world-leading scientists, small businesses, large industries, and other government agencies to build communities that drive innovation.
AFRL teams are using integrated cloud-based tools to simultaneously share, discuss, and chat about critical information—eliminating the toil of email chains and hours-long data file exchanges. Through the video conferencing service, AFRL research teams are hosting flexible, virtual meetings to exchange ideas anywhere, anytime.
The recent collaboration, combined with the company’s Zero Trust security philosophy, provide AFRL with additional safeguards while keeping security measures invisible to end-users. Those AFRL scientists using cloud technology can collaborate and innovate safely and securely under the standards defined by the U.S. Defence Information Systems Agency (DISA).
COVID-19 significantly limited the physical presence of researchers in the lab. The cloud-based tools eliminated what would have otherwise been almost a total work stoppage. In fact, new insights into 2D nanomaterials, critical to future Department of the Air Force capabilities, were discovered using Workspace that would have otherwise been impossible.
– Dr Joshua Kennedy, research physicist, Materials and Manufacturing Directorate at AFRL
A recent survey of 240 researchers involved in the preliminary deployment revealed an average time savings of three hours per week. For AFRL’s highly trained workforce of PhDs, this means more time to dedicate to the mission. The U.S. Air Force places a strong emphasis on modernisation and innovation, and this is apparent in the groundbreaking work of AFRL researchers. Members of AFRL rely on cloud-based tools not only to securely and successfully achieve their mission but also to power new discoveries.
In fact, early in the fiscal year 2021, Air Force Research Laboratory commander, Maj. Gen. Heather Pringle directed AFRL to prioritise ongoing efforts of digitally transforming AFRL and issued a charter establishing the AFRL Digital Transformation Team. The team’s mission centres on the creation of “One AFRL,” a flexible enterprise that capitalises on the seamless integration of data and information through the use of modern methods, digital processes and tools and IT infrastructure.
As reported by OpenGov Asia, The U.S. Department of Defence (DOD) outlined its goals that would help support service members outside of the U.S. by way of cloud computing. The agency establishes the vision and goals for enabling a dominant all-domain advantage through cloud innovation at the tactical edge. It identifies areas requiring modernisation to realise the potential of cloud computing, specifically: security, redundancy, reliability and availability.
The strategy is broken down into three parts: resilient connectivity, providing the right computing power, and training members to utilise the technology. All the goals can be achieved, but some will take much longer than others, and accomplishing all three will require more than just the efforts of the Defence Department. The approach needs to be holistic that involves a whole government, members of Congressfederal partners, internal to DOD, also with the cloud service providers and developing a cohesive strategy that works for the department to be able to deliver these much-needed services, to where they are needed.
The Enduring Security Framework (ESF) hosted a 5G study group comprised of government and industry experts to explore potential threat vectors and vulnerabilities inherent to 5G infrastructures. The experts then recommended identifying and assessing threats posed to 5G, determining what standards and implementations can achieve a higher baseline of 5G security; and identifying risks inherent to the cloud that affect 5G security.
In support of this task, The National Security Agency (NSA) and the Cybersecurity and Infrastructure Security Agency (CISA) have published cybersecurity guidance to securely build and configure cloud infrastructures in support of 5G. “Security Guidance for 5G Cloud Infrastructures”: Prevent and Detect Lateral Movement is the first of a four-part series created by the ESF.
This series provides key cybersecurity guidance to configure 5G cloud infrastructure. Our team examined priority risks so that we could provide useful guidance, disseminated in an actionable way to help implementers protect their infrastructure.
– Natalie Pittore, Chief of ESF in NSA’s Cybersecurity Collaboration Centre
The series builds on the ESF Potential Threat Vectors to 5G Infrastructure analysis paper, which focused specifically on threats, vulnerabilities, and mitigations that apply to the deployment of 5G infrastructures. Based on preliminary analysis and threat assessment, the top 5G cloud infrastructure security challenges were identified by ESF and a four-part series of instructional documents covering those challenges will be released over the next few weeks. Topics include securely isolating network resources; protecting data in transit, in use, and at rest; and ensuring the integrity of the network infrastructure.
Part I focuses on detecting malicious cyber actor activity in 5G clouds to prevent the malicious cyberattack of a single cloud resource from compromising the entire network. The guidance provides recommendations for mitigating lateral movement attempts by malicious cyber actors who have successfully exploited a vulnerability to gain the initial access into a 5G cloud system.
This series exemplifies the national security benefits resulting from the joint efforts of ESF experts from CISA, NSA, and industry. Service providers and system integrators that build and configure 5G cloud infrastructures who apply this guidance will do their part to improve cybersecurity for our nation.
– Rob Joyce, NSA Cybersecurity Director
Strong and vibrant partnerships are critical to the overall effort to reduce cyber risk. Along with the public and private partners in the ESF, CISA is proud to partner with NSA to present the Security Guidance series for 5G Infrastructure. Protecting 5G cloud infrastructure is a shared responsibility and we encourage 5G providers, operators and customers to review the new guidance.
5G cloud providers, integrators, and network operators share the responsibility to detect and mitigate lateral movement attempts within their 5G cloud infrastructure. This document provides best practices to secure the 5G cloud from specific cyber threats of lateral movement that could compromise a network.
As reported by OpenGov Asia, CISA is asking researchers and entrepreneurs for information on developing a ubiquitous and robust 5G/ Internet-of-Things (IoT) Situational Awareness System (5i SAS). The system must enhance situational awareness of current platforms and identify potentially dangerous 5G components and internet-of-things devices.
Without a way to distinguish normal 5G and IoT conditions from suspicious environments, exploits on personnel or systems could go undetected and cyberattacks would be untraceable. As the introduction of 5G will enable billions of devices connected to the network with direct communication to one another, the development of a 5i SAS capability essential
Although the request for the technology has been made on behalf of CISA, other federal and state, local, tribal and territorial governments may need to use it. If enough 5i SAS devices are issued, they could not only detect unhealthy/insecure situations, they could also triangulate the physical location of suspicious IoT and 5G devices, or jamming sources or anomalous network behaviour.