Revenues from the information technology (IT) sector in the first six months of 2022 were estimated at US$ 72.5 billion, up 17.8% year on year. According to data from the Ministry of Information and Communications (MIC), of the total, revenues from hardware and electronics exports were estimated at US$ 57 billion, rising 16.4% year on year. Computer shipments totaled US$ 29.1 billion, up 21.8%. Electronic export earnings reached US $27.9 billion, an 11.2% increase.
Further, revenues from products made in Vietnam accounted for nearly 27% of the total or US $19.4 billion. Revenues from the information and communications technology (ICT) sector reached approximately US$ 77 billion, rising 17% compared to the same period last year. Profit was estimated at US$ 5.9 billion, up 13%, according to the report released on 18 July. This year, the Ministry set a revenue target of around US$ 140 billion for the ICT sector, a year-on-year increase of 14%. These figures will help the government formulate targets and tasks for the second half (H2) of the year.
Regarding the IT landscape, around 3,400 digital technology enterprises were established in the first half (H1) of this year, according to the Ministry. The figure helps make the ministry’s target of having 70,000 firms working in this sector in 2022 possible, following a sharp rise over the years. In 2021 there were 64,000 firms working in the sector, up from 58,000 in 2020, and 45,600 in 2019.
As of the end of June, the number of digital technology companies in Vietnam is estimated at 67,300, an increase of 3,422 companies compared to last December. Vietnam targets to have 80,000 digital technology enterprises operating in the sector by 2025, and 100,000 by 2030. It expects that the technology industry will contribute 6% – 6.5% to the country’s gross domestic product (GDP) by 2025.
OpenGov Asia reported recently that in June, government departments and industry players completed major projects in the post, telecommunications, and IT sectors. Among them were the organisation of a symposium on The Future of the Internet and an international seminar and exhibition: Vietnam Security Summit 2022.
Also in the month, MIC launched the make-in-Vietnam digital products award 2022 to encourage, promote, and find outstanding Vietnamese technology products. Directing the sector’s operations in the time to come, the Minister of Information and Communications, Nguyen Manh Hung, underscored the opening of digital museums, shutting off 2G networks, and running frequency auctions in the last six months of the year. He also requested completing the content of several laws including ones regarding electronic transactions, the digital technology industry, and telecommunications, among others.
Meanwhile, in June, several provinces across the country launched digital-centric initiatives and set digital transformation goals for the future. For example, Ha Long, a city in Quang Ninh province, announced its plans to have a 20%-25% average annual growth rate in the number and value of cashless payments by 2025. Officials held a teleconference discussing the implementation of a project for non-cash payment development for 2022-2025 across the city’s 33 communes and wards.
The National Environment Agency (NEA) and the Singapore Land Authority (SLA) have signed a Memorandum of Understanding (MOU) to develop the use of Global Navigation Satellite System (GNSS) data from SLA’s Singapore Satellite Reference Network (SiReNT) to help NEA better monitor island-wide atmospheric moisture. The goal of the five-year partnership is to help Singapore with weather monitoring by giving it more data and making it easier to do exploratory studies for weather forecasting.
“The collaboration between NEA and SLA highlights our commitment to achieve synergies and tap on enablers across the public sector. This partnership provides a platform for NEA to utilise SLA’s expertise in GNSS data collection and processing, enabling NEA to explore non-traditional methods to enhance our weather monitoring and forecasting capabilities,” says Luke Goh, CEO, NEA.
On the other hand, Colin Low, CEO of SLA, said that SLA’s partnership with NEA is a part of its ongoing efforts to collaborate with parties from the public and commercial sectors to open up new applications for SiReNT and its other geospatial products.
The SLA believed combining the knowledge of multiple parties might lead to more innovation and the discovery of workable solutions that could be advantageous to Singapore and the industries.
Colin continued by saying that they are eager to collaborate with NEA to research the unique uses of SiReNT data for improved weather monitoring and research projects on weather forecasting and climate change. The many experiences that were gathered and shared during this partnership will serve as a foundation for upcoming developments in this area.
The production of accurate weather forecasts, climate monitoring, and timely warnings of dangerous weather events all depend on meteorological measurements. The Meteorological Service Singapore (MSS) routinely gathers a variety of observational data from ground-based and aircraft sensors, such as temperature, wind, and moisture.
To measure these weather components at various altitudes of the atmosphere, sensors linked to a weather balloon are routinely launched twice a day at MSS’ Upper Air Observatory (UAO). To enhance the sounding data from the weather balloon, MSS erected a GNSS reference station at UAO in 2019.
This station will provide continuous estimates of moisture in an atmospheric column known as the integrated precipitable water vapour.
In accordance with the MOU, SiReNT will incorporate MSS’s GNSS station, giving MSS access to continuous, almost real-time atmospheric moisture readings for the entire island. By supplying greater resolution and more frequent observation data, this non-conventional moisture data will complement MSS’s current observation network data and enable research into possible uses for weather forecasting.
The partnership will also help SLA’s SiReNT station network, which now consists of nine reference stations dispersed throughout Singapore, grow. The network will grow to 12 stations with more data receivable with the installation of NEA’s GNSS base receiver station at UAO that will be integrated into SiReNT and two anticipated additional coastal SiReNT reference stations. The SiReNT system can create precise positioning data with an accuracy of up to 3 cm and correct positional inaccuracies in GNSS signals.
The SiReNT technology fosters innovation across a range of sectors, including autonomous driving, logistics and automation in the building industry, and monitoring of changes in Singapore’s land height and sea level.
The addition of stations by the end of 2022 will further increase the stability of the services and applications SiReNT now supports in several important industries. It can also be used in novel ways for scientific research on climate change.
Several domestic banks in Vietnam have 90% of their transactions conducted on digital platforms, surpassing the target of 70% set for 2025. Half of the country’s banking services are expected to be digitalised and 70% of transactions will be carried out online by 2025.
The Vietnamese Prime Minister, Pham Minh Chinh, recently stated that the banking sector has played a significant role in national digital transformation by deploying products and services for people and businesses. He urged the sector to further reform its management methods towards modernity and transparency and diversify and improve the quality of its products and services to curb money laundering.
Addressing an event called, “Digital Transformation Day of the Banking Sector” Chinh explained that the sector should work to understand more about the demands of people, businesses, and credit institutions to devise suitable legal documents, facilitating the application of digital technologies in banking services.
He asked the State Bank of Vietnam (SBV) to continue its close coordination with ministries and agencies to formulate a decree on cashless payments and submit it to the government. Common infrastructure such as payment and credit information infrastructure should be promoted. He said suggested stronger connectivity between banks and credit organisations.
Chinh also requested the sector ensure cybersecurity and safety in digital transformation, given the rise of high-tech crime. The sector should raise public awareness about the benefits of digital transformation, enhance personnel training capabilities, and boost international cooperation in digital transformation.
The Prime Minister also attended an exhibition showcasing products and services that promote the digital transformation of the banking sector. Chinh had a working session with representatives from the SBV and commercial banks. He congratulated the sector on its effective operations amid a host of difficulties, especially those caused by the COVID-19 pandemic. He suggested the sector further cut interest rates to support businesses and actively engage in the state’s policies, particularly housing credit for workers and low-income earners. Participants attributed the developments of banks to supportive policies adopted by the state, the management of the government, and stability in the country.
Vietnam’s financial technology market could grow to US$ 18 billion by 2024. The country is a leader among ASEAN members in terms of the volume of financing for fintech, second only to Singapore. Over 93% of all venture investments in the country are directed at e-wallets and the e-money segment. The total number of fintech companies has grown to 97 since 2016, an 84.5% increase. However, the number of newly-launched start-ups each year decreased from 11 to 2.
As OpenGov Asia reported, the market features high competitiveness and a high entry bar. Transaction volume has seen a 152.8% growth since 2016, with 29.5 million new fintech users. As a result, every second Vietnamese citizen uses at least one fintech service. Demand for digital services (transactions, payments, and wallets) in the country is high. According to industry analysts, Vietnam’s fintech sector is young and promising. The market valuation has increased from US$ 0.7 billion to US$ 4.5 billion since 2016.
Michael G. Regino, President and CEO of SSS, announced that self-employed, volunteer, non-working spouses, and land-based Overseas Filipino Workers can pay their contributions through the online method of their choice. This was done in cooperation with the different financial and private sectors.
“We encourage our members and employers to pay their contributions using our online channels as through these payment facilities, they no longer must go to our branches. These can be accessed at the safety and convenience of their homes or offices,” says Michael.
Individual members may furthermore use the websites and mobile apps of other SSS-accredited collecting partners, such as most banks in the public and commercial sectors of the nation. However, both commercial and domestic employers have access to online payment methods.
SSS is a publicly funded social insurance programme that the Philippine government requires to provide coverage to all wage earners in the private, public, and unorganised sectors.
The agency is mandated to set up, develop, promote, and perfect a sound, tax-free social security system that fits the needs of everyone in the Philippines. This system should encourage social justice through savings and protect members and their beneficiaries from the risks of disability, illness, maternity, old age, death, and other things that could cause a loss of income or a financial burden.
OpenGov Asia earlier reported that digitalising SSS pension fund services remain one of the top priorities in the Philippines and that more online services will be added to its digital channels.
More than 30 member services and more than 20 employer services are currently easily accessible on the SSS website. Transactions for membership, contributions, loan granting and repayment, and benefit distributions are only a few examples of the services offered. Other SSS internet platforms also extend some of these features.
Further, almost all new online services are made available via the agency’s website, which serves as its main online platform. However, more work is being done to make the services on this portal accessible to smartphone users via the SSS Mobile App.
The agency is slowly making it mandatory for its programme to be done online. Those who don’t have their own way to do business online can use the e-Centres in branches.
In the meantime, the Department of Education (DepEd) worked with the Young Southeast Asian Leaders Initiative (YSEALI) and exchanged alumni to improve education about climate change through an online programme called Climate Changemakers.
The National Educators Academy of the Philippines (NEAP) has recognised Climate Changemakers as the first climate change training course as part of the Department’s Professional Development Priorities.
Through online training and other digital education initiatives, the programme aims to make teachers better able to teach climate change skills, integrate climate change skills, and act on climate change in the country.
The ten-week online course, which used synchronous and asynchronous modalities to address common misconceptions about climate change, was successfully completed by 400 instructors. Additionally, it gave teachers a place to consider their own learning, exchange difficulties and effective methods.
The Young Southeast Asian Leaders Initiative Professional Fellows Program (YSEALI PFP) is a two-way exchange programme run by the U.S. Department of State. Its goal is to help young leaders from different countries in Asia and the United States to get to know each other better and strengthen economic relationships.
Data is information that has been organised in a way that makes it simple to move or process. It is a piece of information that has been converted into binary digital form for computers and modern methods of information transmission.
Connected data, on the other hand, is a method of displaying, using, and preserving relationships between data elements. Graph technology aids in uncovering links in data that conventional approaches are unable to uncover or analyse.
Different sectors have invested in big data technologies because of the promise of valuable business insights. As a result, various industries express a need for connected data, particularly when it comes to connecting people such as employees or customers to products, business processes and other Internet-enabled devices (IoT).
In an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia, Chandra Rangan, Chief Marketing Officer of Neo4j shared his knowledge on how a connected data strategy becomes of paramount importance in building a smart nation.
Connected data enables businesses
A great example of the power of graph technology, and a very common use case for Neo4j, is its use in the financial sector to uncover fraud. Finding fraud is all about trying to make connections and understand relationships, Chandra elaborates. A graph-based system could detect if fraud is taking place in one location and determine if the same scenario has occurred in other locations.
“How does one make sense of this? Essentially, you are traversing a network of interconnected data using the relationships between that data. Then you begin to see patterns develop and these patterns provide you with answers so that you can conclude whether there is fraud.”
What is of great concern is that fraud is occurring with much greater frequency and with a higher success rate nowadays. The key to stopping and mitigating the impact is time. Instead of detecting a fraud that occurred hours or days ago,
“What if the organisation could detect it almost immediately and in real-time as it occurs?” asks Chandra. “Graph offers this kind of response and is why it’s a great example of value!”
Supply chain and management are other excellent examples of RoI. One of Neo4j’s clients, which operates arguably the largest rail network in the United States and North America created a digital twin of the entire rail network and all the goods. With graph technology across their network, they can now do all kinds of interesting optimisation much faster, leading to better, more efficient outcomes for their entire system.
The pandemic has taught the world about the value and fragility of supply chains. Systems across the globe are being reimagined as the world’s economy realise the need to become more digital and strategic. More supply sources, data, data sharing, customer demands, and increased complexity necessitate modern, purpose-built solutions.
Apart from all the new expectations and requirements for modern supply chains, systems need to and are becoming more interconnected because of new technologies.
Maintaining consistent profitability is difficult for firms with a high proportion of assets. Executives must oversee intricate worldwide supply chains, extensive asset inventories and field operations that dispatch workers to dangerous or inaccessible places.
With this, organisations need a platform that connects their workforces and makes them more capable, productive and efficient. A platform that provides enterprises with real-time visibility and connectivity, while also assuring efficiency, safety, and compliance.
Modern technologies are required to improve interconnectivity, maximise the value of data, automate essential procedures, and optimise the organisation’s most vital workflows.
Modern data applications require a connected platform
“When we programme, when we create applications, we think in what we are calling a graph. This is the most intuitive approach that you can have,” says Chandra.
Any application development begins with understanding the types of questions people want to solve and then mapping it to a wide range of outcomes that they want to achieve. These are typically mapped in what is known as an entity relationship diagram.
Individuals’ increased reliance on systems that work in a way that makes sense to them and supports them has increased criticality. And frequently, when these systems fail, Neo4j makes sense of complexity and simplifies what needs to be done, resulting in a significant acceleration.
As the world becomes more collaborative, integrated, and networked, nations must respond more quickly to changes in their business environment brought on by the digital era; otherwise, they risk falling behind or entering survival mode.
The proliferation of new technologies, platforms, and devices, as well as the evolving nature of work, are compelling businesses to recognise the significance of leveraging the most recent technology to achieve greater operational efficiencies and business agility.
A graph platform connects individuals to what they require, and when and when they require it. It augments their existing process by facilitating the effective recording and management of personnel data. Neo4j Graph Data Science assists data scientists in finding connections in huge data to resolve important business issues and enhance predictions.
Businesses employ insights from graph data science to discover activities that point to fraud, find entities or people who are similar, enhance customer happiness through improved suggestions, and streamline supply chains. The dedicated workspace combines intake, analysis, and management for simple model improvement without workflow reconstruction.
As a result, people are more engaged, productive, and efficient with connected data. Nations can bridge information and communication gaps between executive teams, field technicians, plant operators, warehouse operators and maintenance engineers. Increasing agility and productivity offers obvious commercial benefits.
In short, organisations easily integrate their whole industrial workforce to increase operational excellence and decrease plant downtime, hence maximising revenues. This methodology is based on a collaborative platform direction.
Contextualising data increases its value
According to Chandra, data is a representation of the world in which people live, and people use data to represent this world. As a result, the world is becoming more connected, and people no longer live in silos and continue to be associated in society.
“If you think about data as the representation of the world that we live in, it is connected data and we can deal with all the complexities that we need to deal with when we try to make sense out of it,” explains Chandra.
Closer to home, connected data is crucial to Singapore’s development as a smart nation. “Connected data is at the centre of each of those conversations around developing the nation. When you think of Singapore as a connected ecosystem and when you think about citizens, services, logistics, contract tracing, and supply chain.”
Chandra believes that the attributes have saved the connection between data and people, which is why connections are important. Once people understand those connections, it becomes much easier and much faster to derive the insights that are required.
Without connected data, organisations lack key information needed to gain a deeper understanding of their customers, build a complete network topology, deliver relevant recommendations in real-time, or gain the visibility needed to prevent fraud.
Thus, “knowing your customer is understanding connected data.” With the right tools, data may be a real-time, demand-driven asset that a financial institution can utilise to reinvent ineffective processes and procedures and change how it interacts with and comprehends its consumers.
“Me as a person – who I am, my name, where I live – these are all properties of who I am. But what really makes me me, are the relationships I have built over time. And so, the notion that almost every problem has data that you can really make sense of with graphs is the larger “Aha” moment,” Chandra ends.
Legacy systems are still in use pieces of hardware or software that are out of date. These systems frequently have problems and are incompatible with more modern ones. Although they can be used in the manner intended by their creators, they cannot be improved.
It is the backbone of many excellent organisations, since they utilise software, apps, and IT solutions that are crucial to the general operation of the business but are obsolete and, in some cases, no longer supported by the original software vendor or developer.
While running legacy systems may not appear to be a big deal, they do present a unique set of challenges and potential issues that organisations would be remiss to ignore.
Thus, obsolete legacy systems are at best a nuisance and, at worst, can undermine an organisation’s entire IT security strategy, severely impeding productivity. Furthermore, the longer a company waits to modernise a legacy system, the more difficult the transition becomes.
However, system modernisation is always a prerequisite for digital transformation. Most firms will be unable to fully grasp the benefits of new technologies and solutions without it.
Due to the rapid development of technology, businesses must maintain compatibility with legacy systems that impede the implementation of contemporary technologies.
With this, the Centre for Strategic Infocomm Technologies (CSIT) employs technology to facilitate and advance Singapore’s national security. Due to the environment’s highly secret nature, it must be air-gapped.
This means that development and deployment are conducted in networks that are not connected to the internet. Consequently, all platforms had to be installed on-premises.
Despite not being able to utilise internet-connected services, CSIT has a Cloud Infrastructure and Services section that offers developers the necessary infrastructure to concentrate on software development.
Further, a monolith system is a big application consisting of code built by several developers over many years. Frequently, the code is inadequately maintained. Some of these developers may have left the development team or the organisation, leaving knowledge gaps.
Due to a lack of expertise and the difficulty of modifying a system that is constantly in use in production, refactoring the code is comparable to replacing the tyres on a moving car.
Having a legacy system result in greater maintenance and support costs and decreased efficiency. Since the monolith system was still essential, CSIT opted to adopt a more manageable strategy by decomposing it into smaller services using the microservices methodology.
Microservices, on the other hand, are software programmes that execute a business function as part of a larger system yet are separate services. These services are intended to be lightweight and straightforward to implement.
Microservices have the following advantages: each service is independently scalable; services have smaller code bases that make them easier to maintain and test; and problems are isolated to a single service, allowing for faster troubleshooting.
In addition, there are two main microservice architectures to consider when implementing the microservices approach. Each has advantages and disadvantages that correspond to specific use cases as Orchestration, as the name suggests, necessitates an orchestrator actively controlling the work of each service, whereas Choreography takes a less stringent method by allowing each service to carry out its work freely.
Microservices architecture may not be appropriate for all projects and choosing an architecture should be based on the needs of the project; therefore, CSIT advised to expect new problems to arise and be prepared to adapt to them.
Malaysia’s Minister of Science, Technology and Innovation recently witnessed the signing of a Memorandum of Understanding (MoU) between two leading firms. Under this MoU, one of the firms, a leading company from Japan and well-known internationally, has been granted the exclusive right to distribute the graphite and graphene produced by a Malaysia-based graphene and graphite producer in Japan.
Looking at the Japanese firm’s strong track record and Japan as one of the main hubs of the automotive industry, this collaboration is seen to be able to promote Malaysian-made graphene materials.
The graphene producer, meanwhile, has pioneered an innovative approach to producing graphene from palm kernels. This patented technology is a solution that will advance the development of graphene. The technology that uses the palm kernel as the main ingredient in the production of graphene also gives Malaysia an advantage to supply raw materials in the production of graphene since Malaysia is one of the world’s largest producers of palm oil.
According to the Minister, this collaboration can provide great benefits to both firms since they are major players in their respective industries. It will provide significant revenue growth to both companies, further driving the Malaysian and Japanese economies.
In the long term, the aim is to the marketing of graphene produced by the Malaysian firm will benefit Malaysia because it has the potential to supply local companies with graphene products under the ‘Graphenovation’ program and enable Malaysia to become one of the world’s graphene exporters one day. In addition, it will also be an incentive for companies that want to take advantage of the great potential of graphene to invest in Malaysia.
The Ministry of Science, Technology and Innovation (MOSTI) also suggested that the firm establish a partnership with business development service in Malaysia, particularly in the supply chain and development of downstream products and applications with graphene, in addition to obtaining GrapheneVerify certification for the product to strengthen its presence with domestic recognition and international.
The global graphene market size is valued at US$ 87.5 million in 2019 and is projected to reach US$ 876.8 million by 2027, growing at a CAGR% of 40.2% from 2020 to 2027.
Graphene, the first 2-D carbon material in the world, widely regarded as a “wonder material” is ideal for many applications. Graphene is resistant to fire, an effective conductor, extremely versatile and 200 times stronger than steel and a substance of ultra-lightness.
Moreover, graphene is regarded in the chemical industry as an effective catalyst because of its properties such as high surface area and adsorption power. The rise in demand for chemicals worldwide is expected to increase graphene demand and thus, drive the growth of the global graphene industry.
Factors that are expected to fuel the growth of the graphene market are growing purchasing power and increasing consumer electronics demand such as tablets and mobile phones. In addition, graphene oxide-based transparent conductive films are used as a raw material in automobiles to make them safer and lighter.
However, the toxic nature of graphene and the risk involved in the graphene production process are expected to hamper the global graphene market growth over the projected period. Furthermore, continuous R&D activities around the globe and large-scale graphene production through renewable sources, in particular the use of value-added chemicals, are expected to give the industry enormous opportunities for growth.
To better serve and protect communities, maintain data security at scale, and perform essential tasks, all government agencies must establish a strong, contemporary data infrastructure that supports data innovation.
Government and the public sector stand to gain considerably by adopting AI into every element of their job. Government AI must consider privacy and security, compatibility with old systems, and changing workloads.
Artificial intelligence is already being used to help run the government, with cognitive applications doing everything from reducing backlogs and cutting costs to handling tasks that humans cannot easily do, such as predicting fraudulent transactions and identifying criminal suspects using facial recognition.
While AI-based technology may fundamentally transform how public-sector employees do their jobs in the coming years — such as eliminating some jobs, redesigning countless others, and even creating entirely new professions — it is already changing the nature of many jobs and revolutionising aspects of government operations.
AI in government services is centred on machine learning and deep learning, computer vision, speech recognition, and robotics. When used correctly, these techniques yield real, measurable results.
Cyber anomaly detection, on the other hand, has the potential to transform cybersecurity strategies in government systems. The possibilities are endless, but they are only now taking shape.
The OpenGov Breakfast Insight on 4 August 2022 offered the most cutting-edge innovative method for enabling large-scale analytics in the public sector.
Public Sector Services Powered by Data and AI
Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, OpenGov Asia acknowledges that data and artificial intelligence will drive the future of government services. “With a unified data platform, the public sector will be able to better serve citizens and protect their communities.”
Governments, in general, are one of the world’s largest employers, with numerous ministries, agencies and departments. The vast network of offices and services introduces significant complexity, operational inefficiencies and, frequently, a lack of transparency.
Agencies must deal with massive amounts of data in various structured and unstructured formats, which will only increase over time. Moreover, they are unable to recognise nor take advantage of the full potential of data, analytics and data due to legacy systems and traditional data warehouses. These are, more often than not, classified by agencies and departments, sabotaging their efforts to undergo digital transformation.
To generate real-time actionable insights and make data-driven decisions, data must be securely shared and exchanged at a scale. Giving government organisations and policymakers access to deeper, more relevant insights into decision-making is only possible through data modernisation.
It is given that much of the information that government agencies oversee is extremely sensitive, including information about the nation’s infrastructure, energy and education as well as information about personal health and financial matters. Data protection at every level of the platform must be ensured through tight interaction with granular cloud provider access control methods.
The fact is that citizens stand to gain through more individualised and effective services, enhanced national security, and wiser resource management that a robust data strategy can give.
Government agencies may adapt to readily access all their data for downstream advanced analytics capabilities to support complicated security use cases by integrating data with analytics and AI.
With such a platform, government security operations teams can quickly identify sophisticated threats, minimising the need for human resources by analytical automation and collaboration and speeding up investigations from days to minutes.
Data stored by public sector bodies can be extremely valuable when shared with other departments and used to elevate data-driven decision-making. The time has come to leverage the cloud’s scale and democratise secure data access to enable downstream BI and AI use cases, allowing government agencies to accelerate innovation.
Governments can improve citizen services while implementing smarter and more transparent governance by leveraging data, analytics and AI for actionable insights at scale. It eliminates data silos and improves communication and collaboration across agencies to achieve the best results for all citizens, delivering personalised citizen services while achieving data security and cyber resilience for a satisfied population.
Building a Scalable Data, Analytics and AI Strategy with Lakehouse Platform
Data infrastructure is an essential aspect of data processing and analysis, according to Chris D’Agostino, Global Field CTO, Databricks.
The complete backend computing support system needed to process, store, transfer and preserve data is referred to as the “data infrastructure.” Without the appropriate data infrastructures, businesses and organisations cannot extract value from their data.
“If there’s one thing that many of us all have in common, it’s that we believe in the impact that data and AI can and will have on the world,” says Chris. “Today, data and AI are transforming every major industry.”
On the other hand, with the ongoing globalisation of artificial intelligence and machine learning, there is an increasing need to rethink an organisation’s whole leadership and thought process, from product strategy and customer experience to strategies to increase the efficiency of human resources.
Rules, models and policies that specify how data is gathered stored, used and managed in the cloud within a company or organisation are contained in cloud data architectures. It controls the data flow, processing, and distribution of that data across stakeholders and other applications for reporting, analytics and other purposes.
Every year, data collection by businesses and organisations increases thanks to IoT and new digital streams. In this climate, cloud data architecture-based data platforms are displacing more conventional data platforms, which are unable to handle the growing data quantities and increasingly demanding end-user applications like machine learning and AI.
Companies are using all available data to expedite, automate and improve decision-making to increase resilience and obtain a competitive edge in the market. These methods for digital transformation are supported by AI and data literacy.
To fully realise the benefit of data and AI, change management is necessary, just like with any change in working practices. It is essential to create a cohesive and evolving plan. This can be based on three pillars: business strategy, operationalisation and architecture (after the technology barriers have been recognised).
Whether it’s a business strategy, data management, or organisational knowledge, it’s critical to assess the organisation’s level of maturity and data literacy.
Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses to deliver the dependability, strong governance, and performance of data warehouses while also allowing for the openness, flexibility and machine learning support of data lakes.
By removing the data silos that normally segregate and complicate data engineering, analytics, BI, data science and machine learning, this unified approach streamlines the current data stack. To increase flexibility, it is created using open standards and open-source software.
Additionally, its shared approach to data management, security and governance works more productively and develops more quickly.
In a global research effort in collaboration with an institution, Databricks polled 117 data leaders and the survey’s findings were illuminating and instructive.
An analytics leader’s biggest regret and issue was not embracing an open standards-based data architecture. “This didn’t surprise us. We are seeing many of our clients adopting the best open-source technologies,” Chris reveals.
In addition, the poll showed that only a small group can be successful with their AI projects, while the multi-cloud is a growing reality.
Most executives say they are currently evaluating or implementing a new data platform to address their current data challenges. During these challenging times, cloud technologies allow businesses to respond and scale rapidly.
With scalable data, analytics and AI strategy, organisations can create significant value. They can implement real-time monitoring, create tailored customer experiences, deploy predictive analytics, and much more. Databricks offers tools that are specifically designed to address the challenges described.
In Conversation With: The Future of Government Services and Shared Data
All the government agencies’ data must be protected and every component must be safeguarded. Unified data with analytics and AI makes it simpler to provide quick access for the organisation’s teams and complete support for security use cases.
Joseph Tan, Deputy Director (Capability Development), Data Science & Artificial Intelligence Division, Government Technology Agency emphasised the importance of data modernisation with a holistic approach. A policy-driven industry that would entrust the organisations’ data will lead to better customer service.
Joseph is convinced that “As technology advances, most businesses are confronted with issues caused by an existing legacy system. Instead of providing companies with cutting-edge capabilities and services such as cloud computing and improved data integration, a legacy system keeps a business constrained.”
A legacy system is computer software or hardware that is no longer in use. The system still meets the needs for which it was originally designed, but it does not allow for expansion. Because a legacy system can only do what it does now for the company, it will never be able to interact with newer systems
“A business might keep using an old system for more than one reason. In the world of investments, for example, upgrading to a new system requires an initial investment of money and people, while keeping an old system running costs money over time,” Joseph explains.
On the other hand, when a whole company moves to a new system, there can be some internal resistance and worries about how hard it will be and what might go wrong. For example, legacy software might have been made with an old programming language, which makes it hard to find staff with the right skills to do the migration.
Additionally, there might not be much information about the system, and the people who made it might have left the company. It can be hard to just plan how to move data from an old system to a new one and figure out what needs the new system will have.
Increased security risk, instability and inefficiency, incompatibility with new technology, company perception and new hire training, single point of failure and lack of information are a few issues that older systems run against.
At best, outdated legacy systems are a pain, and at worst, they can seriously jeopardise an organisation’s overall IT security strategy. Furthermore, the longer a business waits to update a legacy system, the more challenging the transition will be.
System modernisation is almost always a must before digital transformation can occur. Most businesses won’t be able to fully profit from contemporary technologies and solutions without it. “With this, finding the right talent would be very beneficial for the organisation to manage their modern technologies,” says Chris.
Some advantages of updating legacy systems such as enterprises can enhance their IT security and sustain it by taking advantage of vendor upgrades and fixes in the future by updating legacy systems. Modern systems and solutions, including retrofitted legacy systems, are built to deliver optimal performance without consuming excessive amounts of computational power.
Even a legacy system may be modernised to include new features, giving the business users additional capability and a better user experience. The truth is that updated legacy systems require less input from IT staff, freeing them up to focus on activities that really benefit a company.
Similarly, governments all over the world will undergo a fundamental upheaval because of big data and artificial intelligence. Even though the public sector has long used data, the potential and actual use of big data applications have an impact on some theoretical and practical aspects of decision-making. This is fuelled by both the data revolution and the concurrent advancement of advanced analytics.
The availability of data that may be employed in the computer learning process is a major aspect of the maturing of AI technology and the practicality of AI applications to public policy and administration.
However, without the underlying analytical technologies, the data revolution can be seen as only a change in the size of the data that is currently available rather than a fundamental change. As predictive analytics, innovative data and artificial intelligence gain prominence, it is critical to understand their roles in the public sector.
At the start of their data journey, organisations require data capture systems to discover information embedded in all levels of business operations. Following that, the data must be validated for informational accuracy and integrated to reduce the risk of drawing incorrect conclusions and to create a unified view of the business.
The final step is analysis, in which businesses collaborate with data analysts who use cutting-edge analytics tools to peel back layers of proprietary data in search of insights to power change.
Larger companies with more complex data integration and analytics processes can add predictive analytics as the fourth step.
When analysing enormous datasets (often referred to as “big data”), predictive data analytics, also referred to as advanced analytics, uses autonomous or semi-autonomous algorithms to make predictions based on information patterns. Data analysts may provide clients with greater service, which can result in more meaningful transformations, by delivering deeper insights into company data more quickly.
Think about how AI and machine learning might be used in the context of the data processing flow. Analytics tools assist data analysts in identifying areas for improvement in the business after private data has been collected, analysed and combined into a single view.
AI excels at discovering data patterns that humans cannot perceive. This is quickly scalable based on the amount of the dataset. To make data analytics frictionless, machine learning algorithms can also adapt to data pipeline input and human behaviour patterns. This can be accomplished by utilising natural language processing to recode communications between individuals within an organisation so that algorithms can comprehend and act on them.
Artificial intelligence and machine learning have become the “next big thing” in the government sector, while advanced analytics, also known as predictive data analytics, utilises autonomous or semi-autonomous algorithms to evaluate enormous datasets and generate predictions based on information patterns.
By developing deeper insights into company data more quickly, data analysts can provide better service to clients, which can result in more profound transformations. Consider the application of AI and machine learning to the data handling process. After unique data has been collected, analysed and consolidated into a single view, analytics tools assist data analysts in identifying areas for business development.
Smart solutions enable advances that are self-sustaining and AI and ML are at the heart of these. Executives and practitioners agree that AI and ML are catalysts and drivers across both the public and private sectors. As an AI system has a deeper understanding of data platforms and processes, it can continue to enhance its efficacy and capacity to provide personalised insights from massive data silos.
In closing, Chris shared that Databricks was established in 2013 to assist data teams in resolving the most challenging issues facing the globe, and they have been investing in the Asia Pacific region to help this objective forward. “While there are countless possibilities, there are several challenges as well.”
It is insufficient to merely fund and use AI technologies. Businesses and organisations need a talent pool of experts that can use these AI tools in a way that can guarantee the greatest outcomes.
Currently, customers from a wide spectrum of businesses are collaborating with Databricks to tailor their clients’ experiences to improve their capacity to react to market dynamics and safeguard both their own and all stakeholders’ interests. This is most evident in real-time for financial services organisations to help deal with fraud.
“My particular favourite is Databricks’ assistance in Mitsubishi Tanabe’s efforts to quicken drug clinical trials in Japan. The possibilities for our collaboration are virtually endless,” Chris reflects.
Mohit recognises that digital transformation is vital in today’s VUCA environment. What is essential is that industry and government collaborate and work together. For long-term success and sustainability, there have to be partnerships between the public and private sectors.
Strategic alliances gave businesses and government agencies a competitive edge. Partnerships are mutually beneficial, helping each other grow and get better. When people genuinely try to help each other, “it can help to get over certain weaknesses and be first movers in their field.”