As part of CSA’s efforts to reach a wider audience, the Cyber Savvy Machine Pop-Up is an interactive machine which ams to educate citizens on cyber hygiene. It has been stationed at various libraries, one library a month since November 2018 in order to reach a diverse group of Singapore citizens, including the young and the elderly who may need assistance online.
Cultivating good Cyber hygiene
The Pop-Up is an interactive experience that aims to cultivate good cyber hygiene among internet users. Library-goers can participate in an interactive quiz on the Cyber Savvy Machine and pick up cyber tips from the information panels and flyers. By answering five simple cybersecurity questions correctly, participants will receive a small prize.
The initiative reminds library-goers to use an anti-virus software, use strong passwords and enable two-factor authentication (2FA), spots signs of phishing and update their software as soon as possible and regularly.
Cyber Savvy Machine popular with library-goers
Since its inception, the machine has registered almost 20,000 attempts with a constant stream of people lining up to attempt the quiz on weekends. It is encouraging to see library-goers brushing up their cybersecurity knowledge by studying the information panels and the recommended list of books on cybersecurity.
Cyber initiative backed up by Nanyang Polytechnic student ambassadors
Nanyang Polytechnic student ambassadors have been supporting this initiative by showcasing their cyber savvy game one Saturday a month. The game requires participants to answer cybersecurity-related questions to stop bombs from destroying a city.
The student ambassadors will also help interested library-goers conduct non-intrusive mobile phone health checks by scanning their phones to check for possible connections to malicious websites.
Cyber Savvy Machine touring the island of Singapore
The Pop-Up has made its appearance at various libraries islandwide including Jurong Regional Library, Woodlands Regional Library, Bedok Public Library and Ang Mo Kio Public Library. I will visit different public libraries until October 2019. It is stationed at the Central Public Library throughout March 2019.
A multidisciplinary team of Massachusetts Institute of Technology (MIT) researchers led by Iddo Drori, a lecturer in the MIT Department of Electrical Engineering and Computer Science (EECS), has used a neural network model to solve university-level math problems at a human level in a matter of seconds.
“It will help students improve, and it will help teachers create new content, and it could help increase the level of difficulty in some courses. It also allows us to build a graph of questions and courses, which helps us understand the relationship between courses and their pre-requisites, not just by historically contemplating them, but based on data,” Iddo explained, also an adjunct associate professor at Columbia University’s Department of Computer Science.
Additionally, the model automatically explains solutions and rapidly generates new math problems for university-level courses. When the researchers presented these machine-generated questions to university students, the students were unable to distinguish whether the questions were created by a human or an algorithm.
This approach might be used to simplify the creation of course content, which would be particularly beneficial for big residential courses and massive open online courses (MOOCs) with thousands of students. The technology might also be used as an automated tutor that demonstrates to students how to solve basic math problems.
In the past, researchers employed a neural network, such as GPT-3, that was merely pretrained on the text like it was shown millions of examples of text to learn the patterns of natural language. This time, they employed a neural network that was trained on the text and “tuned” on code.
A machine learning model can perform better by using this network, known as Codex, which is effectively an additional pre-training procedure.
The model was exposed to millions of code examples from internet repositories. As the training data for this model contained millions of natural language words and millions of lines of code, it learns the relationships between text and code.
The machine-generated questions were evaluated by showing them to university students. The researchers assigned students 10 problems from each undergraduate math course in random order; five questions were prepared by people and the remaining five were generated by a computer.
Students were unable to discern whether the machine-generated questions were produced by an algorithm or a human, and they scored the difficulty level and course-appropriateness of questions generated by humans and machines similarly.
Researchers emphasised that this effort is not meant to take the place of actual teachers. They claim that although automation has reached 80 per cent accuracy, it will never reach 100 per cent. Every time someone figures something out, someone else will pose a more challenging problem.
Simply this work opens the door for people to begin using machine learning to answer ever-harder questions, and academics are optimistic that it will have a significant impact on higher education.
The team has expanded the work to handle math proofs because of the approach’s effectiveness, although there are several limits they intend to address. Due to computational complexity, the model is currently unable to answer questions with a visual component or resolve computationally intractable issues.
The model is being scaled up to hundreds of courses in addition to these obstacles. They will produce more data with those hundreds of courses, which they may use to improve automation and offer perceptions into course design and curricula.
The Science and Technology Academic and Research-Based Openly Operated Kiosks or STARBOOKS of the Department of Science and Technology (DOST) have arrived on the island of San Miguel in Tabaco, Albay, providing easy access to S&T learning.
STARBOOKS is the country’s first digital science library, created by the Science and Technology Information Institute (DOST-STII). It is a stand-alone information source intended for those who have limited or no access to S&T information resources.
The project’s goal is to provide Science, Technology, and Innovation (ST&I) content to geographically isolated schools and communities across the country. STARBOOKS contains many digitized S&T resources in various formats such as text and video or audio organised in specially designed “pods” with an easy-to-use interface.
STARBOOKS, as SMNHS teacher John Darnell Balbastro put it, is “one way of elevating the scientific and technological literacy” of their students. Its wide range of digitised S&T resources in various formats will “intensify the curiosity among our young learners,” and its offline access will address the lack of S&T learning resources in San Miguel.
Through this programme, DOST Region V, in collaboration with its dedicated Provincial S&T Centres and implementers, will continue to promote and empower S&T knowledge and education.
Meanwhile, Jamaica Pangasinan, Senior Science Research Specialist at the Space Mission Control and Operations Division (SMCOD) of the Philippine Space Agency (PhilSA), said that she was impressed by the level of environmental and social awareness of the incoming senior high school students, which was shown in their work at the “LIFT OFF: PhilSA Space Science Camp 2022.”
She said that the mission goals showed how eager the students were to solve the problems and threats facing the environment right now.
Fourteen science high schools from the 16 divisions of Metro Manila chosen by the Department of Education (DepEd) to attend the camp presented their space missions. Each team had five (5) minutes to talk about their satellite’s mission, its most important technical features, and why it was important.
The students came up with a wide range of missions, from observing Earth to keeping an eye on space junk to sending probes to other planets.
Only two missions were better than the rest. These are the Monitoring Illegal Mining Activities in Remote Areas (MIMA) by Bianca Louise B. Cruz and Oscar A. Araja II of the City of Mandaluyong Science High School, and the Venus Seismic Activity Monitoring Satellite (V-SAMS) by Peter James Lyon and Ysabela Juliana Bernardo of the Caloocan City Science High School.
The students who work on MIMA said that the goal of their satellite mission is to protect the environment and make sure that mining laws and rules are followed better in the country. Based on their plan, MIMA would be a Synthetic Aperture Radar (SAR) satellite that could see through clouds to spot changes in areas where mining could be happening. It would take pictures with the help of optical imagers.
The goal of V-SAMS, on the other hand, would be to learn more about Venus, which is like Earth’s twin, and especially about its earthquakes. To do this, V-SAMS would use infrared imaging to track the surface temperature of Venus’s volcanoes, figure out which ones will erupt, and find other volcanoes that are still active on the planet.
It would also have an interferometric SAR (InSAR) to look for changes on Venus’s surface and signs of earthquakes. V-SAMS would also have an optical payload that would let it take high-resolution pictures.
The National Environment Agency (NEA) and the Singapore Land Authority (SLA) have signed a Memorandum of Understanding (MOU) to develop the use of Global Navigation Satellite System (GNSS) data from SLA’s Singapore Satellite Reference Network (SiReNT) to help NEA better monitor island-wide atmospheric moisture. The goal of the five-year partnership is to help Singapore with weather monitoring by giving it more data and making it easier to do exploratory studies for weather forecasting.
“The collaboration between NEA and SLA highlights our commitment to achieve synergies and tap on enablers across the public sector. This partnership provides a platform for NEA to utilise SLA’s expertise in GNSS data collection and processing, enabling NEA to explore non-traditional methods to enhance our weather monitoring and forecasting capabilities,” says Luke Goh, CEO, NEA.
On the other hand, Colin Low, CEO of SLA, said that SLA’s partnership with NEA is a part of its ongoing efforts to collaborate with parties from the public and commercial sectors to open up new applications for SiReNT and its other geospatial products.
The SLA believed combining the knowledge of multiple parties might lead to more innovation and the discovery of workable solutions that could be advantageous to Singapore and the industries.
Colin continued by saying that they are eager to collaborate with NEA to research the unique uses of SiReNT data for improved weather monitoring and research projects on weather forecasting and climate change. The many experiences that were gathered and shared during this partnership will serve as a foundation for upcoming developments in this area.
The production of accurate weather forecasts, climate monitoring, and timely warnings of dangerous weather events all depend on meteorological measurements. The Meteorological Service Singapore (MSS) routinely gathers a variety of observational data from ground-based and aircraft sensors, such as temperature, wind, and moisture.
To measure these weather components at various altitudes of the atmosphere, sensors linked to a weather balloon are routinely launched twice a day at MSS’ Upper Air Observatory (UAO). To enhance the sounding data from the weather balloon, MSS erected a GNSS reference station at UAO in 2019.
This station will provide continuous estimates of moisture in an atmospheric column known as the integrated precipitable water vapour.
In accordance with the MOU, SiReNT will incorporate MSS’s GNSS station, giving MSS access to continuous, almost real-time atmospheric moisture readings for the entire island. By supplying greater resolution and more frequent observation data, this non-conventional moisture data will complement MSS’s current observation network data and enable research into possible uses for weather forecasting.
The partnership will also help SLA’s SiReNT station network, which now consists of nine reference stations dispersed throughout Singapore, grow. The network will grow to 12 stations with more data receivable with the installation of NEA’s GNSS base receiver station at UAO that will be integrated into SiReNT and two anticipated additional coastal SiReNT reference stations. The SiReNT system can create precise positioning data with an accuracy of up to 3 cm and correct positional inaccuracies in GNSS signals.
The SiReNT technology fosters innovation across a range of sectors, including autonomous driving, logistics and automation in the building industry, and monitoring of changes in Singapore’s land height and sea level.
The addition of stations by the end of 2022 will further increase the stability of the services and applications SiReNT now supports in several important industries. It can also be used in novel ways for scientific research on climate change.
Several domestic banks in Vietnam have 90% of their transactions conducted on digital platforms, surpassing the target of 70% set for 2025. Half of the country’s banking services are expected to be digitalised and 70% of transactions will be carried out online by 2025.
The Vietnamese Prime Minister, Pham Minh Chinh, recently stated that the banking sector has played a significant role in national digital transformation by deploying products and services for people and businesses. He urged the sector to further reform its management methods towards modernity and transparency and diversify and improve the quality of its products and services to curb money laundering.
Addressing an event called, “Digital Transformation Day of the Banking Sector” Chinh explained that the sector should work to understand more about the demands of people, businesses, and credit institutions to devise suitable legal documents, facilitating the application of digital technologies in banking services.
He asked the State Bank of Vietnam (SBV) to continue its close coordination with ministries and agencies to formulate a decree on cashless payments and submit it to the government. Common infrastructure such as payment and credit information infrastructure should be promoted. He said suggested stronger connectivity between banks and credit organisations.
Chinh also requested the sector ensure cybersecurity and safety in digital transformation, given the rise of high-tech crime. The sector should raise public awareness about the benefits of digital transformation, enhance personnel training capabilities, and boost international cooperation in digital transformation.
The Prime Minister also attended an exhibition showcasing products and services that promote the digital transformation of the banking sector. Chinh had a working session with representatives from the SBV and commercial banks. He congratulated the sector on its effective operations amid a host of difficulties, especially those caused by the COVID-19 pandemic. He suggested the sector further cut interest rates to support businesses and actively engage in the state’s policies, particularly housing credit for workers and low-income earners. Participants attributed the developments of banks to supportive policies adopted by the state, the management of the government, and stability in the country.
Vietnam’s financial technology market could grow to US$ 18 billion by 2024. The country is a leader among ASEAN members in terms of the volume of financing for fintech, second only to Singapore. Over 93% of all venture investments in the country are directed at e-wallets and the e-money segment. The total number of fintech companies has grown to 97 since 2016, an 84.5% increase. However, the number of newly-launched start-ups each year decreased from 11 to 2.
As OpenGov Asia reported, the market features high competitiveness and a high entry bar. Transaction volume has seen a 152.8% growth since 2016, with 29.5 million new fintech users. As a result, every second Vietnamese citizen uses at least one fintech service. Demand for digital services (transactions, payments, and wallets) in the country is high. According to industry analysts, Vietnam’s fintech sector is young and promising. The market valuation has increased from US$ 0.7 billion to US$ 4.5 billion since 2016.
Data is information that has been organised in a way that makes it simple to move or process. It is a piece of information that has been converted into binary digital form for computers and modern methods of information transmission.
Connected data, on the other hand, is a method of displaying, using, and preserving relationships between data elements. Graph technology aids in uncovering links in data that conventional approaches are unable to uncover or analyse.
Different sectors have invested in big data technologies because of the promise of valuable business insights. As a result, various industries express a need for connected data, particularly when it comes to connecting people such as employees or customers to products, business processes and other Internet-enabled devices (IoT).
In an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia, Chandra Rangan, Chief Marketing Officer of Neo4j shared his knowledge on how a connected data strategy becomes of paramount importance in building a smart nation.
Connected data enables businesses
A great example of the power of graph technology, and a very common use case for Neo4j, is its use in the financial sector to uncover fraud. Finding fraud is all about trying to make connections and understand relationships, Chandra elaborates. A graph-based system could detect if fraud is taking place in one location and determine if the same scenario has occurred in other locations.
“How does one make sense of this? Essentially, you are traversing a network of interconnected data using the relationships between that data. Then you begin to see patterns develop and these patterns provide you with answers so that you can conclude whether there is fraud.”
What is of great concern is that fraud is occurring with much greater frequency and with a higher success rate nowadays. The key to stopping and mitigating the impact is time. Instead of detecting a fraud that occurred hours or days ago,
“What if the organisation could detect it almost immediately and in real-time as it occurs?” asks Chandra. “Graph offers this kind of response and is why it’s a great example of value!”
Supply chain and management are other excellent examples of RoI. One of Neo4j’s clients, which operates arguably the largest rail network in the United States and North America created a digital twin of the entire rail network and all the goods. With graph technology across their network, they can now do all kinds of interesting optimisation much faster, leading to better, more efficient outcomes for their entire system.
The pandemic has taught the world about the value and fragility of supply chains. Systems across the globe are being reimagined as the world’s economy realise the need to become more digital and strategic. More supply sources, data, data sharing, customer demands, and increased complexity necessitate modern, purpose-built solutions.
Apart from all the new expectations and requirements for modern supply chains, systems need to and are becoming more interconnected because of new technologies.
Maintaining consistent profitability is difficult for firms with a high proportion of assets. Executives must oversee intricate worldwide supply chains, extensive asset inventories and field operations that dispatch workers to dangerous or inaccessible places.
With this, organisations need a platform that connects their workforces and makes them more capable, productive and efficient. A platform that provides enterprises with real-time visibility and connectivity, while also assuring efficiency, safety, and compliance.
Modern technologies are required to improve interconnectivity, maximise the value of data, automate essential procedures, and optimise the organisation’s most vital workflows.
Modern data applications require a connected platform
“When we programme, when we create applications, we think in what we are calling a graph. This is the most intuitive approach that you can have,” says Chandra.
Any application development begins with understanding the types of questions people want to solve and then mapping it to a wide range of outcomes that they want to achieve. These are typically mapped in what is known as an entity relationship diagram.
Individuals’ increased reliance on systems that work in a way that makes sense to them and supports them has increased criticality. And frequently, when these systems fail, Neo4j makes sense of complexity and simplifies what needs to be done, resulting in a significant acceleration.
As the world becomes more collaborative, integrated, and networked, nations must respond more quickly to changes in their business environment brought on by the digital era; otherwise, they risk falling behind or entering survival mode.
The proliferation of new technologies, platforms, and devices, as well as the evolving nature of work, are compelling businesses to recognise the significance of leveraging the most recent technology to achieve greater operational efficiencies and business agility.
A graph platform connects individuals to what they require, and when and when they require it. It augments their existing process by facilitating the effective recording and management of personnel data. Neo4j Graph Data Science assists data scientists in finding connections in huge data to resolve important business issues and enhance predictions.
Businesses employ insights from graph data science to discover activities that point to fraud, find entities or people who are similar, enhance customer happiness through improved suggestions, and streamline supply chains. The dedicated workspace combines intake, analysis, and management for simple model improvement without workflow reconstruction.
As a result, people are more engaged, productive, and efficient with connected data. Nations can bridge information and communication gaps between executive teams, field technicians, plant operators, warehouse operators and maintenance engineers. Increasing agility and productivity offers obvious commercial benefits.
In short, organisations easily integrate their whole industrial workforce to increase operational excellence and decrease plant downtime, hence maximising revenues. This methodology is based on a collaborative platform direction.
Contextualising data increases its value
According to Chandra, data is a representation of the world in which people live, and people use data to represent this world. As a result, the world is becoming more connected, and people no longer live in silos and continue to be associated in society.
“If you think about data as the representation of the world that we live in, it is connected data and we can deal with all the complexities that we need to deal with when we try to make sense out of it,” explains Chandra.
Closer to home, connected data is crucial to Singapore’s development as a smart nation. “Connected data is at the centre of each of those conversations around developing the nation. When you think of Singapore as a connected ecosystem and when you think about citizens, services, logistics, contract tracing, and supply chain.”
Chandra believes that the attributes have saved the connection between data and people, which is why connections are important. Once people understand those connections, it becomes much easier and much faster to derive the insights that are required.
Without connected data, organisations lack key information needed to gain a deeper understanding of their customers, build a complete network topology, deliver relevant recommendations in real-time, or gain the visibility needed to prevent fraud.
Thus, “knowing your customer is understanding connected data.” With the right tools, data may be a real-time, demand-driven asset that a financial institution can utilise to reinvent ineffective processes and procedures and change how it interacts with and comprehends its consumers.
“Me as a person – who I am, my name, where I live – these are all properties of who I am. But what really makes me me, are the relationships I have built over time. And so, the notion that almost every problem has data that you can really make sense of with graphs is the larger “Aha” moment,” Chandra ends.
Legacy systems are still in use pieces of hardware or software that are out of date. These systems frequently have problems and are incompatible with more modern ones. Although they can be used in the manner intended by their creators, they cannot be improved.
It is the backbone of many excellent organisations, since they utilise software, apps, and IT solutions that are crucial to the general operation of the business but are obsolete and, in some cases, no longer supported by the original software vendor or developer.
While running legacy systems may not appear to be a big deal, they do present a unique set of challenges and potential issues that organisations would be remiss to ignore.
Thus, obsolete legacy systems are at best a nuisance and, at worst, can undermine an organisation’s entire IT security strategy, severely impeding productivity. Furthermore, the longer a company waits to modernise a legacy system, the more difficult the transition becomes.
However, system modernisation is always a prerequisite for digital transformation. Most firms will be unable to fully grasp the benefits of new technologies and solutions without it.
Due to the rapid development of technology, businesses must maintain compatibility with legacy systems that impede the implementation of contemporary technologies.
With this, the Centre for Strategic Infocomm Technologies (CSIT) employs technology to facilitate and advance Singapore’s national security. Due to the environment’s highly secret nature, it must be air-gapped.
This means that development and deployment are conducted in networks that are not connected to the internet. Consequently, all platforms had to be installed on-premises.
Despite not being able to utilise internet-connected services, CSIT has a Cloud Infrastructure and Services section that offers developers the necessary infrastructure to concentrate on software development.
Further, a monolith system is a big application consisting of code built by several developers over many years. Frequently, the code is inadequately maintained. Some of these developers may have left the development team or the organisation, leaving knowledge gaps.
Due to a lack of expertise and the difficulty of modifying a system that is constantly in use in production, refactoring the code is comparable to replacing the tyres on a moving car.
Having a legacy system result in greater maintenance and support costs and decreased efficiency. Since the monolith system was still essential, CSIT opted to adopt a more manageable strategy by decomposing it into smaller services using the microservices methodology.
Microservices, on the other hand, are software programmes that execute a business function as part of a larger system yet are separate services. These services are intended to be lightweight and straightforward to implement.
Microservices have the following advantages: each service is independently scalable; services have smaller code bases that make them easier to maintain and test; and problems are isolated to a single service, allowing for faster troubleshooting.
In addition, there are two main microservice architectures to consider when implementing the microservices approach. Each has advantages and disadvantages that correspond to specific use cases as Orchestration, as the name suggests, necessitates an orchestrator actively controlling the work of each service, whereas Choreography takes a less stringent method by allowing each service to carry out its work freely.
Microservices architecture may not be appropriate for all projects and choosing an architecture should be based on the needs of the project; therefore, CSIT advised to expect new problems to arise and be prepared to adapt to them.
To better serve and protect communities, maintain data security at scale, and perform essential tasks, all government agencies must establish a strong, contemporary data infrastructure that supports data innovation.
Government and the public sector stand to gain considerably by adopting AI into every element of their job. Government AI must consider privacy and security, compatibility with old systems, and changing workloads.
Artificial intelligence is already being used to help run the government, with cognitive applications doing everything from reducing backlogs and cutting costs to handling tasks that humans cannot easily do, such as predicting fraudulent transactions and identifying criminal suspects using facial recognition.
While AI-based technology may fundamentally transform how public-sector employees do their jobs in the coming years — such as eliminating some jobs, redesigning countless others, and even creating entirely new professions — it is already changing the nature of many jobs and revolutionising aspects of government operations.
AI in government services is centred on machine learning and deep learning, computer vision, speech recognition, and robotics. When used correctly, these techniques yield real, measurable results.
Cyber anomaly detection, on the other hand, has the potential to transform cybersecurity strategies in government systems. The possibilities are endless, but they are only now taking shape.
The OpenGov Breakfast Insight on 4 August 2022 offered the most cutting-edge innovative method for enabling large-scale analytics in the public sector.
Public Sector Services Powered by Data and AI
Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, OpenGov Asia acknowledges that data and artificial intelligence will drive the future of government services. “With a unified data platform, the public sector will be able to better serve citizens and protect their communities.”
Governments, in general, are one of the world’s largest employers, with numerous ministries, agencies and departments. The vast network of offices and services introduces significant complexity, operational inefficiencies and, frequently, a lack of transparency.
Agencies must deal with massive amounts of data in various structured and unstructured formats, which will only increase over time. Moreover, they are unable to recognise nor take advantage of the full potential of data, analytics and data due to legacy systems and traditional data warehouses. These are, more often than not, classified by agencies and departments, sabotaging their efforts to undergo digital transformation.
To generate real-time actionable insights and make data-driven decisions, data must be securely shared and exchanged at a scale. Giving government organisations and policymakers access to deeper, more relevant insights into decision-making is only possible through data modernisation.
It is given that much of the information that government agencies oversee is extremely sensitive, including information about the nation’s infrastructure, energy and education as well as information about personal health and financial matters. Data protection at every level of the platform must be ensured through tight interaction with granular cloud provider access control methods.
The fact is that citizens stand to gain through more individualised and effective services, enhanced national security, and wiser resource management that a robust data strategy can give.
Government agencies may adapt to readily access all their data for downstream advanced analytics capabilities to support complicated security use cases by integrating data with analytics and AI.
With such a platform, government security operations teams can quickly identify sophisticated threats, minimising the need for human resources by analytical automation and collaboration and speeding up investigations from days to minutes.
Data stored by public sector bodies can be extremely valuable when shared with other departments and used to elevate data-driven decision-making. The time has come to leverage the cloud’s scale and democratise secure data access to enable downstream BI and AI use cases, allowing government agencies to accelerate innovation.
Governments can improve citizen services while implementing smarter and more transparent governance by leveraging data, analytics and AI for actionable insights at scale. It eliminates data silos and improves communication and collaboration across agencies to achieve the best results for all citizens, delivering personalised citizen services while achieving data security and cyber resilience for a satisfied population.
Building a Scalable Data, Analytics and AI Strategy with Lakehouse Platform
Data infrastructure is an essential aspect of data processing and analysis, according to Chris D’Agostino, Global Field CTO, Databricks.
The complete backend computing support system needed to process, store, transfer and preserve data is referred to as the “data infrastructure.” Without the appropriate data infrastructures, businesses and organisations cannot extract value from their data.
“If there’s one thing that many of us all have in common, it’s that we believe in the impact that data and AI can and will have on the world,” says Chris. “Today, data and AI are transforming every major industry.”
On the other hand, with the ongoing globalisation of artificial intelligence and machine learning, there is an increasing need to rethink an organisation’s whole leadership and thought process, from product strategy and customer experience to strategies to increase the efficiency of human resources.
Rules, models and policies that specify how data is gathered stored, used and managed in the cloud within a company or organisation are contained in cloud data architectures. It controls the data flow, processing, and distribution of that data across stakeholders and other applications for reporting, analytics and other purposes.
Every year, data collection by businesses and organisations increases thanks to IoT and new digital streams. In this climate, cloud data architecture-based data platforms are displacing more conventional data platforms, which are unable to handle the growing data quantities and increasingly demanding end-user applications like machine learning and AI.
Companies are using all available data to expedite, automate and improve decision-making to increase resilience and obtain a competitive edge in the market. These methods for digital transformation are supported by AI and data literacy.
To fully realise the benefit of data and AI, change management is necessary, just like with any change in working practices. It is essential to create a cohesive and evolving plan. This can be based on three pillars: business strategy, operationalisation and architecture (after the technology barriers have been recognised).
Whether it’s a business strategy, data management, or organisational knowledge, it’s critical to assess the organisation’s level of maturity and data literacy.
Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses to deliver the dependability, strong governance, and performance of data warehouses while also allowing for the openness, flexibility and machine learning support of data lakes.
By removing the data silos that normally segregate and complicate data engineering, analytics, BI, data science and machine learning, this unified approach streamlines the current data stack. To increase flexibility, it is created using open standards and open-source software.
Additionally, its shared approach to data management, security and governance works more productively and develops more quickly.
In a global research effort in collaboration with an institution, Databricks polled 117 data leaders and the survey’s findings were illuminating and instructive.
An analytics leader’s biggest regret and issue was not embracing an open standards-based data architecture. “This didn’t surprise us. We are seeing many of our clients adopting the best open-source technologies,” Chris reveals.
In addition, the poll showed that only a small group can be successful with their AI projects, while the multi-cloud is a growing reality.
Most executives say they are currently evaluating or implementing a new data platform to address their current data challenges. During these challenging times, cloud technologies allow businesses to respond and scale rapidly.
With scalable data, analytics and AI strategy, organisations can create significant value. They can implement real-time monitoring, create tailored customer experiences, deploy predictive analytics, and much more. Databricks offers tools that are specifically designed to address the challenges described.
In Conversation With: The Future of Government Services and Shared Data
All the government agencies’ data must be protected and every component must be safeguarded. Unified data with analytics and AI makes it simpler to provide quick access for the organisation’s teams and complete support for security use cases.
Joseph Tan, Deputy Director (Capability Development), Data Science & Artificial Intelligence Division, Government Technology Agency emphasised the importance of data modernisation with a holistic approach. A policy-driven industry that would entrust the organisations’ data will lead to better customer service.
Joseph is convinced that “As technology advances, most businesses are confronted with issues caused by an existing legacy system. Instead of providing companies with cutting-edge capabilities and services such as cloud computing and improved data integration, a legacy system keeps a business constrained.”
A legacy system is computer software or hardware that is no longer in use. The system still meets the needs for which it was originally designed, but it does not allow for expansion. Because a legacy system can only do what it does now for the company, it will never be able to interact with newer systems
“A business might keep using an old system for more than one reason. In the world of investments, for example, upgrading to a new system requires an initial investment of money and people, while keeping an old system running costs money over time,” Joseph explains.
On the other hand, when a whole company moves to a new system, there can be some internal resistance and worries about how hard it will be and what might go wrong. For example, legacy software might have been made with an old programming language, which makes it hard to find staff with the right skills to do the migration.
Additionally, there might not be much information about the system, and the people who made it might have left the company. It can be hard to just plan how to move data from an old system to a new one and figure out what needs the new system will have.
Increased security risk, instability and inefficiency, incompatibility with new technology, company perception and new hire training, single point of failure and lack of information are a few issues that older systems run against.
At best, outdated legacy systems are a pain, and at worst, they can seriously jeopardise an organisation’s overall IT security strategy. Furthermore, the longer a business waits to update a legacy system, the more challenging the transition will be.
System modernisation is almost always a must before digital transformation can occur. Most businesses won’t be able to fully profit from contemporary technologies and solutions without it. “With this, finding the right talent would be very beneficial for the organisation to manage their modern technologies,” says Chris.
Some advantages of updating legacy systems such as enterprises can enhance their IT security and sustain it by taking advantage of vendor upgrades and fixes in the future by updating legacy systems. Modern systems and solutions, including retrofitted legacy systems, are built to deliver optimal performance without consuming excessive amounts of computational power.
Even a legacy system may be modernised to include new features, giving the business users additional capability and a better user experience. The truth is that updated legacy systems require less input from IT staff, freeing them up to focus on activities that really benefit a company.
Similarly, governments all over the world will undergo a fundamental upheaval because of big data and artificial intelligence. Even though the public sector has long used data, the potential and actual use of big data applications have an impact on some theoretical and practical aspects of decision-making. This is fuelled by both the data revolution and the concurrent advancement of advanced analytics.
The availability of data that may be employed in the computer learning process is a major aspect of the maturing of AI technology and the practicality of AI applications to public policy and administration.
However, without the underlying analytical technologies, the data revolution can be seen as only a change in the size of the data that is currently available rather than a fundamental change. As predictive analytics, innovative data and artificial intelligence gain prominence, it is critical to understand their roles in the public sector.
At the start of their data journey, organisations require data capture systems to discover information embedded in all levels of business operations. Following that, the data must be validated for informational accuracy and integrated to reduce the risk of drawing incorrect conclusions and to create a unified view of the business.
The final step is analysis, in which businesses collaborate with data analysts who use cutting-edge analytics tools to peel back layers of proprietary data in search of insights to power change.
Larger companies with more complex data integration and analytics processes can add predictive analytics as the fourth step.
When analysing enormous datasets (often referred to as “big data”), predictive data analytics, also referred to as advanced analytics, uses autonomous or semi-autonomous algorithms to make predictions based on information patterns. Data analysts may provide clients with greater service, which can result in more meaningful transformations, by delivering deeper insights into company data more quickly.
Think about how AI and machine learning might be used in the context of the data processing flow. Analytics tools assist data analysts in identifying areas for improvement in the business after private data has been collected, analysed and combined into a single view.
AI excels at discovering data patterns that humans cannot perceive. This is quickly scalable based on the amount of the dataset. To make data analytics frictionless, machine learning algorithms can also adapt to data pipeline input and human behaviour patterns. This can be accomplished by utilising natural language processing to recode communications between individuals within an organisation so that algorithms can comprehend and act on them.
Artificial intelligence and machine learning have become the “next big thing” in the government sector, while advanced analytics, also known as predictive data analytics, utilises autonomous or semi-autonomous algorithms to evaluate enormous datasets and generate predictions based on information patterns.
By developing deeper insights into company data more quickly, data analysts can provide better service to clients, which can result in more profound transformations. Consider the application of AI and machine learning to the data handling process. After unique data has been collected, analysed and consolidated into a single view, analytics tools assist data analysts in identifying areas for business development.
Smart solutions enable advances that are self-sustaining and AI and ML are at the heart of these. Executives and practitioners agree that AI and ML are catalysts and drivers across both the public and private sectors. As an AI system has a deeper understanding of data platforms and processes, it can continue to enhance its efficacy and capacity to provide personalised insights from massive data silos.
In closing, Chris shared that Databricks was established in 2013 to assist data teams in resolving the most challenging issues facing the globe, and they have been investing in the Asia Pacific region to help this objective forward. “While there are countless possibilities, there are several challenges as well.”
It is insufficient to merely fund and use AI technologies. Businesses and organisations need a talent pool of experts that can use these AI tools in a way that can guarantee the greatest outcomes.
Currently, customers from a wide spectrum of businesses are collaborating with Databricks to tailor their clients’ experiences to improve their capacity to react to market dynamics and safeguard both their own and all stakeholders’ interests. This is most evident in real-time for financial services organisations to help deal with fraud.
“My particular favourite is Databricks’ assistance in Mitsubishi Tanabe’s efforts to quicken drug clinical trials in Japan. The possibilities for our collaboration are virtually endless,” Chris reflects.
Mohit recognises that digital transformation is vital in today’s VUCA environment. What is essential is that industry and government collaborate and work together. For long-term success and sustainability, there have to be partnerships between the public and private sectors.
Strategic alliances gave businesses and government agencies a competitive edge. Partnerships are mutually beneficial, helping each other grow and get better. When people genuinely try to help each other, “it can help to get over certain weaknesses and be first movers in their field.”