Both the public and the financial services sectors have made it their goal to serve their country by developing and establishing better policies. Policies that allow the integrated use of digital technologies and implementation of insight-driven choices and processes on time supported by data and automation.
Data is vital to all stakeholders in the digital age, and it will only increase in value as a commodity as businesses ramp up their digital offerings.
However, the public sector and financial services struggle to realise the full value of their data and frequently have difficulties integrating it due to high prices, duplicated data across numerous locations, ageing systems and a lack of interoperability and real-time data access.
Organisations in both the public and private sectors have spread their data across a variety of on-premises and cloud sources, with each serving a different function. As a result, after gathering data from various sources, they must replicate it in a new physical location, requiring time and resources away from crucial reporting activities.
Because of these issues, organisations are less efficient and effective and have a lot of data but little insight.
The OpenGov Breakfast Insight on 15 July 2022 convened digital executives from Singapore to discuss how data-as-a-service can help organisations improve their use of technology to make better decisions.
The Impact of Digitalisation on Business Growth
To kickstart the session, Mohit Sagar, CEO and Editor-in-Chief, OpenGov Asia, delivered the opening address. He believes that organisations cannot afford to rely on yesterday’s methods to address today’s issues. This calls for the use of cutting-edge data management and integration tools while maintaining data sovereignty.
“Organisations will be better able to harness the power of data and accomplish their goals if they have a solid data integration strategy in place,” says Mohit. “Moreover, embracing agility to improve efficiency and cut costs could lead to a complete and cost-effective digital transformation.”
He adds that businesses must repurpose their resources in a way that enables them to create an interoperable, connected data landscape where all user-collected data is readily available where it is needed. In doing so, security and privacy would be protected and sufficient to deal with any legal, technical, or organisational challenges that could lead to data misuse.
Additionally, a Data Fabric, which enables safe and controlled data exchange while maintaining an organisational level of control over what data is shared and with whom, is the key to a data ecosystem.
However, this brings up several issues and clarity are needed on the importance of a Data Fabric in terms of how an organisation can:
- Ensure that the entire data ecosystem is accessible on-demand without replication;
- Maintain security while sharing data across agencies: and
- Deliver data in any format in real-time while keeping the costs low.
There are still problems in many parts of the financial and public sectors in terms of accessing a huge amount of information on clients, such as high costs, duplicated data in many places, older systems that don’t work well together and don’t have access to real-time data and lack of centralisation.
He urges organisations to turn unstructured data into structured data to gain insights that will help them make better decisions, and by using comprehensive data integration, an organisation could give great services and might be able to run data queries quickly to help their stakeholders.
“The key to this data environment is to enable safe and regulated data sharing while maintaining organisational control over what data is exchanged and with whom and the best way to accomplish this is to find the right technology partner,” Mohit advises.
One Logical Platform for All Data
Depending on their requirements, organisations must select from a variety of data management systems and databases, according to Paul Moxon, SVP Data Architectures & Chief Evangelist, Denodo. “A lot can go wrong on a day-to-day basis if proper infrastructure is not in place.”
When a hard drive needs to be replaced or a patch upgrade fails, businesses with on-premises solutions may experience more unplanned downtime than anticipated. On the other hand, if the data remains siloed rather than consolidated into a data warehouse, the analyst may spend a significant amount of time manually blending data, reducing overall efficiency.
“Data and its associated infrastructure are constantly evolving for every organisation. As a result, business data will always be distributed,” explains Paul.
The Denodo Platform enables IT organisations to evolve their data strategies, migrate to the cloud, or logically unify data warehouses and data lakes without disrupting business operations.
That is why it is critical to ensure that the appropriate data infrastructure has met the primary business goals. Hence, a solid data analytics infrastructure strategy ensures increased efficiency and productivity, facilitates collaboration and allows businesses to easily access information from anywhere if the proper authentication steps are in place.
Reducing operational costs, increasing focus on core competencies, and increasing efficiency through process analysis and improvement are some of the advantages of a proper data management system.
The Denodo Platform speeds up data provisioning by reducing data replication, provides consistent security and governance across multiple systems, and allows your business users to choose their preferred applications. Hence, a logical data fabric is needed that is powered by data virtualisation that can only be found in the Denodo Platform.
Fireside Chat: Approach to data integration that helps organisations find answers, accomplish their objectives, and reduce expenses.
In a data-driven organisation, data science expertise is not and should not be restricted to a select few data specialists. All employees will eventually work with data applications; thus, they should be involved in the transition process because they will need to acquire training appropriate to their level or even in their daily life.
According to Tung Whye Loon, Director, Data, AI & Research (SP Digital), SP Group, the success of data-driven decision-making will increase as more individuals become aware of its benefits. This “demands a systematic approach that monitors the required Dataset, Toolset, Skillset, and Mindset.”
Data in the digital age enables people and customers to receive assistance. To securely integrate data, businesses must use advanced analytics and find solutions to optimise operations for long-term use. Efforts are being made by both the public and business sectors to recognise the worth of their data.
Paul believes, “If you have an organisation that encourages experimentation and allows people to fail, you will get more innovation and drive.”
Besides, many companies still don’t have a strong, data-driven culture, and data isn’t always the best way to make decisions. The biggest problems with making businesses based on data are not technical but cultural.
Nonetheless, enterprise data is dispersed among various on-premises and cloud sources, each serving a specific operational area. As a result, organisations become data-rich but insight-poor, which hinders their efficiency and effectiveness.
Companies with strong data-driven cultures usually have leaders who make it clear that decisions should be based on facts. Big data could be hard to understand and complicated. But companies that create systems and methods to collect, analyse and use data will see results in several different areas.
One way to get the most out of data science integration is to treat it like a consulting project. With an intelligent, data-driven approach, businesses get a unique view of their operations and customers, which helps them improve business strategy, improve operations, and embrace digital transformation.
This digital transformation makes it possible for business leaders to use technology all over their businesses and make big changes that take their businesses to the next level.
In real life, this applies to a wide range of situations, such as when AI and cloud computing are used to improve the customer experience or when companies digitise their supply chains to feed data into machine learning.
A solid data strategy can be an organisation’s best asset when it comes to bridging the gap between business strategy and implementation. Even though the exact reasons for these failures are many and varied – like poor communication, lack of buy-in and lack of clarity – they all boil down to one problem: there is a disconnect between planning and implementation.
While top management may have spent a lot of time and energy coming up with business goals and the strategies that support them, it takes the combined efforts of everyone in the organisation to make those goals come true.
The data strategy tells those who implement the strategy step-by-step what to do. This gives them the power to make decisions and take actions that move the organisation closer to its goals.
When data is accessed and shared, people, companies and governments all face similar problems. The main obstacle that organisations or events must overcome for policymakers is to enable and promote improved access and sharing.
Weighing the dangers and rewards of increased data access while considering genuine individual, national and public interests is essential. For this, unreasonable impediments to cross-border data transfers may need to be removed.
Through proactive stakeholder interactions, community building and trust-building, it is possible to promote data sharing and help maximise the value of data re-use. The development of data-related skills, infrastructure and standards as well as maintaining community engagement may incur large expenses.
While recognising the limitations of (data) marketplaces, encouraging the availability of data through sensible incentive structures and viable business models. To do this, it may be necessary to clarify privacy obligations, the role of intellectual property rights (IPRs), and other rights that are like ownership. These tasks should ideally be handled by the relevant expert agencies and organisations.
After the informative discussion, delegates participated in poll-based conversations. This session is meant to provide live audience engagement, stimulate participation, and provide individuals with real-world experiences that could help in their professional development.
When asked to rate how well their company uses analytics and data to make decisions, a majority answered fair and good as they use some data and tools in their decision-making process, although analysis is primarily a manual process.
Data analytics enables comprehension of the employees’ and customers’ interactions and collaboration with the IT and marketing departments to improve them. As a result, the company can better allocate budgets based on customer feedback.
The top motivator for data sharing within the organisation was improving the speed and accuracy of business decisions.
Improving the decision-making process is well worth the effort and will be beneficial if the organisation adopts an agile attitude, enabling it to move forward with modest steps and get test-and-learn insights that will speed up decision-making.
In the third poll, most delegates indicated compliance with data security and privacy requirements as the biggest obstacle to data sharing within the organisation.
The delegates were unified in their belief that a complying organisation implements robust administrative, technical and physical security measures to protect the confidentiality, integrity and availability of data. This includes the capability to effectively detect and block illegal or inappropriate data access.
On what their current data strategy is, most said that they are migrating their selected data to the cloud and are in the process of migrating data and applications to the cloud.
An enterprise can perform various types of cloud migrations, one of which is to transfer data and applications from a local on-premises data centre to the public cloud.
The separation between IT and business was cited by the delegates as the largest obstacle to advancement in their organisation’s data journey. Because there is a disconnect between business and IT, organisations do not include long-term IT innovation in their annual business planning. They favour resolving current problems.
Inquiring as to how their usage of data and analytics would alter over the next two to four years by adding AI/ML analytics to their decision-making, delegates believe that both private and public sectors can foresee and make optimal judgments using decision-making driven by AI and ML.
Mohit encouraged the delegates to put their data into the cloud because “the cloud database reduces overhead expenses. It enables the company to devote additional resources and time to improve its infrastructure.”
Since cloud databases have very little downtime and are the most efficient recovery plan, they can provide more accurate retrieval of data and applications. He feels that a successful digital transformation leader must be well-versed in a wide range of details. They must have a strong bond with and understanding of the customer. They must also appreciate the company’s business model, business processes, as well as supporting technology.
“Data is valuable, but if unrefined, it cannot really be used. Undoubtedly, having mindset flexibility in the building data structure is important,” says Alex Hoehl, Regional Vice President, ASEAN & Korea, Denodo.
To keep up with the huge amount of generated data, it needs to be properly managed and stored. This assertion should encourage businesses to think that their data is an asset. Therefore, proper management is a prerequisite for both ensuring trust as well as for incorporation into business processes. Consequently, it should become clear why data management applies not only to the IT department but represents a structural issue for an entire company.
A logical data fabric is a single platform for delivering data that hides complexity and access to different data systems but exposes data in formats that are easy for businesses to use. It also makes sure that data is delivered according to semantics and data governance rules that have already been set.
Data virtualisation enables the logical data fabric vision. Organisations will find one of the Denodo subscriptions has the right features and functionalities to fit the enterprises’ budget and needs, whether they’re looking for data integration and management solution for a simple project like dashboarding or reporting, or a complex project.
In conclusion, Alex points out that the Denodo subscriptions have the right features and functions to match the businesses’ budget and demands, “Whether you need dashboarding, reporting, data management in hybrid or multi-cloud settings or prescriptive or predictive analytics, Denodo can serve your needs!”
A multidisciplinary team of Massachusetts Institute of Technology (MIT) researchers led by Iddo Drori, a lecturer in the MIT Department of Electrical Engineering and Computer Science (EECS), has used a neural network model to solve university-level math problems at a human level in a matter of seconds.
“It will help students improve, and it will help teachers create new content, and it could help increase the level of difficulty in some courses. It also allows us to build a graph of questions and courses, which helps us understand the relationship between courses and their pre-requisites, not just by historically contemplating them, but based on data,” Iddo explained, also an adjunct associate professor at Columbia University’s Department of Computer Science.
Additionally, the model automatically explains solutions and rapidly generates new math problems for university-level courses. When the researchers presented these machine-generated questions to university students, the students were unable to distinguish whether the questions were created by a human or an algorithm.
This approach might be used to simplify the creation of course content, which would be particularly beneficial for big residential courses and massive open online courses (MOOCs) with thousands of students. The technology might also be used as an automated tutor that demonstrates to students how to solve basic math problems.
In the past, researchers employed a neural network, such as GPT-3, that was merely pretrained on the text like it was shown millions of examples of text to learn the patterns of natural language. This time, they employed a neural network that was trained on the text and “tuned” on code.
A machine learning model can perform better by using this network, known as Codex, which is effectively an additional pre-training procedure.
The model was exposed to millions of code examples from internet repositories. As the training data for this model contained millions of natural language words and millions of lines of code, it learns the relationships between text and code.
The machine-generated questions were evaluated by showing them to university students. The researchers assigned students 10 problems from each undergraduate math course in random order; five questions were prepared by people and the remaining five were generated by a computer.
Students were unable to discern whether the machine-generated questions were produced by an algorithm or a human, and they scored the difficulty level and course-appropriateness of questions generated by humans and machines similarly.
Researchers emphasised that this effort is not meant to take the place of actual teachers. They claim that although automation has reached 80 per cent accuracy, it will never reach 100 per cent. Every time someone figures something out, someone else will pose a more challenging problem.
Simply this work opens the door for people to begin using machine learning to answer ever-harder questions, and academics are optimistic that it will have a significant impact on higher education.
The team has expanded the work to handle math proofs because of the approach’s effectiveness, although there are several limits they intend to address. Due to computational complexity, the model is currently unable to answer questions with a visual component or resolve computationally intractable issues.
The model is being scaled up to hundreds of courses in addition to these obstacles. They will produce more data with those hundreds of courses, which they may use to improve automation and offer perceptions into course design and curricula.
The Science and Technology Academic and Research-Based Openly Operated Kiosks or STARBOOKS of the Department of Science and Technology (DOST) have arrived on the island of San Miguel in Tabaco, Albay, providing easy access to S&T learning.
STARBOOKS is the country’s first digital science library, created by the Science and Technology Information Institute (DOST-STII). It is a stand-alone information source intended for those who have limited or no access to S&T information resources.
The project’s goal is to provide Science, Technology, and Innovation (ST&I) content to geographically isolated schools and communities across the country. STARBOOKS contains many digitized S&T resources in various formats such as text and video or audio organised in specially designed “pods” with an easy-to-use interface.
STARBOOKS, as SMNHS teacher John Darnell Balbastro put it, is “one way of elevating the scientific and technological literacy” of their students. Its wide range of digitised S&T resources in various formats will “intensify the curiosity among our young learners,” and its offline access will address the lack of S&T learning resources in San Miguel.
Through this programme, DOST Region V, in collaboration with its dedicated Provincial S&T Centres and implementers, will continue to promote and empower S&T knowledge and education.
Meanwhile, Jamaica Pangasinan, Senior Science Research Specialist at the Space Mission Control and Operations Division (SMCOD) of the Philippine Space Agency (PhilSA), said that she was impressed by the level of environmental and social awareness of the incoming senior high school students, which was shown in their work at the “LIFT OFF: PhilSA Space Science Camp 2022.”
She said that the mission goals showed how eager the students were to solve the problems and threats facing the environment right now.
Fourteen science high schools from the 16 divisions of Metro Manila chosen by the Department of Education (DepEd) to attend the camp presented their space missions. Each team had five (5) minutes to talk about their satellite’s mission, its most important technical features, and why it was important.
The students came up with a wide range of missions, from observing Earth to keeping an eye on space junk to sending probes to other planets.
Only two missions were better than the rest. These are the Monitoring Illegal Mining Activities in Remote Areas (MIMA) by Bianca Louise B. Cruz and Oscar A. Araja II of the City of Mandaluyong Science High School, and the Venus Seismic Activity Monitoring Satellite (V-SAMS) by Peter James Lyon and Ysabela Juliana Bernardo of the Caloocan City Science High School.
The students who work on MIMA said that the goal of their satellite mission is to protect the environment and make sure that mining laws and rules are followed better in the country. Based on their plan, MIMA would be a Synthetic Aperture Radar (SAR) satellite that could see through clouds to spot changes in areas where mining could be happening. It would take pictures with the help of optical imagers.
The goal of V-SAMS, on the other hand, would be to learn more about Venus, which is like Earth’s twin, and especially about its earthquakes. To do this, V-SAMS would use infrared imaging to track the surface temperature of Venus’s volcanoes, figure out which ones will erupt, and find other volcanoes that are still active on the planet.
It would also have an interferometric SAR (InSAR) to look for changes on Venus’s surface and signs of earthquakes. V-SAMS would also have an optical payload that would let it take high-resolution pictures.
The National Environment Agency (NEA) and the Singapore Land Authority (SLA) have signed a Memorandum of Understanding (MOU) to develop the use of Global Navigation Satellite System (GNSS) data from SLA’s Singapore Satellite Reference Network (SiReNT) to help NEA better monitor island-wide atmospheric moisture. The goal of the five-year partnership is to help Singapore with weather monitoring by giving it more data and making it easier to do exploratory studies for weather forecasting.
“The collaboration between NEA and SLA highlights our commitment to achieve synergies and tap on enablers across the public sector. This partnership provides a platform for NEA to utilise SLA’s expertise in GNSS data collection and processing, enabling NEA to explore non-traditional methods to enhance our weather monitoring and forecasting capabilities,” says Luke Goh, CEO, NEA.
On the other hand, Colin Low, CEO of SLA, said that SLA’s partnership with NEA is a part of its ongoing efforts to collaborate with parties from the public and commercial sectors to open up new applications for SiReNT and its other geospatial products.
The SLA believed combining the knowledge of multiple parties might lead to more innovation and the discovery of workable solutions that could be advantageous to Singapore and the industries.
Colin continued by saying that they are eager to collaborate with NEA to research the unique uses of SiReNT data for improved weather monitoring and research projects on weather forecasting and climate change. The many experiences that were gathered and shared during this partnership will serve as a foundation for upcoming developments in this area.
The production of accurate weather forecasts, climate monitoring, and timely warnings of dangerous weather events all depend on meteorological measurements. The Meteorological Service Singapore (MSS) routinely gathers a variety of observational data from ground-based and aircraft sensors, such as temperature, wind, and moisture.
To measure these weather components at various altitudes of the atmosphere, sensors linked to a weather balloon are routinely launched twice a day at MSS’ Upper Air Observatory (UAO). To enhance the sounding data from the weather balloon, MSS erected a GNSS reference station at UAO in 2019.
This station will provide continuous estimates of moisture in an atmospheric column known as the integrated precipitable water vapour.
In accordance with the MOU, SiReNT will incorporate MSS’s GNSS station, giving MSS access to continuous, almost real-time atmospheric moisture readings for the entire island. By supplying greater resolution and more frequent observation data, this non-conventional moisture data will complement MSS’s current observation network data and enable research into possible uses for weather forecasting.
The partnership will also help SLA’s SiReNT station network, which now consists of nine reference stations dispersed throughout Singapore, grow. The network will grow to 12 stations with more data receivable with the installation of NEA’s GNSS base receiver station at UAO that will be integrated into SiReNT and two anticipated additional coastal SiReNT reference stations. The SiReNT system can create precise positioning data with an accuracy of up to 3 cm and correct positional inaccuracies in GNSS signals.
The SiReNT technology fosters innovation across a range of sectors, including autonomous driving, logistics and automation in the building industry, and monitoring of changes in Singapore’s land height and sea level.
The addition of stations by the end of 2022 will further increase the stability of the services and applications SiReNT now supports in several important industries. It can also be used in novel ways for scientific research on climate change.
Several domestic banks in Vietnam have 90% of their transactions conducted on digital platforms, surpassing the target of 70% set for 2025. Half of the country’s banking services are expected to be digitalised and 70% of transactions will be carried out online by 2025.
The Vietnamese Prime Minister, Pham Minh Chinh, recently stated that the banking sector has played a significant role in national digital transformation by deploying products and services for people and businesses. He urged the sector to further reform its management methods towards modernity and transparency and diversify and improve the quality of its products and services to curb money laundering.
Addressing an event called, “Digital Transformation Day of the Banking Sector” Chinh explained that the sector should work to understand more about the demands of people, businesses, and credit institutions to devise suitable legal documents, facilitating the application of digital technologies in banking services.
He asked the State Bank of Vietnam (SBV) to continue its close coordination with ministries and agencies to formulate a decree on cashless payments and submit it to the government. Common infrastructure such as payment and credit information infrastructure should be promoted. He said suggested stronger connectivity between banks and credit organisations.
Chinh also requested the sector ensure cybersecurity and safety in digital transformation, given the rise of high-tech crime. The sector should raise public awareness about the benefits of digital transformation, enhance personnel training capabilities, and boost international cooperation in digital transformation.
The Prime Minister also attended an exhibition showcasing products and services that promote the digital transformation of the banking sector. Chinh had a working session with representatives from the SBV and commercial banks. He congratulated the sector on its effective operations amid a host of difficulties, especially those caused by the COVID-19 pandemic. He suggested the sector further cut interest rates to support businesses and actively engage in the state’s policies, particularly housing credit for workers and low-income earners. Participants attributed the developments of banks to supportive policies adopted by the state, the management of the government, and stability in the country.
Vietnam’s financial technology market could grow to US$ 18 billion by 2024. The country is a leader among ASEAN members in terms of the volume of financing for fintech, second only to Singapore. Over 93% of all venture investments in the country are directed at e-wallets and the e-money segment. The total number of fintech companies has grown to 97 since 2016, an 84.5% increase. However, the number of newly-launched start-ups each year decreased from 11 to 2.
As OpenGov Asia reported, the market features high competitiveness and a high entry bar. Transaction volume has seen a 152.8% growth since 2016, with 29.5 million new fintech users. As a result, every second Vietnamese citizen uses at least one fintech service. Demand for digital services (transactions, payments, and wallets) in the country is high. According to industry analysts, Vietnam’s fintech sector is young and promising. The market valuation has increased from US$ 0.7 billion to US$ 4.5 billion since 2016.
Michael G. Regino, President and CEO of SSS, announced that self-employed, volunteer, non-working spouses, and land-based Overseas Filipino Workers can pay their contributions through the online method of their choice. This was done in cooperation with the different financial and private sectors.
“We encourage our members and employers to pay their contributions using our online channels as through these payment facilities, they no longer must go to our branches. These can be accessed at the safety and convenience of their homes or offices,” says Michael.
Individual members may furthermore use the websites and mobile apps of other SSS-accredited collecting partners, such as most banks in the public and commercial sectors of the nation. However, both commercial and domestic employers have access to online payment methods.
SSS is a publicly funded social insurance programme that the Philippine government requires to provide coverage to all wage earners in the private, public, and unorganised sectors.
The agency is mandated to set up, develop, promote, and perfect a sound, tax-free social security system that fits the needs of everyone in the Philippines. This system should encourage social justice through savings and protect members and their beneficiaries from the risks of disability, illness, maternity, old age, death, and other things that could cause a loss of income or a financial burden.
OpenGov Asia earlier reported that digitalising SSS pension fund services remain one of the top priorities in the Philippines and that more online services will be added to its digital channels.
More than 30 member services and more than 20 employer services are currently easily accessible on the SSS website. Transactions for membership, contributions, loan granting and repayment, and benefit distributions are only a few examples of the services offered. Other SSS internet platforms also extend some of these features.
Further, almost all new online services are made available via the agency’s website, which serves as its main online platform. However, more work is being done to make the services on this portal accessible to smartphone users via the SSS Mobile App.
The agency is slowly making it mandatory for its programme to be done online. Those who don’t have their own way to do business online can use the e-Centres in branches.
In the meantime, the Department of Education (DepEd) worked with the Young Southeast Asian Leaders Initiative (YSEALI) and exchanged alumni to improve education about climate change through an online programme called Climate Changemakers.
The National Educators Academy of the Philippines (NEAP) has recognised Climate Changemakers as the first climate change training course as part of the Department’s Professional Development Priorities.
Through online training and other digital education initiatives, the programme aims to make teachers better able to teach climate change skills, integrate climate change skills, and act on climate change in the country.
The ten-week online course, which used synchronous and asynchronous modalities to address common misconceptions about climate change, was successfully completed by 400 instructors. Additionally, it gave teachers a place to consider their own learning, exchange difficulties and effective methods.
The Young Southeast Asian Leaders Initiative Professional Fellows Program (YSEALI PFP) is a two-way exchange programme run by the U.S. Department of State. Its goal is to help young leaders from different countries in Asia and the United States to get to know each other better and strengthen economic relationships.
Data is information that has been organised in a way that makes it simple to move or process. It is a piece of information that has been converted into binary digital form for computers and modern methods of information transmission.
Connected data, on the other hand, is a method of displaying, using, and preserving relationships between data elements. Graph technology aids in uncovering links in data that conventional approaches are unable to uncover or analyse.
Different sectors have invested in big data technologies because of the promise of valuable business insights. As a result, various industries express a need for connected data, particularly when it comes to connecting people such as employees or customers to products, business processes and other Internet-enabled devices (IoT).
In an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia, Chandra Rangan, Chief Marketing Officer of Neo4j shared his knowledge on how a connected data strategy becomes of paramount importance in building a smart nation.
Connected data enables businesses
A great example of the power of graph technology, and a very common use case for Neo4j, is its use in the financial sector to uncover fraud. Finding fraud is all about trying to make connections and understand relationships, Chandra elaborates. A graph-based system could detect if fraud is taking place in one location and determine if the same scenario has occurred in other locations.
“How does one make sense of this? Essentially, you are traversing a network of interconnected data using the relationships between that data. Then you begin to see patterns develop and these patterns provide you with answers so that you can conclude whether there is fraud.”
What is of great concern is that fraud is occurring with much greater frequency and with a higher success rate nowadays. The key to stopping and mitigating the impact is time. Instead of detecting a fraud that occurred hours or days ago,
“What if the organisation could detect it almost immediately and in real-time as it occurs?” asks Chandra. “Graph offers this kind of response and is why it’s a great example of value!”
Supply chain and management are other excellent examples of RoI. One of Neo4j’s clients, which operates arguably the largest rail network in the United States and North America created a digital twin of the entire rail network and all the goods. With graph technology across their network, they can now do all kinds of interesting optimisation much faster, leading to better, more efficient outcomes for their entire system.
The pandemic has taught the world about the value and fragility of supply chains. Systems across the globe are being reimagined as the world’s economy realise the need to become more digital and strategic. More supply sources, data, data sharing, customer demands, and increased complexity necessitate modern, purpose-built solutions.
Apart from all the new expectations and requirements for modern supply chains, systems need to and are becoming more interconnected because of new technologies.
Maintaining consistent profitability is difficult for firms with a high proportion of assets. Executives must oversee intricate worldwide supply chains, extensive asset inventories and field operations that dispatch workers to dangerous or inaccessible places.
With this, organisations need a platform that connects their workforces and makes them more capable, productive and efficient. A platform that provides enterprises with real-time visibility and connectivity, while also assuring efficiency, safety, and compliance.
Modern technologies are required to improve interconnectivity, maximise the value of data, automate essential procedures, and optimise the organisation’s most vital workflows.
Modern data applications require a connected platform
“When we programme, when we create applications, we think in what we are calling a graph. This is the most intuitive approach that you can have,” says Chandra.
Any application development begins with understanding the types of questions people want to solve and then mapping it to a wide range of outcomes that they want to achieve. These are typically mapped in what is known as an entity relationship diagram.
Individuals’ increased reliance on systems that work in a way that makes sense to them and supports them has increased criticality. And frequently, when these systems fail, Neo4j makes sense of complexity and simplifies what needs to be done, resulting in a significant acceleration.
As the world becomes more collaborative, integrated, and networked, nations must respond more quickly to changes in their business environment brought on by the digital era; otherwise, they risk falling behind or entering survival mode.
The proliferation of new technologies, platforms, and devices, as well as the evolving nature of work, are compelling businesses to recognise the significance of leveraging the most recent technology to achieve greater operational efficiencies and business agility.
A graph platform connects individuals to what they require, and when and when they require it. It augments their existing process by facilitating the effective recording and management of personnel data. Neo4j Graph Data Science assists data scientists in finding connections in huge data to resolve important business issues and enhance predictions.
Businesses employ insights from graph data science to discover activities that point to fraud, find entities or people who are similar, enhance customer happiness through improved suggestions, and streamline supply chains. The dedicated workspace combines intake, analysis, and management for simple model improvement without workflow reconstruction.
As a result, people are more engaged, productive, and efficient with connected data. Nations can bridge information and communication gaps between executive teams, field technicians, plant operators, warehouse operators and maintenance engineers. Increasing agility and productivity offers obvious commercial benefits.
In short, organisations easily integrate their whole industrial workforce to increase operational excellence and decrease plant downtime, hence maximising revenues. This methodology is based on a collaborative platform direction.
Contextualising data increases its value
According to Chandra, data is a representation of the world in which people live, and people use data to represent this world. As a result, the world is becoming more connected, and people no longer live in silos and continue to be associated in society.
“If you think about data as the representation of the world that we live in, it is connected data and we can deal with all the complexities that we need to deal with when we try to make sense out of it,” explains Chandra.
Closer to home, connected data is crucial to Singapore’s development as a smart nation. “Connected data is at the centre of each of those conversations around developing the nation. When you think of Singapore as a connected ecosystem and when you think about citizens, services, logistics, contract tracing, and supply chain.”
Chandra believes that the attributes have saved the connection between data and people, which is why connections are important. Once people understand those connections, it becomes much easier and much faster to derive the insights that are required.
Without connected data, organisations lack key information needed to gain a deeper understanding of their customers, build a complete network topology, deliver relevant recommendations in real-time, or gain the visibility needed to prevent fraud.
Thus, “knowing your customer is understanding connected data.” With the right tools, data may be a real-time, demand-driven asset that a financial institution can utilise to reinvent ineffective processes and procedures and change how it interacts with and comprehends its consumers.
“Me as a person – who I am, my name, where I live – these are all properties of who I am. But what really makes me me, are the relationships I have built over time. And so, the notion that almost every problem has data that you can really make sense of with graphs is the larger “Aha” moment,” Chandra ends.
Legacy systems are still in use pieces of hardware or software that are out of date. These systems frequently have problems and are incompatible with more modern ones. Although they can be used in the manner intended by their creators, they cannot be improved.
It is the backbone of many excellent organisations, since they utilise software, apps, and IT solutions that are crucial to the general operation of the business but are obsolete and, in some cases, no longer supported by the original software vendor or developer.
While running legacy systems may not appear to be a big deal, they do present a unique set of challenges and potential issues that organisations would be remiss to ignore.
Thus, obsolete legacy systems are at best a nuisance and, at worst, can undermine an organisation’s entire IT security strategy, severely impeding productivity. Furthermore, the longer a company waits to modernise a legacy system, the more difficult the transition becomes.
However, system modernisation is always a prerequisite for digital transformation. Most firms will be unable to fully grasp the benefits of new technologies and solutions without it.
Due to the rapid development of technology, businesses must maintain compatibility with legacy systems that impede the implementation of contemporary technologies.
With this, the Centre for Strategic Infocomm Technologies (CSIT) employs technology to facilitate and advance Singapore’s national security. Due to the environment’s highly secret nature, it must be air-gapped.
This means that development and deployment are conducted in networks that are not connected to the internet. Consequently, all platforms had to be installed on-premises.
Despite not being able to utilise internet-connected services, CSIT has a Cloud Infrastructure and Services section that offers developers the necessary infrastructure to concentrate on software development.
Further, a monolith system is a big application consisting of code built by several developers over many years. Frequently, the code is inadequately maintained. Some of these developers may have left the development team or the organisation, leaving knowledge gaps.
Due to a lack of expertise and the difficulty of modifying a system that is constantly in use in production, refactoring the code is comparable to replacing the tyres on a moving car.
Having a legacy system result in greater maintenance and support costs and decreased efficiency. Since the monolith system was still essential, CSIT opted to adopt a more manageable strategy by decomposing it into smaller services using the microservices methodology.
Microservices, on the other hand, are software programmes that execute a business function as part of a larger system yet are separate services. These services are intended to be lightweight and straightforward to implement.
Microservices have the following advantages: each service is independently scalable; services have smaller code bases that make them easier to maintain and test; and problems are isolated to a single service, allowing for faster troubleshooting.
In addition, there are two main microservice architectures to consider when implementing the microservices approach. Each has advantages and disadvantages that correspond to specific use cases as Orchestration, as the name suggests, necessitates an orchestrator actively controlling the work of each service, whereas Choreography takes a less stringent method by allowing each service to carry out its work freely.
Microservices architecture may not be appropriate for all projects and choosing an architecture should be based on the needs of the project; therefore, CSIT advised to expect new problems to arise and be prepared to adapt to them.