
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Data is increasingly at the core of any business or organisation and is underpinning digital strategies and initiatives more than ever. Data has become a key component of digitalisation and the driving force behind and fuel for analytics, machine learning, edge computing, cloud and other cutting-edge technologies.
As the need to respond more quickly, indeed in as near real-time as possible, data will rapidly become the key competitive advantage. A company’s capacity to compete will be determined by its ability to leverage data – apply analytics and generate intelligence.
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia agrees that “data is the new oil”. But like oil, raw data is not particularly useful in and of itself. Information – processed data – swiftly becomes a decision-making tool that allows businesses to react to market dynamics and make proactive and intentional decisions. The real value of data offers timely actionable insights, trends and projections that can help organisations survive and thrive in a VUCA world.
Generating data is not really the issue at hand. Both the public and private sectors, for the most part, hold massive volumes of data and continue to add to it. Albeit, this has been fairly unorganised and siloed, making it difficult to access and process.
The question is: how can agencies and organisations best derive real value from these mountains of data, which are often distinct, distant and diverse? How do they collect, analyse and rationally build patterns and interconnections to improve decision-making and planning?
While organisations have been deploying AI and ML to gain and analyse insights from the data, a new platform has emerged that has the potential to offer deeper insights – Graph Data Platform.

OpenGov Asia had the opportunity to speak with Nik Vora, Vice President, Asia-Pacific, Neo4j to gain his insights on the importance of a Graph Data Platform and how organisations can derive actual value from it.
Nik Vora is the Vice President of Asia-Pacific at Neo4j. Nik has over 12 years of expertise in the tech industry and joined Neo4j as the company was looking to grow its operations into the Asia Pacific area. In his present position, he oversees the APAC business, which develops solutions for businesses and communities to see the connections and linkages among massive amounts of data to make better decisions.
Genuine innovation or repackaging?
Mohit is keen to know, is this just old technology in new packaging or is there legitimate value-add? If yes, what do organisations gain from a Graph Data Platform?
Nik Vora is quick to clarify that the tool is important because it has the capability to extract the inherent value in the data itself. Data needs to be seen as a network and not merely discrete data points – and the best way to visualise these relationships is in graphs.
A Graph Data Platform considers the relationship between data to be just as significant as the data itself. The purpose of the technology is to store information without restricting it to a pre-defined model. The data is maintained in the same way that is initially collected, with each unique item connected to or related to others. In a native Graph Data Platform, accessing nodes and relationships is a speedy, constant operation that allows one company to traverse millions of connections per second per core.
Companies, agencies and any organisation in the ecosystem, according to Nik, are looking to exploit gain from data. Over the last 24 months, there has been a massive acceleration of digitisation – of supply chains, processes, services and transactions. This has pushed more information online and allows more data to be captured. In turn, businesses rely increasingly on data, leading to more optimisations, depending on how much value an organisation can create from data.
As data becomes more distributed, dynamic and diverse, it is important to capture it in real-time and process it to drive rapid action and feed into strategy, Mohit opines. This means that data needs to be on hand for those who need it. The importance of data availability and accessibility anytime and anywhere is even more pronounced in the current crisis. This is especially true for organisations engaged in providing mission-critical, customer-centric services.
Wholeheartedly agreeing, Nik says the greater the demand for data-driven insight and intelligence, the more important it is to grasp the importance of connectedness in existing data. A Graph Data Platform is uniquely positioned to do this. Since it is modelled as a graph and a network, a Graph Data Platform is the ‘most obvious approach’ to look at connections. “The value of relationships itself is the underlying drive for this technology,” he explains.
Investing in data analytics and technologies without first determining what your specific organisation need to succeed is indeed a waste. It is necessary to first build a big data strategy to get the most out of the data a company already has or plans to collect. A big data strategy lays out how data will be used in practice and what kinds of data a business needs to meet specific business goals.
However, does this means that all organisations should reconsider their entire data collection strategy, including how they acquire, store and distribute data? This, Mohit feels, would be markedly prohibitive.
The answer to this is actually a bit of both and while there is an investment involved it is not unreasonable, says Nik. Organisations do not need to modify their data, but they do need to change their perspective.
The key concern should be how data is connected and how it relates to other data sets and points. Organisations have spent many years building data lakes and data warehouses and that all the data that any organisation could need, already exists. What they need to do now is turn on the tab and start looking at the relationships between different data that are connected across silos, processes, networks and transactions.
The challenges and advantages are that it is a very dynamic world. And, given this new understanding of how interconnected everything is, if an organisation does not have a linked data strategy – where they look at data, how it connects and what relationships and dependencies exist – they are missing out on a huge potential.
Many businesses rely on data to assist rather than drive their operations. But why is that? After all, data is only valuable if it can be turned into actionable insights. Finding out what you want from your data and determining its worth is the first step in gaining these insights.
“We are all gaining insight from our existing data in some way,” posits Nik’s. “Organisations should be more intentional about it if they are to gain genuine advantages.”
Within an organisation’s ecosystem, there are many existing relationships and connections. With the plethora of technologies, ecosystems and capabilities available, Nik believes that the ideal time to start investing is NOW. But investment is not just in technology but in people!
Data and analytics leaders are often perceived as the gurus of graph technology, but the truth is, Mohit points out, many still don’t comprehend it themselves. This means that there has to be an upskilling of the entire workforce if a company wants to gain real value from data. So, how do companies get started?
The strategy, Nik believes, is two-pronged: training and staffing. Organisations must empower their existing workforce to understand the value of and how to use a Graph Data Platform. Above this, they need to bolster organisational capacity by hiring the right people. Although there is a lot of great talent in the market and a relatively large pool, Nik advises caution in recruitment as skills are relatively easy to fake.
“When you embark on a project or a journey, you have enough (and more talent) in the partner ecosystem, as well as the deployed developer ecosystem, where you can source people from,” Nik acknowledges.”However, it is essential to be careful that potential candidates go through a rigorous selection process.”
Big Data can have ‘infinite value’
There are a lot of one-line proverbs and truisms to push unnecessary products. One is “big data can have an “infinite value”. Is this factual or just another way to justify more expenses on the books.
“The simple answer is that it is up to an organisation to decide how they budget their funds. But it is better to look at it differently. It’s not intrinsically about just money,” Nik explains. “It’s the perspective organisations have of tech. Do they see it as an expense or an investment?”
Yes, companies, in the short term, and tangibly, invest their resources, time and effort, but, more significantly, they are investing their company’s future based on the decisions they make. A case in point is fraud.
“If you look at fraud detection alone, fraud detection has gone offline as well as online; it’s an omnichannel; it’s not just one Forster dealing with one credit card somewhere.”
Fraud detection and anti-money laundering depend enormously on exposing connections and patterns. With the new Neo4j Graph Data Platform, which incorporates both the Neo4j graph data science and the code database, detecting fraud is considerably easier now and Neo4j has discovered millions of new frauds from its technology.
So going in for a Graph Data Platform now does mean an expense in terms of investing in the technology, training people and setting systems; but it has massive RoI down the line, in addition to protecting a company’s most valuable assets – market reputation and customer trust.
Case Study
There is a whole new thrust of marketing to customers personally – It’s no longer just a store or an e-commerce website. With people are on social media, rating platforms and a host of mobile apps, getting a complete customer 360 is much more difficult and complex than before. Organisations are increasingly relying on numerous consumers touchpoints to gain a more comprehensive picture of each customer.
To add to the complexity, an organisation can have a million customers or more, with data points spread across billions of records from transactions, events and sensors.
One of their customers AirAsia – one of the largest and most well-known airlines in Southeast Asia – saw a 300% spike in the test group after employing Neo4j’s craft data science. This was because Neo4j was able to gain a significantly deeper grasp of the customer from a single consumer perspective.
To do this, Neo4j did not discard any of the other company’s technology, but instead layered theirs on top of the existing ones, combining all the company’s assets such as data lakes, data warehouses, and data science notebooks, to the power of the Graph Data Platform. As a result, there was a massive performance improvement.
Proof of the pudding is in the eating
While claims are easy to make, the test of the effectiveness of a technology is the success it has in real-life applications. AirAsia apart, the company has a wide spectrum of financial organisations that deploy their solutions.
Neo4j counts a whole host of banks as their satisfied customers including Standard Chartered, prominent banks in Singapore and one of Australia’s largest banks. Most recently. Neo4j partnered with DBS for their hackathon – DBS’s flagship event.
At the same time, Neo4j has a big number of on-premises start-ups, as well as cloud and digital native accounts, all of which are using the Neo4j cloud experience in APAC.
These organisations represent the best in their sectors, and they are at the top because they invest in technologies that help them progress. Findings indicate that Graph Data Platforms were used in 50% of all Artificial Intelligence projects. This is because incorporating a Graph Data Platform into an organisation’s existing AI strategy offers significant improvement at a low cost. The investment is minimal and corporations can increase confidence scores and outcomes for a fraction of expenditure in other solutions.
Embarking on a new transformational journey
Graph analytics applications use algorithms to traverse and analyse graphs to uncover and potentially identify intriguing patterns that represent business prospects.
Business operators can have a better understanding of what they are doing efficiently and inefficiently within their businesses by analysing data. Professionals with an analytics background are capable of answering critical questions once a problem has been recognised.
While Mohit concedes that businesses are on the top because they invest in technologies that help them progress, the pertinent question is: what made them decide to use this technology and how did they get started?
Answering with another truism – the early bird catches the worm, Nik feels that in all likelihood the leading companies had a combination of higher risk appetite, vision and gut instinct. With trailblazers leading the way, the question now isn’t so much about how do companies get started but when do they get started?
With the gains seen in the companies that already deploy Graph Data Platforms, others are eager to climb on board. But they seem to be unsure about timing and the most opportune stage to do so.
“We are seeing a lot of other companies that are inspired by these pioneering companies’ successes and are putting a lot of faith and stock in our technology,” Nik acknowledges. “Leaders in any organisation have to understand that technology is an investment and that everyone must embark on. The time is always right to invest in such technology!”
For more information on Neo4j visit https://neo4j.com/

- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Ministry of Heavy Industries (MHI) launched an Automated Online Data Transfer system to collect critical domestic value addition (DVA) data from a Production Linked Incentive scheme (PLI) applicant’s enterprise resource planning (ERP) system.
The PLI scheme was launched to boost domestic manufacturing, investments, and the export of telecom and networking products. The PLI Scheme for Automobile and Auto Component Industry in India (PLI Auto) proposes financial incentives to boost the domestic manufacturing of Advanced Automotive (AAT) products and attract investments in the automotive manufacturing value chain.
Through the new automated online data transfer mechanism, MHI’s PLI Auto Portal will receive data from the applicant’s ERP system. All approved applicants under the PLI scheme have their own ERP system, which is software that enables organisations to manage business activities.
According to a press release, the application programming interface (API) will be embedded with the applicant’s ERP system, making processes in the scheme automatic and paperless. An API is a set of rules that lets different programmes communicate with each other, exposing data and functionality across the Internet in a consistent format. It is an architectural pattern that describes how distributed systems can expose a consistent interface in a secure cyber environment.
Through the previous system, PLI applicants were required to file voluminous claims. The new system eliminates a large amount of paperwork through automation. It reduces the compliance burden for applicants and speeds up claim processing. The release stated that it was created after exhaustive stakeholder consultations with leading original equipment manufacturers (OEMs) and auto component manufacturing companies.
MHI Minister, Mahendra Nath Pandey, noted that the system is an important step in enhancing transparency, ease of doing business, faceless and self-certification-based assessment, and the paperless delivery of services.
The government approved PLI Auto to enhance the country’s manufacturing capabilities for AAT with a budgetary outlay of US$ 3.9 billion. The scheme has been successful in attracting a proposed investment of US$ 8.5 billion against the target estimate of US$ 5.3 billion over five years. FY 2022-23 is the first financial year for which an approved applicant can claim incentives on the determined sales. Sales of AAT products with a DVA of 50% minimum, with sales from 1 April onwards, for a period of five years, shall be eligible for incentives.
Applicants should maintain a detailed DVA calculation for all their eligible products in their own ERP system. It will record the DVA calculation for each batch, product, and model with details of component-wise values, component-wise DVA, and final DVA at the AAT product level. Applicants’ ERP will push the product-wise DVA to the PLI Auto portal on a quarterly basis through the API.
Over the past year, the government has launched several portals and applications to automate the delivery of public services across several sectors. For example, in May, it launched a single national portal for biotech researchers and start-ups that seek regulatory approval for biological research and development projects. The Biological Research Regulatory Approval Portal (BioRRAP) allows stakeholders to see the approvals accorded against a particular application through a unique BioRRAP ID, as OpenGov Asia reported.
In June, the Department of Pension and Pensioners’ Welfare launched a mobile phone version of Bhavishya, an artificial intelligence-enabled common portal for pensioners and elder citizens. The portal aids the seamless processing, tracking, and disbursal of pensions.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Hong Kong University of Science and Technology (HKUST) and a major cloud computing company recently signed a Memorandum of Understanding (MoU) to further their collaboration in nurturing local technology talent, collaborating on cutting-edge technology research and facilitating the research works of HKUST researchers.
Under the MoU, HKUST and the global leader in cloud technologies will nurture local talent in knowledge relating to data analytics, cloud computing and artificial intelligence (AI) to further develop Hong Kong’s pool of technology professionals and meet industry demands.
The two parties will roll out joint talent development programmes, providing workshops and seminars to prepare students with the practical skills they will require when using advanced technologies as well as promote a culture centred around new technology innovations. Internship opportunities will also be offered to HKUST students by the tech firm.
This partnership aims to support HKUST to move forward as an international leader in education and research. Drawing on its robust cloud experience, the tech firm will continue supporting HKUST in its exploration of Elastic High-Performance Computing (E-HPC) for accelerating HKUST’s research activities in pure science and engineering, as well as AI and machine learning. HKUST will also receive advisory to build an integrated, secure and flexible research platform connecting its campus in Hong Kong and the soon-to-be-opened Guangzhou campus.
The Vice-President for Research and Development at HKUST noted that the tech firm is an active supporter of pioneering research and academic-industry cooperation. Since 2018, both parties have jointly launched 10 collaborative research projects to advance the frontiers of innovative technologies and address challenges of the industry. HKUST aims to strengthen this partnership to pave way for an even closer collaboration, which shall bring greater benefits not just for us, but for the society and region as a whole.
The General Manager for Hong Kong SAR, Macau SAR, and Philippines, of the tech firm stated that the partnership with HKUST is an affirmation of our commitment to nurturing technology talent and fostering local innovation ecosystems. HKUST is a leading university with world-class research across disciplines.
He added that the company is delighted to work with HKUST to continuously prepare young talent for the future and looks forward to co-developing more advanced technologies to empower various industries and advance the GBA as a technology hub in Asia.
Through this collaboration, HKUST and the tech firm will also work together to explore new solutions in areas such as AI, next generation intelligent data processing platforms and serverless computing, in view of addressing technology-related challenges and opening up fresh opportunities for academic and industry experts to cooperate on projects.
The two parties have been long-term collaborators. Supported by the tech firm and its parent group’s AIR Programme, the two organisations have worked together on research projects spanning machine intelligence and data computing. Supported by the cloud tech firm, HKUST organised a FinTech hackathon to encourage its students to identify and solve industry challenges and develop innovation solutions.
The global cloud computing market size is expected to grow from US$ 445.3 billion in 2021 to US$ 947.3 billion by 2026, at a Compound Annual Growth Rate (CAGR) of 16.3% during the forecast period. While technology spending in APAC has increased, the setback due to the recent COVID-19 pandemic is impending. The cloud technology adoption is expected to increase in sectors where the WFH initiative is helping to sustain enterprise business functions.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Data is information that has been organised in a way that makes it simple to move or process. It is a piece of information that has been converted into binary digital form for computers and modern methods of information transmission.
Connected data, on the other hand, is a method of displaying, using, and preserving relationships between data elements. Graph technology aids in uncovering links in data that conventional approaches are unable to uncover or analyse.
Different sectors have invested in big data technologies because of the promise of valuable business insights. As a result, various industries express a need for connected data, particularly when it comes to connecting people such as employees or customers to products, business processes and other Internet-enabled devices (IoT).
In an exclusive interview with Mohit Sagar, CEO and Editor-in-Chief of OpenGov Asia, Chandra Rangan, Chief Marketing Officer of Neo4j shared his knowledge on how a connected data strategy becomes of paramount importance in building a smart nation.
Connected data enables businesses
A great example of the power of graph technology, and a very common use case for Neo4j, is its use in the financial sector to uncover fraud. Finding fraud is all about trying to make connections and understand relationships, Chandra elaborates. A graph-based system could detect if fraud is taking place in one location and determine if the same scenario has occurred in other locations.
“How does one make sense of this? Essentially, you are traversing a network of interconnected data using the relationships between that data. Then you begin to see patterns develop and these patterns provide you with answers so that you can conclude whether there is fraud.”
What is of great concern is that fraud is occurring with much greater frequency and with a higher success rate nowadays. The key to stopping and mitigating the impact is time. Instead of detecting a fraud that occurred hours or days ago,
“What if the organisation could detect it almost immediately and in real-time as it occurs?” asks Chandra. “Graph offers this kind of response and is why it’s a great example of value!”
Supply chain and management are other excellent examples of RoI. One of Neo4j’s clients, which operates arguably the largest rail network in the United States and North America created a digital twin of the entire rail network and all the goods. With graph technology across their network, they can now do all kinds of interesting optimisation much faster, leading to better, more efficient outcomes for their entire system.
The pandemic has taught the world about the value and fragility of supply chains. Systems across the globe are being reimagined as the world’s economy realise the need to become more digital and strategic. More supply sources, data, data sharing, customer demands, and increased complexity necessitate modern, purpose-built solutions.
Apart from all the new expectations and requirements for modern supply chains, systems need to and are becoming more interconnected because of new technologies.
Maintaining consistent profitability is difficult for firms with a high proportion of assets. Executives must oversee intricate worldwide supply chains, extensive asset inventories and field operations that dispatch workers to dangerous or inaccessible places.
With this, organisations need a platform that connects their workforces and makes them more capable, productive and efficient. A platform that provides enterprises with real-time visibility and connectivity, while also assuring efficiency, safety, and compliance.
Modern technologies are required to improve interconnectivity, maximise the value of data, automate essential procedures, and optimise the organisation’s most vital workflows.
Modern data applications require a connected platform
“When we programme, when we create applications, we think in what we are calling a graph. This is the most intuitive approach that you can have,” says Chandra.
Any application development begins with understanding the types of questions people want to solve and then mapping it to a wide range of outcomes that they want to achieve. These are typically mapped in what is known as an entity relationship diagram.
Individuals’ increased reliance on systems that work in a way that makes sense to them and supports them has increased criticality. And frequently, when these systems fail, Neo4j makes sense of complexity and simplifies what needs to be done, resulting in a significant acceleration.
As the world becomes more collaborative, integrated, and networked, nations must respond more quickly to changes in their business environment brought on by the digital era; otherwise, they risk falling behind or entering survival mode.
The proliferation of new technologies, platforms, and devices, as well as the evolving nature of work, are compelling businesses to recognise the significance of leveraging the most recent technology to achieve greater operational efficiencies and business agility.
A graph platform connects individuals to what they require, and when and when they require it. It augments their existing process by facilitating the effective recording and management of personnel data. Neo4j Graph Data Science assists data scientists in finding connections in huge data to resolve important business issues and enhance predictions.
Businesses employ insights from graph data science to discover activities that point to fraud, find entities or people who are similar, enhance customer happiness through improved suggestions, and streamline supply chains. The dedicated workspace combines intake, analysis, and management for simple model improvement without workflow reconstruction.
As a result, people are more engaged, productive, and efficient with connected data. Nations can bridge information and communication gaps between executive teams, field technicians, plant operators, warehouse operators and maintenance engineers. Increasing agility and productivity offers obvious commercial benefits.
In short, organisations easily integrate their whole industrial workforce to increase operational excellence and decrease plant downtime, hence maximising revenues. This methodology is based on a collaborative platform direction.
Contextualising data increases its value
According to Chandra, data is a representation of the world in which people live, and people use data to represent this world. As a result, the world is becoming more connected, and people no longer live in silos and continue to be associated in society.
“If you think about data as the representation of the world that we live in, it is connected data and we can deal with all the complexities that we need to deal with when we try to make sense out of it,” explains Chandra.
Closer to home, connected data is crucial to Singapore’s development as a smart nation. “Connected data is at the centre of each of those conversations around developing the nation. When you think of Singapore as a connected ecosystem and when you think about citizens, services, logistics, contract tracing, and supply chain.”
Chandra believes that the attributes have saved the connection between data and people, which is why connections are important. Once people understand those connections, it becomes much easier and much faster to derive the insights that are required.
Without connected data, organisations lack key information needed to gain a deeper understanding of their customers, build a complete network topology, deliver relevant recommendations in real-time, or gain the visibility needed to prevent fraud.
Thus, “knowing your customer is understanding connected data.” With the right tools, data may be a real-time, demand-driven asset that a financial institution can utilise to reinvent ineffective processes and procedures and change how it interacts with and comprehends its consumers.
“Me as a person – who I am, my name, where I live – these are all properties of who I am. But what really makes me me, are the relationships I have built over time. And so, the notion that almost every problem has data that you can really make sense of with graphs is the larger “Aha” moment,” Chandra ends.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The state of Punjab has launched inaugurated its first hi-tech integrated command and control centre (ICCC), which will supervise 1,401 closed-circuit television cameras that have been installed across the city of Ludhiana.
The ICCC will monitor traffic, LED lights, sewage treatment plants, common effluent treatment plants (CETPs), rooftop solar panels, and encroachments and defacements. It will oversee the revenue collection of the municipal corporation, including property tax, water and sewerage, disposal, and pet registration. It will measure air quality with data sourced from the central and state pollution control boards. It also has a GPS-based vehicle tracking system to monitor solid waste trucks, corporation vehicles, and city bus services.
As per reports, the centre was set up at a total cost of US $4.5 million. According to the state’s Local Bodies Minister, Inderbir Singh Nijjar, 330 more cameras are being installed in the city that will be linked to the ICCC. The cameras will also help to monitor secondary garbage collection points, compactors along the Buddha Nullah stream, and stray animals.
About 30 vehicle-mounted camera systems are also being installed on police and municipal corporation vehicles that will provide live-feed surveillance footage during protests, public gatherings, or other functions in the city. Additionally, 600 external IR illuminators with a 200-metre range would ensure better monitoring even during zero visibility. Officials believe the centre will bring sweeping change in the functioning of the civic body and police administration.
Punjab has been exploring the use of emerging technology in governance over the past few years. In 2020, it became the first state to roll out a business intelligence tool for big data collection. The tool was provided for free by the Ministry of Home Affairs (MHA). In 2021, the state announced it would integrate crime and criminal tracking networks and systems (CCTNS) following the roll-out of two data analytic tools. The systems enabled police officials in the field to analyse data in a web and mobile-based application. 1,100 tablets were given to police officials in the field and 1,500 mobile phones providing access to a comprehensive database were procured.
Other states around the country are also deploying technology to support public administration activities. Earlier this week, the southern state of Telangana inaugurated a US$ 75 million police ICCC, which will function as a nerve centre for operations and disaster management. It will collect information from multiple applications, CCTVs, and traffic systems for predictive policing.
The ICCC is divided into five blocks. Tower A is the headquarters of the Hyderabad City Police Commissionerate. Tower B is the Technology Fusion Tower that hosts backups-related units like Dial-100, SHE safety, cyber and narcotics cells, and crimes and incubation centres.
Tower C has an auditorium on the ground floor and Tower D has a media and training centre. Tower E houses a command control and data centre for multi-department coordination and CCTV monitoring. The CCTV room will have access to around 922,000 cameras installed across the state.
Police can check footage of 100,000 cameras at the same time. The ICCC has a space for artificial intelligence (AI), data analytics, and social media units. The building also has a sewerage treatment plant and solar panels that can generate up to 0.5 megawatts and. As much as 35% of the land area is dedicated to greenery and other amenities such as a gym and health and wellness centre.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
To better serve and protect communities, maintain data security at scale, and perform essential tasks, all government agencies must establish a strong, contemporary data infrastructure that supports data innovation.
Government and the public sector stand to gain considerably by adopting AI into every element of their job. Government AI must consider privacy and security, compatibility with old systems, and changing workloads.
Artificial intelligence is already being used to help run the government, with cognitive applications doing everything from reducing backlogs and cutting costs to handling tasks that humans cannot easily do, such as predicting fraudulent transactions and identifying criminal suspects using facial recognition.
While AI-based technology may fundamentally transform how public-sector employees do their jobs in the coming years — such as eliminating some jobs, redesigning countless others, and even creating entirely new professions — it is already changing the nature of many jobs and revolutionising aspects of government operations.
AI in government services is centred on machine learning and deep learning, computer vision, speech recognition, and robotics. When used correctly, these techniques yield real, measurable results.
Cyber anomaly detection, on the other hand, has the potential to transform cybersecurity strategies in government systems. The possibilities are endless, but they are only now taking shape.
The OpenGov Breakfast Insight on 4 August 2022 offered the most cutting-edge innovative method for enabling large-scale analytics in the public sector.
Public Sector Services Powered by Data and AI

Kicking off the session, Mohit Sagar, CEO & Editor-in-Chief, OpenGov Asia acknowledges that data and artificial intelligence will drive the future of government services. “With a unified data platform, the public sector will be able to better serve citizens and protect their communities.”
Governments, in general, are one of the world’s largest employers, with numerous ministries, agencies and departments. The vast network of offices and services introduces significant complexity, operational inefficiencies and, frequently, a lack of transparency.
Agencies must deal with massive amounts of data in various structured and unstructured formats, which will only increase over time. Moreover, they are unable to recognise nor take advantage of the full potential of data, analytics and data due to legacy systems and traditional data warehouses. These are, more often than not, classified by agencies and departments, sabotaging their efforts to undergo digital transformation.
To generate real-time actionable insights and make data-driven decisions, data must be securely shared and exchanged at a scale. Giving government organisations and policymakers access to deeper, more relevant insights into decision-making is only possible through data modernisation.
It is given that much of the information that government agencies oversee is extremely sensitive, including information about the nation’s infrastructure, energy and education as well as information about personal health and financial matters. Data protection at every level of the platform must be ensured through tight interaction with granular cloud provider access control methods.
The fact is that citizens stand to gain through more individualised and effective services, enhanced national security, and wiser resource management that a robust data strategy can give.
Government agencies may adapt to readily access all their data for downstream advanced analytics capabilities to support complicated security use cases by integrating data with analytics and AI.
With such a platform, government security operations teams can quickly identify sophisticated threats, minimising the need for human resources by analytical automation and collaboration and speeding up investigations from days to minutes.
Data stored by public sector bodies can be extremely valuable when shared with other departments and used to elevate data-driven decision-making. The time has come to leverage the cloud’s scale and democratise secure data access to enable downstream BI and AI use cases, allowing government agencies to accelerate innovation.
Governments can improve citizen services while implementing smarter and more transparent governance by leveraging data, analytics and AI for actionable insights at scale. It eliminates data silos and improves communication and collaboration across agencies to achieve the best results for all citizens, delivering personalised citizen services while achieving data security and cyber resilience for a satisfied population.
Building a Scalable Data, Analytics and AI Strategy with Lakehouse Platform

Data infrastructure is an essential aspect of data processing and analysis, according to Chris D’Agostino, Global Field CTO, Databricks.
The complete backend computing support system needed to process, store, transfer and preserve data is referred to as the “data infrastructure.” Without the appropriate data infrastructures, businesses and organisations cannot extract value from their data.
“If there’s one thing that many of us all have in common, it’s that we believe in the impact that data and AI can and will have on the world,” says Chris. “Today, data and AI are transforming every major industry.”
On the other hand, with the ongoing globalisation of artificial intelligence and machine learning, there is an increasing need to rethink an organisation’s whole leadership and thought process, from product strategy and customer experience to strategies to increase the efficiency of human resources.
Rules, models and policies that specify how data is gathered stored, used and managed in the cloud within a company or organisation are contained in cloud data architectures. It controls the data flow, processing, and distribution of that data across stakeholders and other applications for reporting, analytics and other purposes.
Every year, data collection by businesses and organisations increases thanks to IoT and new digital streams. In this climate, cloud data architecture-based data platforms are displacing more conventional data platforms, which are unable to handle the growing data quantities and increasingly demanding end-user applications like machine learning and AI.
Companies are using all available data to expedite, automate and improve decision-making to increase resilience and obtain a competitive edge in the market. These methods for digital transformation are supported by AI and data literacy.
To fully realise the benefit of data and AI, change management is necessary, just like with any change in working practices. It is essential to create a cohesive and evolving plan. This can be based on three pillars: business strategy, operationalisation and architecture (after the technology barriers have been recognised).
Whether it’s a business strategy, data management, or organisational knowledge, it’s critical to assess the organisation’s level of maturity and data literacy.
Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses to deliver the dependability, strong governance, and performance of data warehouses while also allowing for the openness, flexibility and machine learning support of data lakes.
By removing the data silos that normally segregate and complicate data engineering, analytics, BI, data science and machine learning, this unified approach streamlines the current data stack. To increase flexibility, it is created using open standards and open-source software.
Additionally, its shared approach to data management, security and governance works more productively and develops more quickly.
In a global research effort in collaboration with an institution, Databricks polled 117 data leaders and the survey’s findings were illuminating and instructive.
An analytics leader’s biggest regret and issue was not embracing an open standards-based data architecture. “This didn’t surprise us. We are seeing many of our clients adopting the best open-source technologies,” Chris reveals.
In addition, the poll showed that only a small group can be successful with their AI projects, while the multi-cloud is a growing reality.
Most executives say they are currently evaluating or implementing a new data platform to address their current data challenges. During these challenging times, cloud technologies allow businesses to respond and scale rapidly.
With scalable data, analytics and AI strategy, organisations can create significant value. They can implement real-time monitoring, create tailored customer experiences, deploy predictive analytics, and much more. Databricks offers tools that are specifically designed to address the challenges described.
In Conversation With: The Future of Government Services and Shared Data
All the government agencies’ data must be protected and every component must be safeguarded. Unified data with analytics and AI makes it simpler to provide quick access for the organisation’s teams and complete support for security use cases.

Joseph Tan, Deputy Director (Capability Development), Data Science & Artificial Intelligence Division, Government Technology Agency emphasised the importance of data modernisation with a holistic approach. A policy-driven industry that would entrust the organisations’ data will lead to better customer service.
Joseph is convinced that “As technology advances, most businesses are confronted with issues caused by an existing legacy system. Instead of providing companies with cutting-edge capabilities and services such as cloud computing and improved data integration, a legacy system keeps a business constrained.”
A legacy system is computer software or hardware that is no longer in use. The system still meets the needs for which it was originally designed, but it does not allow for expansion. Because a legacy system can only do what it does now for the company, it will never be able to interact with newer systems
“A business might keep using an old system for more than one reason. In the world of investments, for example, upgrading to a new system requires an initial investment of money and people, while keeping an old system running costs money over time,” Joseph explains.
On the other hand, when a whole company moves to a new system, there can be some internal resistance and worries about how hard it will be and what might go wrong. For example, legacy software might have been made with an old programming language, which makes it hard to find staff with the right skills to do the migration.
Additionally, there might not be much information about the system, and the people who made it might have left the company. It can be hard to just plan how to move data from an old system to a new one and figure out what needs the new system will have.
Increased security risk, instability and inefficiency, incompatibility with new technology, company perception and new hire training, single point of failure and lack of information are a few issues that older systems run against.
At best, outdated legacy systems are a pain, and at worst, they can seriously jeopardise an organisation’s overall IT security strategy. Furthermore, the longer a business waits to update a legacy system, the more challenging the transition will be.
System modernisation is almost always a must before digital transformation can occur. Most businesses won’t be able to fully profit from contemporary technologies and solutions without it. “With this, finding the right talent would be very beneficial for the organisation to manage their modern technologies,” says Chris.
Some advantages of updating legacy systems such as enterprises can enhance their IT security and sustain it by taking advantage of vendor upgrades and fixes in the future by updating legacy systems. Modern systems and solutions, including retrofitted legacy systems, are built to deliver optimal performance without consuming excessive amounts of computational power.
Even a legacy system may be modernised to include new features, giving the business users additional capability and a better user experience. The truth is that updated legacy systems require less input from IT staff, freeing them up to focus on activities that really benefit a company.
Similarly, governments all over the world will undergo a fundamental upheaval because of big data and artificial intelligence. Even though the public sector has long used data, the potential and actual use of big data applications have an impact on some theoretical and practical aspects of decision-making. This is fuelled by both the data revolution and the concurrent advancement of advanced analytics.
The availability of data that may be employed in the computer learning process is a major aspect of the maturing of AI technology and the practicality of AI applications to public policy and administration.
However, without the underlying analytical technologies, the data revolution can be seen as only a change in the size of the data that is currently available rather than a fundamental change. As predictive analytics, innovative data and artificial intelligence gain prominence, it is critical to understand their roles in the public sector.
At the start of their data journey, organisations require data capture systems to discover information embedded in all levels of business operations. Following that, the data must be validated for informational accuracy and integrated to reduce the risk of drawing incorrect conclusions and to create a unified view of the business.
The final step is analysis, in which businesses collaborate with data analysts who use cutting-edge analytics tools to peel back layers of proprietary data in search of insights to power change.
Larger companies with more complex data integration and analytics processes can add predictive analytics as the fourth step.
When analysing enormous datasets (often referred to as “big data”), predictive data analytics, also referred to as advanced analytics, uses autonomous or semi-autonomous algorithms to make predictions based on information patterns. Data analysts may provide clients with greater service, which can result in more meaningful transformations, by delivering deeper insights into company data more quickly.
Think about how AI and machine learning might be used in the context of the data processing flow. Analytics tools assist data analysts in identifying areas for improvement in the business after private data has been collected, analysed and combined into a single view.
AI excels at discovering data patterns that humans cannot perceive. This is quickly scalable based on the amount of the dataset. To make data analytics frictionless, machine learning algorithms can also adapt to data pipeline input and human behaviour patterns. This can be accomplished by utilising natural language processing to recode communications between individuals within an organisation so that algorithms can comprehend and act on them.
Artificial intelligence and machine learning have become the “next big thing” in the government sector, while advanced analytics, also known as predictive data analytics, utilises autonomous or semi-autonomous algorithms to evaluate enormous datasets and generate predictions based on information patterns.
By developing deeper insights into company data more quickly, data analysts can provide better service to clients, which can result in more profound transformations. Consider the application of AI and machine learning to the data handling process. After unique data has been collected, analysed and consolidated into a single view, analytics tools assist data analysts in identifying areas for business development.
Smart solutions enable advances that are self-sustaining and AI and ML are at the heart of these. Executives and practitioners agree that AI and ML are catalysts and drivers across both the public and private sectors. As an AI system has a deeper understanding of data platforms and processes, it can continue to enhance its efficacy and capacity to provide personalised insights from massive data silos.
Conclusion
In closing, Chris shared that Databricks was established in 2013 to assist data teams in resolving the most challenging issues facing the globe, and they have been investing in the Asia Pacific region to help this objective forward. “While there are countless possibilities, there are several challenges as well.”
It is insufficient to merely fund and use AI technologies. Businesses and organisations need a talent pool of experts that can use these AI tools in a way that can guarantee the greatest outcomes.
Currently, customers from a wide spectrum of businesses are collaborating with Databricks to tailor their clients’ experiences to improve their capacity to react to market dynamics and safeguard both their own and all stakeholders’ interests. This is most evident in real-time for financial services organisations to help deal with fraud.
“My particular favourite is Databricks’ assistance in Mitsubishi Tanabe’s efforts to quicken drug clinical trials in Japan. The possibilities for our collaboration are virtually endless,” Chris reflects.
Mohit recognises that digital transformation is vital in today’s VUCA environment. What is essential is that industry and government collaborate and work together. For long-term success and sustainability, there have to be partnerships between the public and private sectors.
Strategic alliances gave businesses and government agencies a competitive edge. Partnerships are mutually beneficial, helping each other grow and get better. When people genuinely try to help each other, “it can help to get over certain weaknesses and be first movers in their field.”
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The development of the National Capital City (IKN) of the Archipelago has made the integration of spatial data and non-spatial data very strategic. “We need to push for precise, good spatial data that can be operated to support all development sectors in IKN,” says Muh Aris Marfai, Head of the Geospatial Information Agency (BIG) during the opening of the Regional Geospatial Information Network Coordination Meeting for Regency/City in East Kalimantan.
He added that the development of the Geospatial Information (GI) system in East Kalimantan is becoming increasingly important in line with the development of the IKN. Thus, a Regional Geospatial Information Network Coordination Meetings were held to enhance the role of local governments in the construction of network nodes. In addition, it is also an initial assessment of the condition of the regional network nodes.
The network node assessment includes five pillars such as regulations and policies; institutional; human resources; technology; standards for geospatial data and information. The government is looking for solutions for those who convey problems and obstacles related to the construction of their respective regional network nodes.
Currently, there are only two cities in East Kalimantan that have not been integrated into the Regional Geospatial Information Network (JIGD). The two areas in question are Paser Regency and West Kutai.
One of the important factors in the development of the capital city is the availability of data. The government hopes that Paser and West Kutai Regencies will soon build JIGD. The data in JIGD will later be integrated with statistical and financial data which is being pursued together as the One Data Indonesia (SDI) programme, hence the support of the local government to improve the operationalisation of the functions of the Regional Apparatus Organisation (OPD) of network nodes is very important. This is necessary to integrate and synchronise Thematic Geospatial Information (IGT), as well as resolve various spatial problems and conflicts.
Regulations and policies are also very important in realising the strengthening of regional network nodes. Coordination between stakeholders, academia, the private sector, and government partners must run synergistically so that the sharing of spatial data through network nodes can be carried out optimally.
Meanwhile, the nation’s Geospatial Information Agency recently signed an agreement with the Regent of Berau Sri Juniarsih Mas. The collaboration is for organising, developing and utilising Geospatial Data and Information in Berau.
Aris explained that Presidential Regulation Number 27 of 2014 concerning the National Geospatial Information Network (JIGN) regulates the need for the establishment of network nodes. Therefore, each region is obliged to organise a JIGD. Currently, there is the construction of the National Capital City of the Archipelago, so the integration of spatial and non-spatial data in the East Kalimantan region is very strategic.
The collaboration with BIG is an effort to foster the implementation of government affairs related to land use and development investment. The government hopes that the existing Geospatial Data and Information will be able to assist the decision-making process in the planning and programme of the Berau Regency Government.
In addition, the collaboration with BIG is a form of support from the Berau Regency Government for the One Map Policy (KSP) for the success of national development as the KSP can be used as a guideline for implementing regional policies that refer to one standard geospatial reference, one database, and one geoportal.
The KSP plays a very important role in addressing the problem of overlapping land use in the regions that hinder economic growth. The lack of certainty of land availability will greatly affect development investment.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The third update on the government’s measures to secure personal data has been released by Singapore’s Smart Nation and Digital Government Office (SNDGO). To increase transparency regarding how the Government utilises and safeguards citizen data, the Public Sector Data Security Review Committee (PSDSRC) made this annual update a fundamental recommendation.
The number of incidents involving government data increased from 108 in FY2020 to 178 in FY2021. While the number of data incidents reported has increased, none of these incidents was deemed severe enough to have a significant impact on the agency or the individuals affected.
The overall increase in reported data incidents mirrors trends seen in the private sector and globally, as data exchange and use continue to grow. The increase also reflects increased awareness among public officials of the importance of data security and reporting all incidents, regardless of severity.
Additionally, the government began developing the Central Account Management (CAM) Solution in August 2021 to improve the user account management process. The CAM solution automates the process of removing and deactivating user accounts that have been deactivated due to staff turnover. Since its inception in April 2022, 32 per cent of eligible Government IT systems have been configured for CAM onboarding.
In May 2022, the government will also launch the Whole-Of-Government (WOG) Data Loss Protection (DLP) Suite. The WOG DLP Suite prevents sensitive data from being accidentally lost from government networks, systems, and devices. To detect risky user activities, the WOG DLP tools employ technical and process controls.
Since its inception in December 2020, the DPPCC has been developing data privacy protection toolkits that agencies can use to promote data protection while not limiting its use. Furthermore, DPPCC has been collaborating with agencies to co-create solutions to strengthen data privacy and key system protection. To reduce the risk of data exposure, these solutions include dataset segregation and stringent encryption standards.
Likewise, the government recognises that it is impossible to eliminate data incidents, but we must have the expertise and capability to respond quickly when they occur. The government held the first central ICT and Data Incident Management exercises in September 2021 to ensure that the public sector is prepared to respond to data incidents at the WOG level. 33 agencies from five Ministries participated in the exercises.
Developing the public sector’s capabilities and instincts in data management and security is an ongoing process. Since May 20, 2021, the government has launched a series of engagement campaigns and workshops aimed at all government employees. These campaigns and workshops are intended to raise officers’ awareness of the importance of using data securely and to educate them on how to do so in their daily work.
Meanwhile, OpenGov Asia earlier reported that the Personal Data Protection Commission (PDPC) and the Infocomm Media Development Authority (IMDA) have introduced the Privacy Enhancing Technologies (PET) Sandbox to support firms looking to prototype PET initiatives that address common business problems.
The goal of the PET Sandbox is to work with participants from the commercial sector to determine the appropriate PET to use for a given instance and their technological limits to generate guidelines and best practices that will promote more adoption. The PET sandbox will provide a safe environment and a testing ground for PET concepts to achieve this.
Overall, the government’s initiatives have helped to improve the data security posture of the public sector. Singapore will continue to strengthen its security efforts to protect both citizens’ and businesses’ data. The third update on the Government’s personal data protection efforts is available on the microsite “A Secure Smart Nation” (go.gov.sg/publicsector-data-security-review).