
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Insights derived from data can help healthcare providers understand
health outcomes not just for individuals but for entire groups of individuals
or populations. They can identify and predict high risk segments within a
population and help take preventive action, creating long term benefits for
patients, hospitals, governments and society at large.
To unlock the true potential of data for population health, data
from a range of disparate sources, including clinics, hospitals, pharmacies,
fitness centres and even homes and employment places, would have to be brought
together and analysed. However, traditional healthcare IT solutions tended to
be limited in scope and restricted to a particular source of data
This was the challenge being faced by Cerner Corporation (Cerner), a leader in the
healthcare IT space, whose solutions are used in over 35 countries at more than
27,000 provider facilities, such as hospitals, integrated delivery networks,
ambulatory offices, and physicians’ offices.
Cerner was expanding its historical focus on electronic
medical records (EMR) to help improve health and care across the board. To do
so, it aimed to assimilate and normalise the world's healthcare data in order
to reduce cost and increase efficiency of delivering healthcare, while
improving patient outcomes.
Mr David Edwards, Vice President and Fellow at Cerner
explained, "Our vision is to bring all of this information into a common
platform and then make sense of it — and it turns out, this is actually a very
challenging problem."
The firm accomplished this by building a comprehensive view
of population health on a Big
Data platform that’s powered by a Cloudera enterprise data hub (EDH).
Management tooling, scalability, performance, price, security, partner
integration, training, and support options were key criteria for the selection
of a partner.
Today, the EDH contains more than two petabytes (PB) of data
in a multi-tenant environment, supporting several hundred clients. It brings together data from an almost unlimited number of sources, and that data
can be used to build a far more complete picture of any patient, condition, or
trend. The end-result is better use of health resources.
Building the data hub
The platform ingests multiple different Electronic Medical
Records (EMRs), Health Level Seven International (HL7[1])
feeds, Health Information Exchange information, claims data, and custom
extracts from a variety of proprietary or client-owned data sources,
It uses Apache
Kafka, a high-throughput, low-latency open-source software platform to
ingest real-time data streams. The data is then pushed back to the
appropriate data storage, HDFS
(Hadoop Distributed File System) cluster or HBase
(a noSQL database which enables random, real-time read/write access to data).
A blog post by Micah Whitacre, a senior software
architect on Cerner Corp.’s Big Data Platforms team, explains how Apache Kafka
helped Cerner overcome challenges related to scalability for the near real-time
streaming system and in streamlining data ingestion from multiple sources,
including ones outside Cerner’s data centres.
Cerner has also taken steps to ensure the security and data
integrity of its Big Data platform. A technical solution must provide a
mechanism for threat mitigation in order to be considered a viable data
management technology in the healthcare space.
Cloudera advised Cerner’s approach to encrypting data at
rest and on its Kerberos
(a network authentication protocol for client/ server applications) integration.
Mr Edwards said that their real aim to get the technology
out of the way so all that users see is the value that comes from their
efforts.
"Our real aim is to get the technology out of the way
so all that users see is the value that comes from their efforts. We really
want the focus to be on the outcomes and results, not on what it takes to
deliver them. The Cloudera platform is the technology that’s driving the value
and it’s allowing us to build applications that help healthcare systems improve
how they manage the chronic conditions of their populations. We're now able to
aggregate the information, stratify it, and offer the opportunity to look at
this data in a way that has never been possible before,” he said.
Improved insights
The uniqueness of Cerner’s EDH is that it allows data to be
brought together from an almost unlimited number of sources, and that data can
be used to build a far more complete picture of any patient, condition or
trend.
The end result: better use of health resources.
Cerner’s EDH helps them understand the most significant
risks and opportunities for improvement across a population of people. Cerner
computes quality scores for managing a number of chronic conditions, and
analysts can see which conditions could gain the most by improving those
scores.
For instance, Cerner can accurately determine the
probability that a person has a bloodstream infection, such as sepsis.
Sepsis is an uncontrolled inflammatory response to an
infection. It is a complex condition which is difficult for a junior doctor or
nurse to recognise. From the time sepsis first takes hold, healthcare
professionals have only the initial 6 hours after the diagnosis to deliver a
group of interventions to the patient. These interventions require close and
rapid interaction between teams in the Emergency Department, in the general
ward and in Critical Care. For an individual patient, getting the interventions
right at the right time may mean a 20-30% better chance of surviving.
Cerner developed an evidence-based algorithm, called the St.
John Sepsis agent, allowing early intervention for patients at risk, and
preventing deterioration into severe sepsis. The algorithm continuously
monitors key clinical indicators and recognises a potentially septic pattern.
The integrated system, which was created in cooperation with Methodist North
Hospital in Memphis, Tenn., gathers patient information from many sources,
helping clinicians early identify patients at risk for sepsis and providing an
accurate diagnosis and treatment.
Ryan Brush, Senior Director and Distinguished Engineer,
Cerner, said, “Our clients are reporting that the new system has actually saved
hundreds of lives by being able to predict if a patient is septic more
effectively than they could before."
Content from Cloudera
customer success story on www.cloudera.com and
related resources.
[1] HL7
provides a comprehensive framework and related standards for the exchange,
integration, sharing, and retrieval of electronic health information that supports
clinical practice and the management, delivery and evaluation of health
services.

- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Dr Gay Jane P. Perez, Deputy Director-General for Space Science and Technology at the Philippine Space Agency addresses the country’s growing vulnerability to climate change using space data collected by the agency’s sovereign satellites, curated satellite data sources, ground stations, and high-performance computing systems. She added that data is readily accessible, and the technical capacity to house this data exists, but the issue lies in translating this data into useful insights.
To realise the economic value, we must maximise what we can get from space data, such as maps, forecasts, and advisories that serve as an impetus to actionable insights that benefit our end-users, such as our fishermen or farmers.
– Dr Gay Jane P. Perez, Deputy Director-General, Space Science and Technology, Philippine Space Agency
An important part of space data mobilisation is a needs assessment, which is being done right now through the 2021–2030 Decadal Survey. This survey has brought together scientists and people from the public and private sectors in the country to identify the most important challenges and goals in Earth observation and other satellite applications for the next ten years.
The results of the Decadal Survey will be used to figure out what the most important and urgent problems in the country are and what the most important and urgent space missions should be. The survey focuses on six categories, including :
- Hydrologic Cycles and Climate Studies
- Weather, Air Quality, and Atmospheric Processes
- Earth Surface and Interior: Dynamics and Processes
- Hazards and Disaster Risk, Reduction, and Management
- Aquatic Ecosystems and Water Resources Management
- Terrestrial Ecosystems and Land Resources Management
Furthermore, PhilSa produces programmes that directly improve the capability to utilise Earth Observation data by analysing the requirements and existing capabilities.
ISKUELA, or Inclusive SSTA Know-how Utilisation, Exchange, and Localisation Activities is one of the activities that PhilSA has implemented to utilise and cascade EO data. ISKUELA is comprised of numerous initiatives and activities designed to educate and strengthen the capacity of partners from academia, industry, media, communities, and the public sector on how to use space data for their purposes.
The Space Information Infrastructure Capacity Building and Training Programme is one of these projects, with the goal of increasing awareness and understanding of space information infrastructures and their applications through webinars, short courses or even resource person support. The project has successfully held two webinars, with students, researchers, and representatives from various local government units participating.
PhilSA also hosts maps derived from space data via the Space Data Dashboard, which was developed collaboratively by PhilSA, DOST-ASTI, and the STAMINA4Space Programme. The dashboard includes publicly available satellite data maps of ship traffic, air quality, water quality, and night lights.
The agency hosts an annual Space Data Dashboard Media Workshop to help the public understand these open maps. Journalists are taught to navigate and produce articles using space data. PhilSA hopes to expand the workshop to include training for campus journalists.
It has been realised that there are still gaps in how space data is used to act on climate change. One way to close this gap is to improve how space science is taught and communicated. It is also important to reach out to a wide range of people and groups to show how using information from space to deal with uncertainties in the environment can be helpful.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Department of Science and Technology (DOST) of the Philippines is now hiring more researchers, scientists, and engineers to assist the government and industry in making and implementing more science-based decisions and policies.
COVID-19 established the need for Science and Technology (S&T) Fellows, but the Philippines had been using science, technology, and innovation as decision-making inputs prior to the global health epidemic, according to Rowena Cristina L. Guevara, Undersecretary, DOST.
Recognising the potential threats humanity may face in the next years, the DOST deemed it vital to implement strategic measures that would ensure a stable and sustainable supply of Filipino researchers, scientists, and engineers. Guevara indicated that the DOST research and development (R&D) section will be manned by master’s and doctoral degree holders on a five-year contract in the context of the DOST S&T Fellows programme.
It has been over a year since the concept of increasing DOST agencies’ workforces with MS and PhD graduates from various S&T specialisations became a reality. To commemorate this momentous occasion, DOST hosted its first S&T Fellows Convention. DOST cited that it is in the hearts of Filipino scientists to share their knowledge and experience with the country, regardless of the money offered.
In the meantime, this initiative is not new in nations that strive to be leaders in technological growth and advancement overall. Around 51,000 students are receiving financial assistance from the DOST through the Science Education Institute (SEI), which administers all scholarship programmes at the undergraduate and graduate levels. In the meantime, the department of science has 4,308 students enrolled in graduate programmes and 1,550 students working on their PhD.
In its hunt for outstanding individuals with backgrounds in science and technology from all over the world, the DOST is seeking to find academics like these. There is now a total of 32 S&T Fellows whose services are being utilised by various departments and agencies within the DOST. In general, the S&T fellows programme offers researchers, scientists, and engineers the chance to participate in important work within the country while also receiving income that is on par with other similar programmes.
There is more to the work that the S&T Fellows conduct than simply the results and outcomes of research and development. They put in a lot of hours working in laboratories, coming up with ideas for road plans, analysing materials, and conducting experiments; the results of their effort will be beneficial to hundreds of thousands, if not millions, of Filipinos. Furthermore, the National Academy of Science and Technology of the Philippines (NAST PHL) recently held the Grand Launching of PAGTANAW 2050.
DOST is optimistic about the institutionalisation of PAGTANAW 2050, as it is a long-term policy instrument that requires regular review and updating to remain relevant to the times. Aside from its review, there is also a need to continuously fund the initiative, which extends beyond term limits and administrative appointments.
The initiative is a significant step toward designing and implementing integrated yet time-specific strategies for a prosperous, inclusive, and agile Philippine future in which the shared vision of the Philippines as a Prosperous, Archipelagic, and Maritime Nation can only be realised by diplomatically asserting rights over marine resources.
The foresight will provide clearer direction on developing enabling mechanisms to further accelerate scientific growth and innovation that will serve the nation’s interests and benefit the Filipino people.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Nanyang Technological University, Singapore (NTU, Singapore) climate scientists have extended the known record of Singapore’s sea level to almost 10,000 years ago, giving a more solid dataset to improve future sea-level rise projections.
This more refined sea-level record also has wider implications. For instance, it would lead to more robust and accurate local projection of sea-level rise, offering a strategic guide for Singapore as it moves to adapt to climate change.
– Dr Stephen Chua, Lead Author
Stephen added by dating the Singapore sea-level record to 10,000 years ago, they retrieved crucial new information from the early Holocene period. This is a time of rapid sea-level rise that has remained poorly understood – until now. Furthermore, reconstructing its history over thousands of years is one of the most difficult aspects of studying climate change. To have a better understanding of the possible causes and repercussions of future developments, scientists must study and comprehend the past.
An international team led by NTU researchers extracted ancient sediments from up to 40 m underground at a site in Singapore’s Marina South. The samples were then subjected to rigorous laboratory methods such as identifying microfossils like foraminifera and statistical analysis to reconstruct Singapore’s sea-level history.
The longer the sea-level record goes back in time, the clearer the picture becomes for future predictions, according to climate scientists. The Holocene transition (10,000-7,000 years ago) was the last major episode of natural global warming in Earth’s history, with melting ice sheets and rising oceans resulting in a 20-meter rise in sea level. Before the recent increase in the twentieth century due to climate change, the sea level in Singapore has been constant for the last 3,000 years.
Researchers believed that this is the type of crucial information needed to effectively plan adaptation measures in the face of ongoing sea-level rise due to global warming. The team chose the Marina South investigations. Sediment extraction from an ‘ideal’ site with deposits like marine mud and mangrove peats was required to create an accurate ancient sea-level record.
Sea-level rise is a potentially disastrous outcome of climate change, as rising temperatures melt ice sheets and warm ocean waters. Scenarios of future rise are dependent upon understanding the response of sea level to climate changes. Accurate estimates of past sea-level variability in Singapore provide a context for such projections.
– Professor Benjamin Horton, Co-author & Director, Earth Observatory of Singapore
Singapore’s coastal defence plan against rising sea levels will benefit from the findings. The study also discovered the first clear evidence that mangroves only existed for roughly 300 years in the Marina South area before succumbing to flooding caused by rising sea levels at the time.
Researchers discovered abundant mangrove pollen at a depth of 20 metres below contemporary sea level, indicating that a mangrove shoreline existed in southern Singapore nearly 10,000 years ago. According to the findings of the NTU, sea-level rise during that time was as much as 10 – 15 mm per year, which likely contributed to the extinction of the mangrove.
The findings are useful for present and future adaptation strategies in Singapore, as the island nation seeks to move beyond engineering solutions and use natural approaches to protect its shoreline.
Despite their adaptability and usefulness as coastal defence, mangroves have limitations in the event of a fast sea-level rise, according to the study. This research backs up a previous study co-authored by NTU that found mangroves will perish if sea levels rise faster than 7 mm per year under a high carbon emissions scenario.
The sea-level change was modelled without deglaciation, meltwater discharge, and other considerations. This important systematic contribution from Singapore and the vicinity spans the post-glacial Holocene period, allowing a broad sea-level change pattern to be formed.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Earth Observatory of Singapore (EOS) will lead the collaboration between the Nanyang Technological University (NTU Singapore) and Singapore Land Authority (SLA) in using Global Navigation Satellite System (GNSS) data for scientific studies.
The EOS, as the NTU Singapore Research Centre of Excellence, provides researchers with access to GNSS data collected by SLA’s Singapore Satellite Positioning Reference Network (SiReNT), as well as to its archived historical data.
Leveraging NTU’s strengths in areas such as sustainability and earth sciences, this collaboration also provides us with valuable data to contextualise more accurate projections to augment Singapore’s climate change response.
– Associate Professor Emma Hill, Acting Chair, Asian School of the Environment and Principal Investigator, Earth Observatory of Singapore
Hill ackownledged that Singapore’s GNSS data from the past is very important for understanding how the land and coast have changed over time. Using precise positioning technology like SLA’s SiReNT can help with more than just positioning and mapping. It can also open a lot of new ways to deal with the increasingly complicated problems caused by climate and environmental changes.
With the combined knowledge of SLA and EOS, they want to use the rich historical data to co-create solutions for a new era of predicting and preparing for coastal and land changes to help Singapore deal with and lessen the effects of climate change.
The collaboration between NTU and SLA supports the university’s NTU 2025 strategic plan, which aspires to address humanity’s great issues on sustainability and speed the translation of academic discoveries into solutions that lessen the human effect on the environment.
Together with EOS’s development of new coastal GNSS reference stations in Singapore, this will enable research into more accurate methods of measuring land height and sea-level changes around the country, as well as the effect of the atmosphere on the weather and climate on the island nation.
GNSS refers to various satellite navigation systems, including the well-known Global Positioning System (GPS), which can be used by systems such as SLA’s SiReNT to produce precise positioning data with a 3 cm accuracy.
The NTU-SLA agreement will establish four-year cooperation that will contribute to the Centre for Climate Research Singapore’s National Sea Level Programme (NSLP), which is supported by the National Research Foundation and the National Environment Agency.
Furthermore, during the duration of the collaboration, EOS will analyse previous GNSS data provided by SLA in order to determine how certain places’ land height has changed. This would increase the accuracy of elevation measurements generated from Interferometric Synthetic Aperture Radar (InSAR), the technique currently employed by NTU to map ground deformation over Singapore and other cities in the region.
In Singapore, EOS and SLA will deploy up to four additional coastal GNSS stations for data gathering to develop innovative approaches for monitoring both land height and sea-level changes. Additionally, they will be incorporated into the SiReNT infrastructure and services to maximise resource utilisation. Also, the existing SiReNT station data will be incorporated to assist this goal.
Simultaneously, EOS will study unique ways to use data from existing GNSS, such as investigating the amount of water vapour in the atmosphere. By characterising the atmospheric processes that impact Singapore at different timeframes, scientists may determine where and when localised weather systems are likely to cause heavy precipitation.
EOS researchers will also attempt to employ GNSS data in regional meteorological studies. By comparing and analysing GNSS and meteorological data in detail, the scientists hope to gain a deeper understanding of precipitation and extreme weather events.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The rate of cloud adoption has accelerated because of the pandemic with various industries throughout the world transitioning to remote working and practically everything going digital. In this environment, the hybrid cloud, which offers enterprises the best of both worlds because of its inherent flexibility, agility and efficiencies, is foundational to success in the new normal.
Demand for infrastructure has impacted cloud services and hybrid cloud use is amping up significantly. A hybrid is a blend of public and private cloud platforms, delivering the best of both. Without a doubt, as technological barriers between the platforms decrease, hybrid cloud adoption will expand.
The OpenGovLive! Virtual Breakfast Insight held on 08 June 2022 focused on adopting a secure government-commercial cloud policy to hyper-scale digitalisation that results in greater levels of efficiency in public service.
The New Normal of Hybrid Work

Starting off the discussion, Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, is convinced that hybrid work will be the new normal for most industries. This will allow employees to split their time between working in the office and working from home.
All technologies should be cloud-native, according to him, allowing businesses to design and deploy scalable applications in current, dynamic settings including public, private and hybrid clouds. It’s designed and engineered to take advantage of the cloud’s size, elasticity, robustness and adaptability.
“The pandemic impacted office culture quickly. Global lockdown and travel bans have changed work and corporate-government interactions. People have learned they can perform most things remotely,” observes Mohit.
Cloud-native solutions let enterprises design and run scalable applications in public, private and hybrid clouds – a huge advantage as both the private and public sectors attempt to save money by embracing technology.
To maintain public services, government organisations must keep expenses low while collecting, storing, updating and protecting data. Organisational leaders, especially in the government, must decide which culture changes to keep and which to combat as they adjust to the new normal and prepare for recovery.
Setting an example for the nation, the Government of Thailand recently looked at bolstering its tourist industry and enacted policies to encourage international travellers. The measures are promising and the World Bank predicts that the Thai economy will come back to pre-pandemic levels by the end of 2022.
Towards an Open Hybrid Cloud Future

Supannee Amnajmongkol, Country Manager, Red Hat Thailand spoke next on the future of an open hybrid cloud.
As society works to cope with the impact of COVID-19, whether in the public or private sectors, the need to adapt is growing, and innovation in the new normal has never been more important than today.
“Innovation remains the number one priority in the new normal,” according to Supannee. “Open-source communities are the innovation engine as the private sectors and even the government can use open-source viewpoints and collaboration to accelerate cloud innovation and deliver better products and services faster.”
Supannee added that the developers have designed a streamlined process to offer any software, anyplace, utilising best-of-breed services regardless of location.
The primary distinction between cloud and on-premises software, she believes, is where it is physically located. On-premises software is installed on the organisation’s computers and servers, whereas cloud software is hosted on the vendor’s server and accessed via a web browser.
Supannee acknowledges that hybrid is the reality for teams and IT environments, whereas the uniformity of environments is desired by operators to reduce operational complexity, lower costs and facilitate cross-environment interoperability. They require visibility throughout the full IT footprint of the organisation.
She also discussed the difference between the private and local clouds. Private clouds, often known as data centres, are hosted on a company’s infrastructure and are typically firewalled and physically secured. A local cloud is a dedicated cloud service that runs on-premises and is managed by a third party.
Most companies use more than one cloud because each cloud has its own features and benefits to offer. Cyber-security, however, is a must whichever cloud service provider is chosen and used. She added that a cloud service partner with a proven track record of successful cloud migrations guarantees success. Red Hat, the leading provider of enterprise open-source software solutions, makes an ideal partner.
Using ICT Capabilities to Transform the Public Sector

Janek Rozov, Head of Strategy at Information Technology and Development Centre, Ministry of Interior, Estonia elaborated on the uncontrolled and controlled cloud in the government sector.
Both individual services and public sector organisations have been designed and have moulded the country’s innovation and digitalisation policies. As digital technology becomes more extensively adopted, public service providers face increasing demands and expectations.
Firms should not entrust their access management and encryption to cloud service providers. This limits cloud service provider lock-in and preserves more confidence or control over who has access to which data.
On the other hand, unauthorised individuals can only access encrypted data. Not all data is sensitive, and risk-based data encryption requires data classification. Therefore, visibility of unauthorised cloud usages such as shadow IT, and unsanctioned IT is essential for establishing security controls or offering end-users and business units secure alternatives.
Governments have been compelled to maximise the value of becoming digital, with benefits ranging from increased efficiency to transparency. Janek feels it is essential for governments to keep in mind that they cannot replicate digital processes, but they can replicate the principles of how they might improve public procedures and lives in the face of the rapid digital shift.
Governments and agencies must determine how diverse social processes occur and how they are interconnected. After mapping the current level of public service supply, the institutional framework, and the cultural context, digitisation could create further benefits.
It is anticipated that citizens will demand and expect more from the public sector as services continue to shift to the digital platform. It may seem that governments are forced to play a never-ending game of catch-up, but efficiencies and transparency are improved over time.
As with most problem-solving, the first step is to separate the means from the goals. Before digitalisation moves forward, the government needs to figure out what people need. To get more people to use digital services, governments should avoid repeating complicated steps that slow down adoption.
Interactive Discussion
Following the informative presentations, delegates engaged in participatory dialogues aided by polling questions. This is intended to provide participants with professional learning and growth through live audience interaction, participation, and sharing of real-life experiences.
Delegates had the opportunity to learn from subject matter experts, share their experiences, and take methods back to their organisations.
The first poll asked the delegates what their organisation’s biggest challenge in digital transformation was. The majority (72%) went with people and skillsets while opted for scalability (21%) or common framework and platform (7%).
On being asked what cloud strategy delegates are interested in implementing, an overwhelming majority indicated hybrid cloud (72%). The remaining were evenly split between private cloud (14%) and multi-cloud (14%).
Inquiring about delegates’ top consideration in adopting multi-cloud, just under a third (30%) agreed that it was data sovereignty and residency (30%). The remainder were equally divided across cost optimisation (20%), tools and services available on the new cloud (20%) and inter-communication and workload portability among the clouds (20%).
Looking to see how delegates characterise their current stage of digital transformation in their respective organisations, well over half (57%) said that planning was conducted but there were delays in implementation and execution. About a quarter (23%) opted for pilot projects rolled out successfully and just under a fifth (19%) chose full-scale implementation of more than one program or project.
Asked how the delegates leverage data between other public sector agencies, about half (52%) said they used API Management, while the rest were equally split between inability due to security constraints (21%) and data integration (21%).
Janek shared that Point to Point is still an option but not as characterised by the API Management. He added that it is advisable to use third party infrastructure to secure the data.
Queried on the key value and driver of a public sector cloud offering, about 40% went with standardisation and governance (40%), followed by the total cost of ownership and price for investment (36%) and security (22%).
Ongkarn Nakprada, Head of Research and Development, National Intelligence Agency said that they are using the public cloud because the government allowed it for free and very efficient for the personnel as well. Pongpan Itsuwan, Chief of Computer Operation Section, Expressway Authority of Thailand shared that they prioritise privacy and personal information security.
On their plan to develop new applications and modernise their legacy applications, just under half (47%) opted to outsource (47%), while others went to re-write (26%) or SaaS (17%).
Siwat Panyachaiwatthanakool, Chief of Transport System Research and Development Section, Expressway Authority of Thailand felt that a starting company or organisation should outsource their platforms to ensure its security protocols.

inquiring as to what external assistance delegates think is needed most to accelerate their Digital Transformation journey, most (54%) felt having a mindset change and new ways of working were important. Others opted for building the framework and a standard platform (31%) or agile integration (9%).
Conclusion
The Breakfast Insight concluded with remarks from Christopher Tan, Global Partner Revenue Acceleration Director, APJ, Intel, who believes that digital ecosystems are frequently created and rapidly influencing change in a variety of sectors, including government.
Christopher urged the delegates to invest in their technology, as that is one of the most crucial elements to the ecosystem’s success. He was confident that Intel had innovative solutions for all their digital concerns.
Whether a company wants to safeguard a brand, intellectual capital and customer information, or offer controls for vital infrastructure, the tools for incident detection and response to defending organisational interests consist of people, procedures and technology.
Data security refers to the process of securing data throughout its lifecycle from illegal access and corruption. Globally, businesses even the government sector are investing extensively in information technology (IT) cyber security skills to safeguard their most valuable assets.
He encouraged the delegates to make sure that their cloud is fully set up by working with trusted partnerships. Cloud and cloud service providers like Red Hat allow adding and changing services straightforward without increasing or deleting digital space. Thus, there is no worry about restricted resources, buying and housing servers and hardware, updating software or data protection.
“Thailand has incredible tourist destinations and production will go up in the manufacturing industry; so every aspect of government service must be fully functional,” Mohit concludes.
Thailand is on a massive digital transformation journey, Mohit acknowledges and reminds delegates that there are people who are willing to help them. He invited them to reach out to the sessions resource persons and experts to explore how to best progress on their transformation journeys.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Ministry of Electronics and Information Technology (MeitY) has issued a draft National Data Governance Framework to mobilise citizen non-personal data for use by public and private entities in a bid to improve services.
The draft policy proposes launching a non-personal data-based India datasets programme. It also addresses the methods and rules to ensure that non-personal and anonymised data from both the government and private entities are safely accessible by the research and innovation ecosystem.
The Minister of State (MoS) for Electronics and Information Technology, Rajeev Chandrasekhar, stated that the National Data Governance Framework will appeal to artificial intelligence (AI) startups, AI research entities, and government departments. He called it an important piece of policy framework that will help the country achieve its target to be a US$1 trillion digital economy. The policy will apply to all government departments and entities. Its rules and standards will be applicable to all data collected and managed by any government entity.
The framework will also accelerate the digitisation of government operations. Currently, digital government data is stored, managed, and accessed in differing and unpredictable ways across different government entities, weakening the efficacy of data-driven governance and preventing an innovative and seamless ecosystem of data science, analytics, and AI. According to the draft, the power of data must be harnessed for more effective digital governance and innovation.
State governments will be encouraged to adopt the provisions of the policy and rules, standards, and protocols where appropriate. The framework also proposes establishing an India Data Management Office (IDMO) under MeitY’s Digital India Corporation, which will develop rules and guidelines. It will be responsible for farming, managing, and periodically reviewing and revising the policy. The draft published on MeitY’s website is open to comments by stakeholders. The last date to submit comments is 11 June.
The rise of data and digital technologies are rapidly transforming economies and societies, with significant implications for governments’ daily operations. The Indian government has said it believes in transparent and accessible public systems that rely on technology-based infrastructure and data-driven decision-making. Recently, the country’s policy commission, the National Institution of Transforming India (NITI Aayog), launched the National Data and Analytics Platform (NDAP). It aims to democratise access to public government data by making information interoperable, interactive, and available on a user-friendly platform.
As OpenGov Asia reported, NDAP hosts foundational datasets from various government agencies and provides tools for analytics and visualisation. All datasets on the platform can be downloaded and merged freely.
NDAP follows a use-case-based approach to ensure that the datasets hosted on the platform are tailored to the needs of data users from government, academia, journalism, civil society, and the private sector. All datasets are standardised to a common schema, which makes it easy to merge datasets. A key feature of NDAP is that it makes foundational datasets interoperable with each other, enabling easy cross-sectoral analysis. The platform has datasets from sectors like agriculture, power and natural resources, transport, housing, finance, health, tourism, science and technology, communications, and industries.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The increased level of security is because the linkage techniques operate on encrypted data, which means there is no requirement for the release of information that could potentially identify an individual.
Under a new Memorandum of Understanding, the Curtin Centre for Data Linkage will work with the Department of Health to link GP, hospital, pathology and non-health data records with analytical tools to support the individual-centred research and service evaluation necessary to improve health outcomes at both a state and national level.
Professor Gavin Pereira, from the Curtin Health Research and Data Analytics Hub at Curtin University, said the new privacy-preserving record linkage methods had the potential to provide new individual-level data for research discovery and to inform government services, policies and programmes. It will now be possible to study a person’s interactions with the health system and overlay big data analytics with the ultimate aim of new research discoveries.
The linkage of big data has traditionally relied on matching personally identifiable information. Although this improves the ability to investigate individual health outcomes and provide personalised health care, there remain concerns pertaining to privacy as the matching requires the exchange of identifiable information.
The professor added that the COVID-19 pandemic increased awareness that health cannot be solely attributable to clinical care. Health is a state of complete physical, mental and social wellbeing. It is not just about genetics, diet and exercise. “Each life activity is a transaction that generates new data, which brings both new opportunities to gain a more holistic understanding of health as well as challenges to overcome.”
The Curtin Centre for Data Linkage has developed a means to connect data across general practices, hospitals, registries, and government departments and yet also preserve privacy. The Department of Health’s Director of Data and Information Systems said WA had a long history of data linkage activities for medical research and health service planning. A collaborative effort by the department, Curtin University, The University of WA and Telethon Kids Institute led to the establishment of the internationally recognised WA Data Linkage System (WADLS).
Professor Pereira noted that the WADLS platform is housed and managed by the Department of Health. The infrastructure has contributed to several improvements to health policy and care in WA and has also supported hundreds of research projects.
The government recognises and supports the significant value of linked data for purposes that extend well beyond the historic usage in medical research. This has been reflected in significant investments in data skills and functions, enabling legislation, and partnerships with universities and industry.
Supporting the use of data linkage services in WA has led to better decision-making and high-quality healthcare throughout the state, which has been showcased through the State’s COVID-19 pandemic response.
Further, the collaboration provided an opportunity for the Department of Health to effectively leverage upon data skills and expertise available within Curtin University to support digital innovation and business transformation throughout the WA health system, using customer focus processes and emerging technologies in direct alignment with strategic priorities of the Digital Strategy for the Western Australian Government 2021-2025 and Sustainable Health Review 2019.
Using privacy-preserving record linkage models, in collaboration with Curtin University’s Centre for Data Linkage, provides a safe and effective approach to integrating data from different sources, consistent with the Australian Privacy Principles and the Department of Health’s strong commitment to ensuring information is appropriately protected from misuse or inappropriate disclosure.