During Strata + Hadoop World, OpenGov sat down with Mr. Doug Cutting, Founder of Hadoop and current Chief Architect at Cloudera, to discuss how he never imagined what Hadoop would produce, some of the most heartwarming examples of data analytics use, and what we must do to combat the growing skills gap in the industry.
With this exclusive opportunity to sit down with the man behind Hadoop -who is as humble, as he is intelligent- we had a lot of questions to ask.
How did you start this journey?
I got in by chance as much as anything, more or less. I needed a job, liked programming, and landed a job looking into search problems. I knew some people who hired me to invent some things, and I developed an understanding of how to cope with large amounts of data. Then, I subsequently got involved in open source.
When Google published some papers about the way they were doing things internally, managing their data systems, I had the experience to see that they were really better methods. I had been working on building search engines myself and I knew this was a big step up from what I was doing. I had enough experience with open source to recognize that if these methods were available as open source they would probably be widely adopted.
So I put two and two together, and started implementing these as open source, which became Hadoop a year later. There wasn’t really a grand design, I just happened to be the right guy with the right information at the right time.
What did you imagine Hadoop would become?
I never imagined Hadoop would be what it is now. I had grown up in a world where enterprise software was very different from what researchers used and websites used. There is a different universe of software development and style of building systems. Enterprises are based on relational databases running on big iron, mainframes, where researchers tended to have PCs and work stations.
I think I did not ever contemplate that the two worlds might merge. I thought that I could build some technology that would have a great impact on the research and web sphere, but it would not likely leave that. Now we have seen enterprises are really adopting open source, adopting unix, and in a much higher degree than I ever would have guessed.
In retrospect, I guess this is not so surprising, if you look at Moore’s Law that technology is pervading every industry. Data is permitting institutions to better understand themselves, their users, the context, and improve. Now we really see, data is driving growth in about every industry. I am very pleased to see that something I worked on is enjoying so much use, but this was not my original plan.
There is a growing skills gap in this industry, how would you propose to address this issue?
The adoption of the technology can’t grow any faster than there are people to use it. We are seeing that as a limiting factor for growth. We also see institutions have a lot of other reasons for not adopting new technology. Institutions may evolve slowly and to adopt this platform requires a lot of change, in many cases. Especially cultural change.
So far, all of those things are paced together, the rate new people are learning and the rate that culture and institutions are changing to adopt these new technologies. In some ways, we do not want it to be too fast so we fall on our face. Having some moderate pace is great.
It is important that people get more trained on this. Cloudera has a program to work with universities. We provide a curriculum so that they can teach students, and they come out of college familiar with these techniques. We are working with over 100 universities worldwide and eager to add more to this program.
These days, people are starting to learn about these new technologies anyway. It is the technology that people are becoming familiar with, so to some degree is generational. Some people will learn new technologies in the course of their careers and some people won’t. But the next generation will have those skills.
I do not think this is a fatal problem, I think public institutions will find people although it may be more difficult. I don’t think it is a unique problem for these new technologies, but we are working as best we can and offer training from the very beginning, as a strategy to help the technology spread. Cloudera has helped over 40,000 people so far in using these tools.
How will the public sector get more organisations on board with open data initiatives?
I think it is really important that organisations have buy-in from the top down to tell them that data is really valuable and can really help them improve.
They need to start taking advantage of it, thinking about it, planning around it, and think about the policies about data. What are the ethics for appropriate use of data? For private and public organisations? How can they make people trust them?
Do you think that this top-down approach is best?
It is essential that you have that, it is necessary but not sufficient. You also need people who are familiar with these technologies and who understand it. I think it helps so much, coming from the top. Everyone I have met here in Singapore, and from neighboring countries, seem to be understanding this and taking it very seriously.
Are these open data initiatives and policies integral to the success of Smart City programmes?
It is hard to say that but it is certainly the smart way to operate. If you are trying to build a Smart City or Smart Nation, you want to try and take as many advantages as you can. When a government operates openly, it operates more efficiently and more effectively.
I think it is very important that governments open up all of their processes as governments. It also permits more value to be extracted from the data when it is open. It allows processes to improve, with help from private sector.
What would you advise organisations who hesitate to open up their data?
Security is a technical problem, I think. Now, security goes hand in hand with Hadoop. Now, there are facilities where you can keep your data encrypted at all times and control who can see what.
It can be harder with an open government initiative to decide issues about privacy. What can you publish? Because when you are operating openly, you are intentionally disabling a certain amount of security. You would like most data that you publish to be anonymous because you do not want to reveal private details of someone’s lives. But often times these things just leak out.
While you want to publish data, you want to protect identities. There are a variety of ways to do this, you can anonymise it, you can try to aggregate it and only provide information about groups rather than individuals, or you could have legal controls to prevent sharing to the public. In this situation, the data would be provided to any person or institution that agrees to follow certain rules, be audited to ensure they comply with these rules.
What is the most interesting story you have heard about people using this technology you created?
One project I was very impressed by a children’s hospital in Atlanta. They were gathering data from the neonatal ICU, with premature babies. It was not just that, the way they used this data was not for a big ambitious project. Instead, they just gathered all of the data and asked the nurses what they would like to know.
The nurses had various questions about how quickly the baby’s vitals returned to normal after different procedures. Then, they would try to modify the way they would do these procedures, in order to have less detrimental impact on these children.
It was a really neat study to see. To see the technology that I had worked on, helping at this Children’s Hospital. This was something I could see in person.
Caterpillar Tractor, on the other hand, has huge machines working all over the world. They are transmitting 60 times/second readings, from hundreds of sensors, back to Peoria. They can then analyse how these products are being used, detect when they might run into a problem, and do maintenance before it has a problem.
When I started working in software, I never would have guessed that I would be working on software which would be used in either of these sorts of situations. It is very exciting to see these things.
The New Zealand government has launched a set of standards designed to act as a guideline for government agencies on how to use algorithms.
The new Algorithm Charter is the first of its type. According to New Zealand’s Minister of Statistics, the charter will help to improve data transparency and accountability.
The Algorithm charter for Aotearoa New Zealand demonstrates a commitment to ensuring New Zealanders have confidence in how government agencies use algorithms. The charter is one of many ways that government demonstrates transparency and accountability in the use of data.
This is most notably observed when algorithms are being used to process and interpret large amounts of data.
Using algorithms for data analysation and decision making is a risky task. The charter will help determine whether the algorithms are being used in a fair, ethical, and transparent way.
So far, 21 agencies have signed the charter. This includes the Department of corrections, Ministry of Education and the ministry for the environment.
In it, departments pledge to be publicly transparent about how decision-making is driven by algorithms, including giving “plain English” explanations; to make available information about the processes used and how data is stored unless forbidden by law
By signing the charter these agencies have agreed to commit a range of measures such as explaining how decisions are informed by the algorithms; making sure data is fit for purpose by managing and identifying biases, ensuring that privacy, ethics and human rights are maintained.
The development of the charter was recommended by the New Zealand government chief data steward and chief digital officer who said that safe and effective use of operational algorithms required more attention and greater consistency across the New Zealand government.
The recommendation was presented after a call made to the New Zealand government about how the government agencies were using algorithms to analyse data.
There were claims that the New Zealand government agencies were potentially using citizen data collected through the country’s visa application process.
This was done to determine those who were breeching their visa conditions by filtering people based on their age, ethnicity, and gender.
A former New Zealand Immigration minister originally rejected the idea, stating that immigration looks at a range of issues such as those who have made and have had rejected multiple visa applications.
He said that it looks at people who place the greatest burden on the health system, people who place the greatest burden on the criminal justice system and uses that data to prioritise those people.
It is important that the integrity of New Zealand’s immigration system is protected and that that immigration resources are used as effectively as possible.
Departments committed to the charter included New Zealand’s accident compensation scheme – which was criticised in 2017 for using algorithms to detect fraud among those on its books – and the corrections agency, which has deployed algorithms to determine an inmate’s risk of reoffending. The immigration agency, found in March to be profiling applicants by algorithm, is also a signatory.
The New Zealand government added that the algorithm charter would evolve and will be reviewed next in 12 months to make sure that it has achieved its intended purpose without creating unnecessary burden or halting progress.
Agencies must also consider te ao Māori, or Indigenous, worldviews on data collection and consult with groups affected by their equations. In New Zealand, Māori are disproportionately represented in the justice and prison system.
In its endeavour to explore and understand how governments around the world are tackling the pandemic, OpenGov Asia organised another Virtual Breakfast Insight.
The session was held on 24th July 2020 with public sector agencies in Singapore to understand how the have adapted to these unexpected times.
Singapore’s government has been driving the adoption of digital and smart technologies throughout the city state as a part of its Smart Nation initiative well before COVID-19 hit the world.
Keeping up with the trend of the series, the session saw a 100% from public sector executives in Singapore.
The session was opened by Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia.
Mohit emphasised the critical role that governments are playing at this point and the enormous amount of data they are producing in this process.
In the digital era, data is unequivocally the new currency; so, it is very important to understand how this data is managed.
Apart from responding to the current situation and recovering from it, governments also have to plan for a secure future.
Mohit advised the audience to also look at the opportunity in crises and collaborate with partners who have a similar vison, who are adding value to their organisations.
After the opening, Remco den Heijer, Vice President, ASEAN, SAS, addressed the audience. He began by briefly introducing his organisation and its mission of improving lives by making better decisions.
Better decision making is done by providing organisations with software that helps manage their Data and Analytics.
Remco shared interesting examples of ways data and real time analytics stream helped governments and public to stay updated with the developments in the world during the last 4 months.
As per his observation, he felt that governments all around the world are tackling the COVID 19 pandemic in a phased approach that rested on 3 basic pillars:
He concluded by highlighting the vital role of Data and Analytics tools in helping governments reimagine the world post the pandemic and emerge stronger and better able to meet the needs of our citizens.
Joseph Musolino, Global Sales and Strategy Consultant, Fraud Security Intelligence, SAS then shared his insights.
He appreciated countries like Singapore that are at a relatively mature stage in their digital transformation journey. For these countries, adopting AI and Analytics is not the challenge. For them the challenge is to deploy it fast and make it more effective.
Joseph opined that the focus for these countries should now be to take AI and Analytics to enterprises and make it faster and easier to deploy.
He highlighted some of the areas where governments are currently deploying advance analytics to strengthen their delivery mechanisms.
They include – Customs, Pandemics, Medical, Taxation and Judicial systems.
In order to give the audience a detailed understanding of how exactly the theory plays out, he demonstrated real life situations where analytics helped government serve citizens better.
He concluded by informing the delegates about their new platform which is a step forward into the next-gen analytics.
After these rich insights, Jeanne Holm, Chief Data Officer and Senior Technology advisor to the mayor of City of Los Angeles took the virtual stage.
Jeanne shared first-hand account of how governments can use predictive data analytics during critical times and also for operations in general.
She explained that that the administration in LA is using advance analytics for two major purposes.
- observing data real time for city management
- predictive analytics that echoes with the before mentioned recovery and to reimagine the city
The LA mayor’s office uses integrated data sets from different sources to have an overall view and make better informed decisions.
Communication of this information to the people is also a major priority of the office and they utilise technology to enable that.
Jeanne shared some of the technology driven initiatives by the LA government that are serving people better during the pandemic hit and in the future. They are:
- Angeleno App: This app is a single way that allows people to access any city service and make e- payment for it.
- Shake Alert LA: This is a warning system that helps send out e- signals out quickly during an earthquake and also informs them of the magnitude and intensity of it.
- Augmented reality public parks games: that let LA residents, especially younger kids visit zoos and park virtually and learn from them while being safe.
- Predicting what we breath: This is a program that uses machine learning to understand urban air quality using satellite and ground data.
- Autonomous piloting on slow streets: This program enables safe autopiloting on certain streets that have less traffic. This is determined from real time satellite and ground data.
- Data Science federation: Open Data and Technology is a team sport. It is important that all the people in this eco system are working towards the same goals. This is exactly what this federation does, collaborates governments and educational institutions working on technology.
After Jeanne’s presentation the session took a more interactive form with the polling questions for the audience.
On the question of how the COVID-19 pandemic changed the way their department/agency functions, major part of the audience voted for “more reliant on social/communication technology” (42%).
On the next question of which of the capabilities will be most useful if a situation like COVID-19 occurred in the future, a majority of participants voted for “better understanding of critical operational processes and human capital required to keep government and healthcare operations running during the lockdown” (52%).
The session also featured a demonstration of a SAS knowledge management solution that stores and catalogues information about the analytical assets of an organisation.
This demonstration helped the audience better understand how easy and useful it is to actually use these applications and the ways in which it can make their functioning more effective.
To give a context to solution demonstration, Mohit put up few questions for the audience which brought out some interesting findings.
On the question of what the typical challenge in is starting a data science project, the audience were split between “lack of understanding on the available data asset” (31%) and “lack of collaborative environment to support team effort” (36%).
The chief data officer from a government agency reflected that technology is rarely an issue when it comes to implementing data science projects. The challenge is more to do with the organisation culture. Traditionally, people look at data as a mere record that needs to be stored. They do not understand that effective data management can help them solve several their routine problems.
There is an urgent need to make data literacy a part the organisation’s culture and make people realise the potential of data.
The session concluded with closing remarks by Joseph. He expressed gratitude to all the attendees and for them sharing their thoughts and experiences.
The well aligned content throughout the session will go a long way to encourage the delegates to utilise Data and Analytics in their workings.
The year 2020 has left us all changed forever. The pandemic has impacted every aspect of our society and personal lives.
As the people struggle to cope, governments and public sector agencies across the globe are working relentlessly to remain sustainable and functional.
The public sector has not had any respite during the last few months. In fact, efforts to deal with the present and to be better prepared for the future, the focus on digital transformation and resilience have redoubled.
OpenGov Asia hosted the second in a series OpenGov Live! Virtual breakfast Insights to discuss what “Powering Smarter and Resilient Government with Advanced Analytics and AI” entails.
The virtual session saw a full house, with public sector executives from various Malaysian agencies in attendance.
Technology is a powerful tool that can lead us towards a better future
Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, set the tone for the discussion by highlighting the pain and pressure on governments over the last four 4 months.
Governments are dedicating all their energy and resources to provide whatever is necessary to maximise the wellbeing of its citizens.
Even as governments have been working hard to protect residents, they have been collecting huge volumes of data from the general public.
This data has been for logistical purposes, for tracking and tracing, for infection control and overall citizen care.
In such times, Mohit said, the role of leaders becomes very important. Leaders not only manage the response to the crisis but also the recovery and path to a better future.
Technology is a powerful tool to communicate and connect with people and has the potential to lead to a better life ahead.
Mohit concluded his opening on an optimistic note pointing out that in a crisis danger and opportunity coexist.
With the right mindset and right people around, one can capitalise on the opportunity that present themselves.
Data Analytics can help government agencies plan better during stressful times
After the opening, the stage was taken by Remco den Heijer, Vice President – ASEAN, SAS.
Remco briefly introduced the audience to SAS and its mission of improving lives through better decisions.
He then shares his observations about the phased approach being followed by governments in tackling the pandemic and how SAS data analytics tools support that.
Remco presented the three stages highlighted by him:
To expound on the Respond Stage, he talked about collaboration with governments in India and the United States to predict their medical infrastructure requirements using SAS’s data analytics.
Similarly, during the Recovery Stage, SAS worked closely with government agencies to prepare them as they were getting ready to reopen their economies.
Data analytics helped government determine their revenue streams, expenses and distribution patterns of stimulus packages.
Remco closed by saying that SAS technology can help governments Reimagine the future – to be better prepared for the next emergency well in advance, avoiding loss to lives and resources.
Data is the biggest commodity that governments have today
The next presenter was Joseph Musolino, Global Sales and Strategy Consultant for Fraud & Security Intelligence, SAS.
Joseph began by emphasizing the importance of data analytics as a tool that can help get valuable insights to understand the environment and the need of the hour.
He further emphasized the relevance of data analytics in the light of the pandemic tying it to the three stages of Respond, Recover and Reimagine.
With governments investing billions of dollars in stimulus packages to support its citizens, it becomes imperative to make sure it reaches people who really need it.
Advance data analytics is a useful tool that can help governments identify the needs of its people and fulfil them by accurately allocating the resources.
Analytics can also help governments manage to their revenue streams and distribute aid to people as they enter the reimagining stage.
He concluded by advising delegates learn a lesson from this pandemic and be well prepared for the next critical event that might hit us.
Analytics can help government get useful insights by measuring the impact of their response to the pandemic
Chris Buxton, Chief Digital Officer, Stats New Zealand shared his insights on the topic.
He began by talking about the three-horizon model by Mckinsey and contextualizing it to New Zealand’s response to the pandemic.
Horizon One is concerned with seeing the way the pandemic has impacted the nation. In this horizon, data and analytics tools help the government to measure the virus’ impact and suitable plan its response.
Horizon Two is more focused on measuring the response of the initiatives by government.
The data related to this horizon helps government identify the areas that needed help more than the others and then act accordingly.
Horizon Three is focused on getting the country back to the speed to which it was before.
Monitoring long-term data indicators like employment rates, GDP, etc. will allow governments to decide their course of action in the years to come.
With this, the session moved into an interaction with participants as they discussed various polling questions around the topic.
On the question of what was most impacted in their respective organisations due to the COVID – 19 they audience were fairly evenly split.
They were divided between Increased demand for services with rising expectations from the citizens (32%), Workforce planning and Need to test the resilience of working remotely both in short and long term (32%).
The Director of Public Services, GLC’S and Telco Sales for SAS Malaysia shared that in the current stage of the pandemic, organisations are already in the reset mode. At this point all areas mentioned in the poll question are equally impacted and need attention. Had this question been asked at the beginning of the pandemic, the answer would have been different.
On the question of which area does your organisation needs to develop most to respond more efficiently to the next COVID – 19, the delegates seemed divided between Integrated operations models to keep the government running efficiently and sustainably (40%) and Use of data and analytics to improve situational awareness for real time decision making (37%).
A senior technical delegate from MIMOS explained that data sharing among government agencies and organisations was a big challenge. For one, they are not integrated and secondly, there are several restrictions on data sharing. This is one area he felt that needed more attention.
On the final question on the biggest risk when spending billions on stimulus packages during the COVID – 19 pandemic, almost half the delegates voted that the money has not been disseminated to the right citizens or businesses that need it the most.
A senior delegate from a social security organisation reflected that one of the major lacunae in distribution of the stimulus packages was that people who were not affiliated to an organisation or were contributing to a social security fund were deprived of the benefits of government’s stimulus package.
The session concluded with remarks by Nik Ariff Nik Omar, Director of Public services, GLC’S and Telco Sales for SAS Malaysia.
It was very interesting to know that about 25% of the attendees were already using the analytical and AI tools that were discussed during the session.
Nik also encouraged the rest to make data analytics an integral part of their workings to enhance their delivery to citizens.
He thanked all the delegates for taking their time out and wished them the best as they are on their journeys to return to the normal.
It has been highlighted over the past few months that in cities, outbreaks or clusters of COVID-19 positive individuals can grow very fast in heavily populated built up areas.
Tracking how people move around urban areas can pinpoint where disease might transmit fastest and farthest.
Places where there have been large gatherings of people have seen high infection rates such as a few notable Church gatherings, outbreaks from people socialising in nightclubs and restaurants as well as outbreaks in residential blocks.
Governments have the task of predicting where are the places that have the highest probability of spreading the disease and governments need to be equipped with the tools and technology to help them do this.
Big-data studies of human mobility need to be combined with epidemiological models. And the demographic profiles of people coming into contact at any particular location need to be included.
In many cities, the details of everyday interactions in cities are not documented well enough to model risk factors accurately, as experiences with COVID-19 show. Resorts, conferences, religious gatherings and workplaces have all experienced notable outbreaks.
Groups living in close proximity are a very high risk risk. Almost 93% of Singapore’s COVID-19 cases in the first 48 days occurred in dormitories for migrant workers.
Each block houses hundreds or thousands of workers. Cases there increased rapidly in a very short space of time to more than 40,000, or more than 12% of that population, compared with fewer than 2,600 infections elsewhere in the city-state of 5.3 million people.
Mapping the Spread of COVID-19
A model of disease spread can be built and refined as data and knowledge improve on human flows on three levels. City-wide – a map which highlights the main flows of people throughout the city. Then by busiest locations and busiest timings should then be mapped and then thirdly record demographics and types of human interactions.
By combining all these insights, governments will be better able to anticipate superspreading locations and target precautionary measures, such as delaying reopening businesses, quarantining arrivals, tightening crowd control and intensifying cleaning and disinfection in particular places.
How Government can tap into pre-existing resources
All sources of data on human mobility need to be tapped. For example, ‘smart’ cities such as Singapore have networks of cameras on lamp posts to track traffic flows. These could be reconfigured to track the density and mixing of people anonymously.
Data from geolocation and contact-tracing apps can map where people go, who they interact with and for what length of time.
Funding agencies can help fund studies of human movement and interactions in key superspreading locations such as transport hubs.
Urban analysts and modellers need to study and document the types of face-to-face interactions, networks and crowd mixing.
Governments should use these data and models to target their public-health strategies. More effective targeting of measures will help to avoid ‘virus fatigue’ among the public and help education and the economy by allowing places to minimize the risks of some kinds of reopening.
The Selangor government set up a task force to monitor, oversee and educate the public on the transmission of Covid-19 in the Malaysian state back in March 2020.
The task force is led by former Health Minister, Datuk Seri Dr Dzulkefly Ahmad and assisted by four professional individuals who had previously worked with him at the ministry, as well as representatives from the Selangor Health Department.
Analytical platforms using Big Data Analytics and Machine Intelligence have been set up by the Selangor state government to assist its state Task Force for Covid-19 (STFC) in battling the coronavirus outbreak.
STFC chairman and former health minister Datuk Seri Dzulkefly Ahmad said the introduction of the advanced computerised systems is to allow his task force to offer more informed recommendations and evidence-based advice to the Chief Minister when undertaking local interventional measures in handling the outbreak.
“As a guide to the Selangor Chief Minister who directs the state machinery in undertaking localised interventional measures, STFC is determined to put forward all recommendations and necessary actions in line with scientific proof and the latest information.
“To achieve this purpose, an analytical platform compatible with Big Data Analytics and Machine Intelligence has been developed so that STFC’s recommendations can be backed up by the latest scientific research, which is ever changing during the period of this epidemic.”
He explained the use of Big Data and Machine Learning would then allow the state to conduct communal screenings and contact tracing in a more efficient and organised manner.
Successfully Using Tech at a Local Level to Fight COVID-19
In a recent Health Summit online panel Dr Helmi, Head of the Digital Epidemiology and Data Analytics portfolio in the Selangor Task Force for COVID-19, spoke of how the team delivered two solutions, the population and location risk-ranking analytics platform for COVID-19 and a digital contact tracing initiative leveraging on QR technology.
The team created a QR code for digital contact tracing that does not require sharing of personal data. The telephone number is the only data captured, which is found to be highly acceptable by users. Each shop and premises are given unique QR codes, which visitors scan when entering.
Out of Selangor state’s 6.2 million population, an impressive 5.9 million or about 95% used the codes. If an establishment is linked to a COVID-19 case, all the visitors can be traced and contacted, and slow the transmission.
As of now, store visits totalled some 30 million, but the team has managed to detect 225 patients and 1,075 contacts.
State Government Efforts to Complement Efforts at Federal Level
Datuk Seri Dzulkefly Ahmad has said the intelligence gathered by the STFC would ultimately be meant to work in tandem to complement the ongoing efforts being taken at the federal level.
“By supplying a little expertise and experience from the members of STFC, we are committed to raising the ability of the state of Selangor, and among others, submit recommendations to the State Health Department to contain the outbreak of the SARS CoV-2 virus, as well as reducing the number of Covid-19 patients in Selangor.
“The state government acts within the framework of complementing the efforts of the Health Ministry, who are the Custodians of Health for the entire Malaysia,” Datuk Seri Dzulkefly Ahmad said.
Since Selangor is the most populous state in Malaysia and a technology front-runner, its success has paved the way for the rest of the country. The QR approach was replicated by the federal government and then nationwide. The team’s area profiling and risk app enabled active detection, which has also been adopted throughout Malaysia.
While the global pandemic has not completely vanished, economies around the word are gradually opening and getting their employees to get back into the physical office.
With the risk of the virus still looming large, ensuring the safety of employees is mission critical for organisations – especially in certain industries where remote working is not a viable option.
In order to better understand the process of critical event management, OpenGov Asia spoke with Graeme Osborn, Vice-President, International Critical Event Management for Everbridge.
The discussion revolved around how different industries and organisations are formulating their return to office plans.
Graeme shared that it is a challenging time for all organisations and they are all following different approaches to deal with the current critical event.
While some do not anticipate having their employees back before the end of the year, others are coming up with new ways of tracking and ensuring the safety of their staff.
One industry that is particularly struggling is the construction industry, as unlike the office space the construction sites are not technologically enabled.
Traditional ways of doing everyday activities have to be altered in a manner that keeps physical contact between the workers to a minimum.
The office ecosystem is facing a different kind of challenge as employees are more resistant to being traced. With that in mind, employers are exploring new ways to ensure safety at work without constantly keeping an eye on their staff.
One of the ways of doing this is contact tracking. Graeme emphasized that contact tracking is very different from contact tracing.
In the former process, leverage points of information are used to understand the potential impact of the virus. Other ways include heat detection, daily health surveys, self-health certification etc.
He expounded further on the process approach involved in contact tracking. Once the application is installed by the employee, the app keeps track of whoever they encounter within a 2-meter radius. If at any point someone reports an infection, everyone they contacted and others in the organisation are informed of it through the application.
In case of a self-report, the app not only alerts the employees and the control team, it also helps contain the exposure quickly by scanning the exposure area within the organisation.
Critical events or disasters are never over; there is one after another and multiple critical events can hit us at the same time. Many parts of the world are dealing with floods, typhoons, bushfires alongside COVID 19.
Graeme highlighted in order to effectively manage multiple critical events, two aspects are very important: 1) Planning (having all the information and resources readily available) 2) Testing out the systems and processes that have been planned.
It is also important to understand that when an organisation is hit with a natural disaster or any other critical event, business continuity, life safety of employees, cybersecurity and operations are threatened.
Therefore, an organisation’s critical event management approach should unify all these business components rather than them operating in silos.
As the executive leader of an organisation, the CEO is ultimately the person responsible to lead its critical event management initiatives.
In the times of crisis and immense pressure, s/he is the one who will take the decision for the whole organisation. As such, it is very important for the CEO to be ahead of the critical event management approach.
Contrarily, when one looks at governments and public sector, it is very difficult to pinpoint any one agency or organisation to be responsible for handling critical events management as with pandemics and disasters multiple public utilities are impacted (health, transportation, food supplies, education, communication etc).
Of course, there is a collective responsibility of all the different agencies focusing of these aspects individually; and they must be directed by the leader of the country to ensure the safety of all citizens.
He concluded by emphasising that irrespective of the type of organisation it is imperative to have an integrated approach and clear leadership to effectively tackle critical events.
OpenGov Asia had earlier shared an interview with Graeme Orsborn on the value of Critical Event Management for any organisation as well as the basics steps to take in order to put in place a successful critical event management plan and how that applies in the global COVID-19 crisis today.
The coronavirus pandemic has been a sobering wake-up call to swiftly abolish corporate inertia plaguing critical event management.
COVID-19 has brought into sharp focus two primary goals – how to keep employees safe with minimal businesses disruption.
This was particularly telling in a recent high-level meeting with around 30 senior executives from major brands from Australia and other Asia-Pacific countries.
Only 7 per cent said they had a scalable solution to deal with the next critical event – bushfires, tsunami, terrorism, earthquake, flood or another economic or life-threatening situation – in a post-COVID world.
Yet, 89 per cent said critical event management was important to their business outcomes.
Most were in the dark and had no idea where to begin but all understood the dire consequences of doing nothing.
The current pandemic is an opportune time to ask if your organisation can quickly identify threats and assess the risk environment, then easily identify and locate the impacted people, assets, operational functions which could potentially be impacted.
It may be 2020 but many organisations still rely on a manual call tree to disseminate accurate information to ensure staff safety. This may be acceptable in small organisations but even they struggle to keep the basics, such as mobile phone numbers, up to date.
The process of managing a critical event is often very manual, even disjointed (some large organisations still have employee details spread across multiple Excel sheets and even in binders).
It’s often siloed using multiple applications, and it takes a significantly long time to work through. Why?
An organisation needs to know what’s happening and why it’s happening – what is the threat and the nature of the threat. What’s the potential impact? Is it related to physical security, inclement weather, or is it digital disruption due to a cyberattack or ransomware?
Is that an IT outage or application latency?
Is it a localised or national disaster?
How many different sources of data are being used to monitor threats? How effective are they and is any of it automated or filtered, and tailored to your specific business?
And can the sources of information be trusted?
Based on all that organisations need to understand and locate what and/or who’s impacted – their people, assets, and operational functions.
This is especially challenging if some staff members are on the move or the risk event is changing – as we’re faced with in the current pandemic.
Trying to correlate the two may involve accessing multiple systems and having multiple applications running at the same time.
How many different systems do you currently have that stores information about your people or assets?
And is this information integrated with your threat data to determine who or what might be impacted.
Before employees can safely to return to the office, organisations must have the capability to effectively respond to another wave or if a worker tests positive.
Critical event management systems can’t be a one-way street as staff need the ability to confirm, acknowledge or respond to alerts, information, safety check-ins, and questions or polls – no matter where they are or what device they use.
It’s time organisations stop outsourcing employee safety and well being to spreadsheets or pieces of paper. Drowning in data during a pandemic without a single source of truth will surely sound the death knell for any business.