Western Union’s business
model has changed over the years. From connecting customers through communications
with its telegraph system in the mid-1800s, today it connects people
financially through cross-border, consumer-to-consumer money transfers, bill
payments, and other financial services. But through it all, customer-centricity
and connecting people has always remained at the core.
But operationally supporting these
transactions is not enough. The massive volume of transactions generates an
equally enormous amount of transactional information, including data about both
senders and receivers. This data contains insights which can help the company create
products and services that are relevant to customers and help differentiate
Western Union in a competitive marketplace. These insights can help the company
simplify transactions from different channels and devices, protect transactions
and provide better understanding of each customer.
The company identified two key areas that
would benefit from a solution that could bring together structured and
unstructured data stores.
The first was customer experience. Based on
user behaviour data, click stream information and mobile usage patterns, the
company could drive a better, more personalised experience for senders as well
The second area was security, risk, and compliance.
By ingesting, processing, and applying analytic capabilities on
multi-structured data streaming from mobile, web, and retail sources, Western Union
could minimise risk and enhance anti-money laundering (AML) compliance at
To accomplish these objectives, the team
would need to revamp their engineering stack. Data would be the foundational
pillar, on top of which they could apply insights.
For evaluating different technologies, the
team identified several criteria. The first was performance and agility to
handle structured, unstructured, and semi-structured information. Secondly the
system should demonstrate rapid time to value and ability to make meaningful
impact. The technology’s customisation capabilities should be able to support a
global enterprise. Finally, management capabilities including data segregation,
auditing, and monitoring and long-term cost efficiencies at large scale were
Western Union looked at several vendors and
Cloudera had the highest aggregate
across the above-mentioned criteria.
For an implementation of its size, Western
Union anticipated to complete the project in a year. Exceeding expectations, the
first production-ready Cloudera system was set up within just five months.
Western Union’s enterprise data hub (EDH) was
powered by a 64-node CDH
cluster that was soon expected to grow to 100 nodes (case study is dated June
The hub feeds in structured data from
multiple data warehouses as well as unstructured data including click streams,
behavioral data, logs, and sentiment data collected by tools such as
transactional, marketing, and other outreach systems.
A combination of Apache Flume, Apache
Sqoop, and Informatica Big Data Edition (BDE) was used to collect data from the
various sources. High-density Cisco Unified Computing System (UCS) servers
formed the backbone of Western Union’s EDH.
The team was also building a transactional
capability on top of the 100-terabyte (TB) hosted data set to provide rapid
response times to critical systems servicing our customers cross-channel – from
the web, mobile, even from retail agent locations.
The company’s 100 internal end
users—including members of the business and engineering communities as well as
data scientists—access the data in their EDH via Hue,
offering a web interface for Hadoop, and Apache
Hive, which offers a SQL-like interface.
Cloudera provided rigorous training for Western
Union’s engineers, and hosted multiple training classes for about forty of
internal end users, which accelerated adoption.
Business users also have access to several
visualisation tools that integrate with the Hadoop cluster, offering an
interactive, 360-degree view of the business against important trends and
timelines, across its digital offerings.
Western Union’s data is secured and segregated
with Apache Sentry (incubating) and Kerberos, and is monitored by Cloudera
Navigator. This is of critical importance as Western Union is the custodian of
its customers’ financial information. It was essential to ensure compliance and
proper monitoring and auditing.
Responding to customer needs
The Cloudera enterprise data hub serves as
a single repository to help Western Union understand its customers. It provides
important insights from initial touch point and qualification and compliance
checks, through the entire customer lifecycle, starting from the moment they
come into one of Western Union’s networks—retail, web, or mobile. It allows customers
to have a more seamless experience across multiple channels, to use products
and services, and discover new ones.
This allows the company to deliver push
relevant and meaningful offers. For example, in San Francisco, Western Union
delivers targeted offers that are tailored to the Chinese culture at its
Chinatown retail agent locations, messages tailored to Filipinos in Daly City,
and to the Mexican community in the Mission District.
One insight revealed that many web and
mobile customers frequently process repeat transactions. They send the same
amount of money to the same recipient at the same time each month.
This then prompted Western Union to add a
“Send Again” button to make the process of repeating payments much more
convenient for the customer.
risk management and compliance
The EDH delivered immediate value to
Western Union by supporting predictive analysis on structured and unstructured
data sets, at the time of transaction.
Consequently, the company was able to
impact transactions in real-time and drive customer compliance in a way that
drove better conversions for customers.
For instance, Western Union’s data hub
revealed high transaction volumes between the US and one Asian community when
it’s early morning on Wall Street, because the tech-savvy senders understand
when new foreign exchange (FX) rates have just launched. This is just one
example of the many variables Western Union could now use to anticipate and
risk-decision its customers and their transactions.
In addition to the above benefits, implementing
the EDH has lowered Western Union’s total cost of ownership (TCO). Managing
these volumes of data in a standard data structure would have led to exponentially
The savings from the EDH can be invested
towards new opportunities and new product innovation, as opposed trying to just
pay for heavy-duty storage and database costs.
A real estate investment trust that invests in carrier-neutral data centres and provides colocation and peering services is building a new data centre in Hong Kong, its second in the administrative region.
HKG11 will be a 21,000 square metres (210,000 square feet) building and hold up to a 24MW of IT capacity. It is expected to be online in mid-2021, around the same time as Digital Realty’s upcoming Seoul facility in South Korea.
In 2012, Digital Realty acquired its first Hong Kong data centre on the Tseung Kwan O industrial estate, HKG10, capable of up to 18MW of IT capacity.
Its planned sister facility, HKG11, is located at the nearby but separate Kwai Chung district in Hong Kong and will operate as an auxiliary to HKG10.
The Chief Executive Officer of the firm stated that its investment in Hong Kong is another important milestone on its global platform road map, enabling customers’ digital transformation strategies while demonstrating its commitment to supporting their future growth on PlatformDIGITAL.
As the firm continues to expand in Asia, the launch of the second facility in Hong Kong underscores its importance as a major data hub, providing customers with the coverage, capacity, and connectivity requirements to support their digital ambitions.
The HKG11 facility will be built up to a total of 12 floors, eight of which will be dedicated to customer deployments.
The firm’s MD for the Asia Pacific region noted that Hong Kong is a regional leader in cloud readiness and has significant potential for further cloud adoption along with a strong base of customers with an appetite for digital technologies.
He stated. “We are delighted to launch our new facility, which will go a long way towards meeting the rapidly growing demand and bringing value to customers across the region, especially from China.”
Aside from Hong Kong, Digital Realty is also establishing facilities in Tokyo, Osaka, Singapore, Sydney, and Melbourne.
Hong Kong – a data centre hub
In February 2020, OpenGov Asia reported that a major telecommunications company, currently operating one of the largest globally connected IPv4+IPv6 networks in the world, completed a round of upgrades and improvements to their Hong Kong data centre.
The aim is to boost network performance for end-users throughout China and across the APAC region.
The addition of new local and international connectivity partners has improved network performance and reliability for businesses seeking to reach one of Asia’s busiest centres or international finance, trade, and enterprise.
The data centre market in the Asia Pacific has been forecasted to reach US$32 billion by 2023, behind only North America in terms of regional revenue.
According to the findings of a data analytics and consulting company, the surge in spending during the next four years will stem from enterprise customers “increasingly migrating” existing resources to data centres to “reap benefits from data”.
By 2023, Asia Pacific will account for nearly 30 per cent of the global data centre market, behind North America with 34.2 per cent but ahead of Western Europe on 24 per cent.
The lead analyst on the report stated that the data centre and hosting market growth in the Asia Pacific will be driven by growing demand for cloud services and digitisation from both enterprises as well as the investors.
Investment will continue in new data centre projects by existing and new entrants with a view to expand their presence in the region and serve additional customers.
In addition, with the commercial availability of 5G services in the next 1-2 years, data consumption is expected to grow multiple-folds.
This will result in constant connectivity requirements as well as data centre supported features, for supporting the critical business applications and activities of the enterprises.
Governments are exceptional institutions that did not slow down during the pandemic. Although not all of their services were essential during the crisis, some were in high demand by default.
All over the world, governments are packed with great volumes of data that is hard to make sense of in their current setting. Primarily because it is unstructured but also because much is on legacy systems that are not accessible or compatible to modern infrastructure.
Merely hoarding data is not useful nor valuable. Data needs to be organised, analysed and should make sense if crucial decisions are based on it.
In light of this, OpenGov Asia held another Virtual Breakfast Insight: High-performance Digital Government – Intelligent Cloud Data Management Strategies on 25 June 2020.
The high-level session had a cross section of chief information officers and IT heads from various government and public sector organisations in Singapore, Malaysia and India. All joined in to discuss Simple, Scalable and Seamless Cloud Data Management for a high-performance digital government.
Mohit Sagar, Group Managing Director and Editor- in-chief at OpenGov Asia opened the session with thought-provoking insights into Cloud Data Management.
Mohit explained that data management is a process that might look like a herculean task especially in such a busy time for the government. Breaking it down in smaller bits makes it easier to implement and practice.
He advised the delegates to set manageable goals for future and focus on those rather than trying fix the mountains of unstructured data from the past.
And this is where cloud becomes relevant. It offers a simple, seamless, and scalable solution to data storage problems.
Organisations no longer have to worry about the location of valuable data in physical data centres. Cloud enables accessing data from multiple points remotely.
However, enabling remote access to data from multiple points has not been the natural structure of government organisations. It was necessitated by the pandemic.
Mohit also cautioned the audience that security of data in cloud is very important otherwise the data is prone to breaches and misuse.
He concluded by saying that government institutions should be open to adopting new technology as it will help them become more efficient and effective in serving public.
Raymond Goh, Director of Systems Engineering, Veeam shared his view that all organisations, be it public or private, are gradually getting more inclined towards a customer/citizen centric approach in their operations. Their data management strategies are also in line with that.
Keeping data classified is highly important for the organisations otherwise it can create impediments in growth and make data complex. It can also make compliance and security hard to achieve.
Apart from making operations difficult, unmanaged data can incur unnecessary costs for the organisations and loss of customer confidence.
Raymond shared key points in the process of cloud data management from ensuring backup and recovery to the last important bit about compliance with regulations.
He then moved to articulate state the three critical aspects for organisations to focus on in their cloud data management journey:
- Digital transformation and cloud data management to go hand in hand
- Have parameters in place to measure the success rate of managing data
- The 4 C’s to enable effective data management: Cloud, Capabilities, Culture, and Confidence
Having understood the importance of Data Management in cloud, Chris Buxton, Chief Digital Officer at Stats New Zealand threw light on how management of data over the cloud can be more effective than the traditional approach to storing data.
He shared some useful insights from his experience of rapidly transitioning to a cloud-based environment.
Chris believes that managing data on cloud is not very different from the traditional way of managing data.
Additionally, it opens a new range of capability and connectedness that is not available in the old way.
In this age of technology, data is are no longer being generated manually; technology is being leveraged to collect and harvest the data from the web. This, in turn, makes collaboration and storage easier.
Conversely at the same time, huge amounts of data are being ingested – making it crucial to ensure that all the data and content all secure. As such, data security becomes an integral part of Cloud data management.
Chris re-emphasised five key areas that were highlighted by Raymond earlier: security, compliance, cost management, automation, performance and monitoring.
Chris concluded by sharing the various steps to be undertaken as organisations begin their journey towards cloud data management.
He completed the circle beginning from having a plan to governing your data.
After Chris’s presentation the session went into a more interactive phase with polling questions for the attendees.
On the first question on how long an IT outage lasts in their organisations, delegates were divided between less than 15 mins (36%) and 16 – 60 mins (26%).
An IT executive from Malaysia shared that she voted for 1- 4 hours as the average time. She opined that the duration varies and is dependent on how critical the application in question is.
Her opinion was totally in line with the findings that Raymond shared from the recent survey conducted by Veeam. On average, in most organisations, IT outage does not last more than 2 hours.
Moving forward, the next question was about the primary reasons for IT outages in an organisation. On this majority of the audience voted for Infrastructure and networking (60%).
A senior executive from insurance sector shared that they faced several IT outages when their organisation moved to a remote working model. With a staff of 3000 employees, working remotely put a lot of pressure on their networks and infrastructure.
Raymond concurred with him as the Veaam survey also showed infrastructure failure as the major reason for outages besides cybersecurity threats and application software failures.
On the final question about why digital transformation is important for an organisation, the participants, for the most part, leaned towards transforming business operations and processes (50%) and transforming customer services (35%).
A delegate from India shared that transforming business processes is definitely the primary driving force behind digital transformation. However, it is still important to be mindful of the limitations of budgets.
The survey reflected similar trends with transforming customer services to be the top motivation for digital transformation followed by transforming business processes.
The interactive Q&A session offered a plethora of reflections and insights from delegates. This rich dialogue was extremely beneficial. It allowed the participants to understand cloud data management from diverse perspectives that were set in a range of contexts and settings.
Raymond concluded the session by urging organisations to take the next step in their digital transformation journey and urged them to work towards the four C’s he mentioned in his presentation.
Delegates of OpenGov Asia’s Virtual Breakfast Insight gained key insights from the digital experts who presented. They were left better informed by the diverse perspectives of each other on digital transformation and using cloud technologies to manage their data.
The Department of Information and Communications Technology (DICT) recently amended its Cloud First Policy to provide clearer instructions on policy coverage, data classification, and data security.
It also covers policies on sovereignty, residency, and ownership as the government transitions to the ‘new normal’ amidst the COVID-19 pandemic.
In a press release, DICT said that the Philippine government’s Cloud First Policy promotes cloud computing as the preferred technology for government administration and the delivery of government services.
Shifting to cloud computing is expected to foster flexibility, security, and cost-efficiency among users. Cloud computing also offers key advantages such as access to global systems of solutions, innovations, and services, as well as up-to-date cybersecurity.
The recent amendments clarify which institutions will be covered by the policy and which institutions are only be encouraged to adopt it. This distinction is absent in the former version.
As amended, the Cloud First Policy covers all departments, bureaus, offices, and agencies of the executive branch, government-owned and/or controlled corporations (GOCCs), state universities and colleges (SUCs), local government units (LGUs), and all cloud service providers and private entities rendering services to the government.
Meanwhile, the Congress, the Judiciary, the Independent Constitutional Commissions, and the Office of the Ombudsman are encouraged to adopt the Cloud First Policy, the release added.
The amendments also clarify the government’s policy on data sovereignty, a concept that was confused with data residency in the previous version.
In the latest version of the policy, the application of Philippine laws over its foreign counterparts is asserted over data owned or processed by the Philippine government or any entity that has links to the Philippines.
Additional provisions on ICT capacity building and development of essential skills to meet international and local standards are also included.
Data classifications are updated to include the following: highly sensitive government, above-sensitive government, sensitive government, and non-sensitive government data.
The new classifications provide a more consistent structure to guide the application of safety protocols on the access, storage, processing, and transmission of data in the cloud.
The DICT Secretary, Gregorio B. Honasan II, said that the department is continuously updating policies to adapt to the present times.
With the amended Cloud First Policy, it is paving the way to an ICT policy environment that is more responsive to current needs, further filling gaps in the country’s digitalisation efforts.
The recent amendments to the Cloud First Policy are expected to further enable government agencies to serve the public more efficiently.
With concise guidelines, government agencies can now implement cloud-based services that are at par with global standards.
DICT-10 has also planned a project to implement free WiFi sites in 353 locations across the country, a press release has noted.
The DICT Assistant Regional Director, Frederick D.C. Amores, shared that even before the pandemic, the department was looking to make connectivity available in various institutions.
DICT has training programmes for digital marketing and online jobs, among others.
In the coming years, the department intends to have a stronger network for government, schools, and SUCs to connect to the internet.
Among the key programmes, the DICT is promoting is the Public Key Infrastructure (PKI), which will use digital signatures.
The Director explained that there are still areas with no internet connection because they are not viable for commercial telecoms to come in while others are beset by armed conflict, making it difficult to install communication facilities.
One possible, but expensive, solution is the use of satellites to bring connectivity to those areas with no signal or connection.
DICT also proposes a broadband network for the government across the region and Mindanao. It is important to build internal capacity for the government to provide connectivity.
Australian national security agencies must develop a national security cloud and finally catch up to the private sector in terms of cloud adoption, according to the Australian Strategic Policy Institute (ASPI).
In a new report, ASPI argued that agencies’ slow adoption of cloud services due to initial concerns about the security of cloud technology has left them years behind the adoption curve.
“For agencies that rely on cutting-edge high technology for their capability edge, this is disastrous,” the report states.
Unless this is addressed rapidly and comprehensively, Australia will quite simply be at a major disadvantage against potential adversaries who are using this effective new technology at scale to advance their analysis and operational performance.
Australia will also fall further behind its allies, ASPI said, arguing that the US national security community has a lead of at least five years over Australian partner agencies.
This change must be driven by ministers and agency heads rather than CIOs and security staff, ASPI said.
The report states that this is because security accreditation standards and processes can’t lead technological change. By definition and by design, security standards are lag controls, based on what’s already understood and formed from experience with past and present technical systems.
Ministers and agency heads have both the responsibility and perspective to look beyond the important current technical security standards and rules and think about the capability benefit that cloud computing can bring to Australia’s national security.
Accordingly, ASPI has called for the government to commit to major investments in cloud infrastructure and services for Australian intelligence agencies as part of any government stimulus to Australia’s digital economy.
The intelligence community needs to make this shift as a community, not as a rag-tag band of loosely coordinated agencies with agency heads making separate risk-based decisions, the report adds.
This collaboration should involve the development of a national security cloud that has agencies’ interoperability as a core principle, ASPI said.
The most powerful cloud infrastructure and applications are useless without the fuel they need to operate — data. So, the maximum data needs to be brought into the national security cloud by each agency in the intelligence community.
The report also notes that decisions will be divisive and difficult, but national capability, not agency fiefdoms, needs to be the overriding interest.
Another key attribute for the national security cloud must be security. Information hosted on the cloud must be protected from both state and non-state cyber actors who are already targeting Australian government systems.
As a result, data must be hosted onshore, and security must go beyond personal and system security to include the resilience and integrity of the supply chains that cloud infrastructure and service providers rely on to produce their products.
This is a newly obvious priority exposed by the vulnerabilities seen in global supply chains through the pandemic — and high-technology supply chains are particularly exposed to Chinese state influence unless security is a design principle baked in from the start.
ASPI also advised against what it anticipates as a tendency to adopt cloud infrastructure at the lower levels of classification first before more highly classified data.
The institute argued that combining valuable top-secret information with the huge trove of lower classification and open-source data is a source of distinctive advantage that agencies can offer the government.
“So, failing to incorporate highly classified data holdings with the analytic horsepower and flexibility that cloud infrastructure and applications bring would be a bit like adopting jet propulsion for reconnaissance aircraft during World War II but sticking with piston-engine aircraft for your fighter fleet, even as your enemy chooses otherwise,” the report concludes.
Despite the havoc that the COVID-19 pandemic is wreaking on firm across the globe, the incubatees at Hong Kong’s Smart Government Innovation Lab have been able to sustain their release of solutions.
Recently, another one of its supported firms developed a solution that is now ready to be acquired by other firms, government agencies and academic institutions.
The solution, called V-OS Cloud Authentication has the following benefits:
- it secures remote access to corporate services or Microsoft Office 365
- it is simple and quick to integrate
- able to provide V-OS App Protection & Smart OTP/PKI Tokens over Cloud as well
The tech follows the procedure of operation detailed below:
- The user launches VPN software on a laptop to access their corporate network
- The authentication process starts when the user inputs their username and password (1FA) on their laptop,
- Upon successful verification of 1FA, the V-Key App will be installed on the user’s mobile device will be activated to request for 2FA
- The user must then further verify (either via facial/fingerprint or biometrics passcode) their identity through the V-Key App
- When the 2FA is successfully verified, the user will be able to access the corporate network
The solution can be applied in the areas of Broadcasting, City Management, Climate and Weather, Commerce and Industry, Development, Education, Employment and Labour, Environment, Finance, Food, Health, Housing, Infrastructure, Law and Security, Population, Recreation and Culture, Social Welfare as well as Transport.
The solution employs the latest Mobile Technologies.
- Cloud Solutions reduce costs for On-Premise deployments
- It is secure yet easy to deploy
- Updates/upgrades done Over-The-Air
Cloud authentication technology
One article notes Covid-19 has accelerated customer demand for digital technologies to ensure resilient enterprise business operations across the Asia Pacific, resulting in cloud-based offerings outshining traditional products.
Key verticals like banking and financial, healthcare and manufacturing sectors are witnessing a surge in demand for cloud-based solutions, owing to features like remote data storage capabilities and provisioning of privileges for hosted applications.
This crisis has derailed the economy of Asian countries to a certain extent. Governments in these countries, therefore, are encouraging the adoption of advanced digital capabilities amongst small and medium-sized enterprises (SMEs) and large enterprises.
For example, the Infocomm Media Development Authority of Singapore (IMDA) has introduced GoCloud to support SMEs when migrating from legacy software development procedures and architectures to cloud-based applications delivered as cloud-native applications or services. This will help businesses to perform effectively during the pandemic, in addition to supporting the economy of the country.
Cloud service providers are also witnessing a surge in the adoption of cloud-based communication and collaboration tools through a rise in audio conference calls, video collaboration solutions and virtual schooling.
During the pandemic, the majority of organisations are providing a work from home facility to employees in countries such as India, Singapore, Australia, Hong Kong and New Zealand.
There is a high demand for software-as-a-service (SaaS) based offerings from enterprises, specifically for teleworking and remote conferencing. Low staff presence to monitor local servers or data centres has compelled them to opt for public cloud offerings.
The benefits of cloud services align directly with broader enterprise strategies like new product and services developments, resulting in the creation of new revenue streams and adopt an agile transformation model to align their operations as per changing business requirements.
About the Smart Government Innovation Lab
In 2018, the Government established the Smart Government Innovation Lab to explore hi-tech products such as AI and relevant technologies, including machine learning, big data analytics, cognitive systems and intelligent agent, as well as blockchain and robotics from firms, especially local start-ups.
The Lab is always on the lookout for innovation and technology (I&T) solutions that are conducive to enhancing public services or their operational effectiveness.
I&T suppliers are encouraged to regularly visit the Lab’s website to check on the current business and operational needs in public service delivery and propose innovative solutions or product suggestions to address them.
Once again, a company operating within Hong Kong’s Smart Government Innovation Lab has announced a new solution.
Following the usual protocol, the company is now seeking start-ups, SMEs, other companies or government agencies to acquire and apply the technologies.
The solution, called Earthy, is a geo-intelligent smartphone application for an advanced chat. Users can use their native language to speak over 100 languages with automated translation powered by AI.
The message is translated as the user hits send. Moreover, the solution covers over 100 languages. The app has geo-intelligence so users can easily find locals and local businesses.
Users can also create public and private chat-grid groups and share these grids with friends, family, and business partners. The app was designed with broad human connection in mind.
The areas of application for the technology include City Management, Commerce and Industry, Development, Education, Employment and Labour, Finance, Food, Health, Infrastructure, Law and Security, Population, Recreation and Culture, Social Welfare, as well as Transport.
The solution employs Artificial Intelligence (AI), Cloud Computing, Deep Learning, Machine Learning, Mobile Technologies, and Natural Language Processing technology.
There are many applications and use cases for Earthy, which increase information sharing, safety, and decreased cost to the government.
If a city employee is trying to communicate with a foreigner, they can communicate without sharing any language in common.
Earthy can be used in a modern international marketplace setting where sellers, vendors, clients and suppliers can coordinate more efficiently.
A team of city employees of diverse backgrounds can coordinate through their day in a complex city sharing information and locations.
If the city wants to hire immigrants or help integrate them into the culture of their city, Earthy is a valuable tool to track and communicate with them.
In another example, an airport is always filled with a huge variety of people from many destinations. Security at the airport has a straight-forward job to protect public safety, but they must navigate different passengers and travellers from every corner of the globe. With Earthy, they can speak 100+ languages as easily as they use their native language.
About the Smart Government Innovation Lab
The HKSAR government established the Smart Government Innovation Lab in 2018 to explore hi-tech products such as AI and relevant technologies, including machine learning, big data analytics, cognitive systems and intelligent agent, as well as blockchain and robotics from firms, especially local start-ups.
The Lab is always on the lookout for innovation and technology (I&T) solutions that are conducive to enhancing public services or their operational effectiveness.
I&T suppliers are encouraged to regularly visit the Lab’s website to check on the current business and operational needs in public service delivery, and propose innovative solutions or product suggestions to address them.
Australia’s Monash University has secured AU$ 4.3 million from the Australian Research Data Commons (ARDC) to lead four major data and cloud infrastructure science projects.
These projects will reportedly advance the artificial intelligence (AI), data science and research technology capabilities at the university.
About the Initiative
ARDC, which is run through the Federal Government initiative, National Collaborative Research Infrastructure Strategy (NCRIS), awarded four projects led by the Monash eResearch Centre
All of the projects focus on building scalable data environments for data-centric research, sensitive data and strengthening the use of AI techniques, such as machine learning (ML).
The University will work in partnership with other leading research organisations and universities to deliver these projects.
Doing so will harness the combined resources and knowledge to achieve improved high-performance data environments for researchers.
In one of these projects, the University was one of four organisations that received funding to upgrade its node of ARDC’s Nectar Research Cloud.
This national research cloud infrastructure provides core services to more than 16,000 researchers in approximately 1,600 currently active projects, enabling Australia’s research community to store, access, and analyse data.
The funding is critically important as the research community is now generating more data than ever and needs new solutions.
Researchers are producing incredible amounts of complex and in some cases, unstructured data.
As stated by the ARDC, the successful platforms projects cover all of the National Science and Research Priorities and National Research Infrastructure Roadmap focus areas.
- Establishing Australia’s Scalable Drone Cloud (ASDC)
Unmanned Aerial Vehicles (UAVs), commonly known as drones, provide sensing capabilities that address the critical scale-gap between ground and satellite-based observations.
It offers a competitive advantage for researchers through the ability to deliver near real-time societally-relevant information.
ASDC will fuse and establish a national best practice approach for experimental and scalable drone data analytics, driven by exemplar data-processing pipelines.
The platform will integrate sensing capabilities with easy-to-use storage, processing, visualisation and data analysis tools, which include computer vision / deep learning techniques, to establish a national ecosystem for drone data management.
- Environments to Accelerate ML Based Discovery
The convergence of big data and ML techniques is spreading through all aspects of people’s lives.
However, the access of researchers to necessary tools, training and resources is still patchy and uncoordinated.
This platform will accelerate the adoption of these techniques by Australian researchers, building on an international survey of research groups.
It will support core ML tools for pre-processing, annotating, training, and validation. It will also integrate with software development environments to provide a consolidated platform for ML-based research.
- Australian Characterisation Commons at Scale (ACCS)
The ACCS will develop a coherent and accessible compute and date environment that promotes collaboration, increases ROI for the characterisation instruments, and delivers value for thousands of researchers in domains.
These domains include health, advanced manufacturing, soil and water, food, energy and transport, and resources.
Building on the Characterisation Virtual Laboratory, the proposed infrastructure will be a rich ecosystem of computing systems, data repositories, workflows, and services, connected with instruments.
- Infrastructure Refresh – ARDC Nectar Research Cloud at Monash (Generation 2)
The University is committed to significant continued involvement with ARDC’s Nectar Research Cloud. ARDC’s support will refresh cloud compute and storage infrastructure for Nectar funded equipment that has reached the end of its useful life.
It will also maintain the capacity required to meet the demand for cloud resources from nationally prioritised research activities.