Telematics has transformed the vehicle
insurance industry over the past few years. Insurers are increasingly offering
usage-based insurance (UBI) products, based on the data gathered from GPS-connected
Telematics, a portmanteau of telecommunications
and informatics, enables the real-time transmission, receiving and storage of
data and information from remote objects such as vehicles. The information
could include location tracking of vehicles, driver behaviour, maintenance
requirements and data on accidents.
Now, developments in Internet of Things
(IoT) technology present a further opportunity for insurers to move beyond
connected vehicles and offer new insurance propositions in the home, commercial
property, life, and other insurance lines. IoT can connect insurers to the
assets they are insuring, whether that is a vehicle, a property, or a person.
This allows insurers to understand the exact status of any insured items and
potentially take action in response to an abnormal situation, such as detecting
a fire, a vehicle collision, or an abnormal heartbeat.
To take advantage of these next generation
IoT-based insurance propositions, Octo
Telamatics, a market-leading telematics service provider, developed a new
platform, which they called the Next Generation Platform (NGP).
pioneer in vehicle telematics
Established in Italy in 2002, Octo
Telematics was one of the first companies to support insurers in the then
vehicle telematics space. Since then, the company has established itself as the
global leader in the telematics service provider (TSP) space. It has one of the
largest global database of telematics data, with over 186 billion miles of
driving data collected and 438,000 crashes and insurance events analysed (as of
31 December 2017).
Octo Telematics captures a comprehensive
set of data from a vehicle, including the speed, location, and journey
duration, as well as aspects of a driver's behavior, such as how harshly they
accelerate or brake and how quickly they corner. This data is combined with
further contextual information, such as the local weather conditions, road
type, and current traffic situation, and analysed to provide the insurer with a
detailed profile of the true risk posed by a specific driver at any particular
learning, the company can make more accurate predictions and risk models, thus allowing the insurer to calculate a premium level that accurately
reflects the risk and usage to the level of the individual policyholder.
Telematics also offers the ability to
detect and respond to crash and claim incidents in real time, and to react in
the most appropriate way, by alerting emergency services or dispatching
roadside assistance. The telematics data captured during a claim incident also
provides insurers with a precise view of a claim, enabling decisions about
liability, the likelihood of fraud, and estimated cost of repair to be made in
a fraction of the time. As claims expenses typically account for between 70%
and 80% of an insurer's costs, this can have a significant effect on an
insurer's profitability. In addition, it can dramatically improve the
customer's experience of the whole process, and boost customer retention.
of dealing with growing volumes and variety of IoT data
Octo Telematics had developed a proprietary
telematics platform that had evolved over the last 15 years. However, with the
growth of IoT and an order of magnitude increase in the number and types of
connected sensors, Octo Telematics foresaw that the current platform would
increasingly become a constraint on the company's growth ambitions.
One of the biggest challenges was that the
existing platform was designed around the needs of vehicle telematics and could
not easily accommodate other types of sensors, such as wearables, smart
watches, smart locks, smoke detectors, and surveillance devices, which are
becoming increasingly important components in IoT insurance propositions.
The growth in sensors also implied the need
to monitor data from potentially tens of millions of connected devices in the
near future. Though the old platform was able to support over 5 million
vehicles, the platform was reaching the limits of scalability.
The data management platform also needed to
accommodate a broad range of inbound data types, ranging from real-time streams
from in-vehicle telematics devices to bulk uploads from other sources, such as
third-party weather data. There is also significant variation in the formats of
stream data depending on the application and capabilities of the sensor. This
can vary from relaying a simple journey start and finish time through to
detailed crash reconstruction data from units incorporating sophisticated
sensors, such as six-axis accelerometers with very high sampling rates. This
data diversity will increase further, potentially to include images and video
The increased number of items being
monitored would also require significant growth in the compute and storage
capability needed to support real-time analysis across many millions of sensors.
The data captured by the platform, whether
from vehicles, properties, or people, needed to be closely coupled to the
incident response and claims processes to enable insurers to offer
policyholders a fully integrated IoT insurance proposition. The existing
platform lacked this high degree of integration.
In addition to capablities of accommodating
potentially orders of magnitude, the data management and analytics
infrastructure would also have to ensure the total security of all inbound data
or "data in motion," as well as that of data within the platform
residing on disk and other storage media ("data at rest"). The challenge
is compounded by the sheer volume of data from connected sensors, numbering in
the millions, distributed across a wide geography.
and implementing a co-innovation approach
Due to the scale, complexity, and
criticality of the development needed to realise the NGP in a time frame that
would allow Octo Telematics to capture the emerging IoT opportunity, the
company decided to adopt a co-innovation development model.
Octo Telematics used its understanding of
the evolving insurance market to define the functional requirements of an NGP
capable of supporting a broad range of new IoT-based insurance propositions.
Key technology partners were identified for
the development of the NGP: Cloudera, Software AG, Salesforce, SAS, and SAP.
Using this co-innovation approach, Octo Telematics and its partners were able
to accelerate the design and implementation of the NGP, delivering a complex
and challenging development project in under 24 months.
The initial phase of formulating the
approach and conducting a dialogue with the partners to refine and improve the
architecture of the NGP took seven months. A jointly agreed co-innovation
roadmap was created. The implementation took 18 months of development, with an
initial prelaunch version being released to key existing Octo Telematics
clients at the end of 2016. Following the beta testing phase, the full
commercial version of the NGP was released in July 2017. All new Octo
Telematics clients are now supported on the NGP, with a migration plan in place
to move the majority of existing clients to the new platform.
11 billion additional data points daily
The resulting NGP enables Octo Telematics
to store, process, and analyse data generated by over 5.3 million drivers
totaling 175 billion driven miles, and that increases by over 11 billion
additional data points daily. It also allows for complete flexibility in the
selection of sensors, analysis and output of data for all insurance and
And the backbone of this NGP is powered by
Cloudera’s machine learning and analytics platform. The Cloudera Enterprise suite
includes a set of tools to provide security, governance, and workload
management functionality operating within an integrated data and platform
model. The platform provides the underlying infrastructure to ingest, process,
and analyse huge volumes of structured and unstructured data, while being able
to perform analytics on both streaming and static data sources. All inbound
data, data moving between multiple clusters, as well as data stored within the
platform, is encrypted.
A "scale out" hardware approach was
adopted, as opposed to “scale up”. Scale-up is done by adding more resources to
the existing nodes of a system, while scaling out involves adding additional
infrastructure capacity in the form of new nodes, which can be done through the
use of commodity on-premises and cloud-based hardware. This avoids the need for
investment in expensive high-performance servers, as storage and compute
The NGP also utilises Cloudera's Shared
Data Experience (SDX)
module to define and enforce unified user and role-based access and security
policies, as well as provide auditing capabilities at the application, cluster,
and environment level.
Using Apache Spark, Octo Telematics is able
to leverage the huge volumes of data, the compute power of multiple clusters,
and a resilient distributed data set (RDD) structure to quickly implement,
train, and test machine learning models.
These models allow Octo Telematics and its
insurance customers to better understand, model, and price risk, and can form
the core of new innovative insurance products.
Inherent in the Cloudera Enterprise
platform's distributed computing model is the ability to operate the NGP both
on-premises and across private or public cloud. The ability to flexibly use
major cloud service providers such AWS, Google Cloud Platform, and Microsoft
Azure means the NGP can support transient but compute-intensive projects, such
as testing new pricing algorithms or risk model development, on a usage-based
The NGP has resolved the capacity issues of
the previous platform and is now continuously scalable. It will only require
additional cloud-based compute and storage resources to accommodate the growth.
The enhanced functionality in areas such as
CRM and incident analytics, as well as the increased capacity of the NGP, means
that Octo Telematics can offer all insurance clients detailed, real-time crash
reconstruction capability. This will allow users to drive significant
efficiency improvement in claims processing, identify potential fraud, and
enhance the customer's claims experience.
Octo Telematics' insurance clients also benefit
from the additional functionality of the NGP by being able to introduce new
types of IoT-based insurance products. For instance, one client introduced a
property insurance product that uses a home hub, developed by Octo Telematics,
that is equipped with smoke, heat, flood, and intrusion sensors. Another
insurer introduced a pet insurance product using IoT-based GPS tags worn by the
pet. Yet another is piloting the use of smart watches as part of a health and
life insurance offering.
Furthermore, the NGP is reducing time to
market for new product launches by more than 50%. The time to implement a new
UBI product has been reduced from two to three months to four weeks.
Currently, most insurers implement UBI offerings
as stand-alone projects requiring parallel core administration and claims
systems. To address the inefficiencies and complications from this, Octo
Telematics is working with core insurance software vendors to develop a range
of connectors that will allow direct integration between the NGP and an
insurer's core processing systems. This direct integration will significantly
reduce the cost of entry and complexity for insurers wanting to offer IoT-based
As of November 2017, Octo Telematics had
developed a connector allowing direct integration of the NGP with the policy
administration and claims suite of Guidewire, a software for property and
casualty (P&C) insurance providers.
Octo Telematics is also looking at
extending vertical-specific functionality to the NGP beyond the telematics
sector, in support of a wider spectrum of industries, such as the telecoms,
energy and utilities sectors.
The Indonesian government disclosed four potential uses of Big Data and AI to improve its e-government programmes. These two technologies, they feel, have the potential to support disaster identification and preventive action, prevention of illegal activities and cyber-attacks and increase workforce effectiveness.
The Director General of Informatics Applications, Semuel A. Pangerapan, explained several scenarios for Big Data. According to him, the government can use Big Data to improve critical event management and the quality of the response by identifying problem points through Big Data Analytics. For example, the agencies can be better prepared to prevent and mitigate natural disasters such as drought, epidemics or massive accidents occur.
In addition, Big Data can also enhance the government’s ability to prevent money laundering and fraud through better surveillance to detect such illegal activities.
Furthermore, Big Data significantly reduces the possibility of cyber-attacks. Cyber-attacks can come from external parties, data leaks or internally for a variety of reasons. An analysis of patterns and unusual activities can help in preventing or managing such cyber issues.
Big Data and analytics can contribute to workforce effectiveness by increasing monitoring. In addition, it can be used for policy design, decision-making and gaining insights.
Semuel stressed the importance of data analysis after collecting all data in the right fashion. Data is only valuable if it is collected correctly and then analysed – data will only provide benefits if processed in the right way. “In its implementation, AI helps analyse existing Big Data, providing data understanding or insight to help make decisions,” he explained.
Another advantage of AI is the ability to speed up new implementation services and corrections in real-time. At the evaluation stage, AI can also provide suggestions for adjustments and improvements to subsequent policies.
Currently, the encourages the improvement of the quality of Big Data and AI innovation through the development of e-government. The Indonesian government is also open to third parties to accelerate Big Data and AI use.
E-government has made progress in recent years and received appreciation from the United Nations in 2020. The UN said that Indonesia’s e-government development index rose to rank 88 from previously ranked 107 in 2018. Indonesia’s e-participation index has also increased from rank 92 in 2018 to 57 in 2022.
“The two rankings show an increase in the quality of Indonesia’s e-government and the level of community activity in using e-government services,” said Semuel.
However, the government faced challenges in implementing these two technologies. Overlapping and data replication is one of the main problems. “Regulatory obstacles in the procurement of government Big Data infrastructure also need to be overcome. Then compliance with international standards for the national Big Data ecosystem is also still the government’s homework.”
To optimise AI use, Semuel emphasised the need for a skilled workforce, regulations governing the ethics of using AI, infrastructure, and industrial and public sector adoption of AI innovations.
The government is implementing several solutions to overcome challenges. First, they have provided suitable facilities in the form of National Data Centres (NDCs) in four separate locations. The NDCs will accommodate Government Cloud and contain national data across sectors.
Optimisation of data centre utilisation needs to be supported by staff with qualified expertise. For this reason, the government is holding digital skills training on AI and Big Data through the Digital Talent Scholarship (DTS) and Digital Leadership Academy (DLA) programs.
Apart from facilities and upskilling, Indonesia is looking to develop a business ecosystem that utilises AI and Big Data. Support for this comes from the National Movement of 1000 Digital Startups, Startup Studio Indonesia (SSI) and HUB.ID.
The Cyberspace Administration of China (CAC) announced a new certification for personal information protection and implementation. The office has decided to implement such certification to enhance its information protection capabilities and to promote the rational processing of personal information.
The certification implementation follows the Personal Information Protection Certification Implementation Rules. The implementation rules clarify that personal information processors must comply with the requirements of GB/T 35273 Information Security Technology Personal Information Security Specifications. The rules outline requirements for on-site audits, the evaluation and approval of certification results, post-certification supervision and certification time limits.
Organisations engaged in personal information protection certification work need approvals to carry out activities. The regulation applies to every personal information processor that carries out private information collection, storage, use, processing, transmission, provision, disclosure, deletion and cross-border processing activities.
The State Administration for Market Regulation and the State Internet Information Office decided to implement personal Information protection certification. The step is relevant to provisions of the Personal Information Protection Law of the People’s Republic of China (‘PIPL’). The body requires the Specifications for Security Certification of Cross-Border Processing of Personal Information for cross-border personal information processing.
The latest versions of the standards include technical verification, on-site audit, and post-certification supervision. In addition, the certification body shall clarify the requirements for certification entrustment materials, including but not limited to the basic materials of the certification client, the certification power of attorney, and relevant certification documents.
To get certified, an organisation must submit certification entrustment materials according to the certification body’s requirements and the certification body shall give timely feedback on whether it is accepted after reviewing the materials.
The materials are then used for determining the certification plan, including the type and quantity of personal information, the scope of personal information processing activities, information on technical verification institutions, etc., before notifying the organisation seeking certification.
The CAC stated certification is valid for three years. An organisation must submit a certification commission within six months before the expiration of the validity period. The certification body shall adopt the method of post-certification supervision and reissue new certificates to those that meet the certification requirements.
Violations, cheating, and other behaviours that seriously affect the implementation of the certification on the certification client or personal information processor will cancel the certificate. Therefore, certification bodies shall adopt appropriate methods to implement post-certification supervision to ensure that certified personal information processors continue to meet certification requirements. The certification body comprehensively evaluates the post-certification surveillance conclusions and other relevant information. If the evaluation is passed, the certification certificate can continue to be maintained.
The organisation shall actively cooperate with the certification activities. During the validity period of the certification certificate. If the name and registered address of the certified personal information processor, or the certification requirements, certification scope, etc., change, the certification principal shall submit a change entrustment to the certification body.
When changes happen, the certification body must evaluate the change in entrustment materials. The result will determine whether the body can approve the change. If technical verification or on-site audit is required, the body shall conduct technical and on-site audits before the change is approved.
When a certified personal information processor no longer meets the certification requirements, the certification body will promptly suspend or revoke the certification certificate. The certification principal can apply for the suspension and cancellation of the certification certificate within the validity period of the certification certificate.
In the new normal, everything is moving online, including employee workloads, leadership insights, and how the services and businesses interact with customers or clients. Organisations must undergo a digital transformation to create entirely digital processes, better experiences and streamlined operations.
Successful digital transformation allows all processes and systems to communicate with one another. Users have a single source of truth, updates occur in real-time, and data is integrated.
The transformation enables organisations to effortlessly pivot when necessary because all their systems and teams are interconnected. Everything can be done quickly and without impacting the operations – whether it is to add more users, connect new business software or begin automating tasks.
In a cloud-first strategy, organisations are not merely adding a new layer of technology when they transform. They are expanding their IT capability in an entirely new way. Data and systems are hosted in the cloud, allowing for a seamless, effective and adaptable connection of all their IT.
Increasingly, companies of all sizes are aware of the potential and power of the cloud. Due to the increased security, scalability and convenience, more businesses and services are moving their apps and data onto the cloud.
Within this suite, that offers consumers a significant advantage is cloud communications. As remote and hybrid work models become the norm, cloud communication is quickly gaining importance.
The OpenGov Breakfast Insight with Indonesia’s top public sector leaders on 1 December 2022 at the Westin Jakarta provided the current information on the benefits of the most recent cloud technology that can help the nation’s public, education, financial services and healthcare sectors.
The Cloud at the Heart of the Digital Transformation
Mohit Sagar, CEO & Editor-in-Chief OpenGov Asia, believes cloud-based strategies are being adopted and implemented by companies of all sizes to spur growth and increase profits. Cloud has fundamentally altered business communications.
Cloud transforms how people communicate, collaborate and conduct business in today’s digital world. It has sparked advancements in machine learning, the Internet of Things (IoT), devices, healthcare and autonomous vehicles.
“The cloud offers cutting-edge features and functionality that let staff members collaborate and communicate in ways – and places – they never imagined,” says Mohit. “Organisations can outsource systems management tasks like provisioning, switching, data storage, and security to cloud communications providers.”
Moreover, with remote and hybrid models, employees report higher productivity and greater satisfaction.
Nonetheless, according to Mohit, even though remote and hybrid models are becoming increasingly popular, they will not be successful if they are not based on the right technology. Cloud communications are a crucial component of any hybrid or remote work environment.
With cloud-based communication tools, staff can easily switch to working remotely, teams can keep meeting, and operations can go on as usual.
“Technology for collaboration will be more crucial than ever with employees working in different time zones and locations. Hence, teams have the resources to connect with coworkers across boundaries thanks to cloud communications,” Mohit explains.
Organisations can make the most of their resources with cloud communications, which can quicken implementation, increase flexibility, and provide limitless high-volume information exchange. Moreover, cloud communication security features guarantee adherence to data privacy laws.
The technology, protocols and best practices that safeguard cloud computing environments, cloud-based applications and cloud-stored data collectively constitute cloud security. Understanding exactly what needs to be secured and the system components that must be managed is the first step in securing cloud services.
As an overview, cloud service providers are responsible for backend development against security vulnerabilities. Clients should concentrate primarily on the proper service configuration, safe use habits, and selecting a security-conscious provider.
“Clients should also confirm that any end-user networks and hardware are properly secured,” Mohit says.
Every step taken to secure the cloud aims to facilitate data recovery in the event of data loss; guard against malicious data theft on networks and storage; prevent human error or carelessness that results in data leaks, and minimise the effects of any data or system compromise.
The transition to cloud-based computing has resulted in a significant evolution of traditional IT security. While cloud models offer greater convenience, always-on connectivity necessitates new security measures. There are a few ways in which cloud security differs from conventional IT models as a modernised cyber security solution.
According to Nathan Guy, Zoom Phone Leader, Asia Pacific, Zoom, the macro business environment has significantly changed. Businesses are under tremendous pressure to increase productivity, adapt quickly as competition heats up and be productive to keep up with the rapid pace of innovation and technological advancements.
This problem is becoming even more pressing because of economic uncertainty. Without effective communication between customers, prospects and employees, it will be impossible to address these issues.
Nathan highlighted that the workforce is also experiencing a generational shift. People prefer the option of remote employment. And they are asking for cutting-edge equipment and communication systems as they need to do their jobs.
With every new tool and app that is made available, communication becomes more complex and confusing. Employees, clients, and potential customers are just a few stakeholders with preferences and expectations about how, when, and where they conduct business.
“Due to this, many businesses choose their battles carefully when it comes to facilitating communication,’ says Nathan.
Among the routes they take are keeping up with currently used systems deemed adequate; embedded communication tools included with other software packages; exploring multiple solutions depending on the situation; among others. “These strategies are meant to provide the organisation with fundamental communication.”
Such methods allow for some flexibility but also change the environment for prospects, employees and customers. People are compelled to alternate between various solutions based on their needs.
Some consumers “separate” from a favourite brand after just one disappointing interaction. Today’s harsh reality is that communication is a critical path activity; your business will also fail if it fails. A path that is crucial to the business failure.
Nathan believes that organisations must go beyond essential communication to universal communication. Creating intuitive connections to all parties – employees, customers, and investors – regardless of location, device, or business activity – will have a tremendous advantage in this uncertain business environment.
“You do this by combining the connection needs of the individual and organisation by delivering a consistent and quality experience for all participants, making human connection effortless, and enabling rapid innovation to maintain relevance,” says Nathan.
These steps could result in:
- Meeting both the organisations’ core business needs and the demands of their customers;
- Refocusing internal resources away from administering communications and towards new services and capabilities; and
- Improving the agility and the perceived value both in the company and the market
An organisation’s reputation is directly linked to the quality of its communication services. In addition to the fact that employees, clients, and customers can work from anywhere, people returning to the office do not want them to be disappointed by the home office environment to which they have grown accustomed.
Expectations have increased; a session that fails due to dropped participants or subpar audio/video is unacceptable and embarrassing. Organisations must adapt to this new hybrid environment and guarantee that everyone receives high-quality service regardless of circumstance or location.
“When communications are disrupted in today’s world, business transactions become impossible,” claims Nathan. “Organisations can eliminate a work-limiting unpredictability risk by doing this. They provide a controlled experience by enabling the staff to work without concern about the underlying technology.”
By using a top-notch infrastructure specially built to prevent failures, Zoom will protect organisations from communications breakdowns. Organisations could troubleshoot the underlying cause of environmental problems and take preventative measures. This allows the workforce to concentrate on their work without unneeded interruptions or uncertainty. Hence, employees will have confidence that the communication system they provide will work as expected.
Differences in network performance and bandwidth can seriously impair audio and video quality and lead to intermittent problems, preventing some users from participating fully. Even with severe packet loss, organisations can use Zoom to deliver a productive meeting experience. This makes it possible to eliminate local network and infrastructure variability, which is crucial when doing business internationally.
More complexity is being added to communications. “Now you have workers returning to the office, frequently in a hotel setting, as well as those travelling or working remotely,” says Nathan.
Three main contexts have been produced as a result: remote, office and mobile. Unfortunately, all too frequently, people are forced to juggle a patchwork of disjointed point solutions created during the pandemic. This includes a personal cellphone, a videoconferencing option for small meetings and another tool for significant events.
Nathan believes that employees and clients must learn to use a different interface. Even if the organisations stick with a single vendor, many have expanded through acquisitions, leading to various products with no shared characteristics.
“There’s no doubt that communication platforms are a big part of how hybrid teams work,” Nathan asserts. “A modern communications platform like Zoom could help boost productivity, add to what can be done, and show how engaged employees are.”
Fireside Chat: How to Prepare for the Transition to the “Cloud Culture”
According to Deddy Kartika Utama, Head of Information Security, Ministry of Home Affairs (Kemendagri), policies regarding political and general governance and regional autonomy are developed, determined and implemented by the Ministry of Home Affairs.
The Ministry also plays a role in establishing regional and village administration, governing issues, regional finance, demographics and civil records.
Given the number of parties involved and the nature of the hybrid organisation, including the Ministry, maintaining consistency may prove difficult. Because of this, compelling and trustworthy means of communication are crucial.
Cloud communications, Deddy emphasised, continue to be the preferred method of meeting the growing demand for efficient organisational communications, considering the advent of the hybrid workplace. With cloud computing and communications, organisations can quickly expand or contract to meet fluctuating demand.
In the public sector, by using internet-based connectivity to reduce lag time and unreliable connections, organisations can communicate with their team and customers through various channels, including email, voice calls, chat and video.
Through the advancements in IT, organisations now have access to a flexible, instant, scalable, stable, and conveniently located environment. Organisations that switch to cloud-based communication technology can take advantage of full cloud communication’s mobility, scalability, security, reliability, and cost-effectiveness.
The rapid development of cloud computing services and collaboration technologies has apparent benefits for remote and hybrid workforces. It enables teams to work together and achieve their shared goals even when they are not physically present in the same office.
“Using a cloud collaboration strategy, coworkers can work together on documents stored in the cloud while having access to the same files and making changes to them in real-time,” Deddy explains. “One method for cutting costs while maximising organisational resources despite growing communication capabilities and reach is to concentrate on the quality of the technology.”
By utilising the cloud, businesses have found cheaper alternatives while ensuring that their customers can access their data and systems from any location at any time. Transitioning from traditional to cloud office culture is exciting and promising. To protect the organisations and their operations, a solid security foundation must first be established.
According to Deddy, the potential of cloud computing is becoming increasingly apparent to various organisations, and it is also growing. “Organisations are already transitioning from the traditional office culture to the cloud culture, and doing so is profitable. They can save money and space by switching to cloud technology.”
Nathan emphasised the significance of cloud security, albeit that most organisations are already utilising cloud computing in some form. “Organisations are still hesitant to move more data and applications to the cloud due to security, governance, and compliance concerns when storing their content in the cloud.”
By partnering with Zoom, the human connection could be simplified and security could be included. Organisations can capitalise on the habits and competencies individuals have developed over the past two years. Additionally, they will ensure consistency across multiple use cases.
“By partnering with Zoom, businesses will be able to maintain their relevance through rapid innovation. They have access to a constant stream of new capabilities that reflect actual user requirements,” Nathan claims.
According to Mohit, a critical component of cloud security is the protection of data and business content such as customer orders, secret design documents and financial records, among others.
Preventing leaks and data theft is critical for maintaining customer trust and safeguarding assets that contribute to competitive advantage. “The ability of cloud security to protect your data and assets makes it critical for any organisations that are transitioning to the cloud.”
Development partners can assist organisations in meeting a broader range of customer needs, resulting in increased market reach. As a result, when developing cloud applications, make sure to include platform or integration capabilities as well as a partner strategy.
“Your cloud partner strategy should be based on business potential, engineering capability, and platform marketing. A balanced strategy will enable a larger partner ecosystem, more comprehensive customer solutions, and increased revenue potential,” Mohit concludes.
Caltech engineers collaborated with the University of Southampton in England to design an ultrahigh-speed data transfer chip. The chip integrates both an electronics chip and a photonics chip which uses light to transfer data. It took four years to complete, from the initial idea to the final test in the lab.
“As the world becomes increasingly connected, and every device generates more data, it is exciting to show that we can achieve such high data rates while burning a fraction of power compared to the traditional techniques. We had to optimise the entire system all at the same time, which enabled achieving a superior power efficiency,” said Azita Emami, the Andrew and Peggy Cherng Professor of Electrical Engineering and Medical Engineering, Executive Officer for Electrical Engineering and senior author of the paper.
The research paper is titled “A 100Gb/s PAM4 Optical Transmitter in A 3D-Integrated SiPh-CMOS Platform Using Segmented MOSCAP Modulators.” Rockley Photonics and the U.K. Engineering and Physical Sciences Research Council funded this research.
The need for high processing power and transmission creates the inevitable excess heat. Heat is the enemy of the speed and the amount of data a computer device can manage. It happens not just for personal computers or laptops but also for data centres.
While a laptop may heat up while when in use, servers in data centres also heat up as they work – but at a much grander scale. Therefore, managing heat in the data centre is essential. The less heat, the more computing power is generated and the greater the volume of information it can handle.
Hence, engineers tried to find a way to increase the processing speed while keeping the heat low. The solution was to design and co-optimise an electronics chip and a photonics chip. The chip is innovative because it integrates an electronic circuit essential for data processing, combined with a photonics chip which is the most efficient piece for data transmission.
The Caltech/Southampton integrated chip can transmit 100 gigabits of data per second! Moreover, the integrated chip generates minimal heat, producing just 2.4 pico-Joules per transmitted bit. The result increases the electro-optical power efficiency by 3.6 times compared to the current technology.
Handling Next-level Computing
In the future, data centres will manage very high volumes of data compared to today. The new design integrated chip will answer a continuous demand for increasing data communication speed in data centres and high-performance computers.
“As the computing power of the chips scale, the communication speed can become the bottleneck, especially under stringent energy constraints,” Emami explained.
The high-demand data transmission and processing from a data-demanding task, such as a video call, streaming a movie, or playing an online video game, need high processing power in the data centre.
“There are more than 2,700 data centres in the U.S. and more than 8,000 worldwide, with towers of servers stacked on top of each other to manage the load of thousands of terabytes of data going in and out every second,” says a Caltech graduate student Arian Hashemi Talkhooncheh (MS ’16), lead author of a paper describing the two-chip innovation that was published in the IEEE Journal of Solid-State Circuits.
The Indian Space Research Organisation’s (ISRO) Polar Satellite Launch Vehicle (PSLV) has launched nine satellites, including eight nanosatellites, into space from the first launch pad at the Satish Dhawan Space Centre in Andhra Pradesh.
The 44-metre-long rocket’s primary payload is the Earth Observation Satellite-6 (EOS-6) or Oceansat-3, a third-generation satellite to monitor oceans. It is a follow up to OceanSat-1 or IRS-P4 and OceanSat-2 launched in 1999 and 2009, respectively. Oceansat-3 will provide data about ocean colour, sea surface temperature, and wind vector data for oceanography, climatology, and meteorological applications.
The Oceansat-3 was placed in the polar orbit at a height of about 740 kilometres above sea level. While it weighs approximately 1,100 kilogrammes, which is only slightly heavier than Oceansat-1, for the first time in this series, it houses three ocean observing sensors. These include an Ocean Colour Monitor (OCM-3), Sea Surface Temperature Monitor (SSTM), and Ku-Band scatterometer (SCAT-3). There is also an ARGOS payload, a press release mentioned.
The OCM-3, with a high signal-to-noise ratio, is expected to improve accuracy in the daily monitoring of phytoplankton. This has a wide range of operational and research applications including fishery resource management, ocean carbon uptake, harmful algal bloom alerts, and climate studies. The SSTM will provide ocean surface temperature, which is a critical ocean parameter to provide various forecasts ranging from fish aggregation to cyclone genesis and movement. Temperature is a key parameter required to monitor the health of the coral reefs, and if needed, to provide coral bleaching alerts. The Ku-Band Pencil beam scatterometre will provide a high-resolution wind vector (speed and direction) at the ocean surface, which will be useful for seafarers, including fishermen and shipping companies. Data regarding temperature and wind is also particularly important for ocean and weather models to improve their forecast accuracies.
ARGOS is a communication payload jointly developed with France and it is used for low-power (energy-efficient) communications including marine robotic floats (Argo floats), fish-tags, drifters, and distress alert devices valuable in search and rescue operations.
The Minister of State (Independent Charge) for Science and Technology, Jitendra Singh, stated that ISRO will continue to maintain the orbit of the satellite and its standard procedures for data reception and archiving. Major operational users of this satellite include Ministry of Earth Sciences (MoEs) institutions such as the Indian National Centre for Ocean Information Services (INCOIS) and the National Centre for Medium Range Weather Forecasting (NCMRWF).
INCOIS has also established a state-of-the-art satellite data reception ground station within its campus with technical support from the National Remote Sensing Centre (ISRO-NRSC). Singh asserted that ocean observations such as this will serve as a solid foundation for India’s blue economy and polar region policies. A representative from MoES noted that the launch of Oceansat-3 is significant as it is the first major ocean satellite launch from India since the initiation of the UN Decade of Ocean Science for Sustainable Development (UNDOSSD, 2021-2030).
The Indian Space Research Organisation is the national space agency of India, headquartered in Bengaluru. It operates under the Department of Space, which is overseen by the country’s Prime Minister.
Astronomers from the California Institute of Technology (Caltech) have completely automated the classification of 1,000 supernovae using a machine-learning (ML) algorithm. The Zwicky Transient Facility, or ZTF, a sky survey instrument located at Caltech’s Palomar Observatory, collected data that the algorithm was then used to analyse.
“We needed a helping hand, and we knew that once we trained our computers to do the job, they would take a big load off our backs,” says Christoffer Fremling, a staff astronomer at Caltech and the mastermind behind the new algorithm tagged as SNIascore.
A year and a half after SNIascore classified its first supernova in April 2021, they are approaching the pleasant milestone of 1,000 supernovae. Every night, ZTF scans the night sky for alterations known as transient events. This covers everything, from asteroids in motion to recently devoured stars by black holes to exploding stars known as supernovae.
ZTF notifies astronomers worldwide of these transient events by sending out hundreds of thousands of alerts each night. Other telescopes are then used by astronomers to monitor and learn more about the nature of the shifting objects. Thousands of supernovae have so far been found thanks to ZTF data.
Members of the ZTF team cannot organise all the data on their own due to the constant flow of data that comes in every night. According to Matthew Graham, project scientist for ZTF and research professor of astronomy at Caltech, “the traditional notion of an astronomer sitting at the observatory and sieving through telescope images carries a lot of romanticism but is drifting away from reality.”
Instead, to help with the searches, the team has created ML algorithms. SNIascore was created to categorise potential supernovae. There are two main categories of supernovae: Type I and Type II. In contrast to Type II supernovae, Type I supernovae are devoid of hydrogen.
When material from a companion star flows onto a white dwarf star, causing a thermonuclear explosion, a Type I supernova is produced. When a massive star collapses due to its own gravity, a Type II supernova happens. Type Ia supernovae, or the “standard candles” in the sky, can be classified by SNIascore. These are dying stars that explode with a steady-state thermonuclear blast.
Astronomers can gauge the universe’s expansion rate thanks to Type Ia supernovae. Fremling and colleagues are currently expanding the algorithm’s capabilities to classify additional types of supernovae soon.
Every night, after ZTF has recorded sky flashes that may be supernovae, it sends the data to the SEDM spectrograph at Palomar, which is in a dome a short distance away (Spectral Energy Distribution Machine).
To determine which supernovae are likely Type Ias, SNIascore collaborates with SEDM. As a result, the ZTF team is working quickly to compile a more trustworthy data set of supernovae that will allow astronomers to conduct additional research and, ultimately, learn more about the physics of the potent stellar explosions.
“SNIascore is incredibly precise. We have observed the performance of the algorithm in the real world after 1,000 supernovae” says Fremling. Since the initial launch in April 2021, they have found no clearly misclassified events, and they are now planning to implement the same algorithm with other observing facilities.
According to Ashish Mahabal, who oversees ZTF’s machine learning initiatives and is the centre’s lead computational and data scientist at Caltech, their work demonstrates how ML applications are maturing in near real-time astronomy.
The SNIascore was created as part of the ZTF’s Bright Transient Survey (BTS), which is currently the most comprehensive supernova survey available to the astronomical community. The entire BTS dataset contains nearly 7000 supernovae, 90 per cent of which were discovered and classified by ZTF while the remaining 10 per cent were contributed by other groups and facilities.
The Victoria University of Wellington’s division of Science, Health, Engineering, Architecture, and Design Innovation (SHEADI) will inaugurate a Centre of Data Science and Artificial Intelligence in the first half of 2023.
According to a statement from the University, the centre will offer areas of expertise in modelling and statistical learning; evolutionary and multi-objective learning; deep learning and transfer learning; image, text, signal, and language processing; scheduling and combinational optimisation; and interpretable AI/ML learning.
These technological themes will be applied across a wide range of areas including primary industry, climate change and environment; health, biology, medical outcomes; security, energy, high-value manufacturing; and social, public policy, and ethics applications. On top of traditional research, the centre will also establish a pipeline of scholarships/internships for Maori students, train early career researchers, and focus on industry, intellectual property, and commercialisation.
The centre will build on the current success and international leadership in this space at the University, the Pro Vice-Chancellor of the division, Ehsan Mesbahi, stated. The institute is continuing to grow its national and international partnerships to create local and global value. The centre will provide a distinctive identity for the growing excellence and innovation in data science and AI research at the University, capabilities which domestic and global partners are increasingly demanding across a vast array of application domains.
In May, the University announced it would offer the first undergraduate major in Artificial Intelligence in the country. It provides students with knowledge of AI concepts, techniques, and tools. They learn how to apply that knowledge to solve problems, combined with programming skills that will enable them to build software tools incorporating AI technology that will help shape the future.
Students studying AI at the University are taught by academics from its internationally renowned AI/ML research group, which is one of the largest in the southern hemisphere. The major is designed to open doors for graduates to opportunities nationally and around the world. There has been an increase in the adoption of AI technologies globally, and a growing demand for people who can apply AI techniques to address a wide range of problems, which the University aims to address.
After completing their degree, graduates will have a wide variety of career options, such as AI scientist, business consultant, AI architect, data analyst, machine learning engineer, and robotic scientist among others. They will also have the option to further their study through the University’s Master of Artificial Intelligence.
OpenGov Asia reported earlier that New Zealand’s Education Technology (EdTech) is set to become one of the country’s key industries. Worth NZ$ 173.6 million in 2020, EdTech software is poised to grow to NZ$ 319.6 million by 2025. At the heart of the digital transformation of education technology has been the pandemic. COVID-19 is seen as the driving force behind the digital transformation of learning, permanently changing the way education is consumed and delivered — right from preschool through post-tertiary education and lifelong learning. The global EdTech market size was valued at US$ 254.8 billion in 2021. Experts believe the market will reach US$ 605.4 billion by 2027.