We’ve entered the era of the information economy where data has become the most critical asset of every organization. Data-driven strategies are now a competitive imperative to succeed in every industry. To support business objectives such as revenue growth, profitability, and customer satisfaction, organizations are increasingly reliant on data to make decisions. Data-driven decision making is at the heart of your digital transformation initiatives.
But in order to provide the business with the data it needs to fuel digital transformation, organizations must solve two problems at the same time.
The data must be timely, because digital transformation is all about speed and accelerating time to market— whether that’s providing real-time answers for your business teams or delivering personalized customer experiences. However, most companies are behind the curve when it comes to delivering technology initiatives quickly.
But while speed is critical, it’s not enough. For data to enable effective decision-making and deliver remarkable customer experiences, organizations need data they can trust. This is also a major challenge for organizations. Being able to trust your data is about remaining on the right side of regulation and customer confidence, and it’s about having the right people using the right data to make the right decisions.
The Indonesian government disclosed four potential uses of Big Data and AI to improve its e-government programmes. These two technologies, they feel, have the potential to support disaster identification and preventive action, prevention of illegal activities and cyber-attacks and increase workforce effectiveness.
The Director General of Informatics Applications, Semuel A. Pangerapan, explained several scenarios for Big Data. According to him, the government can use Big Data to improve critical event management and the quality of the response by identifying problem points through Big Data Analytics. For example, the agencies can be better prepared to prevent and mitigate natural disasters such as drought, epidemics or massive accidents occur.
In addition, Big Data can also enhance the government’s ability to prevent money laundering and fraud through better surveillance to detect such illegal activities.
Furthermore, Big Data significantly reduces the possibility of cyber-attacks. Cyber-attacks can come from external parties, data leaks or internally for a variety of reasons. An analysis of patterns and unusual activities can help in preventing or managing such cyber issues.
Big Data and analytics can contribute to workforce effectiveness by increasing monitoring. In addition, it can be used for policy design, decision-making and gaining insights.
Semuel stressed the importance of data analysis after collecting all data in the right fashion. Data is only valuable if it is collected correctly and then analysed – data will only provide benefits if processed in the right way. “In its implementation, AI helps analyse existing Big Data, providing data understanding or insight to help make decisions,” he explained.
Another advantage of AI is the ability to speed up new implementation services and corrections in real-time. At the evaluation stage, AI can also provide suggestions for adjustments and improvements to subsequent policies.
Currently, the encourages the improvement of the quality of Big Data and AI innovation through the development of e-government. The Indonesian government is also open to third parties to accelerate Big Data and AI use.
E-government has made progress in recent years and received appreciation from the United Nations in 2020. The UN said that Indonesia’s e-government development index rose to rank 88 from previously ranked 107 in 2018. Indonesia’s e-participation index has also increased from rank 92 in 2018 to 57 in 2022.
“The two rankings show an increase in the quality of Indonesia’s e-government and the level of community activity in using e-government services,” said Semuel.
However, the government faced challenges in implementing these two technologies. Overlapping and data replication is one of the main problems. “Regulatory obstacles in the procurement of government Big Data infrastructure also need to be overcome. Then compliance with international standards for the national Big Data ecosystem is also still the government’s homework.”
To optimise AI use, Semuel emphasised the need for a skilled workforce, regulations governing the ethics of using AI, infrastructure, and industrial and public sector adoption of AI innovations.
The government is implementing several solutions to overcome challenges. First, they have provided suitable facilities in the form of National Data Centres (NDCs) in four separate locations. The NDCs will accommodate Government Cloud and contain national data across sectors.
Optimisation of data centre utilisation needs to be supported by staff with qualified expertise. For this reason, the government is holding digital skills training on AI and Big Data through the Digital Talent Scholarship (DTS) and Digital Leadership Academy (DLA) programs.
Apart from facilities and upskilling, Indonesia is looking to develop a business ecosystem that utilises AI and Big Data. Support for this comes from the National Movement of 1000 Digital Startups, Startup Studio Indonesia (SSI) and HUB.ID.
Caltech engineers collaborated with the University of Southampton in England to design an ultrahigh-speed data transfer chip. The chip integrates both an electronics chip and a photonics chip which uses light to transfer data. It took four years to complete, from the initial idea to the final test in the lab.
“As the world becomes increasingly connected, and every device generates more data, it is exciting to show that we can achieve such high data rates while burning a fraction of power compared to the traditional techniques. We had to optimise the entire system all at the same time, which enabled achieving a superior power efficiency,” said Azita Emami, the Andrew and Peggy Cherng Professor of Electrical Engineering and Medical Engineering, Executive Officer for Electrical Engineering and senior author of the paper.
The research paper is titled “A 100Gb/s PAM4 Optical Transmitter in A 3D-Integrated SiPh-CMOS Platform Using Segmented MOSCAP Modulators.” Rockley Photonics and the U.K. Engineering and Physical Sciences Research Council funded this research.
The need for high processing power and transmission creates the inevitable excess heat. Heat is the enemy of the speed and the amount of data a computer device can manage. It happens not just for personal computers or laptops but also for data centres.
While a laptop may heat up while when in use, servers in data centres also heat up as they work – but at a much grander scale. Therefore, managing heat in the data centre is essential. The less heat, the more computing power is generated and the greater the volume of information it can handle.
Hence, engineers tried to find a way to increase the processing speed while keeping the heat low. The solution was to design and co-optimise an electronics chip and a photonics chip. The chip is innovative because it integrates an electronic circuit essential for data processing, combined with a photonics chip which is the most efficient piece for data transmission.
The Caltech/Southampton integrated chip can transmit 100 gigabits of data per second! Moreover, the integrated chip generates minimal heat, producing just 2.4 pico-Joules per transmitted bit. The result increases the electro-optical power efficiency by 3.6 times compared to the current technology.
Handling Next-level Computing
In the future, data centres will manage very high volumes of data compared to today. The new design integrated chip will answer a continuous demand for increasing data communication speed in data centres and high-performance computers.
“As the computing power of the chips scale, the communication speed can become the bottleneck, especially under stringent energy constraints,” Emami explained.
The high-demand data transmission and processing from a data-demanding task, such as a video call, streaming a movie, or playing an online video game, need high processing power in the data centre.
“There are more than 2,700 data centres in the U.S. and more than 8,000 worldwide, with towers of servers stacked on top of each other to manage the load of thousands of terabytes of data going in and out every second,” says a Caltech graduate student Arian Hashemi Talkhooncheh (MS ’16), lead author of a paper describing the two-chip innovation that was published in the IEEE Journal of Solid-State Circuits.
The Indian Space Research Organisation’s (ISRO) Polar Satellite Launch Vehicle (PSLV) has launched nine satellites, including eight nanosatellites, into space from the first launch pad at the Satish Dhawan Space Centre in Andhra Pradesh.
The 44-metre-long rocket’s primary payload is the Earth Observation Satellite-6 (EOS-6) or Oceansat-3, a third-generation satellite to monitor oceans. It is a follow up to OceanSat-1 or IRS-P4 and OceanSat-2 launched in 1999 and 2009, respectively. Oceansat-3 will provide data about ocean colour, sea surface temperature, and wind vector data for oceanography, climatology, and meteorological applications.
The Oceansat-3 was placed in the polar orbit at a height of about 740 kilometres above sea level. While it weighs approximately 1,100 kilogrammes, which is only slightly heavier than Oceansat-1, for the first time in this series, it houses three ocean observing sensors. These include an Ocean Colour Monitor (OCM-3), Sea Surface Temperature Monitor (SSTM), and Ku-Band scatterometer (SCAT-3). There is also an ARGOS payload, a press release mentioned.
The OCM-3, with a high signal-to-noise ratio, is expected to improve accuracy in the daily monitoring of phytoplankton. This has a wide range of operational and research applications including fishery resource management, ocean carbon uptake, harmful algal bloom alerts, and climate studies. The SSTM will provide ocean surface temperature, which is a critical ocean parameter to provide various forecasts ranging from fish aggregation to cyclone genesis and movement. Temperature is a key parameter required to monitor the health of the coral reefs, and if needed, to provide coral bleaching alerts. The Ku-Band Pencil beam scatterometre will provide a high-resolution wind vector (speed and direction) at the ocean surface, which will be useful for seafarers, including fishermen and shipping companies. Data regarding temperature and wind is also particularly important for ocean and weather models to improve their forecast accuracies.
ARGOS is a communication payload jointly developed with France and it is used for low-power (energy-efficient) communications including marine robotic floats (Argo floats), fish-tags, drifters, and distress alert devices valuable in search and rescue operations.
The Minister of State (Independent Charge) for Science and Technology, Jitendra Singh, stated that ISRO will continue to maintain the orbit of the satellite and its standard procedures for data reception and archiving. Major operational users of this satellite include Ministry of Earth Sciences (MoEs) institutions such as the Indian National Centre for Ocean Information Services (INCOIS) and the National Centre for Medium Range Weather Forecasting (NCMRWF).
INCOIS has also established a state-of-the-art satellite data reception ground station within its campus with technical support from the National Remote Sensing Centre (ISRO-NRSC). Singh asserted that ocean observations such as this will serve as a solid foundation for India’s blue economy and polar region policies. A representative from MoES noted that the launch of Oceansat-3 is significant as it is the first major ocean satellite launch from India since the initiation of the UN Decade of Ocean Science for Sustainable Development (UNDOSSD, 2021-2030).
The Indian Space Research Organisation is the national space agency of India, headquartered in Bengaluru. It operates under the Department of Space, which is overseen by the country’s Prime Minister.
Astronomers from the California Institute of Technology (Caltech) have completely automated the classification of 1,000 supernovae using a machine-learning (ML) algorithm. The Zwicky Transient Facility, or ZTF, a sky survey instrument located at Caltech’s Palomar Observatory, collected data that the algorithm was then used to analyse.
“We needed a helping hand, and we knew that once we trained our computers to do the job, they would take a big load off our backs,” says Christoffer Fremling, a staff astronomer at Caltech and the mastermind behind the new algorithm tagged as SNIascore.
A year and a half after SNIascore classified its first supernova in April 2021, they are approaching the pleasant milestone of 1,000 supernovae. Every night, ZTF scans the night sky for alterations known as transient events. This covers everything, from asteroids in motion to recently devoured stars by black holes to exploding stars known as supernovae.
ZTF notifies astronomers worldwide of these transient events by sending out hundreds of thousands of alerts each night. Other telescopes are then used by astronomers to monitor and learn more about the nature of the shifting objects. Thousands of supernovae have so far been found thanks to ZTF data.
Members of the ZTF team cannot organise all the data on their own due to the constant flow of data that comes in every night. According to Matthew Graham, project scientist for ZTF and research professor of astronomy at Caltech, “the traditional notion of an astronomer sitting at the observatory and sieving through telescope images carries a lot of romanticism but is drifting away from reality.”
Instead, to help with the searches, the team has created ML algorithms. SNIascore was created to categorise potential supernovae. There are two main categories of supernovae: Type I and Type II. In contrast to Type II supernovae, Type I supernovae are devoid of hydrogen.
When material from a companion star flows onto a white dwarf star, causing a thermonuclear explosion, a Type I supernova is produced. When a massive star collapses due to its own gravity, a Type II supernova happens. Type Ia supernovae, or the “standard candles” in the sky, can be classified by SNIascore. These are dying stars that explode with a steady-state thermonuclear blast.
Astronomers can gauge the universe’s expansion rate thanks to Type Ia supernovae. Fremling and colleagues are currently expanding the algorithm’s capabilities to classify additional types of supernovae soon.
Every night, after ZTF has recorded sky flashes that may be supernovae, it sends the data to the SEDM spectrograph at Palomar, which is in a dome a short distance away (Spectral Energy Distribution Machine).
To determine which supernovae are likely Type Ias, SNIascore collaborates with SEDM. As a result, the ZTF team is working quickly to compile a more trustworthy data set of supernovae that will allow astronomers to conduct additional research and, ultimately, learn more about the physics of the potent stellar explosions.
“SNIascore is incredibly precise. We have observed the performance of the algorithm in the real world after 1,000 supernovae” says Fremling. Since the initial launch in April 2021, they have found no clearly misclassified events, and they are now planning to implement the same algorithm with other observing facilities.
According to Ashish Mahabal, who oversees ZTF’s machine learning initiatives and is the centre’s lead computational and data scientist at Caltech, their work demonstrates how ML applications are maturing in near real-time astronomy.
The SNIascore was created as part of the ZTF’s Bright Transient Survey (BTS), which is currently the most comprehensive supernova survey available to the astronomical community. The entire BTS dataset contains nearly 7000 supernovae, 90 per cent of which were discovered and classified by ZTF while the remaining 10 per cent were contributed by other groups and facilities.
The Victoria University of Wellington’s division of Science, Health, Engineering, Architecture, and Design Innovation (SHEADI) will inaugurate a Centre of Data Science and Artificial Intelligence in the first half of 2023.
According to a statement from the University, the centre will offer areas of expertise in modelling and statistical learning; evolutionary and multi-objective learning; deep learning and transfer learning; image, text, signal, and language processing; scheduling and combinational optimisation; and interpretable AI/ML learning.
These technological themes will be applied across a wide range of areas including primary industry, climate change and environment; health, biology, medical outcomes; security, energy, high-value manufacturing; and social, public policy, and ethics applications. On top of traditional research, the centre will also establish a pipeline of scholarships/internships for Maori students, train early career researchers, and focus on industry, intellectual property, and commercialisation.
The centre will build on the current success and international leadership in this space at the University, the Pro Vice-Chancellor of the division, Ehsan Mesbahi, stated. The institute is continuing to grow its national and international partnerships to create local and global value. The centre will provide a distinctive identity for the growing excellence and innovation in data science and AI research at the University, capabilities which domestic and global partners are increasingly demanding across a vast array of application domains.
In May, the University announced it would offer the first undergraduate major in Artificial Intelligence in the country. It provides students with knowledge of AI concepts, techniques, and tools. They learn how to apply that knowledge to solve problems, combined with programming skills that will enable them to build software tools incorporating AI technology that will help shape the future.
Students studying AI at the University are taught by academics from its internationally renowned AI/ML research group, which is one of the largest in the southern hemisphere. The major is designed to open doors for graduates to opportunities nationally and around the world. There has been an increase in the adoption of AI technologies globally, and a growing demand for people who can apply AI techniques to address a wide range of problems, which the University aims to address.
After completing their degree, graduates will have a wide variety of career options, such as AI scientist, business consultant, AI architect, data analyst, machine learning engineer, and robotic scientist among others. They will also have the option to further their study through the University’s Master of Artificial Intelligence.
OpenGov Asia reported earlier that New Zealand’s Education Technology (EdTech) is set to become one of the country’s key industries. Worth NZ$ 173.6 million in 2020, EdTech software is poised to grow to NZ$ 319.6 million by 2025. At the heart of the digital transformation of education technology has been the pandemic. COVID-19 is seen as the driving force behind the digital transformation of learning, permanently changing the way education is consumed and delivered — right from preschool through post-tertiary education and lifelong learning. The global EdTech market size was valued at US$ 254.8 billion in 2021. Experts believe the market will reach US$ 605.4 billion by 2027.
With the introduction of its Kooha Version 2.0 during the recently held 2022 National Science and Technology Week celebration, the Department of Science and Technology-Advanced Science and Technology Institute (DOST-ASTI) showered photo enthusiasts with helpful tips on interactive smartphone photography.
Kooha is a photo-sharing app derived from the Filipino word “kuha,” which means “to take.” It capitalises on the Philippines’ status as “the selfie capital of the world,” with thousands of photographs shared on various social media platforms every day.
With the help of the camera app Kooha, users may take pictures that go beyond simple snapshots. Multiple sensors are embedded into mobile devices; Kooha uses these sensor data while users snap pictures and embeds them in the image.
Users will be able to quickly learn the location where the photo was shot, the background noise when they shoot a selfie, the network provider’s signal strength, the device battery level, camera settings, environment sensor data, motion sensor, and more. All the photographs captured by the app are shared on Kooha Community. Users’ photos become more than just images when they post them to the community; they become contributions.
When the sensor data from the images is combined with the large pool of sensor data from other users, the data becomes societally important. The data can assist data scientists in generating insights and fresh knowledge that can be used by decision-makers across the country. Kooha is a free app that can be downloaded from Google Play.
According to the DOST-ASTI, Kooha uses the built-in sensors of a mobile device to gather real-time data like sound level, temperature, and humidity and embeds it into a snapshot, making it particularly valuable in research operations across industries thanks to the fresh knowledge it produces.
It added that even more useful Kooha features include the ability to contribute images to the community section, rate shared photos based on “awards” from other users, map the locations of pinned photos, and unlock “badges” by completing specific “achievements.”
As a useful tool application, Kooha reflects the reality that science and the arts may collaborate effectively to produce meaningful results. In addition, the DOST- ASTI’s Quality Management System (QMS) was recertified in accordance with the ISO 9001:2015 standard.
Director of DOST-ASTI Franz A. de Leon stated that the ISO recertification demonstrates the DOST-ASTI’s dedication to continuously enhance its operations and assure successful service delivery – bringing science and technology closer to the people.
He added that their partners and stakeholders can be confident that the institute will constantly offer high-quality products and services because they adhere to the quality policy of developing relevant, timely, and impactful ICT- and electronics-based innovations.
The ISO certificate was the result of the DOST-ASTI management and staff’s collaborative efforts to expand its technologies and ensure the smooth execution of its mandate and functions. Reviewing and improving processes is critical to achieving the agency’s purpose of contributing to the achievement of national development priorities and the growth of Philippine firms through the provision of creative solutions centred on ICT and electronics technology.
This is DOST-ASTI’s second recertification since transitioning to the ISO 9001:2015 standard in 2018. Subject to regular surveillance assessments, the certificate is valid until November 2025.
The Second Minister for Trade and Industry, Tan See Leng, and the Republic of Korea (RoK) Minister for Trade, Dukgeun Ahn, have signed the Korea-Singapore Digital Partnership Agreement (KSDPA).
Under the agreement, the two sides will work to establish digital trade rules and norms to promote interoperability between digital systems. This will enable more seamless cross-border data flows and build a trusted and secure digital environment for businesses and consumers. A government press release wrote that KSDPA will also deepen bilateral cooperation in new emerging areas such as personal data protection, e-payments, artificial intelligence, and source code protection.
The Ministers also signed a memorandum of understanding (MoU) on Implementing the Korea-Singapore Digital Economy Dialogue, which will act as a platform to promote digital economy collaboration between industry players and academic experts from both sides. The MoU is part of bilateral efforts to develop cooperative projects to implement the KSDPA. Key features of the KSDPA include:
Facilitating end-to-end digital trade
Electronic Payments (e-payments): The two sides will adopt transparent and facilitative rules (e.g. encouraging open Application Programming Interfaces (APIs)) to promote secure cross-border e-payments.
Paperless Trading: Singapore and RoK will accept electronic versions of trade administration documents to support the digitalisation and seamless exchange of key commercial documents.
Open Government Data: Both countries will ensure that government data will be publicly available in a machine-readable and open format, with easy-to-use and freely available APIs.
Enabling trusted data flows
Cross-border Data Flows (including for financial services): Businesses in Singapore and RoK will be allowed to transfer information, including those which are generated or held by financial institutions, across borders if the requisite regulations are met and with adequate personal data protection safeguards in place.
Prohibiting Data Localisation: The two nations will establish rules against data localisation requirements so that businesses can choose where their data is stored and processed, and their cloud technology of choice.
Facilitate trust in digital systems and participation in the Digital Economy
Artificial Intelligence (AI): The countries will promote the adoption of AI governance and ethical frameworks that support the trusted, safe, and responsible use of AI-based technologies.
Cryptography: Neither country will require the transfer of or access to private keys and related technologies, as a condition of market access.
Source Code Protection: To ensure software developers can trust the market within which they operate and ensure that source code is protected, neither country will require the transfer of, or access to, source code as a condition of market access. This includes the algorithm expressed in the source code.
Online Consumer Protection: The two sides will adopt laws that guard against fraudulent or deceptive conduct that causes harm to consumers engaged in online commercial activities.
Small and Medium Enterprises Cooperation: Singapore and RoK will promote jobs and growth for SMEs. They will also encourage their participation in platforms that help link them with international suppliers, buyers, and other potential business partners.
Digital Identities: The countries will promote interoperability of digital identity regimes, which can lead to reliable identity verification and the faster processing of applications. This will enable businesses and consumers to navigate the digital economy with ease and security.