Recently we sat down for a discussion on big data trends with Mr. Mike Tuchen (above), CEO of Talend. Mr. Jason Bissell, GM & SVP, Sales- APAC also participated in the chat, providing his perspective on the local markets.
Mr. Tuchen said that Talend is basically a data integration company. He said that data integration is about helping people take advantage of their data. Every company has data all over the place, out in the cloud, on-premise, in different applications, in different databases, in different file formats. The company needs to pull it all together, blend it, transform it to the right format to be able to analyse it and draw insights. That’s the problem Talend solves for its customers.
For example, Keolis, a Talend customer is the world leader in trams and automated metros, and transports 3 billion passengers annually, across 16 countries. It is the largest provider of public-transportation services in France.
“They have 70 different systems to manage their data. So, if you want simple things to work, like buying a ticket online and have it show up in an app on your phone, have it be consistent with what’s on the kiosk if you want to print it out, have the conductor be able to scan it using his/her smartphone application and confirm that the ticket’s valid, all those different systems need to connect together. That’s what we do,” Mr. Tuchen explained.
Four big trends in big data
Mr. Tuchen highlighted four big trends that reshaping the way companies and governments are using data right now. The first one is that the amount of data is just exploding. And so, companies are moving to these so-called big data or noSQL platforms, like Hadoop and Spark.
The second one is a move to the cloud. Though the trend has been stronger in the commercial space, compared to the public sector, Mr. Tuchen thinks the move will happen in government as well. The primary constraint preventing the shift to cloud for government are concerns over privacy and security. Usually, there are stringent data sovereignty compliance requirements for government data, even if it is not even if it is not sensitive or classified data.
Mr. Bissell said that in several countries in south-east Asia (outside of Singapore) where data centres aren’t being provided by the majors, the telcos are stepping in. By the very nature of their being a local company, the data sovereignty issues are dealt with.
The third trend is a shift towards more self-service. Rather than requiring an IT developer to always do all the data manipulation, you can have someone like a data analyst, who’s comfortable using Excel, to do it in suitable situations.
And finally, there is a move to real-time data and applying machine learning. “Those 4 trends we believe will completely remake how companies use data over the next decade. In those 4 areas, we see ourselves as a leader. We are competing primarily with large incumbents, who haven’t done a great job of adapting to the changing marketplace. In those 4 areas, we do a far better job,” Mr. Tuchen said.
Mr. Bissell added, “The governance, the compliance, the regulation, applying them requires you to have full data lineage. You need to know where it is, what was done with it, if it was masked, how it was masked. Is it being masked throughout the entire process. Is the personal data held in one system and the attributes held in a separate system? You need an integration platform to bring those two together, once you have separated them. The great thing about Talend is that we provide one environment, where you can not only construct those governance frameworks, but also see them at any one point, regardless of where the data sits, whether it’s on the cloud or on-premise.”
Big data in the Asian public sector
When asked about how well governments in south-east Asia are making use of big data, Mr. Tuchen replied that the Asian governments, led by Singapore, are playing a leading role.
He said that Singapore has really been visionary in their data strategy. “They are betting on data not just because they believe it makes their government more effective. They are betting on data because they believe it is a growth area and they want to make sure they have a skill base of highly trained, highly skilled people, that will drive growth for the next couple of decades,” he elaborated.
In Singapore, the government is a leading buyer, and it is supporting that with training programmes, investments in companies, incentives to put development teams here in Singapore, resulting in an all-encompassing, holistic initiative.
For the developing countries, they might not be at the same level at the moment. But they are not burdened with legacy infrastructure. Since, they are building that infrastructure from scratch, they can build it brand new and leapfrog.
Mr. Bissell agreed, saying that the emerging economics are able to adopt a much more modern, contemporary approach, such as the use of noSQL platforms, which are especially suited for handling big data, instead of the traditional relational, SQL-based databases.
Mr. Bissell went on to add, “What we knew about IT systems ten years ago, if we use that same lens and apply that to Asia we will get it wrong. We have to apply a contemporary lens which says these organisations in emerging economies are using much more open source, much more big data practices, than you see in Europe or even in the US. Because they can download open-source platforms, they can get developers skilled on them a lot faster. We as an open source vendor see a lot of downloads of our software in emerging economies even in countries like Myanmar, Laos.”
The Open Source ecosystem
Talend follows the open core model of open source, where there is a “core” of open source and proprietary software built around it.
Mr. Tuchen talked about a number of benefits of open source for customers. The first one is they can try it out before they buy, just to understand what the product does. He said, “By the time they choose to work with us, they are already confident that we can solve their problem. It’s very different from an old traditional software model, where you had to spend a lot of money in advance only to find out a year or two later if you have actually solved your problem.”
Mr. Bissell told us that Talend gets approximately 6000 free downloads of their products in Asia every month. Talend doesn’t track who the downloaders are.
He said, “That’s the whole promise. That they can download our product, they can use it, and then when they recognise that they want to have a commercial relationship with us, they want enterprise capabilities, and service, they come to us."
The second one is that there tends to be less lock-in. The third benefit is that the community can contribute back and add more value to the products.
Mr. Tuchen explained that integration, in particular, is a long-tail problem, because there is an infinite number of different systems that we could connect to. The community can solve those problems themselves and then contribute those back.
“That means that even though we are only a 100 million Dollar company, and we are competing with billion-Dollar companies, that are ten times our size, we actually have more connectivity than they do because of this community, where everyone is helping each other to solve these problems.”
Open source platforms also encourage developers to make use of open data. They can download the open source platform, join the community if they want to, discuss with like-minded developers and solve problems.
Open source tools can also help developers to explore the opportunities provided by gove
rnment initiatives, such as the smart lampposts in Singapore, as part of the Smart Nation Sensor Platform. Half of each lamppost might be available for commercial use. But all developers might not want to invest at an early stage. Tools such as those provided by Talend, can step in here.
Mr. Tuchen described the radical transformation, “What we are seeing is that the entire infrastructure is being reinvented. We are seeing the database layer being reinvented, we are seeing the integration on top of that being reinvented. We are seeing the analytics on top of that being reinvented. And almost all of those new players are open source. So, it’s clearly becoming the preferred model. The reason why that’s happening is that customers are saying they want it that way. And governments are saying they want it that way.”
The Indonesian government disclosed four potential uses of Big Data and AI to improve its e-government programmes. These two technologies, they feel, have the potential to support disaster identification and preventive action, prevention of illegal activities and cyber-attacks and increase workforce effectiveness.
The Director General of Informatics Applications, Semuel A. Pangerapan, explained several scenarios for Big Data. According to him, the government can use Big Data to improve critical event management and the quality of the response by identifying problem points through Big Data Analytics. For example, the agencies can be better prepared to prevent and mitigate natural disasters such as drought, epidemics or massive accidents occur.
In addition, Big Data can also enhance the government’s ability to prevent money laundering and fraud through better surveillance to detect such illegal activities.
Furthermore, Big Data significantly reduces the possibility of cyber-attacks. Cyber-attacks can come from external parties, data leaks or internally for a variety of reasons. An analysis of patterns and unusual activities can help in preventing or managing such cyber issues.
Big Data and analytics can contribute to workforce effectiveness by increasing monitoring. In addition, it can be used for policy design, decision-making and gaining insights.
Semuel stressed the importance of data analysis after collecting all data in the right fashion. Data is only valuable if it is collected correctly and then analysed – data will only provide benefits if processed in the right way. “In its implementation, AI helps analyse existing Big Data, providing data understanding or insight to help make decisions,” he explained.
Another advantage of AI is the ability to speed up new implementation services and corrections in real-time. At the evaluation stage, AI can also provide suggestions for adjustments and improvements to subsequent policies.
Currently, the encourages the improvement of the quality of Big Data and AI innovation through the development of e-government. The Indonesian government is also open to third parties to accelerate Big Data and AI use.
E-government has made progress in recent years and received appreciation from the United Nations in 2020. The UN said that Indonesia’s e-government development index rose to rank 88 from previously ranked 107 in 2018. Indonesia’s e-participation index has also increased from rank 92 in 2018 to 57 in 2022.
“The two rankings show an increase in the quality of Indonesia’s e-government and the level of community activity in using e-government services,” said Semuel.
However, the government faced challenges in implementing these two technologies. Overlapping and data replication is one of the main problems. “Regulatory obstacles in the procurement of government Big Data infrastructure also need to be overcome. Then compliance with international standards for the national Big Data ecosystem is also still the government’s homework.”
To optimise AI use, Semuel emphasised the need for a skilled workforce, regulations governing the ethics of using AI, infrastructure, and industrial and public sector adoption of AI innovations.
The government is implementing several solutions to overcome challenges. First, they have provided suitable facilities in the form of National Data Centres (NDCs) in four separate locations. The NDCs will accommodate Government Cloud and contain national data across sectors.
Optimisation of data centre utilisation needs to be supported by staff with qualified expertise. For this reason, the government is holding digital skills training on AI and Big Data through the Digital Talent Scholarship (DTS) and Digital Leadership Academy (DLA) programs.
Apart from facilities and upskilling, Indonesia is looking to develop a business ecosystem that utilises AI and Big Data. Support for this comes from the National Movement of 1000 Digital Startups, Startup Studio Indonesia (SSI) and HUB.ID.
The Cyberspace Administration of China (CAC) announced a new certification for personal information protection and implementation. The office has decided to implement such certification to enhance its information protection capabilities and to promote the rational processing of personal information.
The certification implementation follows the Personal Information Protection Certification Implementation Rules. The implementation rules clarify that personal information processors must comply with the requirements of GB/T 35273 Information Security Technology Personal Information Security Specifications. The rules outline requirements for on-site audits, the evaluation and approval of certification results, post-certification supervision and certification time limits.
Organisations engaged in personal information protection certification work need approvals to carry out activities. The regulation applies to every personal information processor that carries out private information collection, storage, use, processing, transmission, provision, disclosure, deletion and cross-border processing activities.
The State Administration for Market Regulation and the State Internet Information Office decided to implement personal Information protection certification. The step is relevant to provisions of the Personal Information Protection Law of the People’s Republic of China (‘PIPL’). The body requires the Specifications for Security Certification of Cross-Border Processing of Personal Information for cross-border personal information processing.
The latest versions of the standards include technical verification, on-site audit, and post-certification supervision. In addition, the certification body shall clarify the requirements for certification entrustment materials, including but not limited to the basic materials of the certification client, the certification power of attorney, and relevant certification documents.
To get certified, an organisation must submit certification entrustment materials according to the certification body’s requirements and the certification body shall give timely feedback on whether it is accepted after reviewing the materials.
The materials are then used for determining the certification plan, including the type and quantity of personal information, the scope of personal information processing activities, information on technical verification institutions, etc., before notifying the organisation seeking certification.
The CAC stated certification is valid for three years. An organisation must submit a certification commission within six months before the expiration of the validity period. The certification body shall adopt the method of post-certification supervision and reissue new certificates to those that meet the certification requirements.
Violations, cheating, and other behaviours that seriously affect the implementation of the certification on the certification client or personal information processor will cancel the certificate. Therefore, certification bodies shall adopt appropriate methods to implement post-certification supervision to ensure that certified personal information processors continue to meet certification requirements. The certification body comprehensively evaluates the post-certification surveillance conclusions and other relevant information. If the evaluation is passed, the certification certificate can continue to be maintained.
The organisation shall actively cooperate with the certification activities. During the validity period of the certification certificate. If the name and registered address of the certified personal information processor, or the certification requirements, certification scope, etc., change, the certification principal shall submit a change entrustment to the certification body.
When changes happen, the certification body must evaluate the change in entrustment materials. The result will determine whether the body can approve the change. If technical verification or on-site audit is required, the body shall conduct technical and on-site audits before the change is approved.
When a certified personal information processor no longer meets the certification requirements, the certification body will promptly suspend or revoke the certification certificate. The certification principal can apply for the suspension and cancellation of the certification certificate within the validity period of the certification certificate.
Caltech engineers collaborated with the University of Southampton in England to design an ultrahigh-speed data transfer chip. The chip integrates both an electronics chip and a photonics chip which uses light to transfer data. It took four years to complete, from the initial idea to the final test in the lab.
“As the world becomes increasingly connected, and every device generates more data, it is exciting to show that we can achieve such high data rates while burning a fraction of power compared to the traditional techniques. We had to optimise the entire system all at the same time, which enabled achieving a superior power efficiency,” said Azita Emami, the Andrew and Peggy Cherng Professor of Electrical Engineering and Medical Engineering, Executive Officer for Electrical Engineering and senior author of the paper.
The research paper is titled “A 100Gb/s PAM4 Optical Transmitter in A 3D-Integrated SiPh-CMOS Platform Using Segmented MOSCAP Modulators.” Rockley Photonics and the U.K. Engineering and Physical Sciences Research Council funded this research.
The need for high processing power and transmission creates the inevitable excess heat. Heat is the enemy of the speed and the amount of data a computer device can manage. It happens not just for personal computers or laptops but also for data centres.
While a laptop may heat up while when in use, servers in data centres also heat up as they work – but at a much grander scale. Therefore, managing heat in the data centre is essential. The less heat, the more computing power is generated and the greater the volume of information it can handle.
Hence, engineers tried to find a way to increase the processing speed while keeping the heat low. The solution was to design and co-optimise an electronics chip and a photonics chip. The chip is innovative because it integrates an electronic circuit essential for data processing, combined with a photonics chip which is the most efficient piece for data transmission.
The Caltech/Southampton integrated chip can transmit 100 gigabits of data per second! Moreover, the integrated chip generates minimal heat, producing just 2.4 pico-Joules per transmitted bit. The result increases the electro-optical power efficiency by 3.6 times compared to the current technology.
Handling Next-level Computing
In the future, data centres will manage very high volumes of data compared to today. The new design integrated chip will answer a continuous demand for increasing data communication speed in data centres and high-performance computers.
“As the computing power of the chips scale, the communication speed can become the bottleneck, especially under stringent energy constraints,” Emami explained.
The high-demand data transmission and processing from a data-demanding task, such as a video call, streaming a movie, or playing an online video game, need high processing power in the data centre.
“There are more than 2,700 data centres in the U.S. and more than 8,000 worldwide, with towers of servers stacked on top of each other to manage the load of thousands of terabytes of data going in and out every second,” says a Caltech graduate student Arian Hashemi Talkhooncheh (MS ’16), lead author of a paper describing the two-chip innovation that was published in the IEEE Journal of Solid-State Circuits.
Both in normal circumstances and in times of crisis, Thai people are known to generate a lot of innovative ideas and continue to develop products that make their lives better. This encompasses and encapsulates the nation’s most recent campaign, Innovation Thailand, which promotes Thai creativity to a global audience.
The Innovation Thailand Alliance consists of partners from a variety of sectors including government agencies, private organisations, educational institutions, and civil societies. Through it, the National Innovation Agency of Thailand (NIA), is expanding the scope of its Innovation Thailand platform.
The fundamental goal is to use national/local ideas to revitalise the nation by promoting awareness of and pride in inventive Thai works. Allies will serve as ambassadors in the effort to promote Thailand as an innovative nation. They will be able to exchange knowledge and skills with one another at the same time.
All stakeholders are enthusiastic to help Thailand achieve its goal of being one of the world’s top 30 innovative nations by 2030 and turning Thailand into an innovation-driven country.
Innovation Capabilities of Thai People
The National Innovation Agency’s mission is to support and develop Thailand’s innovation system to promote economic restructuring and competitive enhancement.
“We began the Innovation Thailand campaign before COVID-19 because we faced a significant challenge in terms of how not only Thai people but also global clients, perceive the nation’s unique products and services,” explains Dr Pun-Arj.
Even though this may not be directly related to innovation, the NIA has attempted to communicate and brand national innovation in such a way that it can be easily connected not only with Thais but also with international customers – this is how they started the Innovation Thailand platform.
Thailand is a tourist destination and one of the top three in the world, which has caused the country to innovate their lifestyle as well as their livelihood.
Thai culture places a high value on craftsmanship and attention to detail. Thai innovation for artful living is a process created exclusively by the fusion of modern technology and knowledge passed down from one generation to the next.
“We have created ingenious solutions through this method that enhances the standard of living in terms of society, prosperity, health, safety, and the environment,” Dr Pun-Arj furthers.
They began to construct a community to exchange ideas, develop, and manage innovation that would result in delivering some information or any significant strategic movement that the government could initiate.
They are recruiting more Chief Innovation Officers from not only the private sector but also the public sector and universities, as part of their primary target group.
Dr Pun-Arj is looking to enhance the opportunities brought in by innovation, particularly at the regional level in the city. This is because they are working not only on economic development but also on the skillset of the social innovation division and platform.
“As a result, our primary focus is on regionalisations of innovation possibilities, as well as startups – innovation-based firms,” reveals Dr Pun-Arj.
He believes that every successful community is built upon a robust and well-functioning infrastructure. Hence, Thailand’s industries and infrastructure will be modernised to meet upcoming challenges.
“In the past, one of our five-year priorities included buildings which we identify as system integrators. As the system and ecosystem become more robust, we are transitioning from system integrators to full core facilitators.”
He emphasised the need to consider the impact of being a system integrator before transforming themselves into focal facilitators. Furthermore, the country wants to make better use of the enormous resource of innovation in universities to conduct research and technology in collaboration with other organisations across the world.
Through the City Innovation Index, which focuses primarily on districts and cities, the NIA promotes and monitors the constant innovation and evaluation of diverse organisations. Periodically, they performed surveys in particular industries to evaluate and propose answers for the difficulties they face.
A strong innovation strategy will evaluate the overall objectives, the target portfolio for innovation initiatives, and the process for allocating the necessary resources. The portfolio clearly defines innovation-critical benchmarks and bounds. Therefore, the nation will become democratic and transparent.
“I believe the government’s most essential innovation strategy focuses on three specific concerns. You must have highly strong and capable businesses of all sizes that will establish a very strong enterprise on its own. And secondly, you must have laws and regulations,” Dr Pun-Arj asserts. “In addition, governance is also required and identifying future risks.”
Thailand is struggling with several issues, including inequality, which includes limited access to public services, digital technology, education, and environmental problems. High manufacturing costs and new types of competition in the global supply chain became challenges for Thailand, with this, innovation has emerged as the country’s answer.
Additionally, there are many challenges in terms of digital transformation and government service and the nation is pushing for innovation that can deliver a good policy and deploy it into practice.
In the previous five-year plan, NIA primarily focused on the job of system integrator into four core facilitators. “That is why the short-term strategy is to train management in the methods, programmes, and activities that we have implemented over the last five years.”
NIA is primarily concentrated on strengthening the potential of regional innovation in several key sectors such as new technologies, assistance for startups, venture capital creation or investment for innovation, and internationalisation of Thailand’s innovation.
Dr Pun-Arj envisions a stronger Thai economy and society, with innovation playing a key role in propelling it. The Bio-Circular-Green Economy (BCG) model is a plan for the country’s growth and post-pandemic recovery. The BCG model focuses on four strategic sectors: agriculture and food, wellness and medicine, energy, materials, and biochemicals and tourism and creative economy.
It emphasises using science, technology, and innovation to turn Thailand’s comparative advantage in biological and cultural diversity into a competitive advantage. The primary aim is to support the sustainability of biological resources, develop local economies and communities and make Thai BCG industries more competitive and resilient to societal changes.
The approach is meant to make Thailand’s economy, society, and environment more sustainable and inclusive. “To achieve the 2030 goal, we must work incredibly hard to encourage innovation in this BCG economy. At the same time, the national policy needs to be improved.”
Dr Pun-Arj has been recognised as a pioneer in the domains of foresight and innovation management in the country. He counsels anyone aspiring to be a great innovator to fully comprehend the concepts of uncertainty and failure.
“Innovation will help us grow as a community or nation by making ourselves and others aware of the importance of innovation,” Dr Pun-Arj concludes.
The Indian Space Research Organisation’s (ISRO) Polar Satellite Launch Vehicle (PSLV) has launched nine satellites, including eight nanosatellites, into space from the first launch pad at the Satish Dhawan Space Centre in Andhra Pradesh.
The 44-metre-long rocket’s primary payload is the Earth Observation Satellite-6 (EOS-6) or Oceansat-3, a third-generation satellite to monitor oceans. It is a follow up to OceanSat-1 or IRS-P4 and OceanSat-2 launched in 1999 and 2009, respectively. Oceansat-3 will provide data about ocean colour, sea surface temperature, and wind vector data for oceanography, climatology, and meteorological applications.
The Oceansat-3 was placed in the polar orbit at a height of about 740 kilometres above sea level. While it weighs approximately 1,100 kilogrammes, which is only slightly heavier than Oceansat-1, for the first time in this series, it houses three ocean observing sensors. These include an Ocean Colour Monitor (OCM-3), Sea Surface Temperature Monitor (SSTM), and Ku-Band scatterometer (SCAT-3). There is also an ARGOS payload, a press release mentioned.
The OCM-3, with a high signal-to-noise ratio, is expected to improve accuracy in the daily monitoring of phytoplankton. This has a wide range of operational and research applications including fishery resource management, ocean carbon uptake, harmful algal bloom alerts, and climate studies. The SSTM will provide ocean surface temperature, which is a critical ocean parameter to provide various forecasts ranging from fish aggregation to cyclone genesis and movement. Temperature is a key parameter required to monitor the health of the coral reefs, and if needed, to provide coral bleaching alerts. The Ku-Band Pencil beam scatterometre will provide a high-resolution wind vector (speed and direction) at the ocean surface, which will be useful for seafarers, including fishermen and shipping companies. Data regarding temperature and wind is also particularly important for ocean and weather models to improve their forecast accuracies.
ARGOS is a communication payload jointly developed with France and it is used for low-power (energy-efficient) communications including marine robotic floats (Argo floats), fish-tags, drifters, and distress alert devices valuable in search and rescue operations.
The Minister of State (Independent Charge) for Science and Technology, Jitendra Singh, stated that ISRO will continue to maintain the orbit of the satellite and its standard procedures for data reception and archiving. Major operational users of this satellite include Ministry of Earth Sciences (MoEs) institutions such as the Indian National Centre for Ocean Information Services (INCOIS) and the National Centre for Medium Range Weather Forecasting (NCMRWF).
INCOIS has also established a state-of-the-art satellite data reception ground station within its campus with technical support from the National Remote Sensing Centre (ISRO-NRSC). Singh asserted that ocean observations such as this will serve as a solid foundation for India’s blue economy and polar region policies. A representative from MoES noted that the launch of Oceansat-3 is significant as it is the first major ocean satellite launch from India since the initiation of the UN Decade of Ocean Science for Sustainable Development (UNDOSSD, 2021-2030).
The Indian Space Research Organisation is the national space agency of India, headquartered in Bengaluru. It operates under the Department of Space, which is overseen by the country’s Prime Minister.
Astronomers from the California Institute of Technology (Caltech) have completely automated the classification of 1,000 supernovae using a machine-learning (ML) algorithm. The Zwicky Transient Facility, or ZTF, a sky survey instrument located at Caltech’s Palomar Observatory, collected data that the algorithm was then used to analyse.
“We needed a helping hand, and we knew that once we trained our computers to do the job, they would take a big load off our backs,” says Christoffer Fremling, a staff astronomer at Caltech and the mastermind behind the new algorithm tagged as SNIascore.
A year and a half after SNIascore classified its first supernova in April 2021, they are approaching the pleasant milestone of 1,000 supernovae. Every night, ZTF scans the night sky for alterations known as transient events. This covers everything, from asteroids in motion to recently devoured stars by black holes to exploding stars known as supernovae.
ZTF notifies astronomers worldwide of these transient events by sending out hundreds of thousands of alerts each night. Other telescopes are then used by astronomers to monitor and learn more about the nature of the shifting objects. Thousands of supernovae have so far been found thanks to ZTF data.
Members of the ZTF team cannot organise all the data on their own due to the constant flow of data that comes in every night. According to Matthew Graham, project scientist for ZTF and research professor of astronomy at Caltech, “the traditional notion of an astronomer sitting at the observatory and sieving through telescope images carries a lot of romanticism but is drifting away from reality.”
Instead, to help with the searches, the team has created ML algorithms. SNIascore was created to categorise potential supernovae. There are two main categories of supernovae: Type I and Type II. In contrast to Type II supernovae, Type I supernovae are devoid of hydrogen.
When material from a companion star flows onto a white dwarf star, causing a thermonuclear explosion, a Type I supernova is produced. When a massive star collapses due to its own gravity, a Type II supernova happens. Type Ia supernovae, or the “standard candles” in the sky, can be classified by SNIascore. These are dying stars that explode with a steady-state thermonuclear blast.
Astronomers can gauge the universe’s expansion rate thanks to Type Ia supernovae. Fremling and colleagues are currently expanding the algorithm’s capabilities to classify additional types of supernovae soon.
Every night, after ZTF has recorded sky flashes that may be supernovae, it sends the data to the SEDM spectrograph at Palomar, which is in a dome a short distance away (Spectral Energy Distribution Machine).
To determine which supernovae are likely Type Ias, SNIascore collaborates with SEDM. As a result, the ZTF team is working quickly to compile a more trustworthy data set of supernovae that will allow astronomers to conduct additional research and, ultimately, learn more about the physics of the potent stellar explosions.
“SNIascore is incredibly precise. We have observed the performance of the algorithm in the real world after 1,000 supernovae” says Fremling. Since the initial launch in April 2021, they have found no clearly misclassified events, and they are now planning to implement the same algorithm with other observing facilities.
According to Ashish Mahabal, who oversees ZTF’s machine learning initiatives and is the centre’s lead computational and data scientist at Caltech, their work demonstrates how ML applications are maturing in near real-time astronomy.
The SNIascore was created as part of the ZTF’s Bright Transient Survey (BTS), which is currently the most comprehensive supernova survey available to the astronomical community. The entire BTS dataset contains nearly 7000 supernovae, 90 per cent of which were discovered and classified by ZTF while the remaining 10 per cent were contributed by other groups and facilities.
The Victoria University of Wellington’s division of Science, Health, Engineering, Architecture, and Design Innovation (SHEADI) will inaugurate a Centre of Data Science and Artificial Intelligence in the first half of 2023.
According to a statement from the University, the centre will offer areas of expertise in modelling and statistical learning; evolutionary and multi-objective learning; deep learning and transfer learning; image, text, signal, and language processing; scheduling and combinational optimisation; and interpretable AI/ML learning.
These technological themes will be applied across a wide range of areas including primary industry, climate change and environment; health, biology, medical outcomes; security, energy, high-value manufacturing; and social, public policy, and ethics applications. On top of traditional research, the centre will also establish a pipeline of scholarships/internships for Maori students, train early career researchers, and focus on industry, intellectual property, and commercialisation.
The centre will build on the current success and international leadership in this space at the University, the Pro Vice-Chancellor of the division, Ehsan Mesbahi, stated. The institute is continuing to grow its national and international partnerships to create local and global value. The centre will provide a distinctive identity for the growing excellence and innovation in data science and AI research at the University, capabilities which domestic and global partners are increasingly demanding across a vast array of application domains.
In May, the University announced it would offer the first undergraduate major in Artificial Intelligence in the country. It provides students with knowledge of AI concepts, techniques, and tools. They learn how to apply that knowledge to solve problems, combined with programming skills that will enable them to build software tools incorporating AI technology that will help shape the future.
Students studying AI at the University are taught by academics from its internationally renowned AI/ML research group, which is one of the largest in the southern hemisphere. The major is designed to open doors for graduates to opportunities nationally and around the world. There has been an increase in the adoption of AI technologies globally, and a growing demand for people who can apply AI techniques to address a wide range of problems, which the University aims to address.
After completing their degree, graduates will have a wide variety of career options, such as AI scientist, business consultant, AI architect, data analyst, machine learning engineer, and robotic scientist among others. They will also have the option to further their study through the University’s Master of Artificial Intelligence.
OpenGov Asia reported earlier that New Zealand’s Education Technology (EdTech) is set to become one of the country’s key industries. Worth NZ$ 173.6 million in 2020, EdTech software is poised to grow to NZ$ 319.6 million by 2025. At the heart of the digital transformation of education technology has been the pandemic. COVID-19 is seen as the driving force behind the digital transformation of learning, permanently changing the way education is consumed and delivered — right from preschool through post-tertiary education and lifelong learning. The global EdTech market size was valued at US$ 254.8 billion in 2021. Experts believe the market will reach US$ 605.4 billion by 2027.