OpenGov visited Dr. Ole Nielsen at his office in Geoscience Australia in Canberra to discuss digital transformation at the organisation. He is the Director of Scientific Computing and Systems Engineering. Dr. Nielsen looks after a team of software engineers who are developing models, software and cloud infrastructure. He coordinates software development practices and makes sure things are aligned. He also plays a strategic role for the agency, as the digital transformation coordinator.
Geoscience Australia is the government’s technical adviser on all aspects of geoscience, and custodian of the geographic and geological data and knowledge of the nation. Its function is to apply science and technology to describe and understand the Earth for the benefit of Australia. It operates within the Industry, Innovation and Science portfolio.
Talking about the newly unveiled Digital Science Strategy, Dr. Nielsen tells us how he is leading the movement towards a truly open organisation, using open source tools and standards, organising and releasing open data and learning from other public sector agencies, as well as private sector pioneers.
What are your major areas of focus in digital transformation?
We recently developed a Digital Science Strategy. Digital technology available now can do things that we could never do before, in terms of computing power and connectivity. The data sets we are working with are exponentially growing. The complexity of the questions we are trying to answer is unprecedented. The expectation of users is that everything be available at their fingertips, on their smart phones.
As an agency, we must become more innovative, adaptable, open, collaborative and quantitative. Those are the five guiding principles.
We need to collaborate more and recognise that we need different skills. We need to be adaptable in a changing environment. We can’t plan everything down to the last detail. One, because things change and secondly, it’s so complex we can’t possibly predict everything.
We should be open and transparent. That is the precursor for being collaborative. It also means that we embrace open data and open-source as mechanisms for collaboration.
What steps are you taking in terms of training and enhancing skills?
We believe in learning by doing. We are for example setting up coaching for our teams to learn how to work in an agile world.
We had four staff seconded to Australia Post to work for three months. We’re also observing the Taxation Office to see how they’re working.
We also need to think differently about recruitment. We are looking at what we can do to be smarter about finding talent. The ‘science’ in Geoscience Australia is exciting. But if we can build a culture where staff are truly empowered, I think we would have a better chance of attracting and retaining talent.
One component of the digital strategy is “Establishing a quantitative science platform”, involving HPC, cloud services and big data analytics. Could you tell us more about it?
We want to scale up to do our analyses at the national level. For example, if we want to predict the underground presence of minerals, we might be able to do it on a 100×100 kilometres grid or at just one location. But we want to be able to national scale predictions at a high resolution in the future so will need to use High Performance Computing (HPC) and Big Data systems to achieve this.
It is important that we can scale up whatever we develop on a small scale, without having to re-engineer the whole thing. We cannot do that if it is tied to a particular operating system on the desktop. It is much easier to do that when we use open source tools and standards.
You are taking a completely open and collaborative approach. Sometimes we hear that people have concerns regarding security when it comes to sharing. Can you tell us your views on that?
We have a mandate to make most of our data available to the public. There might be a small amount that we do not want to put out there. But we don’t have personal user data. Most of our data is open scientific data meant for public consumption.
But we’re very serious about security as it is going to be increasingly complex with cloud infrastructure, web services and distributed systems all accessing the Internet of Things. Our security is more about preventing attacks. An attack might take our services down or it might try to use our services as a starting point for other attacks or for sending out spamming emails.
Our new principle on security is that “Security is everyone’s business, everyone’s responsibility”. We need to train everyone up and make them responsible for the security of the systems they build.
We’ve got security training coming up. Instead of telling people not to do this or that, we ask a security company to look for vulnerabilities in our applications. We will ask them to show what they did and tell us what we can do to improve security. That is far more constructive than the traditional approach.
We want to bake security into our systems and processes, so that when we produce new virtual infrastructure, we make sure that we have taken security into account from the start.
You also talked about using open source software. Are you able to fulfil the requirements you need just using open source software and platforms?
I was involved in a programme where we developed a hydrodynamic model, called ANUGA Hydro, to simulate impacts of tsunami, flooding or storm surge disasters on the built environment and to present the results in forms that are easily interpreted. We made that code open source in 2006 and it has been used in Indonesia, Japan and in New Zealand. The NSW State Government is modelling tsunami impact in each estuary using our code. This demonstrates the value of open source and collaboration to me.
Issac Newton said, “If I have seen further than anyone, it’s because I stood on the shoulders of giants”. If you are at the leading edge of science, you will naturally have to work with other science agencies. NASA, for example, released about 250 open source projects in June. We’re using a lot of the tools released from science labs such as NASA and the Lawrence Livermore National Laboratory. In fact, Netflix, also just released a lot of open source tools to do with automation of cloud services we are looking at as well.
There’s a plethora of options out there, that far exceeds what we can buy off the shelf. There is so much potential in open collaboration. Also, some of our requirements are niche, which cannot be met by off-the-shelf products.
For example, we have a programme around dating rocks. The data analysis software for that is currently running on an Excel 2003 spreadsheet. There are about 50 labs in the world using these tools. Those 50 labs are getting together to think about building a community-based data analysis application, led by the University of Charleston in the US. By doing that, we can all chip in a little bit and then get a tool that we all can use. Because no one has the resources to build it by themselves.
But we don’t want to build anything if we can find an existing open source tool or buy it off the shelf at the right price. Or even better, if we can get it as a service off the cloud we prefer that. If none of that exists, then we’ll have to engage in development but make it open source to help others and generate collaboration.
Do you align your strategy with that of the DTO?
The DTO has 12 or 13 service standards that they are pushing very hard from the top. We will be aligning ourselves with those. We look at it as another enabler to help us make the change. The DTO can make presentations, offer training. The DTO provides good principles around openness, open source, collaboration and agile.
We also look at GDS in the UK and 18F from the US. Our digital science strategy is our interpretation of the international and Australian government principles and what they mean for us as a science agency. They serve are a reference for us, allow us to build on existing work and ensure we are aligned with international trends.
Could you tell us about some of the research projects being conducted by Geoscience Australia in this new framework in which ICT is playing that kind of a role?
The new way of working only really started in July so it’s very early days and we have a very long way to go. However, it is the result of a year of consultation and research to work out things.
For example, we have a bush fire warning system called Sentinel Hotspots. We developed it a few years ago to help inform the public about the progression of bushfires. It uses satellite data to provides regular updates on a website, so people can track where the bushfires are.
That was running internally on the old infrastructure. It was deployed in a very manual way. There were different teams responsible for different parts of the system. To make changes, five different teams had to be motivated to act and that’s difficult because they have other priorities.
We re-developed the code so that it could be deployed through continuous delivery. It means that from the minute you make a change to the code, it can be up in production in 10 minutes. It used to take weeks earlier, because of manual processes like people would have to look at a document and type in commands by hand. So, we have automated that. It’s running live on Amazon Web Services. By using commercial cloud, you can also have automatic scaling when web traffic spikes. We tested it with 20 million hits in 40 minutes and the system didn’t miss a beat.
You also have self-healing. If a server goes down, a new one automatically start up. You have the ability to make a quick change without affecting production. These are some of the things we are learning from Netflix who are among the best at this kind of thing.
We are also starting to work more in an agile way. We have teams standing up every morning so everybody can see what they are doing, showcases every two weeks and regular get-togethers to do planning for the immediate future i.e. about 10 weeks ahead.
Can you tell us about your software development process?
In the traditional waterfall, there’s a tendency to stick with a project, even if it’s the wrong thing. In agile, you can get away from the wrong path much faster.
We have started growing an agile culture but, as I said, it is very early days and we have a lot to learn.
Let me explain the concept:
If it’s worth prototyping, you get a few people in a team to write a narrative stating what is it we want to achieve (not too detailed requirements though, because they go “stale” quickly as things change). Then you develop a prototype rapidly and pass it to the user to check if it is useful for them. If you think it is worth building, you expand the team and work on it until you have a ‘minimum viable product’. You get that out as soon as you can. We picked up this approach from Spotify, where they will release it to a small percentage of users. Then if it is good enough, you do the broader roll-out and tweak it as you go ahead.
So, it’s not that you write code faster. But you can throw out the bad ideas faster. It results in more efficient allocation of resources. There is less reporting because the process is open and collaborative. And the users are on-board from an early stage and are not going to surprised later.
Are there any ongoing data consolidation projects?
Some of the data sits on a tape robot in the basement. There are petabytes of data in all sorts of standards and formats. Traditionally it sits in all sorts of places in the agency. To get it out and behind an API is a lot of work. With large amounts of data, you can’t move it often. We have to choose if we want to put it in the cloud. With transient data, it’s a lot easier. With master data, it’s more difficult.
We’re putting a lot of the data, such as satellite data in the National Computational Infrastructure (NCI). For example, we are turning lakes on a map into statistical objects: how often were they wet? How often were they dry?
If we put data on the cloud, there are costs. If you start serving a lot of data in commercial cloud, that could be an unpredictable cost because of the pricing model.
We have a whole section called Scientific Data that’s working on all this. They are focused on critical areas such as metadata, provenance, interoperability, discoverability, archiving and digital continuity.
What are the kinds of challenges that you face in change management?
Leading change is a big part of what I do. We all fear change. It’s human nature to say I don’t want to change the way I work. “Everyone wants change but nobody wants to change”. Some are afraid that if change happens their relevance or their job might be at risk.
The challenge is to demonstrate the value of it.
We need to prove that it’s scarier not to change given that the world around us is. If we don’t change to meet the needs of the future, we all lose our relevance, our influence and ultimately our jobs. If we do change we will all have to adapt, but we – and our organisation – will all be better for it!
Singapore’s Infocomm Media Development Authority (IMDA) has recently updated its platform known as Chief Technology Officer-as-a-Service (CTO-as-a-Service). The platform enables SMEs to self-assess their digital readiness and needs at any time and from any location, as well as access market-proven and cost-effective digital solutions and engage digital consultants for in-depth advisory and project management services.
This is for any business entity that wants to know how to start going digital, understand what type of solutions to adopt for its specific business challenge, or choose the solution that best meets its needs.
An enterprise can benefit from CTO-as-a-Service through:
- Conduct a self-evaluation of its digital readiness and pinpoint its gaps and needs in terms of digitalisation;
- Study other Small and Medium Sized Enterprises (SMEs) that have carried out digitalisation projects successfully;
- Receive digital solution suggestions based on the business’s needs and profile; and
- Evaluate the features and costs of various digital solutions.
There are more than 450 subsidised digital solutions available for selection, including those that address industry-specific or general business needs, as well as those that serve to streamline operations, increase business sales revenue, or ensure business resiliency.
The business can also work with digital consultants from the designated operators through CTO-as-a-Service, for digital advisory to assist:
- Seek a deeper comprehension of its business priorities and needs;
- Create training plans and digital solutions specifically for its businesses;
- Include fundamental data usage, protection, and cybersecurity risks in the digitalisation process.
The business may also ask digital consultants to assist with project managing the rollout of its digitalisation initiatives.
Eligible businesses can use digital advisory and project management services for free for the first time. Should the businesses want to keep using digital consultants, future usage or service enhancement will be based on commercial agreements.
Any company that satisfies the requirements below is qualified to use free project management and digital advisory services for the first time:
- Licensed and active in Singapore;
- A minimum of 30 per cent local shareholding;
- Enterprise’s group employment size is no more than 200 employees, or the group’s annual sales turnover is no more than S$100 million;
- Has never previously used CTO-as-a-Service digital consultants.
Meanwhile, SMEs are the backbone of Singapore’s economy. They employ two-thirds of the country’s workers and contribute almost half of Singapore’s GDP. Since digital technology is changing every part of Singapore’s economy, SMEs need to take advantage of digital technologies to grow and do well.
The SMEs Go Digital programme, which was started by the IMDA in April 2017, is meant to make going digital easy for SMEs. More than 80,000 SMEs have used the programme’s digital solutions.
Enterprises can also use advanced and integrated solutions to improve their capabilities, strengthen business continuity measures, and build longer-term resilience. Solutions that are supported by government agencies solve common problems at the enterprise level on a large scale, help enterprises adopt new technologies, and make it easier for enterprises to do business within or across sectors.
IMDA works with sector-led agencies and industry players to find advanced and integrated digital solutions that can be supported and are relevant to their sectors. Companies that want to use these solutions can check the IMDA website to find out when they can apply for each one.
Costs for hardware, software, infrastructure, connectivity, cybersecurity, integrations, development, improvement, and project management can be covered by funding support. With this, the agency has kept helping businesses, and the list of solutions that are supported will grow, with an emphasis on AI-enabled and cloud-based solutions.
Taiwan City Science Lab @ Taipei Tech demonstrated a series of cutting-edge AI applications. The lab exhibit advanced AI applications and their research and development results, such as the mobile robot, a AI robotic fish and Campus Rover.
The cross-disciplinary R&D and teaching laboratory aims to be a global technology and talent exchange platform. Massachusetts Institute of Technology (MIT) and Taipei Tech are coming together to jointly established City Science Lab @ Taipei Tech.
“Through developing advanced AI technology and big data system, we plan to make Taiwan the island of high-end technology,” said Yao Leehter, Taipei Tech Chair Professor of the Department of Electrical Engineering.
Yao indicated that Taipei Tech alums highly support the lab. The lab also collaborates with Kent Larson, the leader of MIT City Science Lab, the City Science Lab @ Taipei Tech aims to be an international platform for technology and talent exchange.
Taipei Tech adopts and jointly promotes with MIT to implement the Undergraduate Scientific Research Programme. Known as UROP, the programme provides sufficient resources for students and cultivates a new generation of scientific researchers. The collaboration was initially rolled out in 1969 by MIT’s first President, William Rogers.
For students to learn the most modern and state-of-the-art technology applications, the lab provides advanced equipment for R&D purposes, such as mobile robots. The agile, mobile robot can adapt to complex terrains and is equipped with LIDAR, infrared, and stereo vision sensors, which can draw 3D point cloud maps in real-time and detect and dodge obstacles. The mobile robot is used in decommissioned nuclear power plants, factories, construction sites, and offshore drilling oil platforms. Another mobile robot use case is for patrol, troubleshooting, and leak detection.
In addition, the lab also showcased its R&D results which are the AI robotic fish to the advanced instrumental equipment. The robotic fish is a streamlined robot designed to resemble a real fish. The fish robot comprehends and mimics the motion model of swimming fish through machine learning.
The robot can swim underwater in a simulated way. To perfectly mimic the fish movement, researchers have spent significant time collecting massive movement data from real fish, documenting, and analysing the swimming performance. Afterwards, they utilised AI technology and programme coding to control the motoric movement of the robotic fish.
The team then spent a year adjusting the robotic fish to make the swim movement look like a real fish. Machinery fish propulsion efficiency and excellent swimming performance are considered one of the most critical subjects in bionics.
“The robotic fish is useful for biological research and can also be used to carry out underwater operations and examine water quality,” said Yao.
Recently, the fish robot was involved in movie production. During the designing process, the production house team suggested adding a “cloth” on the fish with fish skin and fish scale to make it more lifelike. The company also came up with the idea to use a magnet to stick the fish scale on the body of the robotic fish. Taiwan Textile Research Institute and the local design research group joined the brainstorming and production process to finish the golden fish’s final look onscreen.
Moreover, The Campus Rover, developed by the team of Professor Yao in cooperation with the Taipei Tech Department of Industrial Design, demonstrated practical AI applications in real life. For example, campus or express hospital service can use the self-charging robot to ensure delivery safety.
Around 30,000 rural homes and communities will soon have access to faster and improved connectivity with an expansion of the Rural Capacity Upgrade programme. 21 new contracts have been signed by Crown Infrastructure partners to accelerate upgrades to towers and broadband connections in areas with poor coverage.
The announcement was made by the Minister for Rural Communities, Damien O’Connor, and the Minister for the Digital Economy and Communications, David Clark. This round of the Rural Capacity Upgrade will see many existing towers upgraded and new connections established in rural areas experiencing poor performance. Areas that will benefit from these improvements include, but are not limited to, settlements in the Far North, Gisborne, the Manawatu-Whanganui region, Taranaki, Southland, and Waikato.
The project is expected to significantly boost the economic productivity of homes and businesses with a slow, unreliable, or unusable connection, Clark noted. The government is committed to improving rural connectivity and is on track to see 99.8% of New Zealanders receive access to improved broadband because of the Ultra-Fast Broadband rollout, Rural Broadband Initiative, the Marae Digital Connectivity programme, and the Mobile Black Spot Fund by the end of 2023, he explained.
The investment in rural connectivity will work alongside Land Information NZ’s rollout of the Southern Positioning Augmentation Network (SouthPAN) service. As OpenGov Asia had reported earlier, SouthPAN is the Southern Hemisphere’s first satellite navigation augmentation service. It will improve the availability and accuracy of positioning, taking it from 5-10 metres to as little as 10 centimetres across the country.
This will boost rural productivity through precision agriculture and horticulture, fenceless farming, and improve the safety of search and rescue in the backcountry. The government, along with private sector contributions, has invested more than $2.5 billion into improving digital connectivity to date.
The government has also released “Lifting Connectivity in Aotearoa”, which sets out the high-level connectivity vision for New Zealand over the next decade. This includes the goal that all New Zealanders have access to high-speed connectivity networks, and that the country is in the top 20% of nations with respect to international connectivity measures.
Last month, the government launched the Remote Users Scheme to provide broadband and connect New Zealand’s most remote communities. Clark had announced the scheme, noting that it would equip as many remote households as possible with the connectivity infrastructure needed to access broadband services. As reported on OpenGov Asia, the Remote Users Scheme will help connect people to online health services and educational tools. Through Budget 2022, $15 million was allocated towards funding the scheme, as part of the broader $60 million rural connectivity package announced earlier in the year.
The Crown Infrastructure Partners (CIP), which was established by the government, will administer the Remote Users Scheme and is calling for applications from potentially eligible households and communities. A request for proposal from Internet service providers will follow. It is expected that new broadband connectivity infrastructure for the eligible areas and households can begin being built in mid-2023.
In a process that could be compared to travelling through a wormhole, researchers from the Massachusetts Institute of Technology, California Institute of Technology, Harvard University, and other institutions sent quantum information across a quantum system. The Sycamore quantum processor device was used in this experiment, which pave the way for more quantum computer research into gravitational physics and string theory in the future.
Calculations from the experiment showed that qubits moved from one system of entangled particles to another in a model of gravity, even though this experiment didn’t produce a disruption of physical space and time in the sense that might understand the term “wormhole” from science fiction.
A wormhole connects two far-off regions of spacetime. Nothing is allowed to travel through the wormhole in the general theory of relativity. But in 2019, some scientists hypothesised that an entangled black hole-created wormhole might be passable.
By introducing a direct interaction between the distant spacetime regions and using a straightforward quantum dynamical system of fermions, physicists have discovered a quantum mechanism to make wormholes traversable. This type of “wormhole teleportation” was also created by researchers using entangled quantum systems, and the outcomes were confirmed using classical computers.
In this experiment, researchers used the Sycamore 53-qubit quantum processor to teleport a quantum state from one quantum system to another to send a signal “through the wormhole.” The research team had to find entangled quantum systems that behaved as predicted by quantum gravity while also being small enough to run on current-generation quantum computers.
Finding a simple enough many-body quantum system that maintains gravitational properties was a key challenge for this work. The team gradually reduced the connectivity of highly interacting quantum systems using machine learning (ML) techniques to accomplish this. Each example of a system with behaviour that is consistent with quantum gravity that emerged from this learning process only needed about 10 qubits, making it the ideal size for the Sycamore processor.
It was crucial to find such tiny examples because larger systems with hundreds of qubits would not have been able to function on the quantum platforms currently in use. The team observed the same information on the other 10-qubit quantum system on the processor after inserting a qubit into one system and sending an energy shockwave across the processor after doing so.
Depending on whether a positive or negative shockwave was applied, the team measured how much quantum information was transferred between two quantum systems. The researchers demonstrated that a causal path between the two quantum systems can be established if the wormhole is kept open for enough time by the negative energy shockwaves. It is true that the qubit that was inserted into one system also appears in the other.
The team then used conventional computer calculations to confirm these and other properties. Running a simulation on a traditional computer is not like this. A conventional simulation, which involves the manipulation of classical bits, zeros, and ones, cannot create a physical system, even though it is possible to simulate the system on a classical computer and this was done as described in this paper.
Future quantum gravity experiments could be conducted using more advanced entangled systems and larger quantum computers because of this new research. This research does not replace direct observations of quantum gravity, such as those obtained through the Laser Interferometer Gravitational-wave Observatory’s detection of gravitational waves.
The Counter Ransomware Task Force (CRTF), which was formed to bring together Singapore Government agencies from various domains to strengthen Singapore’s counter-ransomware efforts, has issued its report.
Singapore’s efforts to promote a resilient and secure cyber environment, both domestically and internationally, to combat the rising ransomware threat are guided by the recommendations in the CRTF report.
According to David Koh, Commissioner of Cybersecurity, Chief Executive of CSA and Chairman of the CRTF, ransomware poses a threat to both businesses and individuals. Economically, socially, and even in terms of national security, it can be detrimental. Both internationally and across domains, ransomware is a problem.
“It requires us to collaborate and draw on our knowledge in a variety of fields, including cybersecurity, law enforcement, and financial supervision. It also necessitates that we work with like-minded international partners to identify a common problem and develop solutions,” David explains.
He exhorts businesses and individuals to contribute as well, strengthening the nation’s overall defence against the ransomware scourge.
Cybercriminals use malicious software known as ransomware. When ransomware infects a computer or network, it either locks the system or encrypts the data on it. For the release of the data, cybercriminals demand ransom money from their victims.
A vigilant eye and security software are advised to prevent ransomware infection. Following an infection, malware victims have three options: either they can pay the ransom, attempt to remove the malware, or restart the device.
Extortion Trojans frequently employ the Remote Desktop Protocol, phishing emails, and software vulnerabilities as their attack vectors. Therefore, a ransomware attack can target both people and businesses.
The ransomware threat has significantly increased in scope and effect, and it is now a pressing issue for nations all over the world, including Singapore.
The fact that attackers operate internationally to elude justice makes it a global issue. Ransomware has created a criminal ecosystem that offers criminal services ranging from unauthorised access to targeted networks to money laundering services, all fed by illicit financial gains.
Singapore must approach the ransomware issue as a cross-border and cross-domain problem if it is to effectively combat the ransomware threat.
Other nations should adopt comparable domestic measures to coordinate their financial regulatory, law enforcement, and cybersecurity agencies to combat the ransomware issue and promote international cooperation.
Three significant results were the culmination of the CRTF’s work. For government agencies to collaborate and create anti-ransomware solutions, they first developed a comprehensive understanding of the ransomware kill chain.
Second, it examined Singapore’s stance on paying ransom to cybercriminals. Third, for the government to effectively combat ransomware, the CRTF suggested the following policies, operational plans, and capabilities under four main headings:
Pillar 1: Enhances the security of potential targets (such as government institutions, critical infrastructure, and commercial organisations, especially small and medium-sized businesses) to make it more difficult for ransomware attackers to carry out successful attacks.
Pillar 2: To lower the reward for ransomware attacks, disrupt the ransomware business model.
Pillar 3: To prevent ransomware attack victims from feeling pressured to pay the ransom, which feeds the ransomware industry, support recovery.
Pillar 4: Assemble a coordinated international strategy to combat ransomware by cooperating with international partners. Singapore should concentrate on and support efforts to promote international cooperation in three areas that have been identified by the CRTF: law enforcement, anti-money laundering measures, and discouraging ransom payments.
The appropriate government agencies will take the recommendations of the CRTF under consideration for additional research and action.
An international team led by The Chinese University of Hong Kong (CUHK)’s Faculty of Medicine (CU Medicine) has successfully developed the world’s first artificial intelligence (AI) model that can detect Alzheimer’s disease solely through fundus photographs or images of the retina. The model is more than 80% accurate after validation.
Fundus photography is widely accessible, non-invasive and cost-effective. This means that the AI model incorporated with fundus photography is expected to become an important tool for screening people at high risk of Alzheimer’s disease in the community. Details have been published in The Lancet Digital Health under the international journal The Lancet.
Limitations of Alzheimer’s disease current detection methods
In Hong Kong, 1 in 10 people aged 70 or above suffers from dementia, with more than half of those cases attributed to Alzheimer’s disease. This disease is associated with an excessive accumulation of abnormal amyloid plaque and neurofibrillary tangles in the brain, leading to the death of brain cells and resulting in progressive cognitive decline.
The Clinical Professional Consultant of the Division of Neurology in CU Medicine’s Department of Medicine and Therapeutics stated that memory complaints are common among middle-aged and elderly people, and are often considered a sign of Alzheimer’s disease.
It is sometimes difficult to make an accurate diagnosis of Alzheimer’s disease based on cognitive tests and structural brain imaging. However, methods to detect Alzheimer’s pathology, such as an amyloid-PET scan or testing of cerebrospinal fluid collected via lumber puncture, are invasive and less accessible.
To address the current clinical gap, CU Medicine has led several medical centres and institutions from Singapore, the United Kingdom and the United States to successfully develop an AI model using state-of-the-art technologies which can detect Alzheimer’s disease using fundus photographs alone.
Studying disorders of the central nervous system via the retina
The S.H. Ho Professor of Ophthalmology and Visual Sciences and Chairman of CU Medicine’s Department of Ophthalmology and Visual Sciences explained that the retina is an extension of the brain in terms of embryology, anatomy and physiology. In the entire central nervous system, only the blood vessels and nerves in the retina allow direct visualisation and analysis.
Thus, it is widely considered a window through which disorders in the central nervous system can be studied. Through non-invasive fundus photography, a range of changes in the blood vessels and nerves of the retina that are associated with Alzheimer’s disease can be detected.
The team developed and validated their AI model using nearly 13,000 fundus photographs from 648 Alzheimer’s disease patients (including patients from the Prince of Wales Hospital) and 3,240 cognitively normal subjects. Upon validation, the model showed 84% accuracy, 93% sensitivity and 82% specificity in detecting Alzheimer’s disease. In the multi-ethnic, multi-country datasets, the AI model achieved accuracies ranging from 80% to 92%.
Accessibility, non-invasiveness and high cost-effectiveness of the AI model using fundus photography help the detection of Alzheimer’s cases both in the clinic and the community
A Professor of Medicine and Director of the Therese Pei Fong Chow Research Centre for Prevention of Dementia at CU Medicine stated that in addition to its accessibility and non-invasiveness, the accuracy of the new AI model is comparable to imaging tests such as magnetic resonance imaging (MRI).
It shows the potential to become not only a diagnostic test in clinics but also a screening tool for Alzheimer’s disease in community settings. Looking ahead, the team aims to validate its efficacy in identifying high-risk cases of the disease hidden in the community, so that various preventive treatments such as anti-amyloid drugs can be initiated early to slow down cognitive decline and brain damage.
The Associate Professor in the Department of Ophthalmology and Visual Sciences at CU Medicine said that in addition to applying novel AI technologies in the model, the team also tested it in different scenarios. Notably, their AI model retained a robust ability to differentiate between subjects with and without Alzheimer’s disease, even in the presence of concomitant eye diseases like macular degeneration and glaucoma which are common in city-dwellers and the older population.
Their results further support the hypothesis that the team’s AI analysis of fundus photographs is an excellent tool for the detection of memory-depriving Alzheimer’s disease. To move this research towards clinical application, the team is developing an integrated, AI-based platform to combine information from both blood vessels and nerves of the retina captured by fundus photography and optical coherence tomography for the detection of Alzheimer’s disease. Their findings should provide more evidence to move AI from code to the real world.
The Ministry of Information and Communications (MIC) announced it would roll out Internet advertising management measures at a conference in Hanoi earlier this week. Participants at the event discussed how advertising in cyberspace has become the norm. Domestic and foreign firms choose it because it is easier to access customers and it offers flexible costs and larger reach. However, the limited management of ads poses potential risks to the safety of brands, the Ministry has said.
According to a press release by MIC, ad agents affirmed that without the cooperation of cross-border platforms in modifying algorithms to filter and censor content, ad violations will remain rampant. The Ministry will penalise agents and brands that cooperate with platforms that do not fall in line with MIC regulations. On the other hand, the Ministry will support ads on domestic and foreign digital platforms that comply with domestic laws, MIC’s Deputy Minister, Nguyen Thanh Lam, noted. This will protect brands and build a healthy, safe, and fair ad business environment.
The Ministry will also increase inspection and clampdown on violations of Internet ads activities, he said. Cross-border ad firms that fail to comply with Vietnam’s laws will not be allowed to operate in the country. MIC has also generated a Whitelist consisting of licensed e-newspapers, magazines, general information websites, and social media. Other websites, registered accounts, and information channels are also in the pipeline for the list, the release said. The list will be publicised on the portals of the Ministry and Authority of Broadcasting and Electronic Information. Ad service providers, agents, and brands were also urged to use the list for their work.
Nearly 80% of the population in Vietnam are digital consumers, as OpenGov Asia reported earlier in October. Over the past year, the average contribution of e-commerce to total retail has continued to grow at 15%. Higher than growth in India (10%) and China (4%), with an online-to-total retail share of 6%. Now that the world is in the post-pandemic stage, regional consumers are prioritising an integrated shopping experience, combining online and in-person services. During the ‘discovery’ phase of their shopping, 84% of Vietnamese shoppers use the Internet to browse and find items. This is a period when they use more platforms than ever before, with the dominance of the e-commerce market accounting for 51% of online spending.
At the same time, social networking sites account for nearly half of online discoveries, including images (16%), social media videos (22%), and related tools such as messaging (9%). These tools were paramount channels for 44% of survey respondents. Consumers’ openness to interaction and experimentation has also led to behavioural changes, with 64% of respondents saying they have interacted with a business account in the past year. As customers seek more engagement, the content creation economy is able to grow exponentially.
In the context of digital consumption, Vietnamese users switch brands more often and increase the number of platforms they use to find a better value, with 22% of online orders made on various e-commerce platforms. The number of online platforms Vietnamese consumers use has doubled from 8 in 2021 to 16 in 2022. Therefore, it is important to put in place proper ad regulations as Internet usage grows.