OpenGov had the opportunity to interview Dan Rothman, Chief Technology Officer at the City of Boston. He talked about providing and maintaining all the enterprise-wide IT infrastructure for the City and discussed the consolidation of data centres and sharing infrastructure with communities surrounding Boston. He explained how the city produced near limitless bandwidth through the Boston Optical Fibre Network (BoNET), with very limited capital expenditure.
Can you tell us about your role at the City of Boston?
I am in the Department of Innovation and Technology (DOIT), which is the enterprise IT organisation for the City of Boston. Anything that is enterprise-wide is done within our group. The overall infrastructure that supports daily communications and computing needs is supported within DOIT. This includes IT security and network, data centre operations, mainframe operations, service management, telecommunications, radio networks, video networks.
All applications, infrastructure, that are not agency-specific, are supported through the DOIT department. For instance, the enterprise financial system, the enterprise system for hiring and payroll, the CRM systems which are built into the multiple agencies, those things are done through us. Those are permanent systems with broad reach. There are other silos of IT that are specific to the agencies.
Are there instances where something that should be enterprise-wide is not?
Sure, there are instances. For example, I really think that video surveillance should be managed on an enterprise-wide basis. There should be a common set of cameras that provide multiple agencies the video infrastructure they need.
But different agencies own their own infrastructure. Our transportation division, the police department, the school system, they all have their own infrastructure. We make them interoperable and we work on sharing them. But the reality is that this stuff should be a common resource that should be deployed for multiple uses.
So, there is room for improvement in that. We have been trying for multiple users to try to get their budgets transferred into a common video infrastructure budget. But that hasn’t happened yet.
Initially there were 7 different video management systems (VMS) were being used within the city. We achieved some consolidation and we got it down to 3 systems. Then we put a system on top of that, that allowed sort of a one-stop shopping access to all the systems. It didn’t really work out very well.
We ended up trying to make it better by making a separate network architecture available for video. Maybe we could have some common architecture to make it simpler. We also built up capability within the city for other agencies to adopt an existing VMS and to scale it up. We leveraged off inexpensive storage. If they adopted this, it would be cheaper than building their own infrastructure and storage.
We were also able to set some standards. We made sure we had the capacity to get everyone into compliance. We still have 3. But we got pretty much all the outliers to combine with the one system. That same system is shared by the state government, the state transportation authority, and by a bunch of state agencies.
There are other examples. There are often outliers who have not adopted a city-wide deployed system.
The complexity of city government is such that there are lots of different kinds of agencies. Some of them are purely under city, some are hybrids, where they are quasi-independent agencies. Maybe they are not reporting directly to the mayor but to some other type of government body. So, we don’t necessarily always have a mandate to force change a lot of times. We have to cajole, we have to use a carrot and stick approach.
So sometimes we will make a facility available. We will get participation in the process from a group. We will try to get other silos of IT to participate. It’s never 100% successful. But even if we get 90% of city agencies to adopt something, it’s better than nothing.
We were dependent upon telecoms for data services. Agencies had T1s or T3s, lines that offered sufficient bandwidth but very expensive and not terribly scalable.
Around 7 years ago, the City of Boston was able to get Comcast to give them dark fibre in lieu of some legal mandate for shadow conduit in a construction. Legal permitting process mandated that a certain shadow conduit for our city had to be put in place, any time trenches were dug within the city.
In lieu of them doing that conduit, we said give us x number of strands of fibre in these locations. That allowed us to get this fibre network for free. Initially its use was focused on public safety and the main hub for city operations, providing connectivity to few key large buildings and the public safety infrastructure.
Basically, with that we could push as much data as we wanted. We initially threw out 1 GB, with a 2 GB backhaul. We expanded over the years. Now we are up to a 100 GB backbone, at some locations we are pushing 10 GB at the edge.
With very limited capital expenditure, we were able to produce near limitless bandwidth for the people within the city. It let us do a lot of things that wouldn’t be practical otherwise. In most cases, if you have a 1000 video cameras, you don’t want to have those going across significant lengths, because video is a monster. But having this infrastructure, we can consolidate that video.
It also lets us sort of become the Internet provider for all the city employees, for libraries, for schools, because we are able to throw as much bandwidth as we need across that network, to the end-point.
You were talking about consolidation of data centres (in a pre-interview chat). What is being done in that area?
Initially we had 8 data centres within the City of Boston footprint. These were not purpose-built data centres. They tend not to meet the highest tiers for redundancy and resilience, because they were just ordinary city buildings converted to that use.
So, we have been doing a couple of things. One, we have been trying to migrate city infrastructure into purpose built data centres, managed professionally and meeting higher standards. So, initially we moved our production environments into premier data centre space in the city, which were way better than our environment.
Our main data centre for city hall is in the floodplain, below sea level. There were concerns around that. We would probably need a multi-million Dollar investment to upgrade the environmental controls to meet our future standards, energy and many other things.
So, the next step was to do a Request for Proposal (RFP) for data centre space outside of the city’s limits, to give ourselves some geographical diversity and be able to avert concerns of regional disasters. We ended up building space within another commercial data centre, around 240 miles (386 km). That was far enough to provide geographical diversity, in terms of weather issues like hurricanes. It was also far enough to be on separate power grids. That helps improve resilience.
Once we had built that, all the other city’s data centres wanted to have presence outside of physical footprint for resilience. So, we made that available. We have been in the process of building it out, the second data centre and making space available to other agencies to collapse those 8 into 2. But it’s not a mandate. We make it available, they can adopt it or not.
How are you preparing for the future?
I don’t think we are not going to be limited by bandwidth in the
foreseeable future, at least for a decade or two.
There might be a paradigm shift in technology. But right now, technologies such as small cell are all dependent on the fibre of our backhaul. They still need to get to that fibre connectivity at some point. At the moment, a lot of the wireless technology is very vulnerable, it’s fragile. So, having that fibre pathway, which we have in place, is going to be critical for the future because there is no resilient, non-fibre based solution at this point.
In dealing with IT infrastructure, what are the primary cybersecurity concerns you face?
Today we have systems which are not traditional IT systems and the people installing them are not IT companies. The guy installing security cameras may know enough to put an IP address in there and configure it. But he doesn’t know enough to properly turn off the ports and protocols that are unnecessary. He may for convenience, leave the default passwords, so that any of his customers can get into the camera easily and tweak it. He is not an IT guy and he doesn’t know that it is bad practice.
There are similar issues with say an intelligent air conditioning system, where you are collecting power consumption data and using that to fine tune the air conditioning to reduce the power consumption.
As more and more things become a part of the Internet of Things, the surface area expands. They are not properly secured. And some of them may not allow you to properly secure them at all. Typically, they don’t get patched on a monthly basis, like a computer operating system. At best, it might happen once or twice a year.
If someone gets into these devices, they can then use that to get into other things. It’s a challenge and that means that you have got that much more of a surface area to cover. And all the traditional tools used to manage the IT environment do not have reach into these IoT services.
You need specific expertise and knowledge and you usually have to go the extra mile to understand how can they secured and what you need to do for security. Someone has to audit, see what the passwords of the cameras are.
Does the ICT infrastructure of the City of Boston have connections with the ICT infrastructure of the state, or other cities?
We have specific state agencies that we bridge to. Some of the buildings that we are in, are state buildings.
MBTA (Massachusetts Bay Transportation Authority) runs all the transport infrastructure in the Greater Boston area. We have inter-connectivity with them for camera sharing. We also have some connectivity to state police and some other agencies like that.
We also have connectivity with the surrounding cities. We got federal funds to interconnect the fibre-optic infrastructure of the communities surrounding Boston with ours.
There’s a federal program, E-Rate which subsidises telecommunications and Internet access for schools, libraries. We compete with the telcos, with Verizon, AT&T. and we bid against them to provide services to the schools and libraries in Boston and then we get federal reimbursement. Now we have extended that out to the surrounding communities. We are providing e-ratable ISP services and managed security services to the surrounding communities at very low rate.
We are almost giving to them for the cost of the federal reimbursement. It gives them not only the data services but some security services too, which the communities don’t have the budget for. They benefit from infrastructure investment, in security features like next-generation firewalls.
The federal reimbursement subsidises our network infrastructure. Also, this helps in our negotiations for funds with the federal government.
Boston is the 23rd biggest city in the country. But the metro region is the 10th biggest. As a group, we are much bigger, which makes it easier to get funds from the fed. We are leveraging infrastructure and spreading the funds across a larger area. And we are helping out.
 Dark fibre is optical fibre infrastructure that is not in use. Much of the cost of installing cables comes from the civil engineering work required. Hence, the cable owners usually plan for, and install, significantly more fibre than is needed for current demand, to provide for future expansion and provide for network redundancy.
 In Boston, the "shadow conduit" policy demands that the first company to dig should ask other companies of their potential needs so that shadow conduits can be reserved for future users.
The National Heart Centre Singapore (NHCS) has been on a remarkable journey of advancements in cardiovascular research, particularly in the prevention, diagnosis, and management of heart diseases. With the global rise in heart disease cases, NHCS’s dedication to scientific knowledge and innovation has become increasingly vital.
Since its establishment in 2014, the National Heart Research Institute of Singapore (NHRIS) at NHCS has positioned itself as a leading institution for cardiovascular research in the region. Over the years, NHRIS has achieved significant breakthroughs that hold the potential to transform patient outcomes.
NHRIS’s research encompasses a wide spectrum of disciplines within cardiovascular medicine, spanning basic, translational, and clinical research. Notable achievements include Heart Stem Cell Therapy and Preventing Fibrosis.
By studying patients’ heart stem cells, researchers have uncovered new treatments for heart diseases. For example, a breakthrough treatment using myeloperoxidase has been discovered for hypertrophic cardiomyopathy, an inherited condition characterised by thickening of the heart muscle.
Also, through the study of heart tissue from patients undergoing surgery, NHRIS researchers have identified a potential treatment involving interleukin-11 antibodies to prevent inflammation and fibrosis in the heart and other organs. This innovative therapy has the potential to improve outcomes for patients with various inflammatory and fibrotic conditions.
The next phase of NHCS’s research efforts over the coming years will focus on three key areas:
- Discovery of New Treatments: Ongoing research aims to develop new treatments for heart diseases, enhancing patient outcomes.
- Utilising Artificial Intelligence: NHCS is at the forefront of integrating artificial intelligence (AI) into cardiovascular care. AI holds promise in predicting, diagnosing, and monitoring heart diseases with greater precision and efficiency. The APOLLO study, initiated in 2021, is building an AI-driven national platform for coronary angiography analysis, offering detailed reports on patients’ conditions and future cardiovascular disease risk.
- Clinical Trials and Population Health Studies: NHCS’s research agenda includes conducting clinical trials and population health studies to prevent the onset of heart disease.
NHRIS is pioneering innovative approaches, including Visualising Energy Pathways and AI Applications.
Disturbances in energy-producing pathways in heart muscle contribute to heart conditions as Hyperpolarised magnetic resonance spectroscopy, a novel imaging technology available only in a few centres worldwide, allows the measurement of these metabolic pathways, potentially leading to new treatments for heart disease.
On the other hand, AI accelerates research in the field of cardiovascular science. By processing vast datasets and identifying patterns, AI systems assist researchers in identifying novel treatment methods, risk factors, and disease mechanisms. These insights lead to breakthroughs in treatment and prevention methods, advancing the overall understanding of cardiovascular diseases.
With this, NHCS is leveraging AI to detect, predict, and diagnose heart diseases by analysing complex imaging data. AI provides clinicians with invaluable insights, enabling personalised care and early intervention.
In addition, NHCS collaborates with other heart research institutes and hospitals through CADENCE (Cardiovascular Disease National Collaborative Enterprise), a national platform that combines heart research capabilities in data science, clinical trials, and AI. This collaboration ensures a collective effort to advance cardiovascular research and improve patient care.
NHCS’s groundbreaking research initiatives in AI applications, clinical trials, and collaborative efforts underscore its commitment to enhancing patient care. As NHCS continues its pursuit of research excellence, its impact extends beyond Singapore, benefiting individuals across the region and around the world. The institution is poised to make substantial progress in preventing, diagnosing, and managing cardiovascular diseases, ultimately reshaping the future of cardiovascular medicine.
An innovative microscope developed by a research team at the Hong Kong University of Science and Technology (HKUST) is poised to revolutionise the field of cancer surgery. This cutting-edge microscope, powered by artificial intelligence, has the potential to transform the way surgeons detect and remove cancerous tissue during operations, thereby sparing patients from the distressing prospect of secondary surgeries.
Lung cancer, a leading cause of cancer-related deaths worldwide, has been a focal point for this ground-breaking research. Professor Terence Wong Tsz-Wai, the principal investigator of the project and an assistant professor in the Department of Chemical and Biological Engineering at HKUST, highlights the urgency of their work.
He notes that between 10% to 20% of lung cancer surgery cases require patients to return for a second operation due to incomplete removal of cancer cells. This uncertainty has long plagued surgeons, who often struggle to determine if they’ve successfully excised all cancerous tissue during the initial surgery.
The HKUST research team, led by Prof. Wong, is eager to see their innovation make a significant impact. Collaborating with five hospitals, including Queen Mary Hospital, Prince of Wales Hospital in Hong Kong, and three mainland Chinese hospitals, they have embarked on a large-scale clinical trial involving around 1,000 patient tissue samples. The goal is to have the microscope officially in service locally by 2024 and on the mainland by 2025.
The current methods for imaging cancer tissue offer either accuracy with lengthy delays or speed at the cost of accuracy. Traditional microscopy, considered the gold standard, is highly accurate but can take up to a week to generate results. This means patients must endure a week of anxious waiting to know the outcome of their surgery. In cases where the operation is deemed unsuccessful, patients face the daunting prospect of a second surgery to remove the remaining cancer cells.
The alternative, known as the frozen section, provides quicker results within 30 minutes but sacrifices accuracy, with an estimated accuracy rate of only around 70%.
The HKUST research team’s breakthrough technology, termed “Computational High-throughput Autofluorescence Microscopy by Pattern Illumination” (CHAMP), has changed this landscape. It can detect cancer cells in just three minutes with an accuracy rate exceeding 90%, rivalling the gold standard but with significantly faster results.
CHAMP employs ultraviolet (UV) light excitation to image tissue surfaces at a specific wavelength. Subsequently, a deep learning algorithm transforms the obtained greyscale image into a histological image, facilitating instant interpretation by doctors. This real-time feedback empowers surgeons to ensure they have completely removed all cancer cells during the operation.
CHAMP’s potential has garnered local, regional, and international acclaim, leading to the establishment of a start-up supported by HKUST and funded by the Technology Start-up Support Scheme for Universities (TSSSU). Beyond developing the technology, the company plans to manufacture CHAMP microscopes for medical institutions in Hong Kong, mainland China, and overseas markets.
This endeavour represents the culmination of years of meticulous research, starting with Prof. Wong’s PhD training at Washington University in St. Louis and the California Institute of Technology. During this period, Prof. Wong, under the guidance of biomedical imaging expert Prof. Lihong Wang, developed a microscope capable of analysing breast cancer tumours with an accuracy rate comparable to the gold standard but with results in just one to two hours.
The shift in focus to lung cancer occurred when a pulmonologist approached Prof. Wong, recognising the potential of the technology to enhance precision during lung cancer surgery. This decision led to the development of CHAMP microscopy, which is approximately 100 times faster than Prof. Wong’s earlier work during his PhD training. This breakthrough makes CHAMP clinically useful and impactful.
The applications of CHAMP extend beyond lung and breast cancers. The research team is conducting tests on smaller scales for conditions such as liver, colorectal, kidney, and skin cancers, as well as prostate gland conditions. Prof. Wong is confident that CHAMP will elevate medical imaging and diagnosis to new heights, benefiting not only Hong Kong hospitals but also healthcare institutions nationwide and abroad. This pioneering technology represents a beacon of hope for cancer patients, offering the promise of quicker, more accurate surgeries and improved outcomes.
OpenGov Asia reported that the Hong Kong Science and Technology Parks Corporation (HKSTP) spearheaded an initiative aimed at promoting innovation and technology in the biotech sector, showcasing Hong Kong’s pioneering advancements and entrepreneurial spirit.
This initiative was part of the “Think Business, Think Hong Kong” event organised by the Hong Kong Trade Development Council (HKTDC) in Paris recently. The event was a platform to underscore the potential for cross-border collaboration between Hong Kong and France in the field of biotechnology and innovation.
The government has unveiled the Intelligent Grievance Monitoring System (IGMS) 2.0 Public Grievance Portal and Automated Analysis in the Tree Dashboard portal under the Department of Administrative Reforms and Public Grievances (DARPG). It was unveiled by Jitendra Singh, the Union Minister of State (Independent Charge) for Science and Technology.
The IGMS 2.0 Dashboard was developed by the Indian Institute of Technology, Kanpur (IIT-Kanpur) as part of an agreement with the DARPG through a memorandum of understanding (MoU) signed in 2021. It enhances DARPG’s Centralised Public Grievance Redress and Monitoring System Information Systems (CPGRAMS) by integrating artificial intelligence (AI) capabilities. CPGRAMS is an online platform available to citizens round-the-clock to lodge their grievances to the public authorities on any subject related to service delivery.
The dashboard offers instant tabular analyses of both grievances filed and disposed of. It provides data categorised by state and district for grievances filed, and it also offers Ministry-wise data. Additionally, the dashboard can help officials identify the root causes of grievances.
The CPGRAMS portal receives an increasingly high caseload of issues raised by the general public. Given the public’s expectations for the timely resolution of their grievances, the portal receives approximately 2 million grievances annually.
Due to the substantial volume of grievances received, the manual classification and monitoring of cases is not feasible. The IGMS portal will assist the DARPG in generating draft letters for specific schemes or ministries. This automation expedites the grievance redressal process carried out by the respective ministries and departments involved.
According to Minister Singh, the Prime Minister has repeatedly emphasised the significance of grievance redressal as a crucial element to keep the government accountable and promote citizen-centric governance. In alignment with this vision, a more robust human interface mechanism has been introduced, which includes counselling services provided after the resolution of grievances.
The Minister praised DARPG for ensuring that the CPGRAMS portal is accessible in 22 Scheduled languages, in addition to English, ensuring that the benefits of the portal are accessible to the common man. He also emphasised the importance of integrating state public grievance (PG) portals and other government portals with CPGRAMS for more effective and streamlined grievance redressal processes.
He claimed that thanks to the reforms implemented by DARPG in the CPGRAMS, the average time it takes for central ministries and departments to resolve public grievances has decreased. There has been a decline of almost 50% in the average disposal time for central ministries and departments from 32 days in 2021 to 18 days in 2023.
Minister Singh also launched the Swachhata Special Campaign 3.0 and unveiled the Precedent Book (e-book) developed by the department. He praised the DARPG for achieving the transition to a fully paperless office, where all communication is conducted through the eOffice portal.
During the past two Swachhata campaigns, an impressive 9 million square feet of prime office space has been successfully cleared and repurposed for productive use. Additionally, 456,000 public grievances have been effectively redressed, and 8,998 references from Members of Parliament (MPs) have been addressed. The Swachhata campaign has also played a pivotal role in promoting an eOffice work culture within the government, resulting in over 90% of file work being transitioned to an online format.
Public transportation is a crucial service for enhancing the general satisfaction the government provides. In light of this, the Indonesian government has established high-speed rail infrastructure for Jakarta-Bandung mobility.
The Ministry of Communication and Information Technology (Kominfo) fully supports the Jakarta-Bandung High-Speed Train (KCJB) WHOOSH operation. Kominfo’s Budi Arie Setiadi expressed continuous monitoring for the availability and reliability of digital connectivity, particularly telecommunications networks along the first high-speed rail route in Indonesia.
“We, along with the telecommunications ecosystem, conducted tests. Kominfo is tasked with supporting signal-related issues. We assessed the signal quality along our journey and found that we could use devices and frequencies for communication,” he explained.
Minister Budi Arie emphasised that KCJB, as a technological leap for Indonesia’s progress, needs full support from the latest telecommunications technology. With advancements in transportation paralleled by digital technology, it will undoubtedly facilitate more efficient access for the public.
“This is a technological leap for Indonesia’s progress. Because this train is solid, the tracks are seamless, and the signal is robust. Our duty and responsibility are to support it,” he added.
Kominfo assured that the quality of telecommunications services would sustain the overall KCJB service. According to them, the journey from KCJB Halim Station to KCJB Padalarang Station and vice versa proceeded smoothly.
“Overall, the management and governance of the high-speed train are excellent,” he noted.
At this trial event, Minister Budi Arie Setiadi was joined by Deputy Minister of Kominfo Nezar Patria and senior officials from the Ministry of Communication and Information Technology. Minister Budi Arie encouraged the telecommunications service provider network to oversee and guarantee the quality of the network.
Ismail, the Director-General of Resources and Equipment of Posts and Information Technology at Kominfo, explained that the test conducted by Kominfo officials and telecommunications service providers is part of the initial process to support digital connectivity for KCJB. Kominfo has prepared radio frequency spectra for quality telecommunications signal transmission.
“And, fortunately, the signal used, or the frequency used, is now in collaboration with one of the biggest telecommunication companies in Indonesia. This cooperation began about two or three years ago. And, thank God, we witnessed today that the train’s communication system worked well. No signal interruptions,” he stated.
Director-General Ismail states that 5G telecommunication networks are available at Halim KCJB Station and Padalarang KCJB Station. This network supports connectivity and signifies that Indonesia is ready for full-scale and comprehensive digital transformation, even in minor details.
“For these two station locations here (Halim) and in Padalarang, the 5G signal has already been covered. Passengers at these stations can now enjoy 5G services. The remaining task is to improve the signal for passengers during the journey. So, from Jakarta to Padalarang and Bandung, we hope there will be no frequency or cellular signal interruptions,” he explained.
Next, Henry Mulya Syam, the President and Director of the Telecommunication company, stated that they would address several remaining telecommunications service challenges at various points along the KCJB route.
“There are several sites to be added, both outdoor and on the KCJB panel. We have conducted evaluations, so hopefully, within 6 to 9 months, because new towers need to be built,” he clarified.
Previously, together with President Joko Widodo and several members of the Indonesia Maju Cabinet, Minister of Communication and Information Technology Budi Arie Setiadi conducted a test journey on the KCJB from Halim Station, East Jakarta, to Padalarang Station, West Bandung Regency. The KCJB, WHOOSH, travels 350 kilometres per hour, making it the first high-speed train in Indonesia and Southeast Asia.
Oak Ridge National Laboratory (ORNL) has introduced the Centre for AI Security Research (CAISER) to confront the existing threats stemming from the widespread adoption of artificial intelligence by governments and industries worldwide. This move concedes the potential benefits of AI in data processing, operational streamlining, and decision-making while acknowledging the associated security challenges.
ORNL and CAISER will collaborate with federal agencies such as the Air Force Research Laboratory’s Information Directorate and the Department of Homeland Security Science and Technology Directorate. Together, they will conduct a comprehensive scientific analysis to assess the vulnerabilities, threats, and risks associated with emerging and advanced artificial intelligence, addressing concerns ranging from individual privacy to international security.
Susan Hubbard, Deputy for Science and Technology at ORNL, emphasised this endeavour, “Understanding AI vulnerabilities and risks represents one of the most significant scientific challenges of our time. ORNL is at the forefront of advancing AI to tackle critical scientific issues for the Department of Energy, and we are confident that our laboratory can assist DOE and other federal partners in addressing crucial AI security questions, all while providing valuable insights to policymakers and the general public.”
CAISER represents an expansion of ORNL’s ongoing Artificial Intelligence for Science and National Security initiative, which leverages the laboratory’s unique capabilities, infrastructure, and data to accelerate scientific advancements.
Prasanna Balaprakash, Director of AI Programmes at ORNL, emphasised that AI technologies substantially benefit the public and government. CAISER aims to apply the lab’s expertise to comprehensively understand threats and ensure AI’s safe and secure utilisation.
Previous research has highlighted vulnerabilities in AI systems, including the potential for adversarial attacks that can corrupt AI models, manipulate output, or deceive detection algorithms. Additionally, generative AI technologies can generate convincing deepfake content.
Edmon Begoli, Head of ORNL’s Advanced Intelligent Systems section and CAISER’s founding director emphasised the importance of addressing AI vulnerabilities. CAISER aims to pioneer AI security research, developing strategies and solutions to mitigate emerging risks.
CAISER’s research endeavours will provide federal partners with a science-based understanding of AI risks and effective mitigation strategies, ensuring the reliability and resilience of AI tools against adversarial threats.
They provide educational outreach and disseminate information to inform the public, policymakers, and the national security community.
CAISER’s initial focus revolves around four national security domains aligned with ORNL’s strengths: AI for cybersecurity, biometrics, geospatial intelligence, and nuclear nonproliferation. Collaboration with national security and industry partners is critical to these efforts.
Col Fred Garcia, Director of the Air Force Research Laboratory (AFRL) Information Directorate, expressed confidence in CAISER’s role in studying AI vulnerabilities and safeguarding against potential threats in an AI-driven world.
Moreover, as ORNL celebrates its 80th anniversary, CAISER embodies the laboratory’s commitment to solving complex challenges, advancing emerging scientific fields, and making a global impact. With its established cybersecurity and AI research programmes, ORNL is well-suited to pioneer AI security research through CAISER.
Moe Khaleel, Associated Laboratory Director for National Security Sciences at ORNL, highlighted the laboratory’s legacy of scientific discovery in various fields and emphasised CAISER’s role in scientifically observing, analysing and evaluating AI models to meet national security needs.
The Digital Government Development Agency (DGA) recently updated Thailand’s digital government progress to enhance nationwide digital services. They plan to expand their government application for all age groups, with over 400 million digital service usages, excluding infrastructure services.
The estimated economic value exceeds 8 billion baht. Their strategy focuses on more accessible, faster, and transparent access to government services, fostering a Smart Connector role. This enhances digital government levels, promoting a Smart Nation and Smart Life for Thai citizens, aligning with their quality of life improvement goals. Dr Supot Tiarawut, Director of DGA, presented these 2023 mission results, emphasising their commitment to effectively serving citizens, businesses, and government entities.
At the Government-to-Citizens (G2C) level, the DGA has linked over 112 government services via the government application, functioning as a comprehensive government SUPER APP. This app integrates services from various government agencies to address citizens’ needs effectively. It boasts more than 112 services, with over 7.5 million cumulative users and 607,041 downloads. This offers citizens a convenient single-channel solution for accessing government services, streamlining the process for all age groups and reducing the complexities associated with traditional government service usage. The plan for 2024 involves introducing critical services such as personal land tax checks, insurance information (Life/Non-Life), and interest payment services (pawning).
The Government Open Data Centre elevation aims to provide high-quality open datasets that cater to the populace’s needs and serve software developers, enabling their appropriate and optimal utilisation. This strategic move aims to enhance future competitiveness. Currently, there are 10,226 open datasets with 3,871,796 users.
The plan for 2024 includes boosting information exchange and utilisation among the public, private, and international sectors. Additionally, the Digital Transcript project, which offers digital transcripts, enhances convenience for students, reduces financial burdens, eases document verification processes for staff, and trims university expenditure on document issuance. This initiative has already produced over 1 million cards across 82 universities nationwide.
The DGA promotes transparency and public engagement through the central legal system, where the government seeks general feedback on law drafts and assesses their effectiveness. Over 1,000 regulations have been open for public comment, with 191,683 submissions. Additionally, the Tax Pai Pai system, providing government expenditure data, enhances public participation in monitoring corruption, with 16,187,604 projects disclosed.
In the G2B sector, the Biz Portal streamlines government-business interactions, benefiting SMEs. Over 124 government licenses have been obtained by 15,881 active operators, simplifying business startup processes. The Digital Entrepreneur Centre for Government Agencies (Me-D e-Marketplace) lists 595 digital technology entrepreneurs from various agencies for government procurement.
In G2G collaboration, the DGA enhances data sharing through the Government Data Exchange Centre (GDX), linking 13 agencies through 74 service data APIs with 133.44 million data exchanges. The Digital Government Personnel Development Institute (TDGA) has already benefited over 1,942,443 individuals, with plans to expand to local-level staff in 2024, offering region-specific digital courses and on-site training through the system with over 300,000 learners.
The Digital Local System is a crucial initiative, a cornerstone of local-level digital government adoption. It streamlines the administration and services of 659 Local Administrative Organisations, incorporating systems from 117 agencies. This enhances service provision, making it accessible and convenient nationwide, ultimately improving people’s quality of life in various regions.
During a visit to Bang Saray Subdistrict Municipality in Chonburi Province, the DGA observed the successful Digital Local System pilot project, which enables convenient access to services, reducing the need for physical visits to government offices and improving efficiency and cost-effectiveness. The initiative also established B-Buddy Bang Saray, a network of volunteers aiding those unfamiliar with digital systems to promote inclusivity.
In his closing remarks, Dr Supot highlighted these projects as examples of the DGA’s role in advancing Thailand towards becoming a Smart Nation, enhancing citizens’ quality of life. These efforts have consistently improved Thailand’s digital government development rankings assessed by the United Nations.
Government agencies in New Zealand are entering the digital age by launching their new Government Electronic Tender Service (GETS) and All-of-Government (AoG) collaborative contracts dashboards. These innovative digital tools are set to revolutionise procurement practices, offering unprecedented insights into spending patterns and benchmarking features.
The GETS and AoG dashboards have been developed with a digital-first approach to provide agencies with comprehensive insights into their procurement practices. One of the key goals of these dashboards is to enhance transparency in government spending, allowing agencies to make more informed decisions and facilitating strategic, intelligence-led procurement processes.
The GETS and AoG dashboards leverage cutting-edge data visualisation technologies to present complex procurement data in a clear and accessible manner. Interactive charts, graphs, and visual representations make it easier for users to gain insights from the data, promoting better decision-making.
Early agency feedback has been positive, with many highlighting the value of the benchmarking features. These features enable agencies to compare their procurement practices with others, fostering healthy competition and sharing best practices. This benchmarking capability not only improves transparency but also helps agencies identify areas for improvement.
One of the core objectives of this initiative is to make the dashboards even more user-friendly and comprehensive in future versions. The development team aims to streamline the user experience, making it easier for agencies to access and interpret the available data. Additionally, the dashboards will be expanded to include data from all participating agencies, further enhancing procurement data transparency.
In the pursuit of transparency and efficiency, government agencies actively seek input from users and stakeholders. They have invited agencies and individuals to share their suggestions and ideas on improving the dashboards. This collaborative approach ensures that the tools meet the needs of agencies and the broader public, fostering a culture of continuous improvement.
Moreover, this new GETS commits to making the dashboards more user-friendly and reflects a user-centric design approach. Agencies will likely collaborate with UX designers to ensure the dashboards are intuitive and tailored to users’ needs, ultimately improving the overall user experience.
Implementing a user-friendly UX is not only making a profound statement about the New Zealand government’s commitment to improving public services but also acknowledging that the success of these dashboards hinges on their adoption and utilisation by a diverse user base. In government procurement, where various stakeholders, including procurement officers, administrators, and policymakers, interact with these tools, catering to their varied needs is paramount.
It will also employ artificial intelligence (AI) to provide intelligent insights. With the emergence of technology, the roles of AI algorithms can be analysed deeper and more accurately. It can generate historical spending data and suggest trends, helping agencies identify cost-saving opportunities and optimise procurement strategies.
The GETS and AoG dashboards represent a significant milestone as government agencies continue their digital transformation journey. These tools provide a glimpse into the future of procurement practices, where data-driven decisions and transparency take centre stage. With ongoing efforts to improve user-friendliness and expand data coverage, these dashboards will play a pivotal role in shaping the procurement landscape for years to come.
In the era of digital government, the commitment to harnessing technology for improved governance and public service is evident. As agencies embrace innovative digital tools, the government sets a precedent for other sectors, fostering a culture of digital innovation and data-driven decision-making for the New Zealand government.