
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The growing potency in an Enterprise AI Platform combined with Graph Data Platform is successfully enhancing machine learning models and ultimately tackling empowering decision making effectively. Undeniably, both technologies work hand-in-hand to make data relationships simpler by being scalable, performant, efficient and agile.
From tracing connections via complicated social networks to comprehending interconnections, Graph Data Platform databases with Enterprise AI Platform have proven to be an excellent tool for data management in real-time. The most evident advantages of Graph Data Platform were seen during the current pandemic when governments needed to track down community infections.
Graph Data Platform aids governments in making data-driven, intelligent decisions. Additionally, it prevents fraud and potential information leaks that have mushroomed disproportionally with the rapid COVID-driven digitalisation.
The added agility that Enterprise AI Platform and Graph Data Platform offers makes it clear that the combination should be the preferred decision-making methodology. Further, an Enterprise AI Platform along with a Graph Data Platform has proven to be cost-effective for the government.
In times of crisis, obtaining information in real-time has become critical for decision-making. With a Graph Data Platform that is integrated with an Enterprise AI Platform, information can be structurally arranged quickly, analysed to draw conclusions that can influence decision-making and drive change. These powerful capabilities are the missing link for government to drive actionable outcomes from data.
The pandemic heralds an age where digital transformation in public sectors must take centre stage if governments want to be able to lead and navigate citizens through increasingly complex times. An enhanced machine learning model is the key to helping government agencies build intelligent applications that traverse today’s large, interconnected datasets in real-time. The copious volumes of data that organisations generate and collect need to be analysed and interpreted if they are to streamline government methods in forecasting and serve policymakers in effective decision-making.
The main inquiry of OpenGovLive! Virtual Breakfast Insight was centred on the use of Graph Data Platform and Enterprise AI Platform to generate deep insights for incisive decision-making. This was a closed-door, invitation-only, interactive session with top-level executives from Singapore public sector.
Tackling complex challenges in the public sector through Enterprise AI Platform and Graph Data Platform

Mohit Sagar, Group Managing Director and Editor-in-Chief, OpenGov Asia, kicked off the session with his opening address. The world has fundamentally changed, and the challenges of these times will require sophisticated solutions that will be critical for decision-making in real-time. Without a doubt, technology is a priority, Mohit asserts.
Governments across the world are looking for excellent tools for data management in real-time that can provide insights into data, Mohit acknowledges. The growing potency in an Enterprise AI Platform combined with Graph Data Platform has been proven to strengthen machine learning models and address complex decision making effectively, making it an ideal tool.
In Mohit’s opinion, Singapore is in its infancy when it comes to the adoption of AI technology. “Where does Graph Data fit in if there are already enough tools we are using for AI?” he asks. For him, there is a gap between good to great and that it is the combination of these technologies – AI and Graph Database – that makes the difference.
Graph Data Technology, Mohit firmly believes, is an eventuality; organisations will need it at some point. “You are going to absorb the technology in the future – it is here to stay,” he contends.
AI and Data Graph technology complement the Singapore government’s initiative to make data relationships simpler by being scalable, performant, efficient and agile. Mohit acknowledges that the Singapore government has already begun its drive towards a digital government, harnessing AI and Graph Databases to curb Covid in Singapore. Citing the current examples and practices of AI and Data Graphs, Mohit elaborated on the tremendous benefits and practicalities of these combined technologies.
Singapore has been doing well in utilising insights to inform decision-making. One of the most obvious use cases for graphs is contact tracing for COVID-19 infections. Since COVID-19 proliferates through social interactions, graphs are perfectly suited to helping scientists and policymakers expose and understand connected data – from tracing connections through complex social networks to understanding dependencies between people, places, and events.
He urged the agencies represented at the session to recognise the need to elevate the technology that organisations are using. Mohit reminded the delegates of the complexity of the challenges besetting the world today. Against this backdrop, it would be wise for delegates to partner with experts to better place themselves to respond with agility and efficiency in a rapidly evolving world.
Transforming collected data to connected data with Graph technology

Robin Fong, Regional Director – ASEAN, Neo4j, spoke next on the uses of Graph Database technology and how it can springboard agencies in their alignment with the priority of the Singapore government.
Whether it is humans or AI, “context is key in decision-making,” Robin argues. Making decisions require going beyond the numbers to understand relationships. Humans make tens of thousands of decisions daily, most of which depend on perceptions of surrounding circumstances.
Similarly, machine learning and AI need to be able to access and process a great deal of contextual and connected information, so it can learn from adjacent information, make judgements and adjust to circumstances.
As data is everywhere, the first step is collecting it – data ingestion. This is the acquisition and transportation of data from assorted sources to a storage medium. The next level is in providing deeper context and moving beyond merely collecting data to connecting the dots.
For business leaders to decide swiftly, they require the maximum amount of context they can gather through technology. “Our challenge is to make context practical and actionable for humans, automated processes and AI.”

Where Neo4j’s graph technology gives an edge is in producing deep context through processing collected data to connected data. “How do you solve deep problems with deep relationships?” Robin asks
If organisations can combine data, semantics and a graph structure, they will end up with a knowledge graph that has dynamic and very deep context because it is built around connected data.
Neo4j is the creator of the Property Graph and Cypher language at the core of the GQL ISO project. With thousands of Customers World-Wide, Neo4j is headquartered in Silicon Valley and has outposts in Singapore, Indonesia China, Australia, India and Japan.
Graph technology is extremely versatile and can elevate the capability of companies and agencies. With graph technology, people can solve the previously unsolvable. Top financial institutions, retailers and Telecoms, global governments overseeing civilian affairs, defence, and intelligence use Neo4j to analyse, optimise and protect. They have enabled customers to manage financial fraud, patient outcomes, the mission to Mars, global fare pricing and vaccine distribution.
There are many use cases in resource management, oversight, security, planning, science and education. Robin offered examples where Neo4j graph technology is commonly used in the public sector.
In the context of the pandemic, the technology is extremely competent in the tracking, isolating and vaccination processes of COVID-19. Further, it can be used for recruitment and talent management, which aligns well with the government’s priorities about being future-ready.
Before Graph Technology, connections were tabular, but with Graph technology, relationships are fleshed out for a single individual. This will impact the way teams are built. For instance, when people are put into special projects, graph data can connect and recommend the optimal combination.
In closing, Robin reminded delegates that Neo4j created the graph category and that it is a tool that can catapult organisations in their growth through faster and better-quality insights.
Levelling up business and agency outcomes through a unified AI platform

Alvin Pang, Sales Director, Asia, Dataiku spoke on how AI can be integrated into the operations and processes to solve problems and deliver results for businesses and agencies.
Dataiku is a software company that provides end-to-end data science and machine learning platforms. The company is headquartered in New York and Paris, with a regional based in Singapore for Asia operations.
“AI technology is becoming commonplace,” Alvin opines. To stand out and deliver extraordinary results, the challenge is in utilising AI at scale and deftly integrating technology, people, and processes.
The question is: How can you holistically drive a process across technology and people in a coherent manner to deliver results fast and in a sustainable manner?
Continuing with an examination of the AI maturity journey, he says as organisations peel away from the experiment stage, into the established stage and operationalise use cases, they start to encounter conflicting objectives that they need to satisfy.

Some of these objectives include choosing between giving teams freedom or company with Information Technology standards, promoting innovation across all business units while striking a balance in governance and not introducing shadow IT.
With Dataiku, organisations can have the best of both worlds by systematising AI. He is convinced that the process is about empowering people (experts, citizens, data scientists etc.), accelerating AI from months to days and governing AI lifecycles company-wide to ensure good visibility across all data assets.
The unique value Dataiku offers is a unified platform to systematise AI operations through a centralised workbench for everyone, streamlined paths to production and integration with agencies’ stack that is governed at scale. Taken together, what Dataiku offers is the ability to drive greater collaboration at higher quality and enable good governance over data.
Dataiku would be happy to work with organisations thinking of accelerating their growth. As they have proven to drive 423% RoI over 3 years, he feels delegates would be well served to collaborate with them.
Making deep connections and elevating your work from “good” to “great”

Dr David R. Hardoon, Managing Director of Aboitiz Data Innovation and Senior Advisor, Data & Artificial Intelligence, UnionBank of the Philippines, talked about the critical nature of understanding relationships across all forms of data.
Explaining the theory of the 6 degrees of connections and David believes “everything is fundamentally situated and based on relationships and connections.”
At the moment organisations are at the point of understanding data, although more are moving to the next stage. To unlock the next level, organisations need to master the stage they are currently at.
“Connections networks and graph cuts across every field,” David asserts. Understanding someone from the underlying relationships, influence and productivity unearth the underpinning motivations and rationale people have.
This begs the question: how then, do we make those connections and leverage that information? How do we understand how to identify or detect an event using that insight?
The first step would be to focus on what organisations want to achieve, identify the “why” and work back in terms of the “how.”
For example, if the desire is to find out how to encourage people to get vaccinated, it is about working backwards to understand the type of data you need and the relationships required.
When asked about what constitutes “good” and “great,” David felt that the difference is in operationalisation. The biggest challenge is in being able to execute and turn insights into operational decisions. The work becomes great when “insights that are operational,” that is, information that forms decision support pillars that leads to implementation and execution.
His advice is to focus on how the insights are operational, “achieve greatness, then go for the good to have.” Regardless, David pointed out that organisations should not “be distracted by perfection.”
Interactive Discussions
After the informative presentations, delegates participated in interactive discussions facilitated by polling questions. This activity is designed to provide live-audience interaction, promote engagement, hear real-life experiences, and facilitate discussions that impart professional learning and development for participants.
In the first poll, delegates were asked about the most important factor in their analytics journey. Half of the delegates indicated that evolving their data infrastructure/architecture is the most important (53%). The rest of the delegates were split between the time to deliver results (13%), consolidation and digitalisation of assets (13%) and data security for data science (13%). The rest of the votes went into hiring data scientists/analysts (8%).
A delegate said knowing what data to collect and how to leverage data for policy formulation is important. We need to think about what we are trying to achieve before identifying the data sources.
For David, all the options are important but, to him, everything stems from the time to deliver. Using the banking industry as an example, he shared that the time to deliver end-to-end used to be about 8-10 months. By setting a target to reduce it to three days, all other considerations will follow, for it would involve evolving the data infrastructure in terms of requirements for security procedures.
The following question inquired on what delegates thought their organisation are at in terms of analytics or AI maturity. Most of the delegates selected self-service visualisation (37%), followed by predictive analytics (25%). The other delegates voted for the collection and consolidation of data (19%). The rest of the votes were split between dashboarding (13%) and standard reporting (6%).
A delegate remarked that while they were one of the early adopters of AI, the technology has not moved very much from that. Although they use tools in visualisation, they are lacking the ability to understand data across organisations that can help with service improvements, practical decisions, and operational decisions. He believes that users do not know what they want – they need guidance on what data to collect, how much to collect and how much personalisation.
Nuancing the position, David believes users know what they want but require conversations that bridge the different points of view to elicit firm answers.
Mohit agrees that businesses often do not know what they want. Additionally, he points out, goalposts are regularly shifting, indicating that people may have to reassess what they want.
On the challenges faced while implementing analytics or data science practice, most delegates indicated felt business and IT restrictions in delivering analytical projects/work as the main challenge (37%). The remaining votes were split between understanding where to get the data from to build a practice (19%) and understanding business needs and requirements (19%). The rest of the delegates found the lack of skillsets – proprietary language or different systems (13%), time to deliver analytics projects to production (6%) and not having enough manpower (6%) the main challenge.
One delegate expressed that their struggle lies in using data to formulate strategies. End-users must find correlation and how best to do it. The difficulty is in nailing down the policy question.
When asked about areas delegates saw their organisation expanding data science practice, most expressed that self-service analytics – citizen data scientists (57%) are the priority, followed by cyber / C3 ops (22%), data platforms and consolidation (14%) and MLOPS (7%).
On that note, David explained that he selected MLOPS because the point about using technology is not about innovation per se but turning data into an operational reality.
Regarding the biggest challenge that they faced, most delegates indicated connecting data effectively as a challenge (35%). Other delegates were equally split between drawing insights (29%) and exploring data relationships (29%). The remaining found data interpretation challenging (7%).
In the conversation on this issue, delegates spoke of other prevailing challenges such as not having a data warehouse where data can be accessed easily and the inability to explore data relationships to cross-reference to other data sources.
David pointed out that when managing data, there is rethinking to be done. Too often, organisations are collecting data that they do not need. He stated the need to “understand that data is there for specific purposes.”
Mohit added that it is about surfacing the storyline and connections in the data.
The last poll inquired about the common data Integration and connection challenges faced. Half of the delegates indicated disparate data formats and sources as the challenge that they face (50%). The rest of the votes were split between the fact that the data is not available where it needs to be (22%), other (14%), having low-quality or outdated data (7%) and having too much data (7%).
Mohit remarked that even in a smart nation like Singapore, there are challenges that affect organisations – data collection, data storage and quality of data.
David is of the view that the problem with data is people. “We have all the tools,” he remarked, “but we put limitations on ourselves.” For him, we are leveraging enough in the way that we handle data.
Agreeing with David, Mohit added that the issue might be that “people do not know what tools to use.” Organisations have too much data but are unsure of how they can harness the information to generate insights.
Conclusion
In closing, Robin expressed his gratitude towards everyone for their participation and highly energetic discussion.
Summarising the discussion, Robin pointed out that organisations need to begin with the question of what they want to achieve and link it to policy questions. That would provide clarity on what organisations want to achieve and map that outcome onto the data to be collected.
After identifying, it is about knowing where to get the data and grappling with the over-collection of data. Finally, the following question would be on finding tools and ways to cross-refer insights and links across data sets.
Robin emphasised the edge AI and Graph technology can offer organisations in their journey towards digital transformation. Complex problems required innovative solutions. Harnessing the twin capabilities of AI and Graph can boost capabilities by generating real-time information and deeper analysis.
Before ending the session, Robin echoed David in highlighting the importance of not being distracted by what organisations do not have. He urged delegates to start with what they have and operationalise the insights by connecting data and applying AI.
Mohit added, “Data is only gold if it is giving us the insights and when people have access to them.”
Reiterating that digital transformation is an ongoing and collaborative journey, Robin encouraged the delegates to connect with him and the team to explore ways forward.

- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Ministry of Health recently informed that it has issued more than 14 million electronic COVID-19 vaccine passports to the general public, a month after its official rollout on 15 April. The passport is available on the government’s mobile application, PC COVID-19, which is available on both iOS and Android stores or Digital Health (So suc khoe dien tu) apps. By providing a secure and easy-to-use digital mechanism to verify vaccination statuses, governments can accelerate the re-opening of the economy and build a secure and trusted foundation for further digital healthcare initiatives in the future.
The vaccine passports have 11 fields of information: name, date of birth, nationality, the targeted disease, doses of vaccines received, date of vaccination, lot number of the vaccine batch, type of vaccine, vaccine product received, the vaccine manufacturer, and a code for the certification. The digital passports display all vaccine data in both Vietnamese and English. Data has been encoded into a QR code, which expires after 12 months. Following their expiry, people will be notified, and a new QR code will be created.
According to a government statement, the health ministry has urged relevant authorities and subordinate units to complete updating information regarding 34 more million doses before 1 June to facilitate the issuance of COVID-19 vaccine passports. The ministry had also requested localities to implement vaccine information clarification procedures. Medical staff and police officers in the localities are in charge of the process. As regulated, immunisation facilities must check and verify information on vaccination data. Inaccurate information will be sent to local police officers and the corrected data will be sent back to the Department of Preventive Medicine for a digital signature. The data with a digital signature will be sent to the management system for the issuance of a vaccine passport.
The vaccine passports are issued free of charge to all citizens, according to officials. Citizens are not required to go through any additional procedures except to check that their data is correct and complete. In case the information is not correct or not available, they must send feedback on the vaccination portal system. The vaccine passports were rolled out on a trial basis in late March for those vaccinated against COVID-19 at Ha Noi’s three major hospitals. Vietnam has so far reached a mutual recognition of vaccine passports with 27 European Union countries and 54 nations and territories.
Earlier this month, ASEAN member countries announced their support for a digital technology convergence to develop a globally-accepted vaccine passport. The Indonesian Health Minister, Budi Gunadi Sadikin, said at a press conference that ASEAN will issue a joint statement on its countries’ adoption of health protocol standards. The proposed vaccine passport will adopt an overseas travel passport mechanism utilised by each country’s immigration authority for ascertaining a traveller’s identity. Sadikin also noted that ASEAN health ministers have approved the establishment of an ASEAN Centre for Public Health Emergencies and Emerging Diseases (ACPHEED) as a collaborative effort to deal with extraordinary events and future pandemics. The three pillars of ACPHEED are surveillance or detection, response, and risk management, which are supported by three ASEAN representative countries, namely Vietnam, Thailand, and Indonesia.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Thailand’s Digital Economy Promotion Agency (DEPA) offers a Smart Living Solutions programme that intends to link the demands of digital technology applications in the government, municipal, and regional sectors with the private sector, which is willing to work on Smart City Services via public-private partnerships (PPP).
Along with increasing expertise, the goal is to create awareness and prepare cities and the business sector to develop initiatives for sustainable smart city services. DEPA promoted cooperative partnerships to build and extend a model for offering smart city services to local governments in the future. It also encourages collaboration in the creation of tangible smart city services.
The initiative, which was the first of its kind to create a matching channel between the city and the private sector for digital service providers, was carried out with the participation of over a hundred persons.
Meanwhile, to drive the development of smart cities the city must have a clear and ongoing roadmap. Nattapon Nimmanphatcharin, DEPA Chief Executive stated that smart cities need a clear, continuing strategy to enhance the quality of life and assure sustainability of the residents.
“The city must have a clear and ongoing roadmap and efficient management of city-data as well as the infrastructure investment must be planned to improve the quality of life of people. These would find available solutions to meet the needs of different areas of the city and are supervised by residents to ensure sustainability,” said Nimmanphatcharin during the recently held seminar titled Smart City Roadshow 2022 organized by Surat Thani Provincial Administrative Organization and partner agencies from both the public and private sectors.
Surat Thani Province joins the Smart Cities of Thailand, and a combination of the government, the corporate, the academic, and the people’s sectors will boost their digital demands.
In order for Surat Thani Province to reach its goal of being a vibrant smart city with a high quality of life, the province continuously organizes conferences and seminars for urban development with technology and innovation as well as exhibitions and talks to promote technology and smart city’s innovation knowledge.
Furthermore, DEPA recently took part in a seminar called Intensive Cybersecurity Fundamentals for Smart Cities. The cybersecurity professionals from Carnegie Mellon University’s Software Engineering Institute led the workshops and provided the training material for the event.
It is essential for those who are driving the development of smart cities to have an awareness of the primary cybersecurity components that comprise a smart city. Participants in this event will have access to suggestions that will help them develop strategies for the integration of all key industries.
Activities that are useful to the development of smart cities in Thailand are going to be organized and new activities are going to be created to secure a foundation for the development of intelligent cities while maintaining a focus on data privacy and protection.
Training courses may cover a broad range of subjects, from improving the understanding of what precisely a “Smart City” is to discover the most effective methods for governance and risk management across a spectrum of different sorts of smart cities.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Centre for Development of Telematics (C-DOT) recently inked a memorandum of understanding (MoU) with one of India’s largest telecom operators to help simplify the deployment of Internet of Things (IoT) solutions and foster interoperability among devices and applications as per oneM2M (machine to machine) architecture.
IoT adoption has become critical in any organisation’s digital transformation journey. However, in the current deployments, certain operational challenges prevent businesses from taping into the technology’s true potential. Some issues include device network compatibility, over-the-air firmware upgrades, remote device configuration, security vulnerabilities, and implementation in siloes with proprietary protocols.
To address these challenges, C-DOT and the telecom operator have agreed to evaluate applications and devices from various solution providers against oneM2M specifications and offer joint certificates. A government official said that the partnership is an opportunity to “see the oneM2M specifications in action” in a diverse set of sectors and applications, from smart energy to connected cars. C-DOT’s indigenously-developed oneM2M-based Common Services Platform (CCSP) is expected to benefit the IoT industry. The collaboration presents opportunities for device and application providers to deploy their solutions in telecom operators’ networks. The platform will enable application providers to use a robust middleware framework with all necessary underlying common services to deploy a secure oneM2M-compliant solution.
C-DOT is a leading telecommunications research and development organisation that runs under the Ministry of Communications. It carries out advanced research activities in optical communication, wireless technologies, switching and routing, IoT/M2M, artificial intelligence, and advanced security solutions, among others.
Over the years, the automotive, energy, healthcare, smart cities, and logistics industries have ramped up IoT investments. A recent survey showed that the IoT market in India could touch US$ 9.28 billion by 2025, up from US$ 4.98 billion in 2020.
Government agencies are also working together to foster the IoT ecosystem in the country. For instance, earlier this month, C-DOT signed an MoU with the Centre for Development of Advanced Computing (C-DAC) to collaborate in areas of telecommunications and information communication technologies (ICT), activities in 4G/5G services, broadband, IoT/M2M, packet core, and computing. As OpenGov Asia reported, the two sides also planned to sign Specific Project Agreements as and when required to enumerate the specific roles and responsibilities.
C-DOT is keen on aligning its indigenous R&D endeavours with C-DAC’s to meet the overarching objectives of national development, an official had stated. Both C-DOT and C-DAC are leaders in their respective areas and the MoU can foster strong cooperation and develop state-of-the-art technologies. The agreement will strengthen and secure national networks, boost seamless connectivity, and deploy advanced tech-based applications to make India self-reliant.
C-DAC is a premier institute for the design, development, and deployment of electronic and ICT technologies and applications for socio-economic advancement. It aims to expand the frontiers of ICT in the country, and evolve technology solutions, architectures, systems, and standards for India-specific problems. It rapidly and effectively spreads digital knowledge by overcoming language barriers through cutting-edge technologies, sharing IT experience and expertise, fostering digital inclusion, and utilising the intellectual property generated by converting it into business opportunities.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Researchers from the California Institute of Technology (Caltech) discovered that a deep-learning technology tag, known as Neural-Fly, could assist flying robots known as “drones” in adapting to any weather conditions.
Drones are now flown under controlled conditions, without wind, or by people using software or remote controls. The flying robots have been trained to take off in formation in the open air, although these flights are typically undertaken under perfect conditions.
However, for drones to autonomously perform important but mundane duties, such as package delivery or airlifting injured drivers from traffic accidents, they must be able to adapt to real-time wind conditions.
With this, a team of Caltech engineers has created Neural-Fly, a deep-learning technology that enables drones to adapt to new and unexpected wind conditions in real-time by merely adjusting a few essential parameters. Neural-Fly is discussed in newly published research titled “Neural-Fly Enables Rapid Learning for Agile Flight in Strong Winds” in Science Robotics.
The issue is that the direct and specific effect of various wind conditions on aircraft dynamics, performance, and stability cannot be accurately characterised as a simple mathematical model.
– Soon-Jo Chung, Bren Professor of Aerospace and Control and Dynamical Systems and Jet Propulsion Laboratory Research Scientist
Chung added that they employ a combined approach of deep learning and adaptive control that enables the aircraft to learn from past experiences and adapt to new conditions on the fly, with stability and robustness guarantees, as opposed to attempting to qualify and quantify each effect of the turbulent and unpredictable wind conditions they frequently encounter when flying.
Neural-Fly was evaluated at Caltech’s Center for Autonomous Systems and Technologies (CAST) utilising its Real Weather Wind Tunnel, a 10-foot-by-10-foot array of more than 1,200 tiny computer-controlled fans that enables engineers to mimic everything from a mild breeze to a gale.
Numerous models derived from fluid mechanics are available to researchers but getting the appropriate model quality and tweaking that model for each vehicle, wind condition, and operating mode is difficult.
Existing machine learning methods, on the other hand, demand massive amounts of data for training, but cannot match the flying performance attained by classical physics-based methods. Adapting a complete deep neural network in real-time is a monumental, if not impossible, undertaking.
According to the researchers, Neural-Fly addresses these challenges by utilising a technique known as separation, which requires only a few parameters of the neural network to be altered in real-time. This is accomplished using their innovative meta-learning technique, which pre-trains the neural network so that only these critical parameters need to be changed in order to successfully capture the changing environment.
After only 12 minutes of flying data, autonomous quadrotor drones outfitted with Neural-Fly learn how to respond to severe winds so well that their performance improves dramatically as judged by their ability to precisely follow a flight route.
When compared to drones equipped with current state-of-the-art adaptive control algorithms that identify and respond to aerodynamic effects but lack deep neural networks, the error rate following that flight path is between 2.5 to 4 times lower.
Landing may appear more difficult than flight, however, Neural-Fly can learn in real-time, unlike previous systems. As a result, it can react on the fly to wind variations and does not require post-processing.
In-flight tests were done outside of the CAST facility; Neural-Fly functioned just as well as it did in the wind tunnel. Additionally, the researchers showed that flight data collected by one drone can be transferred to another, establishing a knowledge pool for autonomous cars.
The drones were outfitted with a typical, off-the-shelf flight control computer utilised by the drone research and enthusiast communities. Neural-Fly was built into an onboard Raspberry Pi 4 computer, which is the size of a credit card and costs roughly $20.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Minister for Communications, Electronics, and Information Technology (MietY) recently launched a portal for the centralised right of way (RoW) approvals called GatiShakti Sanchar. It enables telecom service providers (TSPs) and infrastructure providers (IPs) to apply for RoW permissions to lay down optical fibre cables and set up mobile towers. It is a collaborative institutional mechanism between central, state, and union territory governments, local bodies, and service providers.
As all applicants can apply at a single common website, the portal makes the process of RoW permissions and the subsequent approvals faster and more efficient. This, in turn, could help rollout 5G services more quickly, for which base transceiver stations (BTS) are installed at short intervals, an official at the launch event noted. The portal has a dashboard displaying state and district-wise pendency statuses. It also offers automated alerts on application processing updates and centralised help desk availability.
According to a press release, GatiShakti Sanchar was developed in line with the National Broadband Mission (NBM). Launched in 2019 by the Department of Telecommunication, NBM aims to facilitate universal and equitable access to broadband services across the country, especially in rural and remote areas. To achieve these targets, the government plans to create an efficient digital communications infrastructure, and the GatiShakti Sanchar portal is a step in this direction.
The government expects the portal enhances the ease of doing business, which will lead to:
- The fast laying of more optical fibre cables and accelerated fiberasation
- Increased tower density, enhanced connectivity and improved quality of various telecom services
- Increased fiberasation of telecom towers, ensuring better broadband speeds across the country.
Soon, the portal will be integrated with the central RoW portals of several other central ministries and departments, including defence, environment forests and climate change, road transport and highways, railways, petroleum and natural gas, housing and urban affairs, ports, shipping and waterways, and civil aviation.
Public and private partnerships are also a key factor in ensuring a strong and effective 5G ecosystem. Earlier this month, the Telecommunication Engineering Centre (TEC) signed a five-year memorandum of understanding (MoU) with an Indian product engineering and manufacturing company that works in 5G, networking and the Internet of Things (IoT). TEC is a technical arm of the Department of Telecommunications.
The MoU will facilitate registered start-ups, innovators, and MSMEs working in Open Radio Access Network (ORAN) to test their products at the company’s existing labs for interoperability among ORAN components from different vendors. Components include the (remote) radio unit (RRU/RU), distributed unit (DU), and central unit (CU). Start-ups can also use the labs for radio conformance, protocol, and interface testing. As OpenGov Asia reported, facilities will be offered at a subsidised tariff, which will be decided by both the MoU partners. The products offered for testing will be certified by TEC.
The testing certification will accelerate research, innovation, and domestic design and manufacturing. India aims to be a front-runner in 5G and ORAN, and this test certification ecosystem is expected to make the country a leading design, testing, and certification hub in Asia.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
Australia’s national science agency, CSIRO, and a Finnish industrial machinery company have signed a global exclusive cooperation agreement on the delivery of SwirlFlow® agitation technology for the Bauxite and Alumina sector outside of China.
The combination of the companies’ leading expertise in their respective fields will allow the parties to create the strongest offering to the market for the use of this technology in the refinery precipitation tanks.
The Director of Light Metals at the industrial machinery company stated that sustainability is a top priority for the firm. In addition to their own investments to develop technology for sustainable alumina processing, they announced their cooperation with CSIRO. This partnership will allow the firm to meet its customers’ growing demands such as lower capital installation, reduced spare parts costs and an increase in precipitation tank availability.
CSIRO’s leading technology in SwirlFlow® agitation has been pioneered at a tier-one refinery precipitation tanks, leading to significantly reduced maintenance costs and improved operational time between descaling events stated that the Research Program Director for Processing at CSIRO.
In the minerals processing industry, large mixing tanks are utilised to provide a variety of continuous hydrometallurgical processes including leaching (digestion), precipitation, adsorption, oxidation, tailings washing and neutralisation. Usually, single or multiple impellers with vertical baffles inside these tanks are utilised for mixing and to create suspensions of solid materials.
Traditional long-shaft agitators are expensive and difficult to clean during maintenance shutdowns. They may also bog in solids that settle on the bottom of the tank. These issues result in losses of production as well as high maintenance costs.
The technology has significantly lower capital and operating costs compared to traditional agitation systems, cutting installation costs by up to a third. It incorporates a short shaft and a novel impeller design to create a tornado-like vortex flow. As it integrates a short shaft, the technology does not bog in settled solids and is easier to clean. This reduces downtime and maintenance costs. Furthermore, it can achieve the same mixing performance as traditional agitators with lower power consumption, further reducing operating costs.
The technology has been deployed at the Queensland Alumina refinery in Australia and is being evaluated for other alumina refineries in Australia and overseas. In addition, it is also being tested for leaching applications in iron ore, gold, and uranium plants.
The technology has been designed for slurry tanks:
- as a short-shaft system to reduce the mechanical failure risks common in conventional agitator systems
- as a low-weight, lower-cost replacement or new agitator system for gold carbon-in-pulp (CIP) leach and process tanks.
- where downstream pumps are starved of feed due to sedimentation blockage of the pump inlet pipe
- to address the build-up of inventory, scale or sediment that reduces tank online time, or results in a premature stoppage of the tanks.
The capital cost of the technology is around 50% less than traditional technologies and, similarly, the maintenance costs are also much lower, in part due to the lower wear rates than for the impellers used in traditional systems.
Conversion to the technology is both a major capital cost saving and provides long-term operating advantages including a significantly lower tank scaling rate. This means that the tank can stay operational for much longer, increasing production and reducing costs.
- Like
- Digg
- Del
- Tumblr
- VKontakte
- Flattr
- Buffer
- Love This
- Odnoklassniki
- Meneame
- Blogger
- Amazon
- Yahoo Mail
- Gmail
- AOL
- Newsvine
- HackerNews
- Evernote
- MySpace
- Mail.ru
- Viadeo
- Line
- Comments
- Yummly
- SMS
- Viber
- Telegram
- Subscribe
- Skype
- Facebook Messenger
- Kakao
- LiveJournal
- Yammer
- Edgar
- Fintel
- Mix
- Instapaper
- Copy Link
The Material Engineering Student Association (MTM) of the Bandung Institute of Technology (ITB) in Indonesia has built a 1 kWh-capable electric turbine. This activity was a part of the institute’s Bright Wind Programme, a community service whose primary focus is on advancing the local community.
In their latest project, Bright Wind MTM Team has done a preliminary site inspection to gather some information on the site’s location, soil qualities, wind conditions, and the quantity of power (kWh) required by the site. After performing a survey and collecting data, the Kanaan Elementary School was selected as the receiver of a 1 kWh system.
“First, when electricity is successfully supplied through this PLTB, the children in Indragiri [District] Village are happy and excited because they can do gymnastics using songs electronically,” said Dede Iskandar Usman, Kanaan Elementary School Principal.
Usman continued by saying that the Bright Wind Project will undoubtedly bring about alterations and modifications for the Indragiri residents’ chances of survival. The Bright Wind MTM Team then proceeded to carry out the design of the wind power plant (Pembangkit Listrik Tenaga Bayu/PLTB) after having first determined the amount of electricity that would be produced and its precise location. PLTB is a type of power plant known as a wind power plant that generates electricity by harnessing the power of the wind.
The design is geared toward making the PLTB meet its requirements, which include things like height, the shape of the turbine blades, the structure, the material, and other requirements.
The process of manufacturing is helped along in this company by its partners. Beginning with the provision of workshops and continuing through the assistance provided in the production of draft drawings from existing blueprints and the supervision of student work via the provision of instructions regarding the production of the PLTB itself.
In the same workshop where the PLTB was first broken down into its component parts, the Bright Wind Team later put everything back together. After the assembly was finished, it was evaluated to determine whether it met the requirements. At that point, the Bright Wind Team removed the PLTB, which was then transported to Kanaan to be re-installed. This is essential when one considers the treacherous nature of the landscape that must be traversed to get to the destination.
During the process of this installation, the Bright Wind team was able to save time because there was basically already a mains cable that had already been installed. This is because in the previous years there were electric turbine installations, but the electricity that they produced was of lower quality.
To supply electricity to the entirety of the Kanaan Elementary School building, the Bright Wind team only needed to connect the control panel to the main cable, which was made possible by the main cable itself.
The Bright Wind project is also very helpful in making the activities of teaching and learning at Kanaan Elementary School run more smoothly because it has been made possible by the availability of electricity.
ITB was Indonesia’s first technical high school, and it was the first school in the country to provide socialization classes for elementary children as well as entrance exams for state universities for high school students. After being without electricity for 39 years, the village was finally able to get power thanks to a combined effort from many research projects and business partners.