On 28 March, 2018, ICT leaders from a range
of public sector agencies in Singapore and institutes of higher learning
gathered for a discussion on adoption of artificial intelligence (AI), at a
Breakfast Insight session organised by OpenGov Asia and Mellanox Technologies.
Mr Mohit Sagar, Editor-in-Chief of OpenGov
Asia started the session talking about the imperative of AI adoption in the
public sector. He said that even among leaders in the technology and the AI
field, there is an entire spectrum of views when it comes where AI is headed
and whether and how we can trust AI technologies.
But AI technology as it exists today can be
used by governments in healthcare, defence, smart cities, to ultimately provide
better, safer lives for citizens.
Mr Gilad Shainer, Vice President, Marketing
at Mellanox Technologies, and Chairman of the non-profit HPC-AI Advisory Council, which drives
outreach and education in the areas of high performance computing and AI, took
the floor next.
He noted that neural networks were designed
twenty years ago, but they couldn’t be implemented because there was not enough
data. Now the situation has transformed, with huge amounts of data being
generated and consumed. In fact, only a small fraction of the data being
generated is used. If we were able to use a bigger proportion of the data, we
might be able to build even more amazing products and services. This explosion
in data volumes and improvement in the ability to collect and use the data is
driving the AI revolution today.
He said, “Some people look on AI as a market.
I don’t think AI is a market. It is a technology which will impact a lot of
markets.”
To take a few examples, applications range
from real-time fraud detection, credit/risk analysis, high frequency trading in
finance to self-driving cars, image / facial recognition, logistics &
mapping in the automotive and transport sectors to drug discovery and diagnostic
assistance in medicine.
In order to build those solutions which would
take us to the next level, Mr Shainer emphasised that we need more data, better
models, as well as better systems that can move the data and analyse it faster.
If we cannot move the data, we cannot analyse it. He added that Mellanox
provides fast networking solutions which move the data in the most efficient
way.
He went on to say that there is change in
the data centre architecture which drives the need to analyse the data wherever
the data is, enabling faster generation of insights. The network has to become
much more than a pipe that moves the data.
Dr Shengen Yan, Research Director at SenseTime
Group Limited, a cutting-edge artificial intelligence company from China,
talked about some of the work being done by the company.
SenseTime has over 400 partners today,
including China Mobile, Wanda Group, Meitu, graphics processor maker Nvidia,
China UnionPay, JD Finance, Sina Weibo, China Merchants Bank, and mainland
smartphone giants Huawei Technologies, Oppo, Vivo and Xiaomi. It deploys its
technology in a range of areas, including Fintech, smart cities, mobile phones,
medicine and autonomous driving.
For instance, SenseTime provides a crowd
analysis system, which can convert the unstructured data of the video stream to
structured data. It detects cars, humans, bicycles, and the technology can even
provide the colour of the clothes and the cars, as well as the type of the car.
Then the data can be stored for search.
There are several popular frameworks
available for AI, such as Caffe2, torch, and TensorFlow. But the existing
frameworks have limitations such as poor support of distributed learning (Caffe2),
unsatisfactory efficiency and confined technology development and IP issues. To
deal with these problems, SenseTime has developed its own deep learning
framework from scratch, called Parrots. It is about distributed training and using
very complicated models. It is a very scalable platform.
In order to accelerate the training of the
AI models, SenseTime also built a deep learning training supercomputer. It has
more than 8000 GPUs (graphics processing units) and over 10 GPU clusters.
Polling
questions and discussion
When asked about the interconnect speed in
their data centre, 47% delegates selected the 10 Gigabit Ethernet option. Mr
Shainer said that today there is no price difference between 10 and 25
Gigabits. For the same price and same infrastructure, a 2.5 times improvement can
be achieved.
Around 66% attendees responded that they
are already using AI or HPC, while another 24% have a plan in place to use
those technologies. In terms of applications, 53% of delegates are using AI to
drive business intelligence, while 20% are adopting facial recognition or voice
recognition technologies and another 13% are exploring computer vision. Several
delegates also said that they are using combinations of different AI
technologies.
Dr John Kan, Chief Information Officer
(CIO) at the Agency for Science, Technology and Research (A*STAR) said that
procurement, fraud and demand aggregation are key concerns for the agency as it
buys products worth millions of dollars every year for research purposes. So, the agency is using deep learning for
procurement analytics. A*STAR has developed an electronic procurement system
with random sampling, low-level, mid-level and deep dive analytics using
supercomputing.
Ms Samantha Fok, Director, Enterprise
Development, Infocomm Media Development Authority (IMDA) said that there is a
team at IMDA which develops AI algorithms, to build own internal capabilities, as
well as work with some partners to look at specific problem areas. The companies
are brought in to co-develop, so that once the algorithm is developed they can
commercialise it.
Mr David Toh, Assistant
Director, Investment at SGInnovate, said that that the organisation has
a talent intelligence team which is using algorithms to better match talents,
based on their profile to help develop companies that SGInnovate is investing into.
AI
Singapore is working with a number of different institutions and putting in
place plans to start their 100
Experiments project, said Mr Maurice Manning, Head, AI Applications at
AI Singapore. If any enterprise has a problem statement which they are unable
to solve with commodity-off-the-shelf solutions, but for which existing AI
technologies can be quickly built with limited research, then AI Singapore will
facilitate matching the statement to the work areas of researchers from NUS,
NTU, SUTD, SMU and A*STAR.
Majority of delegates (63%) said that there
are no hurdles they are facing in the adoption of AI. Around 31% picked lack of
knowledge as the major hurdle. Mr Manning said that in his view, the lack of
knowledge of AI techniques is not usually the barrier. The problem comes with
networks. Researchers today are using very large datasets. But inadequate attention
is paid to networks or to questions such as should we be thinking of the
network when designing our process and workflow? Very few electrical engineers and
almost no computer science people understand networks. That knowledge is not
being imparted on the graduate level.
Mr Shainer agreed witht the comment, adding
that previously CPU was the most important part, most investment used to go to
buying CPUs. Connectivity and storage came lower in priority. Nowadays the
network is critical. The network is also becoming a distributed CPU. There are
performance bottlenecks cannot be solved at the server or by adding more
computing power. In fact, adding more
CPUs might make the situation worse, as now there is more information to be
combined. Enhanced capabilities on the network itself can be used to solve this
problem.
Another highlighted challenge was inadequate
annotated data, which is essential as a lot of deep learning today is
supervised (supervised machine learning infers a function from labelled
training data consisting of a set of training examples, while the objective of unsupervised
learning is to find the structure or relationships between different inputs.)
When asked which machine learning/ deep
learning framework they are using, nearly 45% of attendees responded that they
are using TensorFlow. However, 36% have developed their own framework.
Around 33% of delegates are planning to use
existing infrastructure for AI development, 20% plan to build dedicated
infrastructure, and 40% are using external resources.
However, the answer sometimes is a mix. As
Mr Paul Gagnon, Director, E-Learning, IT Systems and Services, Nanyang
Technological University – Lee Kong Chian School of Medicine said, they are
working with IBM to develop a virtual
tutor designed to assist the learning of medical students and using ‘external’
resources. However, they are also building own dedicated infrastructure. There
is also existing infrastructure which is being transformed to support the
dedicated infrastructure, which in turn is necessary to use the external
resources.
Dr William TJHI, Associate Director, AI
Engineering at AI Singapore also expects it to be a mix in the
future.
12.50% of delegates are planning to deploy
AI infrastructure within the next 6 months, while 38% were aiming for the next
12 months. Another 38% had plans but no definite schedule.
Mr Charlie Foo, Vice President & GM, Mellanox
Technologies, Asia-Pacific, concluded the discussion saying that there is a
maturity in terms of AI adoption in the public sector. He said that inertia is
the biggest threat and that Mellanox can help organisations make their
infrastructure more intelligent and robust and future-proof investments.