Conference keynote speakers



Guoliang Chen
Academician of Chinese Academy of Sciences
Professor of Nanjing University of Posts and Telecommunications

Title: Foundations of Computation Theory for Big Data
Abstract
In computational science, the content of computation theory mainly includes computability, computational complexity, and algorithm design and analysis. This report only discusses the former two issues, and focuses on the computational complexity theory with big data: it mainly includes computation models and computation theories; the computation of P problem and parallel NC problem; the computation of NP problem and its interactive IP problem. Finally, in the conclusion, we present the inclusion relations of various complex problems and the research countermeasures for P and NP problems in the case of big data.
Speaker Biography:
Guoliang Chen is Academician of Chinese Academy of Sciences and is Professor of Nanjing University of Posts and Telecommunications. He is a PhD supervisor and Honorary Dean of School of Computer Science and Technology, Nanjing University of Posts and Telecommunications. Professor Chen is also the Director of Institute of High Performance Computing and Big Data Processing, Nanjing University of Posts and Telecommunications, the Director of Academic Committee of Nanjing University of Posts and Telecommunications, the Deputy Director of the Academic Committee of the Wireless Sensor Network of Jiangsu Provincial High-tech Key Lab. He is the First National Teaching Teacher of Higher Education and enjoys national government special allowance. He received a Ph.D. degree from Xi'an Jiaotong University in 1961. At the same time, Professor He serves as part-time position of Dean of the School of Software Science and Technology, University of Science and Technology of China, Dean of School of Computer Science, Shenzhen University, Director of National High-Performance Computing Center, Director of Instructional Committee of Computer Basic Course of Higher Education Ministry, Director of International High-Performance Computing (Asia), China Computer Society Director and director of the High Performance Computing Professional Committee, etc. And Professor Chen also serves as Director of the Academic Committee of the National Key Laboratory about computer science. His research interests mainly include parallel algorithms and high-performance computing and its applications. Professor Chen has undertaken more than 20 scientific research projects including the National 863 Plan, the National "Climbing" Plan, the National 973 Plan, and the National Natural Science Foundation of China. A number of research achievements have been widely quoted at home and abroad and reached the international advanced level. He has published more than 200 papers and published more than 10 academic works and textbooks. He won the Second Prize of National Science and Technology Progress Award, the First Prize of Science and Technology Progress Award and the Second Prize of the Ministry of Education, the First Prize of Science and Technology Progress Award of the Chinese Academy of Sciences, the Second Prize of the National Teaching Achievement, the First Prize of the Ministry of Water Resources, and the Second Progress of Anhui Province Science and Technology Progress Awards, 2009 Anhui Provincial Major Science and Technology Achievement Awards, etc. Professor Chen won the 15th anniversary of the advanced personal important contribution award of National 863 Plan, Baosteel Education Fund outstanding teacher's special award, and the glorious title of the model worker in Anhui Province.
For years, Professor Chen has developed a complete set of parallel algorithm disciplines for "algorithmic theory-algorithm design-algorithm implementation-algorithm application" around the teaching and research of parallel algorithms. He proposed the parallel computing research method of "parallel machine architecture-parallel algorithm-parallel programming", established China's first national high-performance computing center, built a parallel research and teaching base for China's parallel algorithms, and trained more than 200 Postdoctoral, doctoral and postgraduate students. Professor Chen is the academic leader in non-numerical parallel algorithm research in China and has a certain influence and status in academic circles and education circles at home and abroad. Academician Chen first established China's first national high-performance computing center in 1995, and successfully developed China's first domestic high-performance general-purpose processor chip Godson single-core, four-core and eight-core, KD-50, KD-60 and KD-90 in 2007, 2009, 2012 and 2014 respectively, which provide infrastructure for cloud computing, big data processing and universal high performance computing in China.




Yiu-ming Cheung
Professor of Department of Computer Science and an Associate Director of Institute of Computational and Theoretical Studies
Hong Kong Baptist University

Title: Objective-Domain Dual Decomposition: An Effective Approach to Optimizing Partially Differentiable Objective Functions
Abstract
This paper addresses a class of optimization problems in which either part of the objective function is differentiable while the rest is nondifferentiable or the objective function is differentiable in only part of the domain. Accordingly, we propose a dual-decomposition-based approach that includes both objective decomposition and domain decomposition. In the former, the original objective function is decomposed into several relatively simple subobjectives to isolate the nondifferentiable part of the objective function, and the problem is consequently formulated as a multiobjective optimization problem (MOP). In the latter decomposition, we decompose the domain into two subdomains, that is, the differentiable and nondifferentiable domains, to isolate the nondifferentiable domain of the nondifferentiable subobjective. Subsequently, the problem can be optimized with different schemes in the different subdomains. We propose a population-based optimization algorithm, called the simulated water-stream algorithm (SWA), for solving this MOP. The SWA is inspired by the natural phenomenon of water streams moving toward a basin, which is analogous to the process of searching for the minimal solutions of an optimization problem. The proposed SWA combines the deterministic search and heuristic search in a single framework. Experiments show that the SWA yields promising results compared with its existing counterparts.



Chengqi Zhang
Associate Vice President
University of Technology Sydney (UTS)

Title: Interactive Deep Metric Learning 
Abstract
The embedding-based data mining is to transform the raw data into useful information that is easy to consume by the downstream tasks, such as classification, predictive analysis, and clustering. The embedding function is traditionally dominated by various pattern mining algorithms and is recently driven by the deep learning-based embedding technique. In this talk, I will briefly introduce our recent data mining practices on the application domain of big healthcare data, specifically Interactive Deep Metric Learning.



Yun Yang
Full professor at School of Software and Electrical Engineering
Swinburne University of Technology

Title: Cost Effective Data Placement in the Cloud for Efficient Data Access of Online Social Networks 
Abstract
Online social networks are organised around users who have certain expectations from their network provider, such as low latency access to both their own data and their friends’ data, often very large, e.g. videos, pictures etc. Replication of data can be used to meet these requirements and geo-distributed cloud services with virtually unlimited capabilities are suitable for large scale data storage. However, social network service providers often have a limited monetary capital to store every piece of data everywhere to minimise users’ data access latency. Therefore, it is crucial to have optimised data placement to fulfil the users’ acceptable latency requirement while having the minimum cost for social network providers. In this seminar, we address key problems including how to find the optimal number of replicas, how to optimally place the datasets and how to distribute the requests to different datacentres.



Peizhuang Wang
Full professor
Liaoning Technical University

Title: Highest algorithm for linear program 
Abstract
The idea of the talk is based on Wang/s Cone cutting theory, which yields a group of special techniques. Combining the highest principle with those algorithms, we are expected to build the strong polynomial algorithms.



Jing He
University of Oxford

Title: Is NP=P? A Polynomial-time solution for finite graph isomorphism 
Abstract
This talk will introduce a polynomial-time solution for finite graph isomorphism. It targets to provide a solution for one of the seven-millennium problems: NP versus P. Three new representation methods of a graph as vertex/edge adjacency matrix and triple tuple are proposed. A duality of edge and vertex and a reflexivity between vertex adjacency matrix and edge adjacency matrix were first introduced to present the core idea. Beyond this, the mathematical approval is based on an equivalence between permutation and bijection. Because only addition and multiplication operations satisfy the commutative law, we proposed a permutation theorem to check fast whether one of two sets of arrays is a permutation of another or not. The permutation theorem was mathematically approved by Integer Factorization Theory, Pythagorean Triples Theorem and Fundamental Theorem of Arithmetic. For each of two n-ary arrays, the linear and squared sums of elements were respectively calculated to produce the results.



Philip S. Yu
Professor and the Wexler Chair in Information Technology at the Department of Computer Science
University of Illinois at Chicago

Title: Broad Learning: A New Perspective on Mining Big Data 
Abstract
In the era of big data, there are abundant of data available across many different data sources in various formats. “Broad Learning” is a new type of learning task, which focuses on fusing multiple large-scale information sources of diverse varieties together and carrying out synergistic data mining tasks across these fused sources in one unified analytic. Great challenges exist on “Broad Learning” for the effective fusion of relevant knowledge across different data sources, which depend upon not only the relatedness of these data sources, but also the target application problem. In this talk we examine how to fuse heterogeneous information to improve mining effectiveness over various applications, including social network, recommendation, malware detection, etc.


Jun Shen
Jun Shen was awarded Ph.D. in 2001 at Southeast University, China. He held positions at Swinburne University of Technology in Melbourne and University of South Australia in Adelaide before 2006. He is currently an Associate Professor in School of Computing and Information Technology at University of Wollongong in Wollongong, NSW of Australia, where he had been Head of Postgraduate Studies, and Chair of School Research Committee since 2014. He is a senior member of three institutions: IEEE, ACM and ACS. He has published more than 230 papers in journals and conferences in CS/IT areas. His expertise includes computational intelligence, bioinformatics, cloud computing and learning technologies including MOOC. He has been Editor, PC Chair, Guest Editor, PC Member for numerous journals and conferences published by IEEE, ACM, Elsevier and Springer. A/Prof Shen was also member of ACM/AIS Task Force on Curriculum MSIS 2016. His publications appeared at IEEE Transactions on Learning Technologies, Briefs in Bioinformatics, International Journal of Information Management and many others.


Computational Intelligence applications in multi-disciplines 
Abstract
The deep learning is very hot nowadays. However, in many disciplines, the availability of data and the real research questions might not be suitable to apply DL everywhere. In this talk, some generic computational intelligence methods, or their combination with AI/DL based approaches, are explored in applications unto areas such as education, bioinformatics and transportation etc. Hopefully, we can rethink what are key challenges in data science.


Andre Van Zundert
Professor & Chairman Discipline of Anesthesiology
The University of Queensland

Title: COVID-19 – Lessons learnt from COVID-19 and the new normal as I see it 
Abstract
Indeed, pandemics are silent killers. As one author described it, these viruses are the tiniest and primitive creatures, invisible to the naked eye form of life, which have the world under his control. Humans no longer are the masters of the world. The virus has the world in his grip and we all struggle to survive. However, plagues, major outbreaks and pandemics are of all times and probably have killed more people than all previous wars together. We often remember wars, not pandemics. Hence, we have forgotten to be prepared for pandemics; governments lack to have a plan ready to be prepared for the next epidemic. We now see that the US, India, Brazil, Russia and Argentina have topped the 1 million mark of positive cases, with many other countries following soon in their steps. And these figures are for sure an underreporting of the reality, with second waves showing we’re far from controlling the virus.


Yong Shi
Professor of Management
University of Chinese Academy of Sciences

Title: How to deal with COVID-19 by using Data Analysis 
Abstract
To determine the right timing for resuming work and life, the talk first provides a retrospective analysis of COVID-19 to gain an in-depth understanding of age-specific contact-based disease transmission. This is followed then by a promising analysis of different work resumption plans to assess not only the respective economic implications of the plans, but most importantly, the associated disease transmission risks. The key to the method of COVID-19 transmission pattern characterization lies in modeling the interactions among people. Specifically, this talk considers four representative settings of social contacts that may cause the disease spread: (1) households; (2) schools; (3) workplaces; and (4) public places. It develops a computational method to measure the contact intensity between different age groups in those social settings. With such an in-depth characterization of social contact-based transmission, it is possible to analyze and explain the ins and outs of the COVID-19 outbreak, including the past and future risks, intervention effectiveness, and corresponding risks of restoring social activities.


Wanlei Zhou
Vice Rector (Academic Affairs), Dean of Institute of Data Science
City University of Macau

Title: Threats and Defenses in Data Security Games 
Abstract
One of the main threats to data security is the Advanced Persistent Threat (APT) attack. An APT attacker is a stealthy threat actor which gains unauthorized access to a computer network and remains undetected for an extended period, so as to gain unauthorized data access and data corruption throughout the data lifecycle. It has five stages: reconnaissance, establish foothold, lateral movement, exfiltration, and post-exfiltration. In this talk, we discuss the use of game theory-based deception technology to defend against APT attacks. After some introduction of data security and major threats, we focus on the following two case studies: The first case study is a countermeasure against reconnaissance, where we introduce differential privacy into a deception game. By using differential privacy, the attacker cannot deduce the real configuration of each system. The second case study is a countermeasure against lateral movement, where we develop an effective repair strategy for an organization using differential game theory. Our findings help to better understand and effectively defend against APT. The talk is based on the following two recently published papers in our group:
Dayong Ye, Tianqing Zhu, Sheng Shen, Wanlei Zhou: "A Differentially Private Game Theoretic Approach for Deceiving Cyber Adversaries". IEEE Transactions on Information Forensics and Security. 16: 569-584 (2021).
Lu-Xing Yang, Pengdeng Li, Yushu Zhang, Xiaofan Yang, Yong Xiang, Wanlei Zhou: "Effective Repair Strategy Against Advanced Persistent Threat: A Differential Game Approach". IEEE Transactions on Information Forensics and Security. 14(7): 1713-1728 (2019).


Jian Cao
Dr. Jian Cao is currently a tenured professor with the Department of Computer Science and Engineering, Shanghai Jiao Tong University. He is also the director of research institute of network computing and service computing. Dr. Cao’s research interests include intelligent data analytics, service computing, collaborative computing and software engineering. Besides national and provincial government research grants, his research is also supported by many industry partners, such as Morgan Stanley, Ctrip, Samsung and Docomo, just to name a few. He has published more than 300 research papers in prestigious journals and conferences. Dr. Cao has won 10 ministerial or provincial level scientific and technological achievements rewards. Currently, he is the distinguished member of CCF and the senior member of IEEE.

Title: Fairness in Recommender Systems
Abstract
With the wide applications of recommender systems, the potential impacts of recommender systems to the customers, item providers and other parties attract more and more attention. Fairness, which is the quality of treating people equally, is also becoming important in recommender system evaluation and algorithm design. Therefore, in the past years, there has been a growing interest in fairness measurement and assurance in recommender systems. In this talk, the concept of fairness will be discussed in detail in the various contexts of recommender systems. The framework to classify fairness metrics will be proposed from different dimensions. Then the strategies for eliminating unfairness in recommendations will be reviewed. Some research done by the team of the speaker will be presented as case studies. Finally, the challenges and future work will be discussed.