Conference keynote speakers



Guoliang Chen
Academician of Chinese Academy of Sciences
Professor of Nanjing University of Posts and Telecommunications

Title: Foundations of Computation Theory for Big Data
Abstract
In computational science, the content of computation theory mainly includes computability, computational complexity, and algorithm design and analysis. This report only discusses the former two issues, and focuses on the computational complexity theory with big data: it mainly includes computation models and computation theories; the computation of P problem and parallel NC problem; the computation of NP problem and its interactive IP problem. Finally, in the conclusion, we present the inclusion relations of various complex problems and the research countermeasures for P and NP problems in the case of big data.
Speaker Biography:
Guoliang Chen is Academician of Chinese Academy of Sciences and is Professor of Nanjing University of Posts and Telecommunications. He is a PhD supervisor and Honorary Dean of School of Computer Science and Technology, Nanjing University of Posts and Telecommunications. Professor Chen is also the Director of Institute of High Performance Computing and Big Data Processing, Nanjing University of Posts and Telecommunications, the Director of Academic Committee of Nanjing University of Posts and Telecommunications, the Deputy Director of the Academic Committee of the Wireless Sensor Network of Jiangsu Provincial High-tech Key Lab. He is the First National Teaching Teacher of Higher Education and enjoys national government special allowance. He received a Ph.D. degree from Xi'an Jiaotong University in 1961. At the same time, Professor He serves as part-time position of Dean of the School of Software Science and Technology, University of Science and Technology of China, Dean of School of Computer Science, Shenzhen University, Director of National High-Performance Computing Center, Director of Instructional Committee of Computer Basic Course of Higher Education Ministry, Director of International High-Performance Computing (Asia), China Computer Society Director and director of the High Performance Computing Professional Committee, etc. And Professor Chen also serves as Director of the Academic Committee of the National Key Laboratory about computer science. His research interests mainly include parallel algorithms and high-performance computing and its applications. Professor Chen has undertaken more than 20 scientific research projects including the National 863 Plan, the National "Climbing" Plan, the National 973 Plan, and the National Natural Science Foundation of China. A number of research achievements have been widely quoted at home and abroad and reached the international advanced level. He has published more than 200 papers and published more than 10 academic works and textbooks. He won the Second Prize of National Science and Technology Progress Award, the First Prize of Science and Technology Progress Award and the Second Prize of the Ministry of Education, the First Prize of Science and Technology Progress Award of the Chinese Academy of Sciences, the Second Prize of the National Teaching Achievement, the First Prize of the Ministry of Water Resources, and the Second Progress of Anhui Province Science and Technology Progress Awards, 2009 Anhui Provincial Major Science and Technology Achievement Awards, etc. Professor Chen won the 15th anniversary of the advanced personal important contribution award of National 863 Plan, Baosteel Education Fund outstanding teacher's special award, and the glorious title of the model worker in Anhui Province.
For years, Professor Chen has developed a complete set of parallel algorithm disciplines for "algorithmic theory-algorithm design-algorithm implementation-algorithm application" around the teaching and research of parallel algorithms. He proposed the parallel computing research method of "parallel machine architecture-parallel algorithm-parallel programming", established China's first national high-performance computing center, built a parallel research and teaching base for China's parallel algorithms, and trained more than 200 Postdoctoral, doctoral and postgraduate students. Professor Chen is the academic leader in non-numerical parallel algorithm research in China and has a certain influence and status in academic circles and education circles at home and abroad. Academician Chen first established China's first national high-performance computing center in 1995, and successfully developed China's first domestic high-performance general-purpose processor chip Godson single-core, four-core and eight-core, KD-50, KD-60 and KD-90 in 2007, 2009, 2012 and 2014 respectively, which provide infrastructure for cloud computing, big data processing and universal high performance computing in China.




Yiu-ming Cheung
Professor of Department of Computer Science and an Associate Director of Institute of Computational and Theoretical Studies
Hong Kong Baptist University

Title: Objective-Domain Dual Decomposition: An Effective Approach to Optimizing Partially Differentiable Objective Functions
Abstract
This paper addresses a class of optimization problems in which either part of the objective function is differentiable while the rest is nondifferentiable or the objective function is differentiable in only part of the domain. Accordingly, we propose a dual-decomposition-based approach that includes both objective decomposition and domain decomposition. In the former, the original objective function is decomposed into several relatively simple subobjectives to isolate the nondifferentiable part of the objective function, and the problem is consequently formulated as a multiobjective optimization problem (MOP). In the latter decomposition, we decompose the domain into two subdomains, that is, the differentiable and nondifferentiable domains, to isolate the nondifferentiable domain of the nondifferentiable subobjective. Subsequently, the problem can be optimized with different schemes in the different subdomains. We propose a population-based optimization algorithm, called the simulated water-stream algorithm (SWA), for solving this MOP. The SWA is inspired by the natural phenomenon of water streams moving toward a basin, which is analogous to the process of searching for the minimal solutions of an optimization problem. The proposed SWA combines the deterministic search and heuristic search in a single framework. Experiments show that the SWA yields promising results compared with its existing counterparts.



Chengqi Zhang
Associate Vice President
University of Technology Sydney (UTS)

演讲题目: 智能产业互联网的思考 
摘要
随着人工智能(AI) 在各个领域越来越广泛的应用及产业互联网的快速兴起,如何让人工智能技术为产业互联网赋能是一个非常值得探讨的问题。本报告将结合我个人在人工智能领域从事的相关研究工作和悉尼科技大学在人工智能领域研究所取得的成就,主要介绍人工智能与产业互联网结合的思考。智能产业互联网将会在企业数字化转型方面起到很重要的作用。
讲座者简历:
张成奇,副校长 (悉尼科技大学) ,杰出教授 (人工智能),计算机科学 博士(PhD), 高级科学博士(DSc),国际人工智能联合会(IJCAI)- 2024 大会候任主席
张成奇教授自2017年12月1日起任悉尼科技大学副校长,并于2017年2月27日被聘为悉尼科技大学杰出教授。
2019年他被任命为IJCAI-2024的大会候任主席及IJCAI理事会成员(2022-2026)。他于2021被邀请成为麻省理工学院技术评论35岁以下的创新者(MIT Technology Review Innovators Under 35 – TR35)的中国区评委及亚太区评委。
张教授于1982年3月获复旦大学的理学学士学位,于1985年3月获吉林大学的理学硕士学位,于1991年10月获昆士兰大学的博士学位,之后于2002年10月获迪肯大学的高级博士学位。所有学位都属于计算机科学领域(人工智能领域)。
迄今为止,张教授共发表了350多篇科研论文;其中发表在一流的国际期刊(如AIJ、IEEE和ACM Transactions)的学术论文已逾百篇。他1992年成为在世界顶级人工智能期刊“Artificial Intelligence” 发表论文的首位大陆华人。根据Google学术搜索,他的论文引用总次数超过16000次, H指数为59。他在国际会议上应邀做了28场特邀报告。他已经指导了30多名博士毕业,其中8名已获得正教授职位。他于2011年获得了新南威尔士州工程与ICT类科学与工程奖,并在2011年获得了悉尼科技大学在领导力方面的杰出研究副校长奖。
张教授于2012年至2014年在澳大利亚基金委(ARC)被聘为兼职专家。他也曾担任人工智能界世界三大顶会的大会主席(包括ICDM-2010,KDD-2015,IJCAI-2024)。
此外,张教授曾被聘为中科院海外评审专家(2004-2008)。曾参与中国的国家科技奖的评审,973 和863的项目评审,杰青,以及长江学者人才的评审。

Yun Yang
Full professor at School of Software and Electrical Engineering
Swinburne University of Technology

Title: Cutting the Unnecessary Long Tail: Cost-Effective Big Data Clustering in the Cloud 
Abstract
Clustering big data often requires tremendous computational resources where cloud computing is undoubtedly one of the promising solutions. However, the computation cost in the cloud can be unexpectedly high if it cannot be managed properly. The long tail phenomenon has been observed widely in the big data clustering area, which indicates that the majority of time is often consumed in the middle to late stages in the clustering process. In this research, we try to cut the unnecessary long tail in the clustering process to achieve a sufficiently satisfactory accuracy at the lowest possible computation cost. A novel approach is proposed to achieve cost-effective big data clustering in the cloud. By training the regression model with the sampling data, we can make widely used k-means and EM (Expectation-Maximization) algorithms stop automatically at an early point when the desired accuracy is obtained. Experiments are conducted on four popular data sets and the results demonstrate that both k-means and EM algorithms can achieve high cost-effectiveness in the cloud with our proposed approach. For example, in the case studies with the much more efficient k-means algorithm, we find that achieving a 99% accuracy needs only 47.71%-71.14% of the computation cost required for achieving a 100% accuracy while the less efficient EM algorithm needs 16.69%-32.04% of the computation cost. To put that into perspective, in the United States land use classification example, our approach can save up to $94,687.49 for the government in each use.



Peizhuang Wang
Full professor
Liaoning Technical University

Title: Highest algorithm for linear program 
Abstract
The idea of the talk is based on Wang/s Cone cutting theory, which yields a group of special techniques. Combining the highest principle with those algorithms, we are expected to build the strong polynomial algorithms.



Jing He
University of Oxford

Title: Is NP=P? A Polynomial-time solution for finite graph isomorphism 
Abstract
This talk will introduce a polynomial-time solution for finite graph isomorphism. It targets to provide a solution for one of the seven-millennium problems: NP versus P. Three new representation methods of a graph as vertex/edge adjacency matrix and triple tuple are proposed. A duality of edge and vertex and a reflexivity between vertex adjacency matrix and edge adjacency matrix were first introduced to present the core idea. Beyond this, the mathematical approval is based on an equivalence between permutation and bijection. Because only addition and multiplication operations satisfy the commutative law, we proposed a permutation theorem to check fast whether one of two sets of arrays is a permutation of another or not. The permutation theorem was mathematically approved by Integer Factorization Theory, Pythagorean Triples Theorem and Fundamental Theorem of Arithmetic. For each of two n-ary arrays, the linear and squared sums of elements were respectively calculated to produce the results.



Philip S. Yu
Professor and the Wexler Chair in Information Technology at the Department of Computer Science
University of Illinois at Chicago

Title: Broad Learning: A New Perspective on Mining Big Data 
Abstract
In the era of big data, there are abundant of data available across many different data sources in various formats. “Broad Learning” is a new type of learning task, which focuses on fusing multiple large-scale information sources of diverse varieties together and carrying out synergistic data mining tasks across these fused sources in one unified analytic. Great challenges exist on “Broad Learning” for the effective fusion of relevant knowledge across different data sources, which depend upon not only the relatedness of these data sources, but also the target application problem. In this talk we examine how to fuse heterogeneous information to improve mining effectiveness over various applications, including social network, recommendation, malware detection, etc.


Jun Shen
Jun Shen was awarded Ph.D. in 2001 at Southeast University, China. He held positions at Swinburne University of Technology in Melbourne and University of South Australia in Adelaide before 2006. He is currently an Associate Professor in School of Computing and Information Technology at University of Wollongong in Wollongong, NSW of Australia, where he had been Head of Postgraduate Studies, and Chair of School Research Committee since 2014. He is a senior member of three institutions: IEEE, ACM and ACS. He has published more than 230 papers in journals and conferences in CS/IT areas. His expertise includes computational intelligence, bioinformatics, cloud computing and learning technologies including MOOC. He has been Editor, PC Chair, Guest Editor, PC Member for numerous journals and conferences published by IEEE, ACM, Elsevier and Springer. A/Prof Shen was also member of ACM/AIS Task Force on Curriculum MSIS 2016. His publications appeared at IEEE Transactions on Learning Technologies, Briefs in Bioinformatics, International Journal of Information Management and many others.


Computational Intelligence applications in multi-disciplines 
Abstract
The deep learning is very hot nowadays. However, in many disciplines, the availability of data and the real research questions might not be suitable to apply DL everywhere. In this talk, some generic computational intelligence methods, or their combination with AI/DL based approaches, are explored in applications unto areas such as education, bioinformatics and transportation etc. Hopefully, we can rethink what are key challenges in data science.


Andre Van Zundert
Professor & Chairman Discipline of Anesthesiology
The University of Queensland

Title: COVID-19 – Lessons learnt from COVID-19 and the new normal as I see it 
Abstract
Indeed, pandemics are silent killers. As one author described it, these viruses are the tiniest and primitive creatures, invisible to the naked eye form of life, which have the world under his control. Humans no longer are the masters of the world. The virus has the world in his grip and we all struggle to survive. However, plagues, major outbreaks and pandemics are of all times and probably have killed more people than all previous wars together. We often remember wars, not pandemics. Hence, we have forgotten to be prepared for pandemics; governments lack to have a plan ready to be prepared for the next epidemic. We now see that the US, India, Brazil, Russia and Argentina have topped the 1 million mark of positive cases, with many other countries following soon in their steps. And these figures are for sure an underreporting of the reality, with second waves showing we’re far from controlling the virus.


Yong Shi
Professor of Management
University of Chinese Academy of Sciences

Title: How to deal with COVID-19 by using Data Analysis 
Abstract
To determine the right timing for resuming work and life, the talk first provides a retrospective analysis of COVID-19 to gain an in-depth understanding of age-specific contact-based disease transmission. This is followed then by a promising analysis of different work resumption plans to assess not only the respective economic implications of the plans, but most importantly, the associated disease transmission risks. The key to the method of COVID-19 transmission pattern characterization lies in modeling the interactions among people. Specifically, this talk considers four representative settings of social contacts that may cause the disease spread: (1) households; (2) schools; (3) workplaces; and (4) public places. It develops a computational method to measure the contact intensity between different age groups in those social settings. With such an in-depth characterization of social contact-based transmission, it is possible to analyze and explain the ins and outs of the COVID-19 outbreak, including the past and future risks, intervention effectiveness, and corresponding risks of restoring social activities.


Wanlei Zhou
Vice Rector (Academic Affairs), Dean of Institute of Data Science
City University of Macau

Title: Countermeasures to Defend against Advanced Persistent Threats
Abstract
Advanced persistent threat (APT) poses a great threat to modern organizations. Different from traditional cyberattacks, an APT attacker is typically a well-resourced and well-organized entity, with the intent of stealing sensitive data covertly and on a long-term basis from the target organization. Through extended reconnaissance and employing sophisticated social engineering techniques, an APT can always avoid traditional cyber defense measures to infiltrate the organization, causing serious data leakage. Consequently, how to effectively defend against APT has been a major concern in the domain of cybersecurity. An APT attack has five stages: reconnaissance, establish foothold, lateral movement, exfiltration, and post-exfiltration. In order to mitigate the impact of APT on an organization, defense measures should be imposed to all stages of an APT attack life cycle, and the compromised systems in the organization must be quarantined and recovered in a timely and effective way. In this talk we first introduce some background of advanced persistent threat, and then describe three case studies to defend against APT attacks in various methods, including a countermeasure to defend against reconnaissance, a countermeasure to defend against lateral movement, and a countermeasure to customize a dynamic quarantine and recovery (QAR) scheme to minimize the APT impact.
Short Bios:
Professor Wanlei Zhou is currently the Vice Rector (Academic Affairs) and Dean of Faculty of Data Science, City University of Macau, Macao SAR, China. He received the B.Eng and M.Eng degrees from Harbin Institute of Technology, Harbin, China in 1982 and 1984, respectively, and the PhD degree from The Australian National University, Canberra, Australia, in 1991, all in Computer Science and Engineering. He also received a DSc degree (a higher Doctorate degree) from Deakin University in 2002. Before joining City University of Macau, Professor Zhou held various positions including the Head of School of Computer Science in University of Technology Sydney, Australia, the Alfred Deakin Professor, Chair of Information Technology, Associate Dean, and Head of School of Information Technology in Deakin University, Australia. Professor Zhou also served as a lecturer in University of Electronic Science and Technology of China, a system programmer in HP at Massachusetts, USA; a lecturer in Monash University, Melbourne, Australia; and a lecturer in National University of Singapore, Singapore. His main research interests include security, privacy, and distributed computing. Professor Zhou has published more than 500 papers in refereed international journals and refereed international conferences proceedings, including many articles in IEEE transactions and journals.

Jian Cao
Dr. Jian Cao is currently a tenured professor with the Department of Computer Science and Engineering, Shanghai Jiao Tong University. He is also the director of research institute of network computing and service computing. Dr. Cao’s research interests include intelligent data analytics, service computing, collaborative computing and software engineering. Besides national and provincial government research grants, his research is also supported by many industry partners, such as Morgan Stanley, Ctrip, Samsung and Docomo, just to name a few. He has published more than 300 research papers in prestigious journals and conferences. Dr. Cao has won 10 ministerial or provincial level scientific and technological achievements rewards. Currently, he is the distinguished member of CCF and the senior member of IEEE.

Title: Fairness in Recommender Systems 
Abstract
With the wide applications of recommender systems, the potential impacts of recommender systems to the customers, item providers and other parties attract more and more attention. Fairness, which is the quality of treating people equally, is also becoming important in recommender system evaluation and algorithm design. Therefore, in the past years, there has been a growing interest in fairness measurement and assurance in recommender systems. In this talk, the concept of fairness will be discussed in detail in the various contexts of recommender systems. The framework to classify fairness metrics will be proposed from different dimensions. Then the strategies for eliminating unfairness in recommendations will be reviewed. Some research done by the team of the speaker will be presented as case studies. Finally, the challenges and future work will be discussed.