Abstract and 1. Introduction

  1. Related Work

  2. Preliminaries and Notations

  3. Differentiable Structural Information

    4.1. A New Formulation

    4.2. Properties

    4.3. Differentiability & Deep Graph Clustering

  4. LSEnet

    5.1. Embedding Leaf Nodes

    5.2. Learning Parent Nodes

    5.3. Hyperbolic Partitioning Tree

  5. Experiments

    6.1. Graph Clustering

    6.2. Discussion on Structural Entropy

  6. Conclusion, Broader Impact, and References Appendix

A. Proofs

B. Hyperbolic Space

C. Technical Details

D. Additional Results

5.2. Learning Parent Nodes

A primary challenge is that, in the partitioning tree, the node number at each internal level is unknown. To address this issue, we introduce a simple yet effective method, setting a large enough node number Nh at the h-th level. A large Nh may introduce redundant nodes and result in a relaxed partitioning tree. According to Theorem 4.8 established in Sec. 4.3, redundant nodes in the partitioning tree do not affect the value of structural entropy, and finally present as empty leaf nodes by optimizing our objective. Theoretically, if an internal level has insufficient nodes, the self-organization of the graph can still be described by multiple levels in the partitioning tree.

Remark. In fact, the geometric centroid in Theorem 5.2 is also equivalent to the gyro-midpoint in Poincare ball model ยด of hyperbolic space, detailed in Appendix B.4.

Authors:

(1) Li Sun, North China Electric Power University, Beijing 102206, China (ccesunli@ncepu.edu);

(2) Zhenhao Huang, North China Electric Power University, Beijing 102206, China;

(3) Hao Peng, Beihang University, Beijing 100191, China;

(4) Yujie Wang, North China Electric Power University, Beijing 102206, China;

(5) Chunyang Liu, Didi Chuxing, Beijing, China;

(6) Philip S. Yu, University of Illinois at Chicago, IL, USA.


This paper is available on arxiv under CC BY-NC-SA 4.0 Deed (Attribution-Noncommercial-Sharelike 4.0 International) license.