Stop Thinking, Just Do!

Sungsoo Kim's Blog

Dynamic Graph Information Bottleneck

tagsTags

1 April 2024


Article Source


Dynamic Graph Information Bottleneck

  • Temporal Graph Learning Reading Group
  • Paper: β€œDynamic Graph Information Bottleneck”
  • Speaker: Haonan Yuan
  • Date: March 21, 2024

Abstract

Dynamic Graphs widely exist in the real world, which carry complicated spatial and temporal feature patterns, challenging their representation learning. Dynamic Graph Neural Networks (DGNNs) have shown impressive predictive abilities by exploiting the intrinsic dynamics. However, DGNNs exhibit limited robustness, prone to adversarial attacks. This paper presents the novel Dynamic Graph Information Bottleneck (DGIB) framework to learn robust and discriminative representations. Leveraged by the Information Bottleneck (IB) principle, we first propose the expected optimal representations should satisfy the Minimal-Sufficient-Consensual (MSC) Condition. To compress redundant as well as conserve meritorious information into latent representation, DGIB iteratively directs and refines the structural and feature information flow passing through graph snapshots. To meet the MSC Condition, we decompose the overall IB objectives into DGIB𝑀𝑆 and DGIB𝐢, in which the DGIB𝑀𝑆 channel aims to learn the minimal and sufficient representations, with the DGIB𝐢 channel guarantees the predictive consensus. Extensive experiments on real-world and synthetic dynamic graph datasets demonstrate the superior robustness of DGIB against adversarial attacks compared with state-of-the-art baselines in the link prediction task. To the best of our knowledge, DGIB is the first work to learn robust representations of dynamic graphs grounded in the information-theoretic IB principle.

동적 κ·Έλž˜ν”„μ˜ κ²¬κ³ ν•œ ν‘œν˜„ ν•™μŠ΅μ„ μœ„ν•œ 동적 κ·Έλž˜ν”„ 정보 병λͺ© (DGIB) ν”„λ ˆμž„μ›Œν¬

μ‹€ μ„Έκ³„μ—λŠ” λ‹€μ–‘ν•œ 곡간적, μ‹œκ°„μ  νŠΉμ§• νŒ¨ν„΄μ„ κ°€μ§€λŠ” 동적 κ·Έλž˜ν”„κ°€ 널리 μ‘΄μž¬ν•˜λ©°, μ΄λŠ” κ·Έλž˜ν”„ ν‘œν˜„ ν•™μŠ΅μ— μžˆμ–΄ 어렀움을 μ œμ‹œν•©λ‹ˆλ‹€. 동적 κ·Έλž˜ν”„ 신경망 (DGNN)은 λ‚΄μž¬μ  동적 νŠΉμ„±μ„ ν™œμš©ν•˜μ—¬ 인상적인 예츑 λŠ₯λ ₯을 λ³΄μ—¬μ£Όμ—ˆμ§€λ§Œ, μ λŒ€μ  곡격에 μ·¨μ•½ν•˜λ‹€λŠ” ν•œκ³„κ°€ μžˆμŠ΅λ‹ˆλ‹€. 이 λ…Όλ¬Έμ—μ„œλŠ” κ²¬κ³ ν•˜κ³  νŒλ³„λ ₯ μžˆλŠ” ν‘œν˜„μ„ ν•™μŠ΅ν•˜κΈ° μœ„ν•œ μƒˆλ‘œμš΄ 동적 κ·Έλž˜ν”„ 정보 병λͺ© (DGIB) ν”„λ ˆμž„μ›Œν¬λ₯Ό μ œμ•ˆν•©λ‹ˆλ‹€.

정보 병λͺ© (IB) 원리 ν™œμš©μ„ 톡해 λ¨Όμ €, κΈ°λŒ€ 졜적 ν‘œν˜„μ€ μ΅œμ†Œ-μΆ©λΆ„-ν•©μ˜ (MSC) 쑰건을 λ§Œμ‘±ν•΄μ•Ό ν•œλ‹€κ³  μ œμ•ˆν•©λ‹ˆλ‹€. DGIBλŠ” 정보 병λͺ© 원리λ₯Ό μ΄μš©ν•˜μ—¬ λΆˆν•„μš”ν•œ 정보λ₯Ό μ••μΆ•ν•˜κ³  μ€‘μš”ν•œ 정보λ₯Ό 잠재적 ν‘œν˜„μœΌλ‘œ μœ μ§€ν•˜κΈ° μœ„ν•΄ κ·Έλž˜ν”„ μŠ€λƒ…μƒ·μ„ ν†΅κ³Όν•˜λŠ” ꡬ쑰적 및 νŠΉμ§• 정보 흐름을 반볡적으둜 μ§€μ‹œ 및 κ°œμ„ ν•©λ‹ˆλ‹€. MSC 쑰건을 μΆ©μ‘±ν•˜κΈ° μœ„ν•΄ 전체 IB λͺ©μ  ν•¨μˆ˜λ₯Ό DGIB_MS 및 DGIB_C둜 λΆ„ν•΄ν•©λ‹ˆλ‹€. μ—¬κΈ°μ„œ DGIB_MS 채널은 μ΅œμ†Œν•œμ˜ μΆ©λΆ„ν•œ ν‘œν˜„μ„ ν•™μŠ΅ν•˜λŠ” 것을 λͺ©ν‘œλ‘œ ν•˜κ³ , DGIB_C 채널은 예츑 ν•©μ˜λ₯Ό 보μž₯ν•©λ‹ˆλ‹€.

μ‹€μ œ 및 ν•©μ„± 동적 κ·Έλž˜ν”„ 데이터 μ„ΈνŠΈμ— λŒ€ν•œ κ΄‘λ²”μœ„ν•œ μ‹€ν—˜μ„ 톡해 μ΅œμ²¨λ‹¨ κΈ° baseline λͺ¨λΈκ³Ό λΉ„κ΅ν•˜μ—¬ 링크 예츑 μž‘μ—…μ—μ„œ DGIB의 견고성이 λ›°μ–΄λ‚˜λ‹€λŠ” 것을 μž…μ¦ν–ˆμŠ΅λ‹ˆλ‹€. μ €μžλ“€μ˜ 연ꡬ 결과에 λ”°λ₯΄λ©΄, DGIBλŠ” 정보 이둠적 IB 원리 기반으둜 동적 κ·Έλž˜ν”„μ˜ κ²¬κ³ ν•œ ν‘œν˜„μ„ ν•™μŠ΅ν•˜λŠ” 졜초의 μ—°κ΅¬μž…λ‹ˆλ‹€.


comments powered by Disqus