黄瓜 发表于 2025-3-25 04:24:31

Tackling Oversmoothing in GNN via Graph SparsificationGNNs and pooling models, such as GIN, SAGPool, GMT, DiffPool, MinCutPool, HGP-SL, DMonPool, and AdamGNN. Extensive experiments on different real-world datasets show that our model significantly improves the performance of the baseline GNN models in the graph classification task.

审问,审讯 发表于 2025-3-25 08:46:46

Enhancing Shortest-Path Graph Kernels via Graph Augmentationnted graphs, we employ the Wasserstein distance to track the changes. Our novel graph kernel is called the Augmented SP (ASP). We conduct experiments on various benchmark graph datasets to evaluate ASP’s performance, which outperforms the state-of-the-art graph kernels on most datasets.

繁重 发表于 2025-3-25 12:52:49

Hyperbolic Contrastive Learning with Model-Augmentation for Knowledge-Aware Recommendationugmentation techniques to assist Hyperbolic contrastive learning. Different from the classical structure-level augmentation (e.g., edge dropping), the proposed model-augmentations can avoid preference shifts between the augmented positive pair. Finally, we conduct extensive experiments to demonstrat

Cabg318 发表于 2025-3-25 16:15:17

http://reply.papertrans.cn/63/6206/620546/620546_24.png

altruism 发表于 2025-3-25 20:04:45

Adaptive Knowledge Distillation for Classification of Hand Images Using Explainable Vision Transformresults demonstrate that ViT models significantly outperform traditional machine learning methods and the internal states of ViTs are useful for explaining the model outputs in the classification task. By averting catastrophic forgetting, our distillation methods achieve excellent performance on dat

HUMP 发表于 2025-3-26 04:08:08

http://reply.papertrans.cn/63/6206/620546/620546_26.png

LINES 发表于 2025-3-26 06:32:37

Lifelong Hierarchical Topic Modeling via Nonparametric Word Embedding Clusteringte that our method can generate a rational, flexible, and coherent topic structure. Lifelong learning evaluations also validate that our method is less influenced by catastrophic forgetting than baseline models. Our code is available at ..

precede 发表于 2025-3-26 08:45:44

http://reply.papertrans.cn/63/6206/620546/620546_28.png

气候 发表于 2025-3-26 16:40:46

http://reply.papertrans.cn/63/6206/620546/620546_29.png

我不明白 发表于 2025-3-26 17:08:50

http://reply.papertrans.cn/63/6206/620546/620546_30.png
页: 1 2 [3] 4 5 6 7
查看完整版本: Titlebook: Machine Learning and Knowledge Discovery in Databases. Research Track and Demo Track; European Conference, Albert Bifet,Povilas Daniušis,In