Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/21760
Title: ADVANCEMENTS IN GRAPH NEURAL NETWORK EFFICIENCY: MODEL COMPRESSION AND COMPARATIVE EVALUATION OF RPHGNN AND GNTK
Authors: SINHA, NIKHIL
Keywords: GRAPH NEURAL NETWORK
COMPARATIVE EVALUATION
RPHGNN
GNTK
Issue Date: Jun-2025
Series/Report no.: TD-8034;
Abstract: Heterogeneous Graph Neural Networks (HGNNs) are smart models that are designed to work with very complex networks that have many different types of nodes and con nections. HGNNs are not like basic graph models, they can understand unstructured real-world data, like social networks or recommendation engines. The methods track patterns in connections or focus on important details; they let the system analyze data quickly. The methods help with tasks such as finding missing links in a network or sorting data into groups. This applies especially to fields like biology or social media. A Random Projection Heterogeneous Graph Neural Network (RpHGNN) is a type of HGNN. Devel opers built it for large, complex networks that have many node and connection types. The RpHGNN simplifies data using random math calculations, which speeds up processing - this happens without much loss of information. The RpHGNN works faster and performs better than regular HGNNs. By using simpler number formats, such as FP16 and INT8, to hold data, we lowered its memory use. That change kept 94% accuracy. The upgraded RpHGNN can now power real-time features and AI tools that work smoothly even on low-power devices or in large cloud systems. We evaluate the Graph Neural Tangent Kernel (GNTK), a theoretically grounded, non-parametric approach to graph learning that eliminates the need for backpropagation. Our analysis reveals that while RpHGNN compression offers efficiency benefits for deep learning pipelines, GNTK provides a powerful and scalable alternative, especially for tasks where interpretability, analytical clarity, and efficient inference are prioritized. By com paring these models on multiple datasets, we offer practical insights into their suitability across varying computational and application scenarios.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/21760
Appears in Collections:MTech Data Science

Files in This Item:
File Description SizeFormat 
Nikhil Sinha M.Tech.pdf1.02 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.