Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
GAT4Rec: Sequential Recommendation with a Gated Recurrent Unit and Transformers
oleh: Huaiwen He, Xiangdong Yang, Feng Huang, Feng Yi, Shangsong Liang
Format: | Article |
---|---|
Diterbitkan: | MDPI AG 2024-07-01 |
Deskripsi
Capturing long-term dependency from historical behaviors is the key to the success of sequential recommendation; however, existing methods focus on extracting global sequential information while neglecting to obtain deep representations from subsequences. Previous research has revealed that the restricted inter-item transfer is fundamental to sequential modeling, and some potential substructures of sequences can help models learn more effective long-term dependency compared to the whole sequence. To automatically find better subsequences and perform efficient learning, we propose a sequential recommendation model with a gated recurrent unit and Transformers, abbreviated as GAT4Rec, which employs Transformers with shared parameters across layers to model users’ historical interaction sequences. The representation learned by the gated recurrent unit is used as the gating signal to identify the optimal substructure in user sequences. The fused representation of the subsequence and edge information is extracted by the encoding layer to make the corresponding recommendations. Experimental results on four well-known publicly available datasets demonstrate that our GAT4Rec model outperforms other recommendation models, achieving performance improvements of 5.77%, 1.35%, 11.58%, and 1.79% in the normalized discounted cumulative gain metric (NDCG@10), respectively.