Using Word2Vec Recommendation for Improved Purchase Prediction

Ramazan Esmeli, Mohamed Bader-El-Den, Hassana Abdullahi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

16 Citations (Scopus)

Abstract

Purchase prediction can help e-commerce planners plan their stock and personalised offers. Word2Vec is a well-known method to explore word relations in sentences for sentiment analysing by creating vector representation of words. Word2Vec models are used in many works for product recommendations. In this paper, we analyse the effect of item similarities in the sessions in purchase prediction performance. We choose the items from different position of the session, and we derive recommendations from selected items using Word2Vec model. We assess the similarities between items by analysing the number of common recommendations of selected items. We train classification algorithms after we include similarity calculations of the selected items as session features. Computational experiments show that using similarity values of the interacted items in the session improves the performance of purchase prediction in terms of F1 score.

Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728169262
DOIs
Publication statusPublished - 28 Sept 2020
Externally publishedYes
Event2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, United Kingdom
Duration: 19 Jul 202024 Jul 2020

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
Country/TerritoryUnited Kingdom
CityVirtual, Glasgow
Period19/07/2024/07/20

Keywords

  • Classification
  • Machine Learning
  • Purchase Intent
  • Purchase behaviour prediction
  • Word2vec Product Recommendation
  • browsing behaviour

Cite this