抄録
In this paper, we propose a solution that won the 10th prize in the KDD Cup
2023 Challenge Task 2 (Next Product Recommendation for Underrepresented
Languages/Locales). Our approach involves two steps: (i) Identify candidate
item sets based on co-visitation, and (ii) Re-ranking the items using LightGBM
with locale-independent features, including session-based features and product
similarity. The experiment demonstrated that the locale-independent model
performed consistently well across different test locales, and performed even
better when incorporating data from other locales into the training.