Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Harmonized system code classification using supervised contrastive learning with sentence BERT and multiple negative ranking loss

Anggoro, Angga, Corcoran, Padraig ORCID: https://orcid.org/0000-0001-9731-3385, De Widt, Dennis ORCID: https://orcid.org/0000-0002-7299-5663 and Li, Yuhua ORCID: https://orcid.org/0000-0003-2913-4478 2024. Harmonized system code classification using supervised contrastive learning with sentence BERT and multiple negative ranking loss. Data Technologies and Applications 10.1108/DTA-01-2024-0052

[thumbnail of Attached standard file-.PDF]
Preview
PDF - Accepted Post-Print Version
Available under License Creative Commons Attribution Non-commercial.

Download (2MB) | Preview

Abstract

Purpose — International trade transactions, extracted from customs declarations, include several fields, among which the product description and the product category are the most important. The product category, also referred to as the Harmonised System Code (HS code), serves as a pivotal component for determining tax rates and administrative purposes. A predictive tool designed for product categories or HS codes becomes an important resource aiding traders in their decision to choose a suitable code. This tool is instrumental in preventing misclassification arising from the ambiguities present in product nomenclature, thus mitigating the challenges associated with code interpretation. Moreover, deploying this tool would streamline the validation process for government officers dealing with extensive transactions, optimising their workload and enhancing tax revenue collection within this domain. Design/methodology/approach — This study introduces a methodology focused on the generation of sentence embeddings for trade transactions, employing Sentence Bidirectional Encoder Representations from Transformers (SBERT) framework in conjunction with the Multiple Negative Ranking (MNR) Loss function following a contrastive learning paradigm. The procedure involves the construction of pairwise samples, including anchors and positive transactions. The proposed method is evaluated using two publicly available real-world datasets, specifically the India Import 2016 and United States Import 2018 datasets, to fine-tune the SBERT model. Several configurations involving pooling strategies, loss functions, and training parameters are explored within the experimental setup. The acquired representations serve as inputs for traditional machine learning algorithms employed in predicting the product categories within trade transactions. Findings — Encoding trade transactions utilising SBERT with MNR loss facilitates the creation of enhanced embeddings that exhibit improved representational capacity. These fixed-length embeddings serve as adaptable inputs for training machine learning models, including Support Vector Machine (SVM) and Random Forest, intended for downstream tasks of HS code classification. Empirical evidence supports the superior performance of our proposed approach compared to fine-tuning transformer-based models in the domain of trade transaction classification. Originality/value — Our approach generates more representative sentence embedding by creating the networks architectures from scratch with the SBERT framework. Instead of exploiting a data augmentation method generally used in contrastive learning for measuring the similarity between the samples, we arranged positive samples following a supervised paradigm and determined loss through distance learning metrics. This process involves continuous updating of the Siamese or bi-encoder network to produce embeddings derived from commodity transactions. This strategy aims to ensure that similar concepts of transactions within the same class converge closer within the feature embedding space, thereby improving the performance of downstream tasks.

Item Type: Article
Date Type: Published Online
Status: In Press
Schools: Computer Science & Informatics
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Uncontrolled Keywords: Sentence BERT, Multiple Negative Ranking Loss, Harmonised System Code, Trade Transactions, Support Vector Machine, Random Forest
Publisher: Emerald Publishing
ISSN: 2514-9288
Funders: Indonesia Government - Ministry of Finance
Date of First Compliant Deposit: 7 January 2025
Date of Acceptance: 19 October 2024
Last Modified: 07 Jan 2025 13:00
URI: https://orca.cardiff.ac.uk/id/eprint/174481

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics