Aug 2023
7 Mon
8 Tue
9 Wed
10 Thu
11 Fri 09:00 AM – 06:00 PM IST
12 Sat
13 Sun
Aug 2023
7 Mon
8 Tue
9 Wed
10 Thu
11 Fri 09:00 AM – 06:00 PM IST
12 Sat
13 Sun
This video is for members only
Meghana Negi
Problem :
At the TnS(Trust and Safety) team at Swiggy, building powerful fraud detection models that operate at high precision while still capturing maximum fraud has been the uber goal. Our system currently operates at a high level of complexity through various interventions, modelling techniques, and semi-supervised training methods while maintaining robustness.
For the final downstream model, we have always relied on tree-based learners over neural networks. Since are data is primarily tabular in nature, tree-based learners outperformed DNNs significantly on the winning metrics. While tree-based learners are great performers in terms of the final metrics that we’re looking to optimise, it has a few challenges:
1. It inherently restricts us from trying out more complex data structures like images or sequential data, we have tried to integrate such signals through a separate model whose final score is fed into the tree based learner but it significantly adds to complexity of the system.
2. A major press point for Fraud models historically has been a lack of explainability in predictions. We have experimented with LIME and SHAP-based approaches to build an explainable overhead but they’re computationally expensive to run for each record.
Solution:
While tree-based methods for a deployable model have all these challenges, what works in their favour is that they have historically outperformed DL-based methods by a significant margin. This changes with TabNet, in the original paper(Ref), authors claim that TabNet can match or even outperform tree-based methods while also giving sample-level explainability, which we can also visualise. We explored a tabnet based model for our approach and found it to be on par with tree-based counterpart(xgboost). TabNet also allowed us to compute and store feature level attention within the model logs without any computational overhead.
Outline:
In the presentation, we’ll be going through the following in depth.
Current pipeline and solution
Challenges in depth
Motivation for TabNet and what it unlocks
Experimental results and conclusion
Aug 2023
7 Mon
8 Tue
9 Wed
10 Thu
11 Fri 09:00 AM – 06:00 PM IST
12 Sat
13 Sun
Hosted by
Supported by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}