Can someone please build a framework for store assistants to be able to quickly diagnose a customers intent?
- Customer entered the store casually wandered around equates to casual shopping.
- Customer walked directly to a specific section and is browsing.
- Customer has one hand in pocket, or clasping handbag
I don’t know the specifics but a model so that store assistants can recognise the behaviour for a certain outcome then adjust their service accordingly. A win/win for both parties.
Wouldn’t be too hard, grab some security footage for a day, monitor what happens, classify users into purchasing behaviours, link to commonalities. Test again with new footage.
Tags: purchasing mode