Traders can’t predict the market, but maybe their faces can



“Everyone on the street talks,” says one trader in London (not part of the study) who said they’d find such alerts about their peers’ sentiment useful. “The whole part of doing what we do is discuss ideas and share information… Non-verbal communication is massive.” Years back, trading floors were loud places where people would often talk on three or four phone lines at the same time; now many communicate over chat rooms and talking is minimal.

But the study also points to another uncomfortable phenomenon: facial recognition is here to stay and its more controversial cousin, facial analysis, might be as well. For all the concern that has bubbled up around facial recognition, including over the mistakes it can make as a surveillance tool, tens of millions of us still use it unhesitatingly to unlock our phones.

‘We have a whole toolbox of searching algorithms that we’ll be testing to see if they correlate to a market signal.’

Mario Savvides, lead scientist on the project

Facial analysis like the kind being used by Carnegie Mellon, opens a bigger can of worms. Last summer, Microsoft vowed to eliminate its facial-analysis tools, which estimated a person’s gender age and emotional state, admitting that the system could be unreliable and invasive. That might not matter too much for traders, eager to lap up whatever data they can for an edge. But this study — if successful — could embolden research into analysing faces for other purposes, like assessing one’s emotional state during a work meeting.

“If you’re doing a business deal over Zoom, can you have an AI read the face to tell if someone is calling your bluff, or being a hard negotiator?” asks Savvides. “It’s possible. Why not?”

Zoom Video Communications introduced a feature last year that tracks sentiment in a recorded work meeting. Called Zoom IQ, the software targeted at sales professionals gives meeting participants a score of between 0 and 100, with anything over 50 indicating greater engagement in the conversation. The system doesn’t use facial analysis but tracks speakers’ engagement, or how long one waits to respond, and offers its score at the end of the meeting.

More than two dozen rights groups have called on Zoom to stop working on the feature, arguing that sentiment analysis is underpinned by pseudoscience and is “inherently biased”. A spokesperson for Zoom said the company still sells the software, and that it “turns customer interactions into meaningful insights”.

You can argue that Carnegie’s researchers shouldn’t care what their facial-analysis tool tells them about traders’ emotions; they just need to spot the patterns that point to correlations, and pivot those figures into a searching algorithm. But the downside of turning emotions into a number is just that: it risks devaluing one of the most fundamental features of being a human. It might be better if it doesn’t catch on.

Bloomberg

The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.



Source link

Denial of responsibility! planetcirculate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – admin[email protected]. The content will be deleted within 24 hours.