Quants see promise in DeBerta’s untangled reading

Improved language models are able to grasp context better

Context, as they say, is everything – which is a big problem for investors when they try to use so-called large language models to weigh the sentiment of financial news. The models are notorious for misreading terms that could be either good or bad depending on what’s being talked about at the time.

A few methods have been tried to solve the problem, mostly using the idea that models can look at the words around those they want to make sense of. One such model, FinBert, a version of Google’s open

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

Sorry, our subscription options are not loading right now

Please try again later. Get in touch with our customer services team if this issue persists.

New to Risk.net? View our subscription options

Register

Want to know what’s included in our free membership? Click here

This address will be used to create your account

Most read articles loading...

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here