Natural Language Processing (NLP) has come a long ways to be able to work with unsupervised data (self-learning) but in most industries, it is still equated to virtual assistants (‘chatbots’) that can handle basic customer queries. Most AI enabled linguistic mentions fail to distinguish between Natural Language Understanding (NLU) and Natural Language Generation (NLG) and how advanced deep learning (specifically recurrent neural networks) are ushering in a highly sophisticated era where virtual assistants go beyond answering basic customer query.
Even more amusing is how many Financial Institutions fervently market their virtual assistant as the next big breakthrough, so much so, they christen it with a human name. In this research, we move past this repeated diatribe into more sophisticated yet highly practical NLU, NLG use cases:
Dear Reader. "Unsupervised NLP Ushers in Sophisticated VA Era: Compliance, Trading, Issue Resolution, Information Exchange w/ Intention" is a premium article. If you would like to get access to this article, please pay 2500 USD fee.
In uncertain and precarious times, most enterprises split into one of these two camps.
Those wanting to capitalize on lurking opportunities at any cost (arbitrage) and those wanting to retain customers while carefully helping them navigate through uncertainty. Which approach businesses adopt, is strictly their prerogative as the ultimate objective of any business is to maximize profit for their shareholders, ideally by keeping interest of all stakeholders (such as customers) in mind.
In dire times, it is not uncommon for enterprises wanting to fend for themselves but also wanting to capitalize (even if unfairly, unethically) on imbalances in demand and supply. To combat exploitation, unfair opportunistic advantages, consumers look up to regulators and compliance officers.
Dear Reader. "Next Wave of RegTech: Less Manual Oversight via Greater AI Hindsight" is a premium article. If you would like to get access to this article, please pay 1000 USD fee.
In 2018, Google introduced new transfer learning technique, based on neural network for Natural Language Processing (NLP), called Bidirectional Encoder Representation for Transformers (BERT). According to Google, ‘this technology enables anyone to train their own state-of-the-art question answering system’.
Dear Reader. "2020 Linguistic Cheatsheet for Banking: BERT, Lexical Chaining, NLP and CX" is a premium article. If you would like to get access to this article, please pay 1000 USD fee.