Home » Blogs » AI Recruiting Technology » Learn About Natural Language Processing ...

Learn About Natural Language Processing And Its Challenges In Artificial Intelligence

BY KATHLEEN PREDDY, LINGUISTICS ENGINEER

Read time: 4 minutes

As part of AllyO’s new All-yo questions video series, Linguistic Engineer Kathleen Preddy discusses natural language processing (NLP) and its challenges with AI. with co-founder Ankit Somani. Read or watch below as Kathleen explains the type of dialogue system we use here at AllyO, what resources and informations our computers need to understand us, and how to tackle language understanding.

Task oriented dialogue systems

There are many different kinds of dialogue systems.  The earliest human-computer dialogue systems were active listeners who parroted back responses from the human.  Another common kind of human-computer dialogue system is a question answering system who answers factoid questions like “what’s the capital of Yemen?”  At AllyO, we build a task-oriented dialogue system, in which the human and computer work together to accomplish the task of recruiting.  AllyO not only understands what the user is asking but also what information the user is trying to obtain and what information AllyO needs to obtain to accomplish the task, all while balancing the give-and-take of various tasks.  

 

Does my computer understand me?

In a world where AI is improving at a dramatic rate and is built into our everyday lives, it can seem like our computers understand almost everything we say—and it can be frustrating when they misinterpret us!  With unprecedented computing power and data access, why is language hard for computers?

“What resources and information do computers need to understand us?”

Language is a very complicated phenomenon.  In fact, scientists don’t have a good understanding of how humans process language!  AI doesn’t have the hardware a human has, nor does it have the world experience and context that humans use when interpreting language, so language understanding remains a very challenging problem in the world of AI.  The question becomes: what resources and information do computers need to understand us?  This is the problem we are tackling at AllyO.  

Some things are easier to understand than others.  Grammatical language with unambiguous meanings is something we can teach to computers because we can break it down into heuristics the computer can apply.  However, anything semantically ambiguous—think sarcasm, metaphors, or even homophones like chilly and chili—is nearly impossible for even a very good language processing system to learn.

As we move into the future of natural language processing at AllyO, I’m excited about the new trend of using deep learning and structured knowledge together to teach computers language.  This seems in line with how humans develop cognitively—building a structure of knowledge from the world around us based on as much data as we have gathered from our environment. For example, structured semantic data could give our computers a sense of the differences between chilly and chili and how they may be related to actions like closing windows.  There are a lot of reasons AI can’t compete with humans at understanding other humans, but we can certainly use our huge amounts of available real world knowledge to avoid frustrations like mixing up chilly and chili.  

 

Can we tackle language understanding a piece at a time?

Language processing is a challenging problem in AI for many reasons.  One reason is that the scope of language is so broad. This leads to a lot of ambiguity: Did you see that play? could be a question about your theater attendance, your attendance to a specific possession in a sports game, or your noticing the diversion of some creature.  If we don’t know what topic we’re on, it’s very hard to predict the meaning of the word play.  

So how do we tackle not only the problem of play, but the problem of all the ambiguous words and phrases we could find?  It’s a bit overwhelming.

One way many natural language processing (NLP) systems are tackling certain topics, or domains, one at a time.  Here at AllyO, we cover the domain of recruiting, which allows us to build out state of the art NLP systems without having to mess with as much ambiguity as if we covered all of the English language.

The best way to tackle domain-specific NLP is to build domain-specific knowledge resources.  One important knowledge resource is an ontology.  An ontology models meaning and relatedness using data. Ontologies allow applications to practically use meaning and relatedness from specific real-world data input.  Using a domain-specific ontology, we can see how the words we care about are related to each other without being burdened down by all the words in the language. This lets us build out NLP systems that are complete for our domain, without overdoing it.  

SUBSCRIBE TO ALLYO BLOG

Stay up-to-date with the latest insights and trends from AI recruiting brought to you by AllyO Blog!

Schedule Demo Widget

Ready to transform your HR communications?

Learn More

AllyO Virtual Hiring Events™Learn More
+