Home » Blogs » AI Recruiting Technology » The AI Recruiter Who Really ...

The AI Recruiter Who Really Gets It: Part II — Linguistics in AI

As described in Part I of this article, we humans do a lot of unconscious work when communicating.  We make use of linguistics in AI, using rules and knowledge to interpret phrases and sentences, and even individual words. We can also apply contextual information to derive meaning from otherwise ambiguous expressions.  Finally, understanding one another depends on having shared knowledge of the world and objects around us. Here in part II, I will explain how computers, and specifically our AI recruiter, AllyO, can take advantage of these same language-understanding skills.

Linguistics in AI — rules and knowledge

We will re-use the examples from Part I, starting with those highlighting linguistic knowledge:

  1. Coordination: I can repair electronics and operate hand tools and heavy machinery.
  2. Negation: I have worked in a cafe, never in a restaurant.
  3. Spelling: I have on year of experience.
Coordination:

Example 1, coordination, is the most strictly rule-based example here.  The rules for constructing and understanding utterances in a language are fairly regular, and can be assembled into a grammar.  The grammar can then be used to parse text, and the underlying structure can discovered. In this case, a parse would reveal that “hand tools” and “heavy machinery” are both arguments of the verb “operate”, and that “repair electronics” and “operate hand tools and heavy machinery” are arguments of the verb “can”.  

Negation:

Negation, as in example 2, tends to follow some rules, but can also arise the meanings of individual words.  A first pass at recognizing negation can simply look for negative words like “not” and “never”, or prefixes like “un-”.  However, people can also answer a question negatively by making positive assertions about the opposite of what is asked (“Can you work weekends?” “I am only available to work on weekdays.”).  Subtle negations can be understood with explicitly defined knowledge resources like an ontology where the concepts of “working weekends” and “working weekdays” can be encoded as having contradictory meanings.

Spelling:

Although spelling is governed by rules, we cannot necessarily use these rules to fix misspelled words because we don’t know what word the writer meant (or even that the given spalling was definitely a mistake).  One good way to overcome misspellings is to use edit distance when words are unrecognizable, or phrases fail to produce felicitous meanings as given.  In example sentence 3, if AllyO has some expectation of receiving numbers with messages about “years of experience”, perhaps of the form “I have <number> years of experience”, then we could recover “one” from “on” with an edit distance of 1.

Contextual Information

      1. In order to fully understand the second set of example sentences here, we need some contextual information. I was a sales lead at ABC Corp. from 2014 to 2016 and have been doing the same thing at XYZ Inc. since then.

      2. I received my bachelor’s in finance in 2010.  

          I’ve been a financial analyst at National Realty Group since college.

      3. I was in charge of the servers at a bank. – (Applying for an IT position)

Recognizing entities and concepts:

In sentence 1 above, we would start with recognizing entities and concepts, like “sales lead”, and creating a parse of the sentence.  This analysis would give us something like <job role: sales lead>, <job employer: ABC Corp>, <job role: the same thing>, <job employer: XYZ Inc.>.  With a predefined set of referential expressions like {“it”, “that”, “there”, “the same thing”}, AllyO knows to look back and copy the previously found job role “sales lead” in place of the referential expression.

Conception of time and events:

Sentence 2 requires a conception of time and events.  Similar to the analysis of sentence 1, we can construct a representation of concepts and events like <type of degree: bachelor’s in finance> <time of degree completion: 2010>, <job role: financial analyst>, <job start: college>.  This example requires inferring that “since college” really means “since the end of attending college”, but with some basic knowledge representation, we can recognize that a bachelor’s degree is received by attending college, so <job start> is equivalent to <time of degree completion>, which we know is 2010.

Graph-based representation of concepts:

Sentence 3 is in some ways the easiest of this set to understand.  AllyO doesn’t have the chance to read books and newspapers and talk to people to learn new vocabulary, so the only known concepts are those that we have explicitly encoded.  This means we can avoid spurious interpretations like “bank” as a river bank by simply excluding them from our knowledge representation. Given our domain of jobs and recruiting across all industries, we would need to know both the web-server and food-server senses of “server”, but this ambiguity can be resolved by using a graph-based representation of these concepts.  We can measure the distance in the graph from the concept of the position being applied for to the possible senses of the word “server”. In this case, “web-server” would have a much shorter distance from IT professions than “food server”.

Predefined knowledge representations

The final set of examples relies further on predefined knowledge representations:

      1. Q: Do you have a reliable means of transportation?

          A: I don’t have a car, but my apartment is along the bus route.

      2. I’d like a job in the Bay Area

The question in example 1 is framed as a yes/no question, but the response is a sentence describing the applicant’s available means transportation.  The absence of a clear affirmative or negative answer would be problematic for a naive system. But with our knowledge representation, we can include an expected topic of <transportation type> for this exchange, and recognize concepts including “car”, “bus” and “bicycle” in addition to a simple “yes” or “no”. Linguistics in AI

In the final example, a prospective applicant has used a colloquial term for expressing interest in jobs within a specific geographic region.  Understanding requests like this can benefit from a combination of several techniques. Most simply, there already exist publicly available geographic search tools to help find the most likely meaning of an unofficial place names.  Additionally, we can learn from our data and include these place names in our own knowledge base, mapping them to standard cities and postal codes.

From linguistic knowledge to world knowledge, computers can make use of pre-compiled information in many similar ways to humans.  They can analyze and deconstruct parts of a sentence, make judgements of similarity or relevance of word senses, and recognize places and entities referred to by non-standard names. By using linguistics in AI, we can better understand the way that humans present information,

SUBSCRIBE TO ALLYO BLOG

Stay up-to-date with the latest insights and trends from AI recruiting brought to you by AllyO Blog!

Schedule Demo Widget

Ready to transform your HR communications?

Learn More

AllyO is Now SOC-2 Type 1 CertifiedLearn More
+