What Is Natural Language Processing (NLP)?

Natural language processing (NLP) is a branch of artificial intelligence that helps computers understand, interpret and manipulate human language.

NLP draws from many disciplines, including computer science and computational linguistics, in its pursuit to fill the gap between human communication and computer understanding.

Challenges in NLP frequently involve speech recognition, natural language understanding, and natural language generation.


How does NLP evolve?


While natural language processing isn't a new science, the technology is rapidly advancing thanks to an increased interest in human-to-machine communications, plus an availability of big data, powerful computing and enhanced algorithms.

In the early days, many language-processing systems were designed by hand-coding a set of rules, such as by writing grammars or devising heuristic rules for stemming.

However, this is rarely robust to natural language variation. Since the so-called "statistical revolution" in the late 1980s and mid 1990s, much natural language processing research has relied heavily on machine learning.

The machine-learning paradigms calls instead for using statistical inference to automatically learn such rules through the analysis of large documents of typical real-world examples. These algorithms take as input a large set of "features" that are generated from the input data, which make soft, probabilistic decisions based on attaching real-valued weights to each input feature, producing more reliable results by expressing the relative certainties of many different possible answers.


How does NLP work?


NLP includes many different techniques for interpreting human language, ranging from statistical and machine learning methods to rules-based and algorithmic approaches.

A broad array of approaches is needed because the test- and voice-based data varies widely, as do the practical applications.

Basic NLP tasks include tokenization and parsing, lemmatization/stemming, part-of-speech tagging, language detection and identification of semantic relationships.

In general terms, NLP tasks break down language into shorter, elemental pieces, try to understand relationships between the pieces and explore how the pieces work together to create meaning. Some higher-level capabilities include:

  • Content categorization. A linguistic-based document summary, including search and indexing, content alerts and duplication detection.

  • Topic discovery and modeling. Accurately capture the meaning and themes in text collections, and apply advanced analytics to text, like optimization and forecasting.

  • Contextual extraction. Automatically pull structured information from text-based sources. Sentiment analysis. Identifying the mood or subjective opinions within large amounts of text, including average sentiment and opinion mining.

  • Speech-to-text and text-to-speech conversion. Transforming voice commands into written text, and vice versa.

  • Document summarization. Automatically generating synopses of large bodies of text. Machine translation. Automatic translation of text or speech from one language to another


Why is NLP important?

Large volumes of textual data. Natural language processing helps computers communicate with humans in their own language and scales other language-related tasks.

For example, NLP makes it possible for computers to read text, hear speech, interpret it, measure sentiment and determine which parts are important.

Today’s machines can analyze more language-based data than humans, without fatigue and in a consistent, unbiased way. Structuring a highly unstructured data source. Human language is astoundingly complex and diverse.

We express ourselves in infinite ways, both verbally and in writing. Not only are there hundreds of languages and dialects, but within each language is a unique set of grammar and syntax rules, terms and slang. When we write, we often misspell or abbreviate words, or omit punctuation. When we speak, we have regional accents, and we mumble, stutter and borrow terms from other languages. While supervised and unsupervised learning, and specifically deep learning, are now widely used for modeling human language, there’s also a need for syntactic and semantic understanding and domain expertise that are not necessarily present in these machine learning approaches.

NLP is important because it helps resolve ambiguity in language and adds useful numeric structure to the data for many downstream applications, such as speech recognition or text analytics.


What are NLP major applications?


Text analytics

NLP goes hand in hand with text analytics, which counts, groups and categorizes words to extract structure and meaning from large volumes of content. Text analytics is used to explore textual content and derive new variables from raw text that may be visualized, filtered, or used as inputs to predictive models or other statistical methods.

NLP and text analytics are used together for many applications, including: Investigative discovery. Identify patterns and clues in emails or written reports to help detect and solve crimes.

Subject-matter expertise.

Classify content into meaningful topics so you can take action and discover trends.

Social media analytics.

Track awareness and sentiment about specific topics and identify key influencers.

Everyday NLP examples

There are many common and practical applications of NLP in our everyday lives. Beyond conversing with virtual assistants like Alexa or Siri, here are a few more examples:

Have you ever looked at the emails in your spam folder and noticed similarities in the subject lines? You’re seeing Bayesian spam filtering, a statistical NLP technique that compares the words in spam to valid emails to identify junk mail.

Have you ever missed a phone call and read the automatic transcript of the voicemail in your email inbox or smartphone app? That’s speech-to-text conversion, an NLP capability.

Have you ever navigated a website by using its built-in search bar, or by selecting suggested topic, entity or category tags? Then you’ve used NLP methods for search, topic modeling, entity extraction and content categorization.

A subfield of NLP called natural language understanding (NLU) has begun to rise in popularity because of its potential in cognitive and AI applications.

NLU goes beyond the structural understanding of language to interpret intent, resolve context and word ambiguity, and even generate well-formed human language on its own.

NLU algorithms must tackle the extremely complex problem of semantic interpretation – that is, understanding the intended meaning of spoken or written language, with all the subtleties, context and inferences that we humans are able to comprehend.

The evolution of NLP toward NLU has a lot of important implications for businesses and consumers alike. Imagine the power of an algorithm that can understand the meaning and nuance of human language in many contexts, from medicine to law to the classroom.


As the volumes of unstructured information continue to grow exponentially, we will benefit from computers’ tireless ability to help us make sense of it all.




Data source: Wikipedia and SAS