Course Catalog - 2023-2024

     

COMP 652 - NATURAL LANGUAGE PROCESSING

Long Title: NATURAL LANGUAGE PROCESSING
Department: Computer Science
Grade Mode: Standard Letter
Language of Instruction: Taught in English
Course Type: Lecture
Credit Hours: 3
Restrictions:
Must be enrolled in one of the following Program(s):
Online Master of Data Science
Online Master Computer Science
Must be enrolled in one of the following Level(s):
Graduate
Prerequisite(s): COMP 614 AND COMP 642 AND COMP 680
Description: This is an introductory graduate-level course in Natural Language Processing (NLP), where students will learn the fundamental concepts of computational linguistics, probabilistic language models, neural representations of language, and text parsing. Further, the course includes several applications of these concepts for solving real-world problems, such as sentiment analysis, information extraction, question answering, and the design of chatbots. Students will complete individual assignments, a literature review, and a team project designed to provide them with the exposure necessary to develop, build, design, and train NLP methods and models for different real-world tasks. In addition, relevant state-of-the-art algorithms and architectures will be discussed. Recommended Prerequisite(s): Familiarity with fundamental concepts of calculus, including partial derivatives, chain rule, total derivatives, derivatives and partial derivatives of vectors and matrices. Familiarity with fundamental concepts of probability and statistics, including probability distributions, density functions, computing probabilities, expectation, variance, multivariate distributions, random variables and multivariate random variables. Familiarity with fundamental concepts of linear algebra, such as inner products, vector spaces, vector and matrix norms, rank of a matrix, positive definite matrices, and matrix factorization, e.g., spectral decomposition and singular value decomposition. Familiarity with fundamental concepts of machine learning and optimization theory, such as loss functions, gradient descent, maximum likelihood estimation, MAP estimation, dimensionality reduction, principal component analysis, Naive Bayes algorithm, and logistic regression.