Online Course
Intro to LLMs

Start your AI Engineer career journey: Master Transformer Architecture and the Essentials of Modern AI

4.8

863 reviews on
6,133 students already enrolled
  • Institute of Analytics
  • The Association of Data Scientists
  • E-Learning Quality Network
  • European Agency for Higher Education and Accreditation
  • Global Association of Online Trainers and Examiners

Skill level:

Intermediate

Duration:

3 hours
  • Lessons (3 hours)

CPE credits:

4
CPE stands for Continuing Professional Education and represents the mandatory credits a wide range of professionals must earn to maintain their licenses and stay current with regulations and best practices. One CPE credit typically equals 50 minutes of learning. For more details, visit NASBA's official website: www.nasbaregistry.org

Accredited

certificate

What you learn

  • Create AI-driven applications for specific business needs.
  • Boost your career by mastering AI engineering skills.
  • Understand key LLM concepts like attention and self-attention.
  • Integrate OpenAI’s API to connect products with AI models.
  • Gain an introduction to LangChain and Hugging Face.

Topics & tools

AITransformersLLMsAttentionGPTLangchainBERTHuggingfaceXLNetNatural Language ProcessingPython

Your instructor

Course OVERVIEW

Description

CPE Credits: 4 Field of Study: Information Technology
Delivery Method: QAS Self Study
In recent years, large language models (LLMs) have dominated the tech news for their incredible ability to write poetry, essays, social media content, code, and more. They’re the hot new topic in natural language processing. This Intro to Large Language Models course teaches you the knowledge and skills required to experiment and create your own language model solutions. Through a combination of video lessons and practical coding exercises, we’ll cover what these language models are, their functions, and ways to implement them into your own projects. Whether you want to generate content, create a chatbot, or train these models on your own custom data and NLP tasks, this course equips you with the fundamental tools and concepts to fine-tune LLM models and tackle these challenges.

Prerequisites

  • Python (version 3.8 or later), Hugging Face Transformers library, and a code editor or IDE (e.g., VS Code or Jupyter Notebook)
  • Basic understanding of Python programming is required.
  • No prior experience with natural language processing or machine learning is necessary.

Curriculum

46 lessons 19 exercises 1 exam
  • 1. Introduction to Large Language Models
    16 min
    We’ll begin our journey with an introduction into large language models., We’ll study the world of LLMs, their applications, training processes, and what datasets they’ve been trained on.
    16 min
    We’ll begin our journey with an introduction into large language models., We’ll study the world of LLMs, their applications, training processes, and what datasets they’ve been trained on.
    Introduction to the course Free
    Course Materials and Notebooks Free
    What are LLMs? Free
    How large is an LLM? Free
    General purpose models Free
    Pre-training and fine tuning Free
    Exercise Free
    What can LLMs be used for? Free
  • 2. The Transformer Architecture
    24 min
    In this segment of the LLM course, we’ll break down the transformers' architecture and explain the mechanics behind encoders and decoders, embeddings, multi-headed attention, and the significance of a feed-forward layer. You’ll learn the advantages of transformers over RNNs.
    24 min
    In this segment of the LLM course, we’ll break down the transformers' architecture and explain the mechanics behind encoders and decoders, embeddings, multi-headed attention, and the significance of a feed-forward layer. You’ll learn the advantages of transformers over RNNs.
    Deep learning recap Free
    Exercise
    The problem with RNNs Free
    The solution: attention is all you need Free
    Exercise
    The transformer architecture
    Exercise
    Input embeddings
    Exercise
    Multi-headed attention
    Exercise
    Feed-forward layer
    Masked multihead attention
    Exercise
    Predicting the final outputs
  • 3. Getting started with GPT models
    33 min
    We’ll examine GPT models closely and begin our practical part of the LLM tutorial. We’ll connect to OpenAI’s API and implement a simple chatbot with a personality: a poetic chatbot. I’ll also show you how to use LangChain to work with your own custom data, feeding information from the 365 web pages to our model.
    33 min
    We’ll examine GPT models closely and begin our practical part of the LLM tutorial. We’ll connect to OpenAI’s API and implement a simple chatbot with a personality: a poetic chatbot. I’ll also show you how to use LangChain to work with your own custom data, feeding information from the 365 web pages to our model.
    What does GPT mean?
    The development of ChatGPT
    Importamt Update
    OpenAI API
    Generating text
    Customizing GPT Output
    Key word text summarization
    Exercise
    Coding a simple chatbot
    Introduction to Langchain in Python
    Langchain
    Exercise
    Adding custom data to our chatbot
  • 4. Hugging Face Transformers
    27 min
    The Hugging Face package is an open-source package, which allows us an alternative way to interact with LLMs. We’ll learn about pre-trained and customized tokenizers and how to integrate Hugging Face into Pytorch and Tensorflow deep learning workflows.
    27 min
    The Hugging Face package is an open-source package, which allows us an alternative way to interact with LLMs. We’ll learn about pre-trained and customized tokenizers and how to integrate Hugging Face into Pytorch and Tensorflow deep learning workflows.
    Hugging Face package
    Exercise
    The transformer pipeline
    Pre-trained tokenizers
    Special tokens
    Exercise
    Hugging Face and PyTorch, TensorFlow
    Saving and loading models
  • 5. Question and answer models with BERT
    32 min
    This section of our Intro to Large Language Models course will explore BERT's architecture and contrast it with GPT models. It will delve into the workings of question-answering systems both theoretically and practically and examine variations of BERT—including the optimized RoBERTa and the smaller lightweight version DistilBERT.
    32 min
    This section of our Intro to Large Language Models course will explore BERT's architecture and contrast it with GPT models. It will delve into the workings of question-answering systems both theoretically and practically and examine variations of BERT—including the optimized RoBERTa and the smaller lightweight version DistilBERT.
    GPT vs BERT
    Exercise
    BERT architecture
    Exercise
    Loading the model and tokenizer
    BERT embeddings
    Calculating the response
    Creating a QA bot
    BERT, RoBERTa, DistilBERT
    Exercise
  • 6. Text classification with XLNet
    26 min
    In the final Intro to Large Language Models course section, we’ll look under the hood of XLNET (a novel LLM), that uses permutations of data sets to train a model. We’ll also compare XLNet and our previously discussed models, BERT and GPT.
    26 min
    In the final Intro to Large Language Models course section, we’ll look under the hood of XLNET (a novel LLM), that uses permutations of data sets to train a model. We’ll also compare XLNet and our previously discussed models, BERT and GPT.
    GPT vs BERT vs XLNET
    Exercise
    A note on the following lecture
    Preprocessing our data
    XLNet Embeddings
    Fine tuning XLNet
    Evaluating our model
  • 7. Course exam
    30 min
    30 min
    Course exam

Free lessons

Introduction to the course

1.1 Introduction to the course

2 min

Course Materials and Notebooks

1.2 Course Materials and Notebooks

1 min

What are LLMs?

1.3 What are LLMs?

3 min

How large is an LLM?

1.4 How large is an LLM?

3 min

General purpose models

1.5 General purpose models

1 min

Pre-training and fine tuning

1.6 Pre-training and fine tuning

3 min

Start for free

96%

of our students recommend

365 Data Science.

4.8

Based on 863 reviews

#1 most reviewed

AI and data learning platform on Trustpilot.

94%

of AI and data science graduates

successfully change

or advance their careers.

ACCREDITED certificates

Craft a resume and LinkedIn profile you’re proud of—featuring certificates recognized by leading global institutions.

Earn CPE-accredited credentials that showcase your dedication, growth, and essential skills—the qualities employers value most.

  • Institute of Analytics
  • The Association of Data Scientists
  • E-Learning Quality Network
  • European Agency for Higher Education and Accreditation
  • Global Association of Online Trainers and Examiners

Certificates are included with the Self-study learning plan.

A LinkedIn profile mockup on a mobile screen showing Parker Maxwell, a Certified Data Analyst, with credentials from 365 Data Science listed under Licenses & Certification. A 365 Data Science Certificate of Achievement awarded to Parker Maxwell for completing the Data Analyst career track, featuring accreditation badges and a gold “Verified Certificate” seal.

How it WORKS

  • Lessons
  • Exercises
  • Projects
  • Practice exams
  • AI mock interviews

Lessons

Learn through short, simple lessons—no prior experience in AI or data science needed.

Try for free

Exercises

Reinforce your learning with mini recaps, hands-on coding, flashcards, fill-in-the-blank activities, and other engaging exercises.

Try for free

Projects

Tackle real-world AI and data science projects—just like those faced by industry professionals every day.

Try for free

Practice exams

Track your progress and solidify your knowledge with regular practice exams.

Try for free

AI mock interviews

Prep for interviews with real-world tasks, popular questions, and real-time feedback.

Try for free

Student REVIEWS

A collage of student testimonials from 365 Data Science learners, featuring profile photos, names, job titles, and quotes or video play icons, showcasing diverse backgrounds and successful career transitions into AI and data science roles.