Intro to LLMs trending topic

with Lauren Newbould
4.7/5
(274)

Start your AI Engineer career journey: Master Transformer Architecture and the Essentials of Modern AI

3 hours of content 3569 students

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 3 hours of content
  • 19 Interactive exercises
  • 3 Downloadable resources
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

Intro to LLMs trending topic

A course by Lauren Newbould

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 3 hours of content
  • 19 Interactive exercises
  • 3 Downloadable resources
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 3 hours of content
  • 19 Interactive exercises
  • 3 Downloadable resources
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

What You Learn

  • Create your own AI-driven applications tailored to solve specific business problems
  • Boost your career prospects by mastering AI Engineering skills
  • Gain a solid understanding of key LLM concepts such as attention and self-attention, crucial for building intuitive AI systems
  • Learn how to integrate Open AI’s API and create a bridge between your products and powerful AI foundation models
  • Get an introduction to LangChain, the platform that streamlines the creation of AI-driven apps
  • Explore HuggingFace to access the cutting-edge AI Engineering tools it offers

Top Choice of Leading Companies Worldwide

Industry leaders and professionals globally rely on this top-rated course to enhance their skills.

Course Description

In recent years, large language models (LLMs) have dominated the tech news for their incredible ability to write poetry, essays, social media content, code, and more. They’re the hot new topic in natural language processing. This Intro to Large Language Models course teaches you the knowledge and skills required to experiment and create your own language model solutions. Through a combination of video lessons and practical coding exercises, we’ll cover what these language models are, their functions, and ways to implement them into your own projects. Whether you want to generate content, create a chatbot, or train these models on your own custom data and NLP tasks, this course equips you with the fundamental tools and concepts to fine-tune LLM models and tackle these challenges.

Learn for Free

Introduction to the course

1.1 Introduction to the course

2 min

Course Materials and Notebooks

1.2 Course Materials and Notebooks

1 min

What are LLMs?

1.3 What are LLMs?

3 min

How large is an LLM?

1.4 How large is an LLM?

3 min

General purpose models

1.5 General purpose models

1 min

Pre-training and fine tuning

1.6 Pre-training and fine tuning

3 min

Curriculum

  • 1. Introduction to Large Language Models
    7 Lessons 16 Min

    We’ll begin our journey with an introduction into large language models., We’ll study the world of LLMs, their applications, training processes, and what datasets they’ve been trained on.

    Introduction to the course
    2 min
    Course Materials and Notebooks Read now
    1 min
    What are LLMs?
    3 min
    How large is an LLM?
    3 min
    General purpose models
    1 min
    Pre-training and fine tuning
    3 min
    What can LLMs be used for?
    3 min
  • 2. The Transformer Architecture
    9 Lessons 24 Min

    In this segment of the LLM course, we’ll break down the transformers' architecture and explain the mechanics behind encoders and decoders, embeddings, multi-headed attention, and the significance of a feed-forward layer. You’ll learn the advantages of transformers over RNNs.

    Deep learning recap
    3 min
    The problem with RNNs
    4 min
    The solution: attention is all you need
    3 min
    The transformer architecture
    1 min
    Input embeddings
    3 min
    Multi-headed attention
    4 min
    Feed-forward layer
    3 min
    Masked multihead attention
    1 min
    Predicting the final outputs
    2 min
  • 3. Getting started with GPT models
    10 Lessons 31 Min

    We’ll examine GPT models closely and begin our practical part of the LLM tutorial. We’ll connect to OpenAI’s API and implement a simple chatbot with a personality: a poetic chatbot. I’ll also show you how to use LangChain to work with your own custom data, feeding information from the 365 web pages to our model.

    What does GPT mean?
    1 min
    The development of ChatGPT
    2 min
    OpenAI API
    3 min
    Generating text
    2 min
    Customizing GPT Output
    4 min
    Key word text summarization
    4 min
    Coding a simple chatbot
    6 min
    Introduction to Langchain in Python
    1 min
    Langchain
    3 min
    Adding custom data to our chatbot
    5 min
  • 4. Hugging Face Transformers
    6 Lessons 27 Min

    The Hugging Face package is an open-source package, which allows us an alternative way to interact with LLMs. We’ll learn about pre-trained and customized tokenizers and how to integrate Hugging Face into Pytorch and Tensorflow deep learning workflows.

    Hugging Face package
    3 min
    The transformer pipeline
    6 min
    Pre-trained tokenizers
    9 min
    Special tokens
    3 min
    Hugging Face and PyTorch, TensorFlow
    5 min
    Saving and loading models
    1 min
  • 5. Question and answer models with BERT
    7 Lessons 32 Min

    This section of our Intro to Large Language Models course will explore BERT's architecture and contrast it with GPT models. It will delve into the workings of question-answering systems both theoretically and practically and examine variations of BERT—including the optimized RoBERTa and the smaller lightweight version DistilBERT.

    GPT vs BERT
    3 min
    BERT architecture
    5 min
    Loading the model and tokenizer
    2 min
    BERT embeddings
    4 min
    Calculating the response
    6 min
    Creating a QA bot
    9 min
    BERT, RoBERTa, DistilBERT
    3 min
  • 6. Text classification with XLNet
    5 Lessons 25 Min

    In the final Intro to Large Language Models course section, we’ll look under the hood of XLNET (a novel LLM), that uses permutations of data sets to train a model. We’ll also compare XLNet and our previously discussed models, BERT and GPT.

    GPT vs BERT vs XLNET
    4 min
    Preprocessing our data
    10 min
    XLNet Embeddings
    4 min
    Fine tuning XLNet
    4 min
    Evaluating our model
    3 min

Topics

AITransformersLLMsAttentionGPTLangchainBERTHuggingfaceXLNetNatural Language ProcessingPython

Tools & Technologies

python
langchain

Course Requirements

  • Highly recommended to take the Intro to Python course first

Who Should Take This Course?

Level of difficulty: Intermediate

  • Aspiring data analysts, data scientists, data engineers, machine learning engineers, and AI engineers

Exams and Certification

A 365 Data Science Course Certificate is an excellent addition to your LinkedIn profile—demonstrating your expertise and willingness to go the extra mile to accomplish your goals.

Exams and certification

Meet Your Instructor

Lauren Newbould

Lauren Newbould

Data Scientist at

2 Courses

676 Reviews

8172 Students

Guided by a comprehensive social science and statistics background, Lauren's data science career has taken her through several pivotal roles—from creating custom NLP solutions for non-profits in Nepal to providing insights for BBC Sport and the 2020 Olympics. Lauren has spoken at several conferences on how NLP can benefit those in developing countries and advocates for ethical and open data science. She aims to empower individuals and organizations to make confident, data-driven decisions and to ensure AI is fair and accessible for all.

What Our Learners Say

365 Data Science Is Featured at

Our top-rated courses are trusted by business worldwide.

Recommended Courses