<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE FL_Course SYSTEM "https://www.flane.de/dtd/fl_course095.dtd"><?xml-stylesheet type="text/xsl" href="https://portal.flane.ch/css/xml-course.xsl"?><course productid="34464" language="fr" source="https://portal.flane.ch/swisscom/fr/xml-course/nvidia-bnlpa" lastchanged="2025-07-29T12:18:27+02:00" parent="https://portal.flane.ch/swisscom/fr/xml-courses"><title>Building Transformer-Based Natural Language Processing Applications</title><productcode>BNLPA</productcode><vendorcode>NV</vendorcode><vendorname>Nvidia</vendorname><fullproductcode>NV-BNLPA</fullproductcode><version>1.0</version><objective>&lt;ul&gt;
&lt;li&gt;How transformers are used as the basic building blocks of modern LLMs for NLP applications&lt;/li&gt;&lt;li&gt;How self-supervision improves upon the transformer architecture in BERT, Megatron, and other LLM variants for superior NLP results&lt;/li&gt;&lt;li&gt;How to leverage pretrained, modern LLM models to solve multiple NLP tasks such as text classification, named-entity recognition (NER), and question answering&lt;/li&gt;&lt;li&gt;Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering&lt;/li&gt;&lt;li&gt;Manage inference challenges and deploy refined models for live applications&lt;/li&gt;&lt;/ul&gt;</objective><essentials>&lt;ul&gt;
&lt;li&gt;Experience with Python coding and use of library functions and parameters&lt;/li&gt;&lt;li&gt;Fundamental understanding of a deep learning framework such as TensorFlow, PyTorch, or Keras&lt;/li&gt;&lt;li&gt;Basic understanding of neural networks&lt;/li&gt;&lt;/ul&gt;</essentials><contents>&lt;h5&gt;Introduction&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Meet the instructor.&lt;/li&gt;&lt;li&gt;Create an account at courses.nvidia.com/join&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Introduction to Transformers&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Explore how the transformer architecture works in detail:&lt;/li&gt;&lt;li&gt;Build the transformer architecture in PyTorch.&lt;/li&gt;&lt;li&gt;Calculate the self-attention matrix.&lt;/li&gt;&lt;li&gt;Translate English to German with a pretrained transformer model.&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Self-Supervision, BERT, and Beyond&lt;/h5&gt;&lt;p&gt;Learn how to apply self-supervised transformer-based models to concrete NLP tasks using NVIDIA NeMo:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Build a text classification project to classify abstracts.&lt;/li&gt;&lt;li&gt;Build a NER project to identify disease names in text.&lt;/li&gt;&lt;li&gt;Improve project accuracy with domain-specific models.&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Inference and Deployment for NLP	&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Learn how to deploy an NLP project for live inference on NVIDIA Triton:&lt;/li&gt;&lt;li&gt;Prepare the model for deployment.&lt;/li&gt;&lt;li&gt;Optimize the model with NVIDIA&amp;reg; TensorRT&amp;trade;.&lt;/li&gt;&lt;li&gt;Deploy the model and test it.&lt;/li&gt;&lt;/ul&gt;&lt;h5&gt;Final Review&lt;/h5&gt;&lt;ul&gt;
&lt;li&gt;Review key learnings and answer questions.&lt;/li&gt;&lt;li&gt;Complete the assessment and earn a certificate.&lt;/li&gt;&lt;li&gt;Take the workshop survey.&lt;/li&gt;&lt;li&gt;Learn how to set up your own environment and discuss additional resources and training.&lt;/li&gt;&lt;/ul&gt;</contents><objective_plain>- How transformers are used as the basic building blocks of modern LLMs for NLP applications
- How self-supervision improves upon the transformer architecture in BERT, Megatron, and other LLM variants for superior NLP results
- How to leverage pretrained, modern LLM models to solve multiple NLP tasks such as text classification, named-entity recognition (NER), and question answering
- Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering
- Manage inference challenges and deploy refined models for live applications</objective_plain><essentials_plain>- Experience with Python coding and use of library functions and parameters
- Fundamental understanding of a deep learning framework such as TensorFlow, PyTorch, or Keras
- Basic understanding of neural networks</essentials_plain><contents_plain>Introduction


- Meet the instructor.
- Create an account at courses.nvidia.com/join
Introduction to Transformers


- Explore how the transformer architecture works in detail:
- Build the transformer architecture in PyTorch.
- Calculate the self-attention matrix.
- Translate English to German with a pretrained transformer model.
Self-Supervision, BERT, and Beyond

Learn how to apply self-supervised transformer-based models to concrete NLP tasks using NVIDIA NeMo:


- Build a text classification project to classify abstracts.
- Build a NER project to identify disease names in text.
- Improve project accuracy with domain-specific models.
Inference and Deployment for NLP	


- Learn how to deploy an NLP project for live inference on NVIDIA Triton:
- Prepare the model for deployment.
- Optimize the model with NVIDIA® TensorRT™.
- Deploy the model and test it.
Final Review


- Review key learnings and answer questions.
- Complete the assessment and earn a certificate.
- Take the workshop survey.
- Learn how to set up your own environment and discuss additional resources and training.</contents_plain><duration unit="d" days="1">1 jour</duration><pricelist><price country="DE" currency="EUR">995.00</price><price country="AT" currency="EUR">995.00</price><price country="US" currency="USD">500.00</price><price country="IT" currency="EUR">995.00</price><price country="SI" currency="EUR">995.00</price><price country="GB" currency="GBP">420.00</price><price country="CA" currency="CAD">690.00</price><price country="CH" currency="CHF">995.00</price></pricelist><miles/></course>