<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE FL_Course SYSTEM "https://www.flane.de/dtd/fl_course095.dtd"><?xml-stylesheet type="text/xsl" href="https://portal.flane.ch/css/xml-course.xsl"?><course productid="25934" language="en" source="https://portal.flane.ch/swisscom/en/xml-course/splunk-isdsp" lastchanged="2025-07-29T12:18:09+02:00" parent="https://portal.flane.ch/swisscom/en/xml-courses"><title>Implementing Splunk Data Stream Processor (DSP)</title><productcode>ISDSP</productcode><vendorcode>SP</vendorcode><vendorname>Splunk</vendorname><fullproductcode>SP-ISDSP</fullproductcode><version>1.2</version><objective>&lt;ul&gt;
&lt;li&gt;Introduction to Splunk Data Stream Processor&lt;/li&gt;&lt;li&gt;Deploying a DSP cluster&lt;/li&gt;&lt;li&gt;Prepping Sources and Sinks&lt;/li&gt;&lt;li&gt;Building Pipelines - Basics&lt;/li&gt;&lt;li&gt;Building Pipelines - Deep Dive&lt;/li&gt;&lt;li&gt;Working with 3rd party Data Feeds&lt;/li&gt;&lt;li&gt;Working with Metric Data&lt;/li&gt;&lt;li&gt;Monitoring DSP Environment&lt;/li&gt;&lt;/ul&gt;</objective><essentials>&lt;p&gt;&lt;strong&gt;Required:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span class=&quot;cms-link-marked&quot;&gt;&lt;a class=&quot;fl-href-prod&quot; href=&quot;/swisscom/en/course/splunk-sesa&quot;&gt;&lt;svg role=&quot;img&quot; aria-hidden=&quot;true&quot; focusable=&quot;false&quot; data-nosnippet class=&quot;cms-linkmark&quot;&gt;&lt;use xlink:href=&quot;/css/img/icnset-linkmarks.svg#linkmark&quot;&gt;&lt;/use&gt;&lt;/svg&gt;Splunk Enterprise System Administration &lt;span class=&quot;fl-prod-pcode&quot;&gt;(SESA)&lt;/span&gt;&lt;/a&gt;&lt;/span&gt;&lt;/li&gt;&lt;li&gt;&lt;span class=&quot;cms-link-marked&quot;&gt;&lt;a class=&quot;fl-href-prod&quot; href=&quot;/swisscom/en/course/splunk-seda&quot;&gt;&lt;svg role=&quot;img&quot; aria-hidden=&quot;true&quot; focusable=&quot;false&quot; data-nosnippet class=&quot;cms-linkmark&quot;&gt;&lt;use xlink:href=&quot;/css/img/icnset-linkmarks.svg#linkmark&quot;&gt;&lt;/use&gt;&lt;/svg&gt;Splunk Enterprise Data Administration &lt;span class=&quot;fl-prod-pcode&quot;&gt;(SEDA)&lt;/span&gt;&lt;/a&gt;&lt;/span&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;&lt;strong&gt;Recommended:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span class=&quot;cms-link-marked&quot;&gt;&lt;a class=&quot;fl-href-prod&quot; href=&quot;/swisscom/en/course/splunk-ased&quot;&gt;&lt;svg role=&quot;img&quot; aria-hidden=&quot;true&quot; focusable=&quot;false&quot; data-nosnippet class=&quot;cms-linkmark&quot;&gt;&lt;use xlink:href=&quot;/css/img/icnset-linkmarks.svg#linkmark&quot;&gt;&lt;/use&gt;&lt;/svg&gt;Architecting Splunk Enterprise Deployments &lt;span class=&quot;fl-prod-pcode&quot;&gt;(ASED)&lt;/span&gt;&lt;/a&gt;&lt;/span&gt;&lt;/li&gt;&lt;li&gt;Working knowledge of:&lt;ul&gt;
&lt;li&gt;Distributed system architectures&lt;/li&gt;&lt;li&gt;Apache Kafka (user level)&lt;/li&gt;&lt;li&gt;Apache Flink (user level)&lt;/li&gt;&lt;li&gt;Kubernetes (admin level)&lt;/li&gt;&lt;/ul&gt;&lt;/li&gt;&lt;/ul&gt;</essentials><audience>&lt;p&gt;This 4-day module is designed for the experienced Splunk administrators who are new to Splunk DSP.&lt;/p&gt;</audience><contents>&lt;p&gt;This hands-on module provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Please note that this class may run across four days, in 4.5 hour sessions each day and contains 18 total hours of content.&lt;/strong&gt;&lt;/p&gt;</contents><outline>&lt;p&gt;&lt;strong&gt;Topic 1 &amp;ndash; Introduction to DSP&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Review Splunk deployment options and challenges&lt;/li&gt;&lt;li&gt;Describe the purpose and value of Splunk DSP&lt;/li&gt;&lt;li&gt;Understand DSP concepts and terminologies&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;&lt;strong&gt;Topic 2 &amp;ndash; Deploying a DSP Cluster&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;List DSP core components and system requirements&lt;/li&gt;&lt;li&gt;List DSP core components and system requirements&lt;/li&gt;&lt;li&gt;Describe installation options and steps&lt;/li&gt;&lt;li&gt;Check DSP service status&lt;/li&gt;&lt;li&gt;Learn to navigate in DSP UI&lt;/li&gt;&lt;li&gt;Use scloud&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;&lt;strong&gt;Topic 3 &amp;ndash; Prepping Sources and Sinks&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Ingest data with DSP REST API service&lt;/li&gt;&lt;li&gt;Configure DSP source connections for Splunk data&lt;/li&gt;&lt;li&gt;Configure DSP sink connections for Splunk indexers&lt;/li&gt;&lt;li&gt;Create Splunk-to Splunk pass-through pipelines&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;&lt;strong&gt;Topic 4 &amp;ndash; Building Pipelines - Basics&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Describe the basic elements of a DSP pipeline&lt;/li&gt;&lt;li&gt;Create data pipelines with the DSP canvas and SPL2&lt;/li&gt;&lt;li&gt;List DSP pipeline commands&lt;/li&gt;&lt;li&gt;Use scalar functions to convert data types and schema&lt;/li&gt;&lt;li&gt;Filter and route data to multiple sinks&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;
 
&lt;strong&gt;Topic 5 &amp;ndash; Building Pipelines - Deep Dive&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Manipulate pipeline options:&lt;/li&gt;&lt;li&gt;Extract&lt;/li&gt;&lt;li&gt;Transform&lt;/li&gt;&lt;li&gt;Obfuscate&lt;/li&gt;&lt;li&gt;Aggregate and conditional trigger&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;
 
&lt;strong&gt;Topic 6 &amp;ndash; Working with 3rd party Data Feeds&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Read from and write data to pub-sub systems like Kafka&lt;/li&gt;&lt;li&gt;List sources supported with the collect service&lt;/li&gt;&lt;li&gt;Transform data from Kafka and normalize&lt;/li&gt;&lt;li&gt;Write to S3&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;
 
&lt;strong&gt;Topic 7 &amp;ndash; Working with Metric Data&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Onboard metric data into DSP&lt;/li&gt;&lt;li&gt;Transform metric data for Splunk indexers and SignalFx&lt;/li&gt;&lt;li&gt;Send metric data to Splunk indexers&lt;/li&gt;&lt;li&gt;Send metric data to Splunk SignalFx&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;
 
&lt;strong&gt;Topic 8 &amp;ndash; Monitoring DSP Environment&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Back up DSP pipelines&lt;/li&gt;&lt;li&gt;Monitor DSP environment&lt;/li&gt;&lt;li&gt;Describe steps to isolate DSP service issues&lt;/li&gt;&lt;li&gt;Scale DSP&lt;/li&gt;&lt;li&gt;Replace DSP master node&lt;/li&gt;&lt;li&gt;Upgrade DSP cluster&lt;/li&gt;&lt;/ul&gt;</outline><objective_plain>- Introduction to Splunk Data Stream Processor
- Deploying a DSP cluster
- Prepping Sources and Sinks
- Building Pipelines - Basics
- Building Pipelines - Deep Dive
- Working with 3rd party Data Feeds
- Working with Metric Data
- Monitoring DSP Environment</objective_plain><essentials_plain>Required:


- Splunk Enterprise System Administration (SESA)
- Splunk Enterprise Data Administration (SEDA)
Recommended:


- Architecting Splunk Enterprise Deployments (ASED)
- Working knowledge of:
- Distributed system architectures
- Apache Kafka (user level)
- Apache Flink (user level)
- Kubernetes (admin level)</essentials_plain><audience_plain>This 4-day module is designed for the experienced Splunk administrators who are new to Splunk DSP.</audience_plain><contents_plain>This hands-on module provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.

Please note that this class may run across four days, in 4.5 hour sessions each day and contains 18 total hours of content.</contents_plain><outline_plain>Topic 1 – Introduction to DSP


- Review Splunk deployment options and challenges
- Describe the purpose and value of Splunk DSP
- Understand DSP concepts and terminologies
Topic 2 – Deploying a DSP Cluster


- List DSP core components and system requirements
- List DSP core components and system requirements
- Describe installation options and steps
- Check DSP service status
- Learn to navigate in DSP UI
- Use scloud
Topic 3 – Prepping Sources and Sinks


- Ingest data with DSP REST API service
- Configure DSP source connections for Splunk data
- Configure DSP sink connections for Splunk indexers
- Create Splunk-to Splunk pass-through pipelines
Topic 4 – Building Pipelines - Basics


- Describe the basic elements of a DSP pipeline
- Create data pipelines with the DSP canvas and SPL2
- List DSP pipeline commands
- Use scalar functions to convert data types and schema
- Filter and route data to multiple sinks

 
Topic 5 – Building Pipelines - Deep Dive


- Manipulate pipeline options:
- Extract
- Transform
- Obfuscate
- Aggregate and conditional trigger

 
Topic 6 – Working with 3rd party Data Feeds


- Read from and write data to pub-sub systems like Kafka
- List sources supported with the collect service
- Transform data from Kafka and normalize
- Write to S3

 
Topic 7 – Working with Metric Data


- Onboard metric data into DSP
- Transform metric data for Splunk indexers and SignalFx
- Send metric data to Splunk indexers
- Send metric data to Splunk SignalFx

 
Topic 8 – Monitoring DSP Environment


- Back up DSP pipelines
- Monitor DSP environment
- Describe steps to isolate DSP service issues
- Scale DSP
- Replace DSP master node
- Upgrade DSP cluster</outline_plain><duration unit="d" days="0">18 hours</duration><pricelist><price country="DE" currency="EUR">2000.00</price><price country="CH" currency="CHF">2600.00</price></pricelist><miles><milesvalue country="SI" vendorcurrency="SPC" vendorcurrencyname="Splunk Training Units">200.00</milesvalue><milesvalue country="DE" vendorcurrency="SPC" vendorcurrencyname="Splunk Training Units">200.00</milesvalue><milesvalue country="AT" vendorcurrency="SPC" vendorcurrencyname="Splunk Training Units">200.00</milesvalue><milesvalue country="CH" vendorcurrency="SPC" vendorcurrencyname="Splunk Training Units">200.00</milesvalue></miles></course>