Ebook Preview
Learning Spark Streaming
Best Practices For Scaling And Optimizing Apache Spark
François Garillot
Gerard Maas

Audience: Developers, Architects
Technical level: Introductory
To build analytics tools that provide faster insights, knowing how to process data in real time is a must, and moving from batch processing to stream processing is absolutely required. Fortunately, the Spark in-memory framework/platform for processing data has added an extension devoted to fault-tolerant stream processing: Spark Streaming. If you're familiar with Apache Spark and want to learn how to implement it for streaming jobs, this practical book is a must.
- Understand how Spark Streaming fits in the big picture
- Learn core concepts such as Spark RDDs, Spark Streaming clusters, and the fundamentals of a DStream
- Discover how to create a robust deployment
- Dive into streaming algorithmics
- Learn how to tune, measure, and monitor Spark Streaming
Grab your copy
Please enter your information to receive your E-book chapter(s) of Learning Spark Streaming and be signed up for the Lightbend Newsletter. Once you've entered your information and submitted the form, the PDF will be emailed to your address.