Spark and Streaming with Matei Zaharia - a podcast by Greatest Hits – Software Engineering Daily
from 2018-02-26T10:00:14
::
::
Apache Spark is a system for processing large data sets in parallel. The core abstraction of Spark is the resilient distributed dataset (RDD), a working set of data that sits in memory for fast, iterative processing. Matei Zaharia created Spark with two goals: to provide a composable, high-level set of APIs for performing distributed processing;
The post Spark and Streaming with Matei Zaharia appeared first on Software Engineering Daily.
Further episodes of Greatest Hits Archives - Software Engineering Daily
Further podcasts by Greatest Hits – Software Engineering Daily
Website of Greatest Hits – Software Engineering Daily