Apache Spark Use Cases & Applications
Knowledge Hut
MAY 2, 2024
Though the majority of use cases of Spark uses HDFS as the underlying data file storage layer, it is not mandatory to use HDFS. It does work with a variety of other Data sources like Cassandra, MySQL, AWS S3 etc. A typical use case is building a Data Warehouse for batch processing and daily reporting.
Let's personalize your content