Spark - What are Transformation? #spark #python #programming #learning #datascience #jeenu #sql
Mukesh Singh
What are Transformations in Spark?
💎Transformations in Spark are operations that are applied to RDDs (Resilient Distributed Datasets) to create a new RDD.
💎When a transformation is applied to an RDD, it does not compute the result immediately. Instead, it creates a new RDD representing the transformed data but keeps track of the lineage (dependencies) between the original RDD and the transformed RDD.
💎Transformations are lazy evaluated, meaning Spark delays the actual computation until an action is triggered.
💎Examples of transformations include map(), filter(), flatMap(), groupByKey(), reduceByKey(), sortByKey(), etc.
To learn more, please follow us - 🔊 http://www.sql-datatools.com
To Learn more, please visit our YouTube channel at — 🔊 http://www.youtube.com/c/Sql-datatools
To Learn more, please visit our Instagram account at - 🔊 https://www.instagram.com/asp.mukesh/
To Learn more, please visit our twitter account at - 🔊 https://twitter.com/macxima ... https://www.youtube.com/watch?v=Rg4lBvhlMiw
2055831 Bytes