What are optimization technique in spark or what optimization you have done during your spark project .
>>>df = spark.createDataFrame( [('1', 'true'),('2', 'false'), ('1', 'true'),('2', 'false'), ('1', 'true'),('2', 'false'), ('1', 'true'),('2', 'false'), ('1', 'true'),('2', 'false'), ])>>> df.rdd.getNumPartitions()8#Now performing a group by Operation>>> group_df = df.groupBy("_1").count()>>> group_df.show()+---+-----+| _1|count|+---+-----+| 1| 5|| 2| 5|+---+-----+>>> group_df.rdd.getNumPartitions()200