Hi@Rishi,
Yes, number of spark tasks can be greater than the executor no. But at that situation, extra task thread is just sitting there in the TIMED_WAITING state. Each task needs one executor core. When one executor finishes its task, another task is automatically assigned. You can increase your executor no. But it depends on your available memory.
To know more about Pyspark, it's recommended that you join Pyspark Certification today.