Spark-submit jobs are also run from client/edge node but its execution environment will depend on the parameters --master and --deploy-mode such as if the --deploy-mode is cluster then the driver class of the code will be executed in Spark cluster and if its client then it will get executed in the edge node itself.
1. To monitor the spark application there will a spark web UI from which you can see the spark application current status as well as history. The image attached below will help you understand how.
2. The Spark Web UI will show you all the jobs running in all worker nodes as a whole and no specific number will be mentioned as its an automated thing to choose which worker node will get assigned to which job, only the master node log has the details.