You can view all Step logs for a pipeline step in Stackdriver Logging by clicking the Stackdriver link on the right side of the logs pane.
Here is a summary of the types of log files available:
job-message - logs contain job level messages that various components of Cloud Dataflow generate. An example is the autoscaling configuration. Worker level errors originate from crashing user code and that are present in worker logs also propagate upto job message logs.
worker - Logs are produced by Cloud Dataflow workers. Workers do most of the pipeline work like applying ParDos to the data. Worker logs contain messages logged by your code and Dataflow.
worker-startup - logs are present on most Cloud Dataflow jobs and can capture messages related to the startup process. It includes downloading the job's jar's from Cloud storage.
shuffler - logs contain messages from workers that consolidate the results of the parallel pipeline operations.
Other logs include docker and kubelet. They contain messages about these public technologies which are used on Cloud Dataflow workers