You can create a file directly in hdfs using put:
hadoop fs -put - /path/to/file/filename
Or you can use HdfsCLI python module:
# Loading a file in memory.
with client.read('features') as reader:
features = reader.read()
# Directly deserializing a JSON object.
with client.read('model.json', encoding='utf-8') as reader:
from json import load
model = load(reader)
Link to HdfsCLI: https://hdfscli.readthedocs.io/en/latest/quickstart.html#command-line-interface