Azure Data Lake Storage is a very popular Data Storage service from Microsoft. Here I am explaining a simple python program that writes a file into ADLS.

The following python package is required for the execution of this program.

pip install azure-storage-file-datalake

After installing the package, perform the following steps.

  • Login to the Azure Portal
  • Go to Storage Accounts
  • Create or use an existing storage account
  • Create or use an existing container within the storage account
  • Create a directory in the storage container
  • Get the access credentials from the Access Keys section in the storage account Settings

Now we have fulfilled all the requirements for running our program. The program is given below. Update the values for storage account, container name, access credential and directory name in the below program.

from import DataLakeServiceClient
# install the following package
# pip install azure-storage-file-datalake
# Get the below details from your storage account
storage_account_name = ""
storage_account_key = ""
container_name = ""
directory_name = ""
service_client = DataLakeServiceClient(account_url="{}://{}".format(
"https", storage_account_name), credential=storage_account_key)
file_system_client = service_client.get_file_system_client(file_system=container_name)
dir_client = file_system_client.get_directory_client(directory_name)
data = """
Sample data for testing.
This is a multiline text for testing the ADLS Gen2 file system operations.
file_client = dir_client.create_file("sampledata.txt")
file_client.append_data(data, 0, len(data))
view raw hosted with ❤ by GitHub