DataLakeFileClient class

A DataLakeFileClient represents a URL to the Azure Storage file.

Extends

Constructors

DataLakeFileClient(string, Pipeline)

Creates an instance of DataLakeFileClient from url and pipeline.

DataLakeFileClient(string, StorageSharedKeyCredential | AnonymousCredential | TokenCredential, StoragePipelineOptions)

Creates an instance of DataLakeFileClient from url and credential.

Properties

fileSystemName

Name of current file system.

name

Name of current path (directory or file).

Inherited Properties

accountName
credential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

url

Encoded URL string value.

Methods

append(RequestBodyType, number, number, FileAppendOptions)

Uploads data to be appended to a file. Data can only be appended to a file. To apply perviously uploaded data to a file, call flush.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

create(FileCreateOptions)

Create a file.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

create(PathResourceTypeModel, PathCreateOptions)

Create a file.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

createIfNotExists(FileCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

createIfNotExists(PathResourceTypeModel, PathCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

flush(number, FileFlushOptions)

Flushes (writes) previously appended data to a file.

generateSasStringToSign(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://docs.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas

generateSasUrl(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://docs.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas

query(string, FileQueryOptions)

Quick query for a JSON or CSV formatted file.

Example usage (Node.js):

// Query and convert a file to a string
const queryResponse = await fileClient.query("select * from BlobStorage");
const downloaded = (await streamToBuffer(queryResponse.readableStreamBody)).toString();
console.log("Query file content:", downloaded);

async function streamToBuffer(readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}
read(number, number, FileReadOptions)

Downloads a file from the service, including its metadata and properties.

  • In Node.js, data returns in a Readable stream readableStreamBody
  • In browsers, data returns in a promise contentAsBlob

See https://docs.microsoft.com/en-us/rest/api/storageservices/get-blob

  • Example usage (Node.js):
// Download and convert a file to a string
const downloadResponse = await fileClient.read();
const downloaded = await streamToBuffer(downloadResponse.readableStreamBody);
console.log("Downloaded file content:", downloaded.toString());

async function streamToBuffer(readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}

Example usage (browser):

// Download and convert a file to a string
const downloadResponse = await fileClient.read();
const downloaded = await blobToString(await downloadResponse.contentAsBlob);
console.log("Downloaded file content", downloaded);

async function blobToString(blob: Blob): Promise<string> {
  const fileReader = new FileReader();
  return new Promise<string>((resolve, reject) => {
    fileReader.onloadend = (ev: any) => {
      resolve(ev.target!.result);
    };
    fileReader.onerror = reject;
    fileReader.readAsText(blob);
  });
}
readToBuffer(Buffer, number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file.

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

readToBuffer(number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

readToFile(string, number, number, FileReadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Downloads a Data Lake file to a local file. Fails if the the given file path already exits. Offset and count are optional, pass 0 and undefined respectively to download the entire file.

setExpiry(FileExpiryMode, FileSetExpiryOptions)

Sets an expiry time on a file, once that time is met the file is deleted.

upload(Blob | ArrayBuffer | ArrayBufferView | Buffer, FileParallelUploadOptions)

Uploads a Buffer(Node.js)/Blob/ArrayBuffer/ArrayBufferView to a File.

uploadFile(string, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a local file to a Data Lake file.

uploadStream(Readable, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a Node.js Readable stream into a Data Lake file. This method will try to create a file, then starts uploading chunk by chunk. Please make sure potential size of stream doesn't exceed FILE_MAX_SIZE_BYTES and potential number of chunks doesn't exceed BLOCK_BLOB_MAX_BLOCKS.

PERFORMANCE IMPROVEMENT TIPS:

  • Input stream highWaterMark is better to set a same value with options.chunkSize parameter, which will avoid Buffer.concat() operations.

Inherited Methods

delete(boolean, PathDeleteOptions)

Delete current path (directory or file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/delete

deleteIfExists(boolean, PathDeleteOptions)

Delete current path (directory or file) if it exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/delete

exists(PathExistsOptions)

Returns true if the Data Lake file represented by this client exists; false otherwise.

NOTE: use this function with care since an existing file might be deleted by other clients or applications. Vice versa new files might be added by other clients or applications after this function completes.

getAccessControl(PathGetAccessControlOptions)

Returns the access control data for a path (directory of file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/getproperties

getDataLakeLeaseClient(string)

Get a DataLakeLeaseClient that manages leases on the path (directory or file).

getProperties(PathGetPropertiesOptions)

Returns all user-defined metadata, standard HTTP properties, and system properties for the path (directory or file).

WARNING: The metadata object returned in the response will have its keys in lowercase, even if they originally contained uppercase characters. This differs from the metadata keys returned by the methods of DataLakeFileSystemClient that list paths using the includeMetadata option, which will retain their original casing.

See https://docs.microsoft.com/en-us/rest/api/storageservices/get-blob-properties

move(string, PathMoveOptions)

Move directory or file within same file system.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

move(string, string, PathMoveOptions)

Move directory or file to another file system.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

removeAccessControlRecursive(RemovePathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Removes the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

setAccessControl(PathAccessControlItem[], PathSetAccessControlOptions)

Set the access control data for a path (directory of file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

setAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Sets the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

setHttpHeaders(PathHttpHeaders, PathSetHttpHeadersOptions)

Sets system properties on the path (directory or file).

If no value provided, or no value provided for the specified blob HTTP headers, these blob HTTP headers without a value will be cleared.

See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties

setMetadata(Metadata, PathSetMetadataOptions)

Sets user-defined metadata for the specified path (directory of file) as one or more name-value pairs.

If no option provided, or no metadata defined in the parameter, the path metadata will be removed.

See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-metadata

setPermissions(PathPermissions, PathSetPermissionsOptions)

Sets the file permissions on a path.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

toDirectoryClient()

Convert current DataLakePathClient to DataLakeDirectoryClient if current path is a directory.

toFileClient()

Convert current DataLakePathClient to DataLakeFileClient if current path is a file.

updateAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Modifies the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

Constructor Details

DataLakeFileClient(string, Pipeline)

Creates an instance of DataLakeFileClient from url and pipeline.

new DataLakeFileClient(url: string, pipeline: Pipeline)

Parameters

url

string

A Client string pointing to Azure Storage data lake file, such as "https://myaccount.dfs.core.windows.net/filesystem/file". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net/filesystem/directory/file?sasString".

pipeline
Pipeline

Call newPipeline() to create a default pipeline, or provide a customized pipeline.

DataLakeFileClient(string, StorageSharedKeyCredential | AnonymousCredential | TokenCredential, StoragePipelineOptions)

Creates an instance of DataLakeFileClient from url and credential.

new DataLakeFileClient(url: string, credential?: StorageSharedKeyCredential | AnonymousCredential | TokenCredential, options?: StoragePipelineOptions)

Parameters

url

string

A Client string pointing to Azure Storage data lake file, such as "https://myaccount.dfs.core.windows.net/filesystem/file". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net/filesystem/directory/file?sasString".

credential

StorageSharedKeyCredential | AnonymousCredential | TokenCredential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

options
StoragePipelineOptions

Optional. Options to configure the HTTP pipeline.

Property Details

fileSystemName

Name of current file system.

string fileSystemName

Property Value

string

name

Name of current path (directory or file).

string name

Property Value

string

Inherited Property Details

accountName

accountName: string

Property Value

string

Inherited From DataLakePathClient.accountName

credential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

credential: StorageSharedKeyCredential | AnonymousCredential | TokenCredential

Property Value

Inherited From DataLakePathClient.credential

url

Encoded URL string value.

url: string

Property Value

string

Inherited From DataLakePathClient.url

Method Details

append(RequestBodyType, number, number, FileAppendOptions)

Uploads data to be appended to a file. Data can only be appended to a file. To apply perviously uploaded data to a file, call flush.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function append(body: RequestBodyType, offset: number, length: number, options?: FileAppendOptions): Promise<FileAppendResponse>

Parameters

body
HttpRequestBody

Content to be uploaded.

offset

number

Append offset in bytes.

length

number

Length of content to append in bytes.

options
FileAppendOptions

Optional. Options when appending data.

Returns

create(FileCreateOptions)

Create a file.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function create(options?: FileCreateOptions): Promise<FileCreateResponse>

Parameters

options
FileCreateOptions

Optional. Options when creating file.

Returns

create(PathResourceTypeModel, PathCreateOptions)

Create a file.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function create(resourceType: PathResourceTypeModel, options?: PathCreateOptions): Promise<PathCreateResponse>

Parameters

resourceType
PathResourceTypeModel

Resource type, must be "file" for DataLakeFileClient.

options
PathCreateOptions

Optional. Options when creating file.

Returns

createIfNotExists(FileCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function createIfNotExists(options?: FileCreateIfNotExistsOptions): Promise<FileCreateIfNotExistsResponse>

Parameters

options
FileCreateIfNotExistsOptions

Optional. Options when creating file.

Returns

createIfNotExists(PathResourceTypeModel, PathCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function createIfNotExists(resourceType: PathResourceTypeModel, options?: PathCreateIfNotExistsOptions): Promise<PathCreateIfNotExistsResponse>

Parameters

resourceType
PathResourceTypeModel

Resource type, must be "file" for DataLakeFileClient.

Returns

flush(number, FileFlushOptions)

Flushes (writes) previously appended data to a file.

function flush(position: number, options?: FileFlushOptions): Promise<FileFlushResponse>

Parameters

position

number

File position to flush. This parameter allows the caller to upload data in parallel and control the order in which it is appended to the file. It is required when uploading data to be appended to the file and when flushing previously uploaded data to the file. The value must be the position where the data is to be appended. Uploaded data is not immediately flushed, or written, to the file. To flush, the previously uploaded data must be contiguous, the position parameter must be specified and equal to the length of the file after all data has been written, and there must not be a request entity body included with the request.

options
FileFlushOptions

Optional. Options when flushing data.

Returns

generateSasStringToSign(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://docs.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas

function generateSasStringToSign(options: FileGenerateSasUrlOptions): string

Parameters

options
FileGenerateSasUrlOptions

Optional parameters.

Returns

string

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

generateSasUrl(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://docs.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas

function generateSasUrl(options: FileGenerateSasUrlOptions): Promise<string>

Parameters

options
FileGenerateSasUrlOptions

Optional parameters.

Returns

Promise<string>

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

query(string, FileQueryOptions)

Quick query for a JSON or CSV formatted file.

Example usage (Node.js):

// Query and convert a file to a string
const queryResponse = await fileClient.query("select * from BlobStorage");
const downloaded = (await streamToBuffer(queryResponse.readableStreamBody)).toString();
console.log("Query file content:", downloaded);

async function streamToBuffer(readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}
function query(query: string, options?: FileQueryOptions): Promise<FileReadResponse>

Parameters

query

string

Returns

Promise<FileReadResponse>

read(number, number, FileReadOptions)

Downloads a file from the service, including its metadata and properties.

  • In Node.js, data returns in a Readable stream readableStreamBody
  • In browsers, data returns in a promise contentAsBlob

See https://docs.microsoft.com/en-us/rest/api/storageservices/get-blob

  • Example usage (Node.js):
// Download and convert a file to a string
const downloadResponse = await fileClient.read();
const downloaded = await streamToBuffer(downloadResponse.readableStreamBody);
console.log("Downloaded file content:", downloaded.toString());

async function streamToBuffer(readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}

Example usage (browser):

// Download and convert a file to a string
const downloadResponse = await fileClient.read();
const downloaded = await blobToString(await downloadResponse.contentAsBlob);
console.log("Downloaded file content", downloaded);

async function blobToString(blob: Blob): Promise<string> {
  const fileReader = new FileReader();
  return new Promise<string>((resolve, reject) => {
    fileReader.onloadend = (ev: any) => {
      resolve(ev.target!.result);
    };
    fileReader.onerror = reject;
    fileReader.readAsText(blob);
  });
}
function read(offset?: number, count?: number, options?: FileReadOptions): Promise<FileReadResponse>

Parameters

offset

number

Optional. Offset to read file, default value is 0.

count

number

Optional. How many bytes to read, default will read from offset to the end.

options
FileReadOptions

Optional. Options when reading file.

Returns

Promise<FileReadResponse>

readToBuffer(Buffer, number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file.

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

function readToBuffer(buffer: Buffer, offset?: number, count?: number, options?: FileReadToBufferOptions): Promise<Buffer>

Parameters

buffer

Buffer

Buffer to be fill, must have length larger than count

offset

number

From which position of the Data Lake file to read

count

number

How much data to be read. Will read to the end when passing undefined

Returns

Promise<Buffer>

readToBuffer(number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

function readToBuffer(offset?: number, count?: number, options?: FileReadToBufferOptions): Promise<Buffer>

Parameters

offset

number

From which position of the Data Lake file to read(in bytes)

count

number

How much data(in bytes) to be read. Will read to the end when passing undefined

Returns

Promise<Buffer>

readToFile(string, number, number, FileReadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Downloads a Data Lake file to a local file. Fails if the the given file path already exits. Offset and count are optional, pass 0 and undefined respectively to download the entire file.

function readToFile(filePath: string, offset?: number, count?: number, options?: FileReadOptions): Promise<FileReadResponse>

Parameters

filePath

string

offset

number

From which position of the file to download.

count

number

How much data to be downloaded. Will download to the end when passing undefined.

options
FileReadOptions

Options to read Data Lake file.

Returns

Promise<FileReadResponse>

The response data for file read operation, but with readableStreamBody set to undefined since its content is already read and written into a local file at the specified path.

setExpiry(FileExpiryMode, FileSetExpiryOptions)

Sets an expiry time on a file, once that time is met the file is deleted.

function setExpiry(mode: FileExpiryMode, options?: FileSetExpiryOptions): Promise<FileSetExpiryResponse>

Parameters

Returns

upload(Blob | ArrayBuffer | ArrayBufferView | Buffer, FileParallelUploadOptions)

Uploads a Buffer(Node.js)/Blob/ArrayBuffer/ArrayBufferView to a File.

function upload(data: Blob | ArrayBuffer | ArrayBufferView | Buffer, options?: FileParallelUploadOptions): Promise<FileUploadResponse>

Parameters

data

Blob | ArrayBuffer | ArrayBufferView | Buffer

Buffer(Node), Blob, ArrayBuffer or ArrayBufferView

Returns

uploadFile(string, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a local file to a Data Lake file.

function uploadFile(filePath: string, options?: FileParallelUploadOptions): Promise<FileUploadResponse>

Parameters

filePath

string

Full path of the local file

Returns

uploadStream(Readable, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a Node.js Readable stream into a Data Lake file. This method will try to create a file, then starts uploading chunk by chunk. Please make sure potential size of stream doesn't exceed FILE_MAX_SIZE_BYTES and potential number of chunks doesn't exceed BLOCK_BLOB_MAX_BLOCKS.

PERFORMANCE IMPROVEMENT TIPS:

  • Input stream highWaterMark is better to set a same value with options.chunkSize parameter, which will avoid Buffer.concat() operations.
function uploadStream(stream: Readable, options?: FileParallelUploadOptions): Promise<FileUploadResponse>

Parameters

stream

Readable

Node.js Readable stream.

Returns

Inherited Method Details

delete(boolean, PathDeleteOptions)

Delete current path (directory or file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/delete

function delete(recursive?: boolean, options?: PathDeleteOptions): Promise<PathDeleteResponse>

Parameters

recursive

boolean

Required and valid only when the resource is a directory. If "true", all paths beneath the directory will be deleted.

options
PathDeleteOptions

Optional. Options when deleting path.

Returns

Inherited From DataLakePathClient.delete

deleteIfExists(boolean, PathDeleteOptions)

Delete current path (directory or file) if it exists.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/delete

function deleteIfExists(recursive?: boolean, options?: PathDeleteOptions): Promise<PathDeleteIfExistsResponse>

Parameters

recursive

boolean

Required and valid only when the resource is a directory. If "true", all paths beneath the directory will be deleted.

Returns

Inherited From DataLakePathClient.deleteIfExists

exists(PathExistsOptions)

Returns true if the Data Lake file represented by this client exists; false otherwise.

NOTE: use this function with care since an existing file might be deleted by other clients or applications. Vice versa new files might be added by other clients or applications after this function completes.

function exists(options?: PathExistsOptions): Promise<boolean>

Parameters

options
PathExistsOptions

options to Exists operation.

Returns

Promise<boolean>

Inherited From DataLakePathClient.exists

getAccessControl(PathGetAccessControlOptions)

Returns the access control data for a path (directory of file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/getproperties

function getAccessControl(options?: PathGetAccessControlOptions): Promise<PathGetAccessControlResponse>

Parameters

options
PathGetAccessControlOptions

Optional. Options when getting file access control.

Returns

Inherited From DataLakePathClient.getAccessControl

getDataLakeLeaseClient(string)

Get a DataLakeLeaseClient that manages leases on the path (directory or file).

function getDataLakeLeaseClient(proposeLeaseId?: string): DataLakeLeaseClient

Parameters

proposeLeaseId

string

Optional. Initial proposed lease Id.

Returns

Inherited From DataLakePathClient.getDataLakeLeaseClient

getProperties(PathGetPropertiesOptions)

Returns all user-defined metadata, standard HTTP properties, and system properties for the path (directory or file).

WARNING: The metadata object returned in the response will have its keys in lowercase, even if they originally contained uppercase characters. This differs from the metadata keys returned by the methods of DataLakeFileSystemClient that list paths using the includeMetadata option, which will retain their original casing.

See https://docs.microsoft.com/en-us/rest/api/storageservices/get-blob-properties

function getProperties(options?: PathGetPropertiesOptions): Promise<PathGetPropertiesResponse>

Parameters

options
PathGetPropertiesOptions

Optional. Options when getting path properties.

Returns

Inherited From DataLakePathClient.getProperties

move(string, PathMoveOptions)

Move directory or file within same file system.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function move(destinationPath: string, options?: PathMoveOptions): Promise<PathMoveResponse>

Parameters

destinationPath

string

Destination directory path like "directory" or file path "directory/file". If the destinationPath is authenticated with SAS, add the SAS to the destination path like "directory/file?sasToken".

options
PathMoveOptions

Optional. Options when moving directory or file.

Returns

Promise<PathMoveResponse>

Inherited From DataLakePathClient.move

move(string, string, PathMoveOptions)

Move directory or file to another file system.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/create

function move(destinationFileSystem: string, destinationPath: string, options?: PathMoveOptions): Promise<PathMoveResponse>

Parameters

destinationFileSystem

string

Destination file system like "filesystem".

destinationPath

string

Destination directory path like "directory" or file path "directory/file" If the destinationPath is authenticated with SAS, add the SAS to the destination path like "directory/file?sasToken".

options
PathMoveOptions

Optional. Options when moving directory or file.

Returns

Promise<PathMoveResponse>

Inherited From DataLakePathClient.move

removeAccessControlRecursive(RemovePathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Removes the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function removeAccessControlRecursive(acl: RemovePathAccessControlItem[], options?: PathChangeAccessControlRecursiveOptions): Promise<PathChangeAccessControlRecursiveResponse>

Parameters

acl

RemovePathAccessControlItem[]

The POSIX access control list for the file or directory.

options
PathChangeAccessControlRecursiveOptions

Optional. Options

Returns

Inherited From DataLakePathClient.removeAccessControlRecursive

setAccessControl(PathAccessControlItem[], PathSetAccessControlOptions)

Set the access control data for a path (directory of file).

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function setAccessControl(acl: PathAccessControlItem[], options?: PathSetAccessControlOptions): Promise<PathSetAccessControlResponse>

Parameters

acl

PathAccessControlItem[]

The POSIX access control list for the file or directory.

options
PathSetAccessControlOptions

Optional. Options when setting path access control.

Returns

Inherited From DataLakePathClient.setAccessControl

setAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Sets the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function setAccessControlRecursive(acl: PathAccessControlItem[], options?: PathChangeAccessControlRecursiveOptions): Promise<PathChangeAccessControlRecursiveResponse>

Parameters

acl

PathAccessControlItem[]

The POSIX access control list for the file or directory.

options
PathChangeAccessControlRecursiveOptions

Optional. Options

Returns

Inherited From DataLakePathClient.setAccessControlRecursive

setHttpHeaders(PathHttpHeaders, PathSetHttpHeadersOptions)

Sets system properties on the path (directory or file).

If no value provided, or no value provided for the specified blob HTTP headers, these blob HTTP headers without a value will be cleared.

See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties

function setHttpHeaders(httpHeaders: PathHttpHeaders, options?: PathSetHttpHeadersOptions): Promise<PathSetHttpHeadersResponse>

Parameters

httpHeaders
PathHttpHeaders

Returns

Inherited From DataLakePathClient.setHttpHeaders

setMetadata(Metadata, PathSetMetadataOptions)

Sets user-defined metadata for the specified path (directory of file) as one or more name-value pairs.

If no option provided, or no metadata defined in the parameter, the path metadata will be removed.

See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-metadata

function setMetadata(metadata?: Metadata, options?: PathSetMetadataOptions): Promise<PathSetMetadataResponse>

Parameters

metadata
Metadata

Optional. Replace existing metadata with this value. If no value provided the existing metadata will be removed.

options
PathSetMetadataOptions

Optional. Options when setting path metadata.

Returns

Inherited From DataLakePathClient.setMetadata

setPermissions(PathPermissions, PathSetPermissionsOptions)

Sets the file permissions on a path.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function setPermissions(permissions: PathPermissions, options?: PathSetPermissionsOptions): Promise<PathSetPermissionsResponse>

Parameters

permissions
PathPermissions

The POSIX access permissions for the file owner, the file owning group, and others.

options
PathSetPermissionsOptions

Optional. Options when setting path permissions.

Returns

Inherited From DataLakePathClient.setPermissions

toDirectoryClient()

Convert current DataLakePathClient to DataLakeDirectoryClient if current path is a directory.

function toDirectoryClient(): DataLakeDirectoryClient

Returns

Inherited From DataLakePathClient.toDirectoryClient

toFileClient()

Convert current DataLakePathClient to DataLakeFileClient if current path is a file.

function toFileClient(): DataLakeFileClient

Returns

Inherited From DataLakePathClient.toFileClient

updateAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Modifies the Access Control on a path and sub paths.

See https://docs.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/update

function updateAccessControlRecursive(acl: PathAccessControlItem[], options?: PathChangeAccessControlRecursiveOptions): Promise<PathChangeAccessControlRecursiveResponse>

Parameters

acl

PathAccessControlItem[]

The POSIX access control list for the file or directory.

options
PathChangeAccessControlRecursiveOptions

Optional. Options

Returns

Inherited From DataLakePathClient.updateAccessControlRecursive