Class: StreamUpload

StreamUpload(sourceStream, blob, authData)

Uploads a stream to Azure Blob Storage.

Azure Blob Storage can support up to 50,000 blocks of up to 100MB each, for a total size of approximately ~4.8TB per blob (we're using block blobs). For files larger than 4.8TB, multiple blobs are created, with names .000, .001, .002, etc.

Constructor

new StreamUpload(sourceStream, blob, authData)

Constructor: initialize a StreamUpload object.

Parameters:
Name Type Description
sourceStream Object

Stream containing the data to be sent

blob string

Name of the blob (starting with /)

authData Object

Authentication data

Properties
Name Type Description
storageAccountName string

Name of the Azure Storage Account (required)

storageAccountKey string

Key of the Azure Storage Account (required if storageAccountSasToken is not set)

storageAccountSasToken string

SAS token (required if storageAccountKey is not set)

Source:

Members

blobUrl :string

Base URL for the blob.

This is a read-only value.

Type:
  • string
Source:

blockSize :number

Size of each block uploaded, in bytes.

The maximum size imposed by Azure Blob Storage is 100MB; by default, we are using a smaller block size to reduce memory footprint.

Note that the maximum number of blocks per blob remain 50,000, regardless of the size of each block.

Type:
  • number
Source:

blocksPerBlob :number

Number of blocks in each blob. Each block is at most 100MB in size.

When using smaller values, you will potentially have a bigger number separate blobs inside your Azure Storage Account. Because performance targets are applied at each individual blob, retrieving data might be faster if you have more blobs, as you can download them in parallel. This has no effect on upload speed.

The maximum value is 50,000, as imposed by Azure Blob Storage.

Type:
  • number
Source:

concurrency :number

Number of parallel upload tasks.

Please note that the higher the number of parallel uploads, the more memory is required.

Type:
  • number
Source:

endpoint :string

Set what endpoint to use for Azure Blob Storage.

The default value is 'blob.core.windows.net' (Azure Global). Other options you might want to leverage:

  • Azure China: 'blob.core.chinacloudapi.cn'
  • Azure Germany: 'blob.core.cloudapi.de'

For Azure Stack, use your custom endpoint.

Type:
  • string
Source:

md5 :boolean

Calculate MD5 of blocks before uploading them, to ensure integrity during transfer. This is enabled by default.

Type:
  • boolean
Source:

singleBlob :boolean

Do not append a suffix to the file name. This will ensure that for files or streams that can fit one blob, no ".000" suffix is added. However, uploads of larger files will fail. This is disabled by default.

Type:
  • boolean
Source:

Methods

commitBlockBlob(blockCount, seqIdopt) → {Promise}

Commit a block blob into Azure Blob Storage by sending the list of blocks.

Parameters:
Name Type Attributes Description
blockCount number

Number of blocks uploaded. This will be used to re-generate the block ids

seqId string <optional>

Optional suffix for the blob name (for storing files bigger than 4.8TB)

Source:
Returns:

Promise containing the result of the operation

Type
Promise

generateBlockId(blockNum) → {string}

Generate a block id. This implementation returns the number of the block with padding 0's in front, converted to base64

Parameters:
Name Type Description
blockNum int

Number of block

Source:
Returns:

Block id for use with Azure Blob Storage

Type
string

putBlock(block, blockId, seqIdopt) → {Promise}

Upload a block of data to the block blob in Azure Blob Storage

Parameters:
Name Type Attributes Description
block Buffer

Block of data to upload (maximum 100MB in size)

blockId String

ID of the block

seqId string <optional>

Optional suffix for the blob name (for storing files bigger than 4.8TB)

Source:
Returns:

Promise containing the result of the operation

Type
Promise

upload() → {Promise}

Start upload of the stream

Source:
Returns:

Promise containing the result of the upload

Type
Promise