- @azure/storage-blob
- @azure/storage-file
- @azure/storage-queue
- API Reference documentation
This project provides a SDK in JavaScript that makes it easy to consume Microsoft Azure Storage services.
Please note that this version of the SDK is a compete overhaul of the current Azure Storage SDK for Node.js and JavaScript in Browsers, and is based on the new Storage SDK architecture.
- Blob Storage
- Get/Set Blob Service Properties
- Create/List/Delete Containers
- Create/Read/List/Update/Delete Block Blobs
- Create/Read/List/Update/Delete Page Blobs
- Create/Read/List/Update/Delete Append Blobs
- File Storage
- Get/Set File Service Properties
- Create/List/Delete File Shares
- Create/List/Delete File Directories
- Create/Read/List/Update/Delete Files
- Queue Storage
- Get/Set Queue Service Properties
- Create/List/Delete Queues
- Enqueue/Dequeue/Peek/Clear/Update/Delete Queue Messages
- Features new
- Asynchronous I/O for all operations using the async methods
- HttpPipeline which enables a high degree of per-request configurability
- 1-to-1 correlation with the Storage REST API for clarity and simplicity
This SDK is compatible with Node.js and browsers, and validated against LTS Node.js versions (>=6.5) and latest versions of Chrome, Firefox and Edge.
You need polyfills to make this library work with IE11. The easiest way is to use @babel/polyfill, or polyfill service. Or you can load separate polyfills for missed ES feature(s). This library depends on following ES features which need external polyfills loaded.
Promise
String.prototype.startsWith
String.prototype.endsWith
String.prototype.repeat
String.prototype.includes
Array.prototype.includes
Object.keys
(Override IE11'sObject.keys
with ES6 polyfill forcely to enable ES6 behavior)
There are differences between Node.js and browsers runtime. When getting start with this SDK, pay attention to APIs or classes marked with "ONLY AVAILABLE IN NODE.JS RUNTIME" or "ONLY AVAILABLE IN BROWSERS".
- Shared Key Authorization based on account name and account key
SharedKeyCredential
- Shared Access Signature(SAS) generation
generateAccountSASQueryParameters()
generateBlobSASQueryParameters()
generateFileSASQueryParameters()
generateQueueSASQueryParameters()
- Parallel uploading and downloading
uploadFileToBlockBlob()
uploadStreamToBlockBlob()
downloadBlobToBuffer()
uploadFileToAzureFile()
uploadStreamToAzureFile()
downloadAzureFileToBuffer()
- Parallel uploading and downloading
uploadBrowserDataToBlockBlob()
uploadBrowserDataToAzureFile()
The preferred way to install the Azure Storage SDK for JavaScript is to use the npm package manager. Take "@azure/storage-blob" for example.
Simply type the following into a terminal window:
npm install @azure/storage-blob
In your TypeScript or JavaScript file, import via following:
import * as Azure from "@azure/storage-blob";
Or
const Azure = require("@azure/storage-blob");
To use the SDK with JS bundle in the browsers, simply add a script tag to your HTML pages pointing to the downloaded JS bundle file(s):
<script src="https://mydomain/azure-storage.blob.min.js"></script>
<script src="https://mydomain/azure-storage.file.min.js"></script>
<script src="https://mydomain/azure-storage.queue.min.js"></script>
The JS bundled file is compatible with UMD standard, if no module system found, following global variable(s) will be exported:
azblob
azfile
azqueue
Download latest released JS bundles from links in the GitHub release page. Or from following links directly:
- Blob https://aka.ms/downloadazurestoragejsblob
- File https://aka.ms/downloadazurestoragejsfile
- Queue https://aka.ms/downloadazurestoragejsqueue
You need to set up Cross-Origin Resource Sharing (CORS) rules for your storage account if you need to develop for browsers. Go to Azure portal and Azure Storage Explorer, find your storage account, create new CORS rules for blob/queue/file/table service(s).
For example, you can create following CORS settings for debugging. But please customize the settings carefully according to your requirements in production environment.
- Allowed origins: *
- Allowed verbs: DELETE,GET,HEAD,MERGE,POST,OPTIONS,PUT
- Allowed headers: *
- Exposed headers: *
- Maximum age (seconds): 86400
The Azure Storage SDK for JavaScript provides low-level and high-level APIs. Take Blob SDK as example:
-
ServiceURL, ContainerURL and BlobURL objects provide the low-level API functionality and map one-to-one to the Azure Storage Blob REST APIs.
-
The high-level APIs provide convenience abstractions such as uploading a large stream to a block blob (using multiple PutBlock requests).
const {
Aborter,
BlobURL,
BlockBlobURL,
ContainerURL,
ServiceURL,
StorageURL,
SharedKeyCredential,
AnonymousCredential,
TokenCredential
} = require("@azure/storage-blob");
async function main() {
// Enter your storage account name and shared key
const account = "account";
const accountKey = "accountkey";
// Use SharedKeyCredential with storage account and account key
const sharedKeyCredential = new SharedKeyCredential(account, accountKey);
// Use TokenCredential with OAuth token
const tokenCredential = new TokenCredential("token");
tokenCredential.token = "renewedToken"; // Renew the token by updating token field of token credential
// Use AnonymousCredential when url already includes a SAS signature
const anonymousCredential = new AnonymousCredential();
// Use sharedKeyCredential, tokenCredential or anonymousCredential to create a pipeline
const pipeline = StorageURL.newPipeline(sharedKeyCredential);
// List containers
const serviceURL = new ServiceURL(
// When using AnonymousCredential, following url should include a valid SAS or support public access
`https://${account}.blob.core.windows.net`,
pipeline
);
let marker;
do {
const listContainersResponse = await serviceURL.listContainersSegment(
Aborter.none,
marker
);
marker = listContainersResponse.nextMarker;
for (const container of listContainersResponse.containerItems) {
console.log(`Container: ${container.name}`);
}
} while (marker);
// Create a container
const containerName = `newcontainer${new Date().getTime()}`;
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
const createContainerResponse = await containerURL.create(Aborter.none);
console.log(
`Create container ${containerName} successfully`,
createContainerResponse.requestId
);
// Create a blob
const content = "hello";
const blobName = "newblob" + new Date().getTime();
const blobURL = BlobURL.fromContainerURL(containerURL, blobName);
const blockBlobURL = BlockBlobURL.fromBlobURL(blobURL);
const uploadBlobResponse = await blockBlobURL.upload(
Aborter.none,
content,
content.length
);
console.log(
`Upload block blob ${blobName} successfully`,
uploadBlobResponse.requestId
);
// List blobs
marker = undefined;
do {
const listBlobsResponse = await containerURL.listBlobFlatSegment(
Aborter.none,
marker
);
marker = listBlobsResponse.nextMarker;
for (const blob of listBlobsResponse.segment.blobItems) {
console.log(`Blob: ${blob.name}`);
}
} while (marker);
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
const downloadBlockBlobResponse = await blobURL.download(Aborter.none, 0);
console.log(
"Downloaded blob content",
await streamToString(downloadBlockBlobResponse.readableStreamBody)
);
// Delete container
await containerURL.delete(Aborter.none);
console.log("deleted container");
}
// A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", data => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
// An async method returns a Promise object, which is compatible with then().catch() coding style.
main()
.then(() => {
console.log("Successfully executed sample.");
})
.catch(err => {
console.log(err.message);
});
- Blob Storage Examples
- Blob Storage Examples - Test Cases
- File Storage Examples
- File Storage Examples - Test Cases
- Queue Storage Examples
- Queue Storage Examples - Test Cases
This project is licensed under MIT.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.