teasense/node_modules/fetch-blob
2022-05-23 13:53:41 -05:00
..
file.d.ts Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
file.js Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
from.d.ts Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
from.js Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
index.d.ts Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
index.js Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
LICENSE Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
package.json Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
README.md Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00
streams.cjs Update to one file and use JavaScript 2022-05-23 13:53:41 -05:00

fetch-blob

npm version build status coverage status install size

A Blob implementation in Node.js, originally from node-fetch.

Installation

npm install fetch-blob
Upgrading from 2x to 3x

Updating from 2 to 3 should be a breeze since there is not many changes to the blob specification. The major cause of a major release is coding standards. - internal WeakMaps was replaced with private fields - internal Buffer.from was replaced with TextEncoder/Decoder - internal buffers was replaced with Uint8Arrays - CommonJS was replaced with ESM - The node stream returned by calling blob.stream() was replaced with whatwg streams - (Read "Differences from other blobs" for more info.)

Differences from other Blobs
  • Unlike NodeJS buffer.Blob (Added in: v15.7.0) and browser native Blob this polyfilled version can't be sent via PostMessage
  • This blob version is more arbitrary, it can be constructed with blob parts that isn't a instance of itself it has to look and behave as a blob to be accepted as a blob part.
    • The benefit of this is that you can create other types of blobs that don't contain any internal data that has to be read in other ways, such as the BlobDataItem created in from.js that wraps a file path into a blob-like item and read lazily (nodejs plans to implement this as well)
  • The blob.stream() is the most noticeable differences. It returns a WHATWG stream now. to keep it as a node stream you would have to do:
  import {Readable} from 'stream'
  const stream = Readable.from(blob.stream())

Usage

// Ways to import
// (PS it's dependency free ESM package so regular http-import from CDN works too)
import Blob from 'fetch-blob'
import File from 'fetch-blob/file.js'

import {Blob} from 'fetch-blob'
import {File} from 'fetch-blob/file.js'

const {Blob} = await import('fetch-blob')


// Ways to read the blob:
const blob = new Blob(['hello, world'])

await blob.text()
await blob.arrayBuffer()
for await (let chunk of  blob.stream()) { ... }
blob.stream().getReader().read()
blob.stream().getReader({mode: 'byob'}).read(view)

Blob part backed up by filesystem

fetch-blob/from.js comes packed with tools to convert any filepath into either a Blob or a File It will not read the content into memory. It will only stat the file for last modified date and file size.

// The default export is sync and use fs.stat to retrieve size & last modified as a blob
import blobFromSync from 'fetch-blob/from.js'
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'

const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
const fsBlob = await blobFrom('./2-GiB-file.mp4')

// Not a 4 GiB memory snapshot, just holds references
// points to where data is located on the disk
const blob = new Blob([fsFile, fsBlob, 'memory', new Uint8Array(10)])
console.log(blob.size) // ~4 GiB

blobFrom|blobFromSync|fileFrom|fileFromSync(path, [mimetype])

Creating Blobs backed up by other async sources

Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item An example of this is that our blob implementation can be constructed with parts coming from BlobDataItem (aka a filepath) or from buffer.Blob, It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has Symbol.toStringTag, size, slice() and either a stream() or a arrayBuffer() method. If you then wrap it in our Blob or File new Blob([blobDataItem]) then you get all of the other methods that should be implemented in a blob or file

An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase

See the MDN documentation and tests for more details of how to use the Blob.