I tried to read chunk by chunk and upload to remote server. but the final one seems exceed the file size
I'm using "read-chunk": "^3.2.0", because new version doesn't support require for which My project has to use it
firstChunk result {"d":{"StartUpload":"5242880"}}
uploading continue chunks 5242880 10485760 51387344
...
other chun result {"d":{"ContinueUpload":"47185920"}}
finished uplading... 47185920 52428800 51387344
const stats = fs.statSync(path);
const fileSizeInBytes = stats.size;
console.log('fileSizeInBytes', fileSizeInBytes)
for (offset = 0; offset < fileSizeInBytes; offset += chunkSize) {
const chunk = await ReadChunk(path, offset, chunkSize);
if (firstChunk) {
console.log('uploading firstChunk', path, offset, offset + chunkSize, fileSizeInBytes)
} else if (offset >= fileSizeInBytes - chunkSize) {
console.log('finished uplading...', offset, offset + chunkSize, fileSizeInBytes)
} else {
console.log('uploading continue chunks', offset, offset + chunkSize, fileSizeInBytes)
}
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too