I have the following code (please ignore it's ugliness):
const convertToVideoBlob = async (dataURL : string, storageRef: StorageReference) : Promise<string> => {
console.log("dataURL: ", dataURL)
const promiseRes : any = await fetch(dataURL);
const blob : Blob = (await promiseRes.blob());
const reader = new FileReader();
reader.onloadend = async ()=>{
const newDataURL : string | ArrayBuffer = reader.result || ''
console.log(`testing: ${newDataURL}`);
console.log(`testing type: ${typeof(newDataURL)}`);
await uploadString(storageRef, (newDataURL as string), 'base64');
const videoStorageDownloadURL: string = await getDownloadURL(storageRef);
console.log("videoStorageDLlink: ", videoStorageDownloadURL)
return videoStorageDownloadURL;
}
// console.log("outside newDataURL: ", newDataURL);
reader.readAsDataURL(blob);
return 'notAvailable'
}
where dataURL is the following string, representing a screen recording my React typescript application recorded:
blob:https://www.junojourney.com/a40cbd71-ea64-4bf9-b563-c6dcb34bdf53
in order to upload to firebase, I needed to transfer it into a Blob object and from there to to use reader.readAsDataURL(blob) which provides me of the following result within newDataURL:
data:video/mp4;base64,GkXfo59ChoEBQveBAULygQRC84EIQoKEd2VibUKHgQRChYECGFOAZwH/////////FUmpZpkq17GDD0JAT....
which is from the required form of data presentation in order to use after the uploadString() method for uploading to firebase, as far as i'm aware..
but, weirdly i'm getting the following error:
content.js:205 Uncaught (in promise) FirebaseError: Firebase Storage: String does not match format 'base64': Invalid character found (storage/invalid-format)
and if I change the line:
await uploadString(storageRef, (newDataURL as string), 'base64');
to
await uploadString(storageRef, newDataURL, 'base64');
I get the following error:
const newDataURL: string | ArrayBuffer
Argument of type 'string | ArrayBuffer' is not assignable to parameter of type 'string'.
Type 'ArrayBuffer' is not assignable to type 'string'.ts(2345)
Any ideas? Regards!
Related
I'm attempting to download csv files from S3, perform some transforms on the data (in this example, hardcoding an ID), and then reupload it back to S3 as a 'processed' version of the file whilst using streams to avoid running out of memory. Fast-csv looked to be a good library to do this with.
Consider the following code:
const s3Client = new S3Client({ region: 'eu-west-2' });
const getFileFromS3 = async () => {
const command = new GetObjectCommand({
Bucket: 'mybucket',
Key: 'originaldata.csv',
});
const getFile = await s3Client.send(command);
const stream = await getFile.Body;
return stream;
};
const csvParser = csv
.parse({ headers: true })
.transform((data) => ({
...data,
id: 'TEST',
}))
.on('error', (error) => console.error(error))
.on('data', (row) => console.log(row))
.on('end', (rowCount: number) => console.log(`Parsed ${rowCount} rows`));
const fileStream = await getFileFromS3();
const transformationStream = new PassThrough();
fileStream.pipe(csvParser).pipe(transformationStream);
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'mybucket',
Key: 'processeddata.csv',
Body: transformationStream,
},
});
await upload.done();
But when doing this, I get the following error:
TypeError [ERR_INVALID_ARG_TYPE]: The "chunk" argument must be of type string or an instance of Buffer or Uint8Array. Received an instance of Object
It would seem another person has encountered this in the fast-csv repo, but the solution was never given.
I'm using node 18.7 on ubuntu. I'm trying to parse a bunch of csv files to objects (using csv-parse), ultimately to load into a db. Because there are large numbers of these I decided to try streams and I'd like to use the async await style.
Based on Using async/await syntax with node stream , I have changed my code to
const { parse } = require('csv-parse');
const path = __dirname + '/file1.csv';
const opt = { columns: true, relax_column_count: true, skip_empty_lines: true, skip_records_with_error: true };
console.log(path);
const { pipeline } = require('node:stream/promises');
async function readByLine(path, opt) {
const readFileStream = fs.createReadStream(path);
const writeFileStream = fs.createWriteStream(__dirname + '/file2');
var csvParser = parse(opt, function (err, records) {
if (err) throw err;
});
await pipeline(readFileStream, csvParser, writeFileStream);
}
readByLine(path, opt);
when I run the file, I'm getting:
TypeError [ERR_INVALID_ARG_TYPE]: The "chunk" argument must be of type string or an instance of Buffer or Uint8Array. Received an instance of Object
at new NodeError (node:internal/errors:387:5)
at _write (node:internal/streams/writable:315:13)
at Writable.write (node:internal/streams/writable:337:10)
at Parser.ondata (node:internal/streams/readable:766:22)
at Parser.emit (node:events:513:28)
at Readable.read (node:internal/streams/readable:539:10)
at Parser.<anonymous> (/home/gmail-username/node/maricopa/node_modules/csv-parse/dist/cjs/index.cjs:1357:28)
at Parser.emit (node:events:513:28)
at emitReadable_ (node:internal/streams/readable:590:12)
at process.processTicksAndRejections (node:internal/process/task_queues:81:21) {
code: 'ERR_INVALID_ARG_TYPE'
}
How can I fix this?
Reads from the csvParser stream are in object mode.
An fs.writeStream requires strings/buffers to be written to it.
A simple transform stream can be implemented to convert the objects to a JSON string by setting the writableObjectMode option:
class JsonTransform extends Transform {
constructor(opt) {
super(Object.assign({}, { writableObjectMode: true }, opt));
}
_transform(obj, enc, callback) {
let ret = JSON.stringify(obj) + '\n'
this.push(ret)
callback()
}
}
Then use that between the parse + file write.
const toJSON = new JsonTransform()
await pipeline(readFileStream, csvParser, toJSON, writeFileStream);
I believe the csv stringify project is similar for writing CSV's.
I am trying to send a yaml file as a base64 string so that this code works:
const response = await octokit.request('GET /repos/{owner}/{repo}/git/blobs/{file_sha}', {
owner: 'DevEx',
repo: 'hpdev-content',
file_sha: fileSha,
headers: {
authorization: `Bearer ${githubConfig?.token}`,
},
});
const decoded = Buffer.from(response.data.content, 'base64').toString('utf8');
In the above code response.data.content should have the data.
I have this route:
router.get('/repos/:owner/:repo/git/blobs/:file_sha', (req, res) => {
// TODO: do we need to do anything with the path params?
// eslint-disable-next-line #typescript-eslint/no-unused-vars
const { owner, repo, file_sha } = req.params;
const contents = writeUsersReport();
const encoded = Buffer.from(contents, 'binary').toString('base64');
res.send(encoded);
});
The code is working fine except that the client code expects the base64 string in a property called content in the following code:
const decoded = Buffer.from(response.data.content, 'base64').toString('utf8');
But the string is in response.data.
How can I set the content property instead?
How about sending a json response containing an object with a content property from your server side instead of the encoded string directly?
// ...
const encoded = Buffer.from(contents, 'binary').toString('base64');
res.json({content:encoded});
I am trying to set the types for a file upload, but I can not believe I have to define every single property on the file object
export type FileProps = {
path: string
lastModified: number
slice: () => void
stream: () => void
text: () => void
arrayBuffer: ArrayBuffer
name: string
size: number
type: string
}
const [files, setFiles] = useState<FileProps[]>([])
I upload a few files and store them on the state, but then when I try to add to the form
const formData = new FormData()
for (const file of files) {
formData.append('files', file)
}
I get an error on file
If you just use File, then you get exactly what you want:
const [files, setFiles] = useState<File[]>([])
const formData = new FormData()
for (const file of files) {
formData.append('files', file)
}
That should get you all the fields documented here: https://developer.mozilla.org/en-US/docs/Web/API/File
See playground
The second argument of the FormData.append method is Blob or USVString.
You had mentioned in a comment that the entire structure needs to be sent to the backend. So, you need to convert the FormData instance to a blob.
for (const file of files) {
formData.append('files', new Blob([JSON.stringify(file)], {type : 'text/html'});)
}
Hi i have a function that downloads an excel file coming from backend
component.ts
getExcelExport(resultsFilterRootObject: ResultsFilterRootObject) {
return this.http.post(urls.getExcelExportCPPMetrics , resultsFilterRootObject, {
responseType: 'arraybuffer',
observe: 'response'
})
.pipe(
tap(
data => {
const blob = new Blob([data.body], {type: 'application/vnd.ms-excel'});
const filename = 'vehicle-metrics-template.xls';
FileSaver.saveAs(blob, filename);
},
catchError(MetricsService.handleError)
)
);
}
component.spec.ts
it('should download Excel ', () => {
// const expectedResult: ArrayBuffer = new ArrayBuffer(8); Tried this fails too
const expectedResult = new TextEncoder();
expectedResult.encode("This is a string converted to a Uint8Array");
httpClientSpy.post.and.returnValue(asyncData(expectedResult));
metricsService.getExcelExportCPPMetrics(resultsFilterRootObject).subscribe(
heroes => expect(heroes).toEqual(expectedResult, 'expected VehicleSalesResultRootObject'),
fail
);
expect(httpClientSpy.post.calls.count()).toBe(1, 'one call');
});
I keep getting error error TS2345: Argument of type 'TextEncoder' is not assignable to parameter of type 'Expected<HttpResponse<ArrayBuffer>>'.
Type 'TextEncoder' is missing the following properties from type 'ObjectContaining<HttpResponse<ArrayBuffer>>': jasmineMatches, jasmineToString
Basically if I can create a variable of type ArrayBuffer in the Unit
Test this problem would be solved
any idea on this ?
Note that post method with params responseType: 'arraybuffer' and observe: 'response' returns value Observable<HttpResponse<ArrayBuffer>> which is not directly ArrayBuffer as provided there:
post(url: string, body: any | null, options: {
headers?: HttpHeaders | {
[header: string]: string | string[];
};
observe: 'response';
params?: HttpParams | {
[param: string]: string | string[];
};
reportProgress?: boolean;
responseType: 'arraybuffer';
withCredentials?: boolean;
}): Observable<HttpResponse<ArrayBuffer>>;
What you can do is return Observable with simple object that has property which are you using - body:
it('should download Excel ', () => {
const expectedResult: ArrayBuffer = new ArrayBuffer(8);
// httpClientSpy.post.and.returnValue(asyncData(expectedResult));
httpClientSpy.post.and.returnValue(of({body: expectedResult})); // Or that below one if "asyncData" return Observable
metricsService.getExcelExportCPPMetrics(resultsFilterRootObject).subscribe(
data => expect(data.body).toEqual(expectedResult, 'expected VehicleSalesResultRootObject'),
fail
);
expect(httpClientSpy.post.calls.count()).toBe(1, 'one call');
});