pdf-lib merge pdfs on frontend - javascript

I'm trying to merge two pdf files on frontend using javascript and pdf-lib library. I found this snippet pdf-lib in github repository:
async function mergePdfs(pdfsToMerge: string[]) {
const mergedPdf = await PDFDocument.create();
for (const pdfCopyDoc of pdfsToMerge) {
const pdfBytes = fs.readFileSync(pdfCopyDoc);
const pdf = await PDFDocument.load(pdfBytes);
const copiedPages = await mergedPdf.copyPages(pdf, pdf.getPageIndices());
copiedPages.forEach((page) => {
mergedPdf.addPage(page);
});
}
const mergedPdfFile = await mergedPdf.save();
return mergedPdfFile;
}
But as I see this snipped is for nodejs (there's no fs.readfilesync in browser javascript). So I have 2 questions:
what should I put in pdfsToMerge(string: [])? I have variables containing urls to pdf1 and pdf
Also I have two variables containing base64 code of these pdfs. How can I use this snippet not using fs.readfilesync like in nodejs but on frontend?
Many thanks in advance!

The PDFDocument.load() method will accept base64 strings as the parameter so you don't need to do transform those at all.
As for your variables storing url paths to pdf documents, you can use fetch instead of node's file system. As described in the pdf-lib docs, you can store the ArrayBuffer and pass that into PDFDocument.load() like so:
const url = 'https://pdf-lib.js.org/assets/with_update_sections.pdf'
const arrayBuffer = await fetch(url).then(res => res.arrayBuffer())
const pdfDoc = await PDFDocument.load(arrayBuffer)

Your version number should be newest pdf-lib
THen sequence of events matters. Here is function i use must be in order of event
I use is with data or emtpy to get filled or non filled pdf files
async copyPages(sale: Sale, url1, urlArray, isWithData, isEmptyForm) {
this.pdfService.getIsEmpty().subscribe(data => { isEmptyForm = data; });
this.pdfService.getIsWithData().subscribe(data => { isWithData = data; });
console.log(urlArray);
let donorBytes = [];
let donorBytesFInal = [];
let donorPage = [];
let donorDoc = [];
/**
* first page get bytes from url
* then load data
* then convert the data bytes to pdfDocument
* later in routine this firstDonorDoc pages are inserted not added
*/
let firstDonorPdfBytes = await fetch(url1).then(res => res.arrayBuffer());
await this.loadDataTodocument(firstDonorPdfBytes, sale, isWithData,
isEmptyForm).then(data => {
firstDonorPdfBytes = data;
});
/**
* load first document
*/
const firstDonorPdfDoc = await PDFDocument.load(firstDonorPdfBytes);
/**
* load url array convert to bytes, send bytes to populate textfields with
data
*/
for (let i = 0; i < urlArray.length; ++i) {
console.log(urlArray.length);
donorBytes[i] = await fetch(urlArray[i].url).then(res =>
res.arrayBuffer());
}
/* Insert data to donorBytes and create DonorBytesFinal array with data */
// tslint:disable-next-line:prefer-for-of
for (let i = 0; i < donorBytes.length; ++i) {
await this.loadDataTodocument(donorBytes[i], sale, isWithData,
isEmptyForm).then(data
=> {
donorBytesFInal.push(data);
});
}
// console.log(donorBytesFInal);
/*
convert donor bytes to PdfDocument after bytes include data re
donorBytesFInal
*/
for (let i = 0; i < donorBytesFInal.length; ++i) {
donorDoc[i] = await PDFDocument.load(donorBytesFInal[i]);
}
/* create out put document **/
const pdfDoc = await PDFDocument.create();
/**
* copay first page... not in array
*/
const [firstDonorPage] = await pdfDoc.copyPages(firstDonorPdfDoc, [0]);
/**
* copy all array pages of singular docuemnts output pdfdoc. Notices these
are insertpages nto addpage
*/
for (let i = 0; i < donorBytes.length; ++i) {
[donorPage[i]] = await pdfDoc.copyPages(donorDoc[i], [0]);
pdfDoc.insertPage(0, donorPage[i]);
}
/** first page is an ADDpage not an insert */
pdfDoc.addPage(firstDonorPage);
/** create tyes for 64 and 8 and update globally */
const u8 = await pdfDoc.save();
const n64 = await pdfDoc.saveAsBase64();
this.pdfService.changeUint8ByteArray(u8);
this.pdfService.changeBase64Array(n64);
const pdfBytes = u8;
/** redundant empty urlarray */
urlArray = [];

Related

How do I create a video that has seek-able timestamps from an unknown number of incoming video blobs/chunks, using ts-ebml (on-the-fly)?

I am creating a live stream component that utilizes the videojs-record component. Every x amount of milliseconds, the component triggers an event that returns a blob. As seen, the blob contains data from the video recording. It's not the full recording but a piece, for this got returned x seconds into the recording
After saving it in the backend and playing it back, I find that I am unable to skip through the video; it's not seek-able.
Because this is a task that I'm trying to keep in the frontend, I have to inject this metadata within the browser using ts-ebml. After injecting the metadata, the modified blob is sent to the backend.
The function that receives this blob looks as follows:
timestampHandler(player) {
const { length: recordedDataLength } = player.recordedData;
if (recordedDataLength != 0) {
const { convertStream } = this.converter;
convertStream(player.recordedData[recordedDataLength - 1]).then((blob) => {
console.log(blob);
blob.arrayBuffer().then(async response => {
const bytes = new Uint8Array(response);
let binary = '';
let len = bytes.byteLength;
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
this.$backend.videoDataSendToServer({ bytes: window.btoa(binary), id: this.videoId })
})
.catch(error => {
console.log('Error Converting:\t', error);
})
})
}
}
convertStream is a function located in a class called TsEBMLEngine. This class looks as follows:
import videojs from "video.js/dist/video";
import { Buffer } from "buffer";
window.Buffer = Buffer;
import { Decoder, tools, Reader } from "ts-ebml";
class TsEBMLEngine {
//constructor(){
//this.chunkDecoder = new Decoder();
//this.chunkReader = new Reader();
//}
convertStream = (data) => {
const chunkDecoder = new Decoder();
const chunkReader = new Reader();
chunkReader.logging = false;
chunkReader.drop_default_duration = false;
// save timestamp
const timestamp = new Date();
timestamp.setTime(data.lastModified);
// load and convert blob
return data.arrayBuffer().then((buffer) => {
// decode
const elms = chunkDecoder.decode(buffer);
elms.forEach((elm) => {
chunkReader.read(elm);
});
chunkReader.stop();
// generate metadata
let refinedMetadataBuf = tools.makeMetadataSeekable(
chunkReader.metadatas,
chunkReader.duration,
chunkReader.cues
);
let body = buffer.slice(chunkReader.metadataSize);
// create new blob
let convertedData = new Blob([refinedMetadataBuf, body], { type: data.type });
// store convertedData
return convertedData;
});
}
}
// expose plugin
videojs.TsEBMLEngine = TsEBMLEngine;
export default TsEBMLEngine;
After recording for more than 10 seconds I stop the recording, go to the DB, and watch the retrieved video. The video is seek-able for the first 3 seconds before the dot reaches the very end of the seek-able line. When I'm watching the video in a live stream, the video freezes after the first 3 seconds.
When I look at the size of the file in the DB, it increases after x seconds which means it's being appended to it, just not properly.
Any help would be greatly appreciated.
For being seekable a video (at least talking about EBMLs) needs to have a SeekHead tag, Cues tags and defined duration in Info tag.
For creating new metadata of the video you can use ts-ebml's exporting function makeMetadataSeekable
Then slice the beginning of video and replace it with new metadata like it was done in the example :
const decoder = new Decoder();
const reader = new Reader();
const webMBuf = await fetch("path/to/file").then(res=> res.arrayBuffer());
const elms = decoder.decode(webMBuf);
elms.forEach((elm)=>{ reader.read(elm); });
reader.stop();
const refinedMetadataBuf = tools.makeMetadataSeekable(reader.metadatas, reader.duration, reader.cues);
const body = webMBuf.slice(reader.metadataSize);
const refinedWebM = new Blob([refinedMetadataBuf, body], {type: "video/webm"});
And voila! new video file becomes seekable

Unable to remove data from json file on disk

I'm unable to find a way to remove whole line of JSON data(line) after it's used.
For some reason delete is not working or rather said not doing anything.
.JSON
[
{"id":"1","code":"super-S","expires_in":"","gives_currencies":"1.5","gives_items":"","number_of_attempts":"1","attempts_used":""},
{"id":"2","code":"wow!","expires_in":"","gives_currencies":"3","gives_items":"","number_of_attempts":"1","attempts_used":""},
{"id":"3","code":"slashme","expires_in":"","gives_currencies":"4","gives_items":"","number_of_attempts":"1","attempts_used":""},
{"id":"4","code":"randombla","expires_in":"","gives_currencies":"5","gives_items":"","number_of_attempts":"1","attempts_used":""}
]
code
//fs configuration
const fs = require('fs');
let rawdata = fs.readFileSync('test.json');
let mycodes = JSON.parse(rawdata);
//something above
const randomcode = mycodes[Math.floor(Math.random() * mycodes.length)];
console.log('Your code is:', randomcode['code']); //logs me a random code value
delete mycodes[randomcode];
The goal here is to select random code, which is done but then I need to remove it from .JSON file so it won't repeat. I tried several things but it's not working, delete.randomcode etc... the line never removed from the .JSON file.
Use Array.prototype.splice(index, deleteCount) instead of delete.
delete, on an Array, will just null the key, without removing it.
Save back your modified data using JSON.stringify(mycodes) to that same file.
const fs = require('fs');
const mycodes = JSON.parse(fs.readFileSync('./test.json'));
const randomIndex = Math.floor(Math.random() * mycodes.length);
const randomObject = mycodes[randomIndex];
console.log('Your code is:', randomObject.code); // Log a random code value
mycodes.splice(randomIndex, 1); // Remove one key at randomIndex
// Write back to file
fs.writeFileSync('test.json', JSON.stringify(mycodes, 0, 4), 'utf8');
If you already have that Object out of your Array, and since Objects are passed by reference (like pointer in memory), make use of the Array.prototype.indexOf(someObject) like:
const fs = require('fs');
const mycodes = JSON.parse(fs.readFileSync('./test.json'));
const randomIndex = Math.floor(Math.random() * mycodes.length);
const randomObject = mycodes[randomIndex];
// later in your code....
const objIndex = mycodes.indexOf(randomObject); // Get Object index in Array
mycodes.splice(objIndex, 1); // Remove it from array at that index
// Write back to file
fs.writeFileSync('test.json', JSON.stringify(mycodes, 0, 4), 'utf8');
You need to persist your data by writing it back to your JSON file after using JSON.stringify().
While you're at it, you can move your code into functions, which will make it easier to read and work with.
You might also want to read about editing arrays using Array.prototype.splice().
The delete operator is for deleting properties from objects. While you can use it to delete elements from an array, it will leave the index empty rather than closing the gap in the array after deletion.
const fs = require('fs');
// get a random element from any array
function getRandomElement (array) {
const randomElement = array[Math.floor(Math.random() * array.length)];
return randomElement;
}
function deleteElementFromArray (array, element) {
const index = array.indexOf(element);
if (index < 0) return false;
array.splice(index, 1);
return true;
}
// move the reading work inside a function
function readJson (filePath) {
const json = fs.readFileSync(filePath, {encoding: 'utf8'});
const data = JSON.parse(json);
return data;
}
// move the reading work inside a function
function writeJson (filePath, data) {
const json = JSON.stringify(data);
fs.writeFileSync(filePath, json);
}
const jsonFilePath = 'test.json';
const mycodes = readJson(jsonFilePath);
const randomcode = getRandomElement(mycodes);
console.log('Your code is:', randomcode['code']);
deleteElementFromArray(mycodes, randomcode);
writeJson(jsonFilePath, mycodes);

get data from another add-on on Excel javascript api

I saved data on workbook on following code
export const storeSettingsToWorkbook = async (settingsType: Settings, storeData:
WorkbookModel) => {
return Excel.run(async (context) => {
const originalXml = createXmlObject(storeData);
const customXmlPart = context.workbook.customXmlParts.add(originalXml);
customXmlPart.load("id");
await context.sync();
// Store the XML part's ID in a setting
const settings = context.workbook.settings;
settings.add(settingsTitles[settingsType], customXmlPart.id);
await context.sync();
})
}
when i get data -it works normally.But when i want to get this data form another "add-in" on Excel- I cannot get this data
const {settings} = context.workbook;
const sheet = context.workbook.worksheets.getActiveWorksheet().load("items");
const xmlPartIDSetting = settings.getItemOrNullObject(settingsTitles[settingsType]).load("value");
await context.sync();
if (xmlPartIDSetting.value) {
const customXmlPart = context.workbook.customXmlParts.getItem(xmlPartIDSetting.value);
const xmlBlob = customXmlPart.getXml();
await context.sync()
const parsedObject = parseFromXmlString(xmlBlob.value);
const normalizedData = normalizeParsedData(parsedObject);
Any ideas?
Thanks for reaching us.
This is by design. Each addin has its own setting and cannot share with each other.
You can use 'context.workbook.properties.custom' as a workaround.
You can also use 'context.workbook.worksheets.getActiveWorksheet().customProperties', but the two add-ins are required to be on the same worksheet.

Creating separated files for each request with createWriteStream

I'm making multiple URL request's using Axios and collecting the data using Cheerio.
Everything works great, I just can't figure out how to prevent the data to be overwritten by the previous response that was generated into a file using the createWriteStream method.
I'm trying to create different files for each request with unique names preferably, but haven't found any solution in the docs.
const axios = require("axios").default;
const cheerio = require('cheerio');
const fs = require('fs');
const writeStream = fs.createWriteStream('./names/names.text')
const getTitle = (res) => {
const $ = cheerio.load(res.data);
const names = $('.name_wrap > .name')
names.each(function (i, el) {
const item = $(el).text().replace(/^\s*$/g, '')
writeStream.write(`${item}\n`)
});
}
// URL'S Array
let URLS = []
for (let index = 1; index <= 3; index++) {
let url = `https://www.example.com/name-1-${index}`
URLS.push(axios.get(url))
}
Promise.all(URLS)
.then(responses => {
getTitle(responses[0])
getTitle(responses[1])
});

Adding dynamically to a JSON file

I have a json file that is generated dynamically by taking the values of a page using a crawler, json is created as follows:
{
"temperatura":"31°C",
"sensacao":"RealFeel® 36°",
"chuva":"0 mm",
"vento":"NNO11km/h",
"momentoAtualizacao":"Dia",
"Cidade":"carazinho",
"Site":"Accuweather"
}
{
"temperatura":"29 º",
"sensacao":"29º ST",
"vento":"11 Km/h",
"umidade":"51% UR",
"pressao":"1013 hPa",
"Cidade":"carazinho",
"Site":"Tempo Agora"
}
The problem with this generated file is missing [] to join all the files inside an array, and commas to separate the files.
The final json should look like this:
[{
"temperatura":"31°C",
"sensacao":"RealFeel® 36°",
"chuva":"0 mm",
"vento":"NNO11km/h",
"momentoAtualizacao":"Dia",
"Cidade":"carazinho",
"Site":"Accuweather"
},
{
"temperatura":"29 º",
"sensacao":"29º ST",
"vento":"11 Km/h",
"umidade":"51% UR",
"pressao":"1013 hPa",
"Cidade":"carazinho",
"Site":"Tempo Agora"
}]
I am currently using this code to generate json.
const climatempo = async (config) => {
const browser = await puppeteer.launch()
const page = await browser.newPage()
const override = Object.assign(page.viewport(), {width: 1920, heigth:1024});
await page.setViewport(override);
await page.goto(config.cidades[cidade],{waitUntil: 'load',timeout:'60000'})
if(siteEscolhido == "accu"){
const elementTemp = await page.$(config.regras.elementTemp)
const temperatura = await page.evaluate(elementTemp => elementTemp.textContent, elementTemp)
const sensacaoElement= await page.$(config.regras.sensacaoElement)
const sensacao = await page.evaluate(sensacaoElement => sensacaoElement.textContent, sensacaoElement)
const chuvaElement = await page.$(config.regras.chuvaElement)
const chuva = await page.evaluate(chuvaElement => chuvaElement.textContent, chuvaElement)
const ventoElement = await page.$(config.regras.ventoElement)
const vento = await page.evaluate(ventoElement => ventoElement.textContent, ventoElement)
const atualizadoA = await page.$(config.regras.atualizadoA)
const momentoAtualizacao = await page.evaluate(atualizadoA => atualizadoA.textContent, atualizadoA)
var dado = {
temperatura:temperatura,
sensacao:sensacao,
chuva:chuva,
vento:vento,
momentoAtualizacao:momentoAtualizacao,
Cidade:cidade,
Site:"Accuweather"
}
//dados.push(dado)
var x = JSON.stringify(dado)
fs.appendFile('climatempo.json',x,function(err){
if(err) throw err
})
console.log("Temperatura:" + temperatura)
console.log(sensacao)
console.log("Vento:" + vento)
console.log("chuva:" + chuva)
console.log(momentoAtualizacao)
await browser.close()
If anyone has any idea how to solve my problem, please let me know!
Grateful, Carlos
I would suggest doing it a little differently
I will try to explain in pseudo code since i dont understand your variable names
read json file
array = JSON.parse(fileContents)
array.push(newItem)
newContents = JSON.stringify(array)
file WRITE (not append) newContents
I recommend reading the file, pushing onto an array captured from that file, and then writing the file back to disk.
Assuming the file has content already in the form of an array:
let fileDado = JSON.parse(fs.readFileSync('climatempo.json'));
fileDado.push(dado);
fs.writeFileSync('climatempo.json', JSON.stringify(fileDado));

Categories