Remove a subscription - javascript

I'm using a library (Polidea-BLE || react-native-ble-plx) to connect to an external device and recover information.
Basically I read some informations from external device and then I should pass these informations to another page to wrote in the db.
My problem is that I don't understand how to stop the reading, because at the moment I read data and directly pass these to another page.
They said that:
monitorCharacteristicForService also returns subscription with
remove() function so setting transactionId is not necessary.
But I don't understand how to use.
This is my code:
async setupNotifications1(device) {
var timeagm = 0
var time = 0
const service = this.serviceGeneral();
await device.monitorCharacteristicForService(service,this.AccGyrMg, (error, characteristic) => {
if (error)
{
this.error(error.message);
return;
}
const buf = Buffer.from(characteristic.value, "base64");
const [...acc_dx] = [2, 4, 6].map(index => buf.readInt16LE(index));
this.setState(state => ({acc_dx,array_acc_dx: [...state.array_acc_dx,[timeagm, acc_dx]]
}));
timeagm += 20
}
);
await device.monitorCharacteristicForService(service,this.Pressure,(error, characteristic) => {
if (error)
{
this.error(error.message);
return;
}
const buf = Buffer.from(characteristic.value, "base64");
const [...pressure_dx] = [0, 2, 4, 6, 8].map(index => buf.readUInt16LE(index));
this.setState(state => ({pressure_dx,array_pressure_dx: [...state.array_pressure_dx,[time, pressure_dx] ]
}));
time += 20
}
);
and when the user click on a stopButton I pass the data without stop the reading ( at the moment and this is what I should correct)
stopConnection() {
console.log("Inizio stopConnection");
Actions.registerattivita(
{
array_acc_dx: this.state.array_acc_dx,
array_pressure_dx: this.state.array_pressure_dx,
}
)
}
In your opionion how can I do to use this remove() to stop the reading of data? Thank you.

this.subscriptionMonitor = device.monitorCharacteristicForService(...)
and then in stopConnection()
if(this.subscriptionMonitor) {
this.subscriptionMonitor.remove()
}

Related

typescript node insert on mongodb database

im a total newbie in js (typescript, mongoDB, node.)
i just found that my code is not behaving as i expected, im getting 6 registers on the mongoDB instead of just one, it should check if the register exists and then update it, i dont know if it is something related to the await / async or i am doing something wrong, thanks in advace, here is my code.
fields.forEach((value) => {
try {
const mongoConnection = new DocumentDbRepository();
let checksIfExists = await mongoConnection.getValue(key, information[uniqueValue]);
if(checksIfExists==null){
let insert = await mongoConnection.insertValue(information);
console.log(insert);
}
if(checksIfExists?.passValue===information.passValue){
console.log('---------update---------');
let sons = Object.values(information.ticketToRide);
information.ticketToRide = sons;
let update = await mongoConnection.updateRegister(information, checksIfExists._id);
console.log(update);
} else {
console.log('---------insert---------');
let sons = Object.values(information.ticketToRide);
information = sons;
let insert = await mongoConnection.insertValue(information);
console.log(insert);
}
} catch (error) {
console.log(error)
}
}
async getValue(uniqueValue: any, keyValue:any) {
if (this._connection == null) {
await this.connect();
}
const db = this._connection.db(DocumentDbRepository.DbName);
const ticketToRide = db.collection("ticketToRide");
const query = {};
query[uniqueValue] = ''+keyValue+'';
const passInfo = await ticketToRide.findOne(query);
return passInfo;
}
async insertValue(information: any) {
if (this._connection == null) {
await this.connect();
}
const db = this._connection.db(DocumentDbRepository.DbName);
const ticketToRide = db.collection("ticketToRide");
let check = await ticketToRide.insertOne(
information
)
return check;
}
First, you don't need to create a connection inside the loop.
Second, mongodb has an update() or updateMany() method that has a special option { upsert: true }. If it is passed, insert will happen automatically.
Usage example:
Person.update( { name: 'Ted' }, { name: 'Ted', age : 50 }, { upsert: true })

Before finishing the map in Promise.all it executes next line in react js

Good Evening All,
Am just trying to create one collection with multiple data, that datas are nested with other collections am trying to call a function to get one id , that id used to call another connection and get some data and id's, that id is used to call other connection like wise i call all connection and gather the data
Here My code
firestore.collection("pods/").where("start_date", "==", dformat)
.where("pod_status", "==", "active").get()
.then((snap: any) => {
snap.docs.map((docsnap: any) => {
let details = docsnap.data();
let pod_docId = docsnap.id;
let pod_id = details.pod_id;
let course_id = details.course_id;
getDocDetails(course_id, firestore, "courses").then(
(courseData: any) => {
let course_data = courseData.data(); let levels = course_data.levels;
Promise.all(
levels.map((lev: any) => {
let level_id = lev;
getDocDetails(level_id, firestore, "levels")
.then((level_data) => {
let result: any = level_data; let level_details = result.data();
let modules = level_details.modules;
modules.map((module_id: any) => {
getDocDetails(module_id, firestore, "modules")
.then((module_data) => {
let response: any = module_data; let mod_det = response.data(); let sess = mod_det.sessions;
sess.map((sess_id: any) => {
sessions_id_list.push(sess_id);
logger.info("sessions_id_list", sessions_id_list);
});
})
.catch((err) => {
logger.info(
"Error while getting the Module details",
err
);
});
});
logger.info("sessions_id_list 2", sessions_id_list);
})
.catch((err) => {
logger.info("Error while getting the Level details", err);
});
})
);
logger.info("sessions_id_list 3", sessions_id_list);
let obj = {
pod_docId: pod_docId,
pod_id: pod_id,
sessions_id_list: sessions_id_list,
status: "created",
count: 0,
completed: 0,
};
addQueue(obj, firestore);
}
);
});
})
.catch((err: any) => {
reject(err);
});
Here before pushing the session list , it execute print session list 3 and call addQueue function. How can i overcome this, Help me guys, Thanks in Avance

How can I update more than 500 docs in Firestore using Batch?

I'm trying to update a field timestamp with the Firestore admin timestamp in a collection with more than 500 docs.
const batch = db.batch();
const serverTimestamp = admin.firestore.FieldValue.serverTimestamp();
db
.collection('My Collection')
.get()
.then((docs) => {
serverTimestamp,
}, {
merge: true,
})
.then(() => res.send('All docs updated'))
.catch(console.error);
This throws an error
{ Error: 3 INVALID_ARGUMENT: cannot write more than 500 entities in a single call
at Object.exports.createStatusError (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\common.js:87:15)
at Object.onReceiveStatus (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:1188:28)
at InterceptingListener._callNext (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:564:42)
at InterceptingListener.onReceiveStatus (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:614:8)
at callback (C:\Users\Growthfile\Desktop\cf-test\functions\node_modules\grpc\src\client_interceptors.js:841:24)
code: 3,
metadata: Metadata { _internal_repr: {} },
details: 'cannot write more than 500 entities in a single call' }
Is there a way that I can write a recursive method which creates a batch object updating a batch of 500 docs one by one until all the docs are updated.
From the docs I know that delete operation is possible with the recursive approach as mentioned here:
https://firebase.google.com/docs/firestore/manage-data/delete-data#collections
But, for updating, I'm not sure how to end the execution since the docs are not being deleted.
I also ran into the problem to update more than 500 documents inside a Firestore collection. And i would like to share how i solved this problem.
I use cloud functions to update my collection inside Firestore but this should also work on client side code.
The solution counts every operation which is made to the batch and after the limit is reached a new batch is created and pushed to the batchArray.
After all updates are completed the code loops through the batchArray and commits every batch which is inside the array.
It is important to count every operation set(), update(), delete() which is made to the batch because they all count to the 500 operation limit.
const documentSnapshotArray = await firestore.collection('my-collection').get();
const batchArray = [];
batchArray.push(firestore.batch());
let operationCounter = 0;
let batchIndex = 0;
documentSnapshotArray.forEach(documentSnapshot => {
const documentData = documentSnapshot.data();
// update document data here...
batchArray[batchIndex].update(documentSnapshot.ref, documentData);
operationCounter++;
if (operationCounter === 499) {
batchArray.push(firestore.batch());
batchIndex++;
operationCounter = 0;
}
});
batchArray.forEach(async batch => await batch.commit());
return;
I liked this simple solution:
const users = await db.collection('users').get()
const batches = _.chunk(users.docs, 500).map(userDocs => {
const batch = db.batch()
userDocs.forEach(doc => {
batch.set(doc.ref, { field: 'myNewValue' }, { merge: true })
})
return batch.commit()
})
await Promise.all(batches)
Just remember to add import * as _ from "lodash" at the top. Based on this answer.
You can use default BulkWriter. This method used 500/50/5 rule.
Example:
let bulkWriter = firestore.bulkWriter();
bulkWriter.create(documentRef, {foo: 'bar'});
bulkWriter.update(documentRef2, {foo: 'bar'});
bulkWriter.delete(documentRef3);
await close().then(() => {
console.log('Executed all writes');
});
As mentioned above, #Sebastian's answer is good and I upvoted that too. Although faced an issue while updating 25000+ documents in one go.
The tweak to logic is as below.
console.log(`Updating documents...`);
let collectionRef = db.collection('cities');
try {
let batch = db.batch();
const documentSnapshotArray = await collectionRef.get();
const records = documentSnapshotArray.docs;
const index = documentSnapshotArray.size;
console.log(`TOTAL SIZE=====${index}`);
for (let i=0; i < index; i++) {
const docRef = records[i].ref;
// YOUR UPDATES
batch.update(docRef, {isDeleted: false});
if ((i + 1) % 499 === 0) {
await batch.commit();
batch = db.batch();
}
}
// For committing final batch
if (!(index % 499) == 0) {
await batch.commit();
}
console.log('write completed');
} catch (error) {
console.error(`updateWorkers() errored out : ${error.stack}`);
reject(error);
}
Explanations given on previous comments already explain the issue.
I'm sharing the final code that I built and worked for me, since I needed something that worked in a more decoupled manner, instead of the way that most of the solutions presented above do.
import { FireDb } from "#services/firebase"; // = firebase.firestore();
type TDocRef = FirebaseFirestore.DocumentReference;
type TDocData = FirebaseFirestore.DocumentData;
let fireBatches = [FireDb.batch()];
let batchSizes = [0];
let batchIdxToUse = 0;
export default class FirebaseUtil {
static addBatchOperation(
operation: "create",
ref: TDocRef,
data: TDocData
): void;
static addBatchOperation(
operation: "update",
ref: TDocRef,
data: TDocData,
precondition?: FirebaseFirestore.Precondition
): void;
static addBatchOperation(
operation: "set",
ref: TDocRef,
data: TDocData,
setOpts?: FirebaseFirestore.SetOptions
): void;
static addBatchOperation(
operation: "create" | "update" | "set",
ref: TDocRef,
data: TDocData,
opts?: FirebaseFirestore.Precondition | FirebaseFirestore.SetOptions
): void {
// Lines below make sure we stay below the limit of 500 writes per
// batch
if (batchSizes[batchIdxToUse] === 500) {
fireBatches.push(FireDb.batch());
batchSizes.push(0);
batchIdxToUse++;
}
batchSizes[batchIdxToUse]++;
const batchArgs: [TDocRef, TDocData] = [ref, data];
if (opts) batchArgs.push(opts);
switch (operation) {
// Specific case for "set" is required because of some weird TS
// glitch that doesn't allow me to use the arg "operation" to
// call the function
case "set":
fireBatches[batchIdxToUse].set(...batchArgs);
break;
default:
fireBatches[batchIdxToUse][operation](...batchArgs);
break;
}
}
public static async runBatchOperations() {
// The lines below clear the globally available batches so we
// don't run them twice if we call this function more than once
const currentBatches = [...fireBatches];
fireBatches = [FireDb.batch()];
batchSizes = [0];
batchIdxToUse = 0;
await Promise.all(currentBatches.map((batch) => batch.commit()));
}
}
Based on all the above answers, I put together the following pieces of code that one can put into a module in JavaScript back-end and front-end to easily use Firestore batch writes, without worrying about the 500 writes limit.
Back-end (Node.js)
// The Firebase Admin SDK to access Firestore.
const admin = require("firebase-admin");
admin.initializeApp();
// Firestore does not accept more than 500 writes in a transaction or batch write.
const MAX_TRANSACTION_WRITES = 499;
const isFirestoreDeadlineError = (err) => {
console.log({ err });
const errString = err.toString();
return (
errString.includes("Error: 13 INTERNAL: Received RST_STREAM") ||
errString.includes("Error: 4 DEADLINE_EXCEEDED: Deadline exceeded")
);
};
const db = admin.firestore();
// How many transactions/batchWrites out of 500 so far.
// I wrote the following functions to easily use batchWrites wthout worrying about the 500 limit.
let writeCounts = 0;
let batchIndex = 0;
let batchArray = [db.batch()];
// Commit and reset batchWrites and the counter.
const makeCommitBatch = async () => {
console.log("makeCommitBatch");
await Promise.all(batchArray.map((bch) => bch.commit()));
};
// Commit the batchWrite; if you got a Firestore Deadline Error try again every 4 seconds until it gets resolved.
const commitBatch = async () => {
try {
await makeCommitBatch();
} catch (err) {
console.log({ err });
if (isFirestoreDeadlineError(err)) {
const theInterval = setInterval(async () => {
try {
await makeCommitBatch();
clearInterval(theInterval);
} catch (err) {
console.log({ err });
if (!isFirestoreDeadlineError(err)) {
clearInterval(theInterval);
throw err;
}
}
}, 4000);
}
}
};
// If the batchWrite exeeds 499 possible writes, commit and rest the batch object and the counter.
const checkRestartBatchWriteCounts = () => {
writeCounts += 1;
if (writeCounts >= MAX_TRANSACTION_WRITES) {
batchIndex++;
batchArray.push(db.batch());
writeCounts = 0;
}
};
const batchSet = (docRef, docData) => {
batchArray[batchIndex].set(docRef, docData);
checkRestartBatchWriteCounts();
};
const batchUpdate = (docRef, docData) => {
batchArray[batchIndex].update(docRef, docData);
checkRestartBatchWriteCounts();
};
const batchDelete = (docRef) => {
batchArray[batchIndex].delete(docRef);
checkRestartBatchWriteCounts();
};
module.exports = {
admin,
db,
MAX_TRANSACTION_WRITES,
checkRestartBatchWriteCounts,
commitBatch,
isFirestoreDeadlineError,
batchSet,
batchUpdate,
batchDelete,
};
Front-end
// Firestore does not accept more than 500 writes in a transaction or batch write.
const MAX_TRANSACTION_WRITES = 499;
const isFirestoreDeadlineError = (err) => {
return (
err.message.includes("DEADLINE_EXCEEDED") ||
err.message.includes("Received RST_STREAM")
);
};
class Firebase {
constructor(fireConfig, instanceName) {
let app = fbApp;
if (instanceName) {
app = app.initializeApp(fireConfig, instanceName);
} else {
app.initializeApp(fireConfig);
}
this.name = app.name;
this.db = app.firestore();
this.firestore = app.firestore;
// How many transactions/batchWrites out of 500 so far.
// I wrote the following functions to easily use batchWrites wthout worrying about the 500 limit.
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
}
async makeCommitBatch() {
console.log("makeCommitBatch");
if (!this.isCommitting) {
this.isCommitting = true;
await this.batch.commit();
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.isCommitting = true;
await this.batch.commit();
this.writeCounts = 0;
this.batch = this.db.batch();
this.isCommitting = false;
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async commitBatch() {
try {
await this.makeCommitBatch();
} catch (err) {
console.log({ err });
if (isFirestoreDeadlineError(err)) {
const theInterval = setInterval(async () => {
try {
await this.makeCommitBatch();
clearInterval(theInterval);
} catch (err) {
console.log({ err });
if (!isFirestoreDeadlineError(err)) {
clearInterval(theInterval);
throw err;
}
}
}, 4000);
}
}
}
async checkRestartBatchWriteCounts() {
this.writeCounts += 1;
if (this.writeCounts >= MAX_TRANSACTION_WRITES) {
await this.commitBatch();
}
}
async batchSet(docRef, docData) {
if (!this.isCommitting) {
this.batch.set(docRef, docData);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.set(docRef, docData);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async batchUpdate(docRef, docData) {
if (!this.isCommitting) {
this.batch.update(docRef, docData);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.update(docRef, docData);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
async batchDelete(docRef) {
if (!this.isCommitting) {
this.batch.delete(docRef);
await this.checkRestartBatchWriteCounts();
} else {
const batchWaitInterval = setInterval(async () => {
if (!this.isCommitting) {
this.batch.delete(docRef);
await this.checkRestartBatchWriteCounts();
clearInterval(batchWaitInterval);
}
}, 400);
}
}
}
No citations or documentation, this code i invented by myself and for me it worked and looks clean, and simple for read and usage. If some one like it, then can use it too.
Better make autotest becose code use private var _ops wich can be changed after packages upgrade. Forexample in old versions its can be _mutations
async function commitBatch(batch) {
const MAX_OPERATIONS_PER_COMMIT = 500;
while (batch._ops.length > MAX_OPERATIONS_PER_COMMIT) {
const batchPart = admin.firestore().batch();
batchPart._ops = batch._ops.splice(0, MAX_OPERATIONS_PER_COMMIT - 1);
await batchPart.commit();
}
await batch.commit();
}
Usage:
const batch = admin.firestore().batch();
batch.delete(someRef);
batch.update(someRef);
...
await commitBatch(batch);
Simple solution
Just fire twice ?
my array is "resultsFinal"
I fire batch once with a limit of 490 , and second with a limit of the lenght of the array ( results.lenght)
Works fine for me :)
How you check it ?
You go to firebase and delete your collection , firebase say you have delete XXX docs , same as the lenght of your array ? Ok so you are good to go
async function quickstart(results) {
// we get results in parameter for get the data inside quickstart function
const resultsFinal = results;
// console.log(resultsFinal.length);
let batch = firestore.batch();
// limit of firebase is 500 requests per transaction/batch/send
for (i = 0; i < 490; i++) {
const doc = firestore.collection('testMore490').doc();
const object = resultsFinal[i];
batch.set(doc, object);
}
await batch.commit();
// const batchTwo = firestore.batch();
batch = firestore.batch();
for (i = 491; i < 776; i++) {
const objectPartTwo = resultsFinal[i];
const doc = firestore.collection('testMore490').doc();
batch.set(doc, objectPartTwo);
}
await batch.commit();
}

Relation between storage and database - Firebase

I'm facing yet another issue.
I'm using firebase db to store text and firebase storage to store files. And here comes my issue.
Q: How to fetch a correct image from storage when fetching particular element from database?
Here's my attempt:
const storageRef = firebase.storage().ref('companyImages/companyImage' + 123);
^^^^^^^^^^^^ I dont have access to id yet :(
const task = storageRef.put(companyImage);
task.on('state_changed', () => {
const percentage = (snap.bytesTransferred / snap.totalBytes) * 100;
// ^^^^^^^^^^^^ not sure if i even need this
}, (err) => {
console.log(err);
}, () => {
firebase.database().ref('offers').push(values);
^^^^^^^^^^^^^^^ now I could retrieve id from it with .key but its too late
});
As you can see, first what Im doing is uploading the image and when it's succesful, Im starting to upload the data to database.
Still, it doesnt work as it is supposed to. When uploading image I have to name it with a correct id to retrieve it easily later, in components.
It may look a lil bit complex but will appreciate any kind of help. Any suggestion or hint.
Should I firstly upload data to DB and then image to the storage?
You can generate the push ID before you upload the file – you can also just save the download URL of the returned snapshot at task.snapshot.downloadURL so you don't have to retrieve the file from storage using the storage ref.
const offerRef = firebase.database().ref('offers').push();
const storageRef = firebase.storage().ref(`companyImages/${offerRef.key}`);
const task = storageRef.put(companyImage);
task.on('state_changed', (snap) => {
const percentage = (snap.bytesTransferred / snap.totalBytes) * 100;
}, (error) => {
console.log(err);
}, () => {
offerRef.set(values);
});
I would suggest using .getDownloadURL(). Then push all your uploadedfileDownloadURL's into an object or array and then store that into your database. So in the future you can access this object or array from, lets say your user/ProfilePHotos, and then in your app level code you can just use the DownloadURL as a uri links inside an image tag!
In this example I am using react-native, I upload multiple photos, save the download URL each time in an array, then set the array to firebase under the users account.
export const userVehiclePhotoUploadRequest = (photos, user, year) => dispatch => {
console.log('Inside vehiclePhotoUpload Actions', photos, user)
let referenceToUploadedPhotos = [];
return new Promise((resolve, reject) => {
photos.map(ele => {
let mime = 'application/octet-stream'
let uri = ele.uri
let uploadUri = Platform.OS === 'ios' ? uri.replace('file://', '') : uri
let sessionId = new Date().getTime()
let uploadBlob = null
let imageRef = firebase.storage().ref('vehicleImages/' + `${user.account.uid}`).child(`${sessionId}`)
fs.readFile(uploadUri, 'base64')
.then((data) => {
return Blob.build(data, { type: `${mime};BASE64` })
})
.then((blob) => {
uploadBlob = blob
return imageRef.put(blob, { contentType: mime })
})
.then(() => {
uploadBlob.close()
return imageRef.getDownloadURL()
})
.then((url) => {
referenceToUploadedPhotos.push(url)
console.log('ARRAY OF URLS WHILE PUSHING', referenceToUploadedPhotos)
resolve(url)
})
.catch((error) => {
reject(error)
})
})
})
.then(() => {
//I did this to not go home until photos are done uploading.
let vehicles;
firebase.database().ref('users/' + user.account.uid + `/allVehicles/allVehiclesArray`).limitToFirst(1).once('value').then(function (snapshot) {
// ******** This method is straight from their docs ********
// ******** It returns whatever is found at the path xxxxx/users/user.uid ********
vehicles = snapshot.val();
}).then(() => {
console.log('ARRAY OF URLS BEFORE SETTING', referenceToUploadedPhotos)
// let lastVehicle = vehicles.length - 1;
firebase.database().ref('users/' + user.account.uid + `/allVehicles/allVehiclesArray/` + `${Object.keys(vehicles)[0]}` + `/photosReference`).set({
referenceToUploadedPhotos
}).then(() => {
dispatch(loginRequest(user.account))
})
})
})
};
And then in your code, lets say inside a map of the user's information...
{ ele.photosReference !== undefined ? dynamicAvatar = { uri: `${ele.photosReference.referenceToUploadedPhotos[0]}` } : undefined }

vscode.commands.executeCommand was not working

I'm writing an VS Code extension to help migrating React.createClass to class extends React.Component. The problem here was, I could not get vscode.commands.executeCommand('vscode.executeFormatDocumentProvider', ...) to work.
Note that the code below is pure JavaScript, but not TypeScript.
function activate(context) {
context.subscriptions.push(vscode.commands.registerCommand('migrate-to-react-es6-class', () => {
const editor = vscode.window.activeTextEditor
const document = editor.document
try {
const originalCode = document.getText()
const modifiedCode = 'do something and return new code'
if (originalCode === modifiedCode) {
vscode.window.showInformationMessage('Nothing is to be migrated.')
} else {
editor.edit(edit => {
const editingRange = document.validateRange(new vscode.Range(0, 0, Number.MAX_SAFE_INTEGER, Number.MAX_SAFE_INTEGER))
edit.replace(editingRange, modifiedCode)
})
if (document.isUntitled === false) {
vscode.commands.executeCommand('vscode.executeFormatDocumentProvider', document.uri, { insertSpaces: true, tabSize: 2 })
}
}
} catch (error) {
vscode.window.showErrorMessage(error.message)
console.error(error)
}
}))
}
After 3.25 years, you've probably figured this out by now, but for the record, I assume you hung a .then() on the editor.edit() and then moved the executeCommand to within the then(), right?
editor.edit(edit => {
const editingRange = ...
edit.replace(editingRange, modifiedCode)
}).then(editWorked => {if (editWorked && !document.isUntitled) {
vscode.commands.executeCommand('vscode.executeFormatDocumentProvider', ...) })
You must apply returned edits
private async formatDocument(): Promise<void> {
const docUri = this.textEditor.document.uri;
const textEdits = (await vscode.commands.executeCommand(
'vscode.executeFormatDocumentProvider',
docUri,
)) as vscode.TextEdit[];
const edit = new vscode.WorkspaceEdit();
for (const textEdit of textEdits) {
edit.replace(docUri, textEdit.range, textEdit.newText);
}
await vscode.workspace.applyEdit(edit);
}
I implemented it directly in the extension.ts:
commands.registerCommand(constants.commands.formatDocument, async () => {
const docUri = editor?.document.uri;
const textEdits: TextEdit[] | undefined = await commands.executeCommand(
'vscode.executeFormatDocumentProvider',
docUri
);
if (textEdits && docUri) {
const edit = new WorkspaceEdit();
for (const textEdit of textEdits) {
edit.replace(docUri, textEdit.range, textEdit.newText);
}
await workspace.applyEdit(edit);
}
});
constants.commands.formatDocument gets the value after parsing my package.json

Categories