Firestorage has different behavior when hardcode - javascript

I don't know what happened with my code. But everytime i used this code is working
const downloadUrlThumb = async(urlPathThumbs) => {
const gsRef = projectStorage.refFromURL('gs://myapps.appspot.com/courses/m9APF8TEUnfS6IQew591sl4kphH2/thumbs/business-3560917_640_200x200.jpg') // hardcode mode
//const gsRef = projectStorage.refFromURL(urlPathThumbs)
await gsRef.getDownloadURL().then(urlDownload => {
console.log('URL THUMB ADALAH: ', urlDownload);
url.value = urlDownload
}).catch(err => {
console.log(err.message);
})
}
But when i switched to urlPathThumbs it will become error 404. Error said like so
Firebase Storage: Object
'courses/m9APF8TEUnfS6IQew591sl4kphH2/thumbs/Rectangle 68
(1)_200x200.png' does not exist. (storage/object-not-found)
however they both are same pattern. This is how variable urlPathThumbs created in previous snippet code
var filename = file.name.replace(/(\.[\w\d_-]+)$/i, '_200x200$1')
console.log('filename ', filename);
let location = 'gs://myapps.appspot.com/courses/'+user.value.uid+'/thumbs/'+filename
console.log('full location: ', location); // --> This will print full location: gs://myapps.appspot.com/courses/m9APF8TEUnfS6IQew591sl4kphH2/thumbs/Rectangle 68 (1)_200x200.png
await downloadUrlThumb(location)
this file is successful uploaded and i can see it in firebase storage location:
Please can somebody help me?

since refFromURL is expecting a URL string, spaces should not be allow. You will need to urlencode the spaces.
e.g.
console.log(encodeURI("gs://myapps.appspot.com/courses/m9APF8TEUnfS6IQew591sl4kphH2/thumbs/Rectangle 68 (1)_200x200.png"))
refFromURL refFromURL ( url : string ) : Reference Returns a
reference for the given absolute URL.
Parameters url: string A URL in the form: 1) a gs:// URL, for example
gs://bucket/files/image.png 2) a download URL taken from object
metadata. #see firebase.storage.FullMetadata.downloadURLs

Related

Electron - write file before open save dialog

I'm using electron to develop an app. after some encryption operations are done, I need to show a dialog to the user to save the file. The filename I want to give to the file is a random hash but I have no success also with this. I'm trying with this code but the file will not be saved. How I can fix this?
const downloadPath = app.getPath('downloads')
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach( (file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
const filename = hash.createHash('md5').toString('hex');
console.log(filename)
const response = output.join(' :: ');
dialog.showSaveDialog({title: 'Save encrypted file', defaultPath: downloadPath }, () => {
fs.writeFile(`${filename}.mfs`, response, (err) => console.log(err) )
})
})
The problem you're experiencing is resulting from the asynchronous nature of Electron's UI functions: They do not take callback functions, but return promises instead. Thus, you do not have to pass in a callback function, but rather handle the promise's resolution. Note that this only applies to Electron >= version 6. If you however run an older version of Electron, your code would be correct -- but then you should really update to a newer version (Electron v6 was released well over a year ago).
Adapting your code like below can be a starting point to solve your problem. However, since you do not state how you generate the hash (where does hash.createHash come from?; did you forget to declare/import hash?; did you forget to pass any message string?; are you using hash as an alias for NodeJS' crypto module?), it is (at this time) impossible to debug why you do not get any output from console.log (filename) (I assume you mean this by "in the code, the random filename will not be created"). Once you provide more details on this problem, I'd be happy to update this answer accordingly.
As for the default filename: As per the Electron documentation, you can pass a file path into dialog.showSaveDialog () to provide the user with a default filename.
The file type extension you're using should also actually be passed with the file extension into the save dialog. Also passing this file extension as a filter into the dialog will prevent users from selecting any other file type, which is ultimately what you're also currently doing by appending it to the filename.
Also, you could utilise CryptoJS for the filename generation: Given some arbitrary string, which could really be random bytes, you could do: filename = CryptoJS.MD5 ('some text here') + '.mfs'; However, remember to choose the input string wisely. MD5 has been broken and should thus no longer be used to store secrets -- using any known information which is crucial for the encryption of the files you're storing (such as data.password) is inherently insecure. There are some good examples on how to create random strings in JavaScript around the internet, along with this answer here on SO.
Taking all these issues into account, one might end up with the following code:
const downloadPath = app.getPath('downloads'),
path = require('path');
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach((file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
// not working:
// const filename = hash.createHash('md5').toString('hex') + '.mfs';
// alternative requiring more research on your end
const filename = CryptoJS.MD5('replace me with some random bytes') + '.mfs';
console.log(filename);
const response = output.join(' :: ');
dialog.showSaveDialog(
{
title: 'Save encrypted file',
defaultPath: path.format ({ dir: downloadPath, base: filename }), // construct a proper path
filters: [{ name: 'Encrypted File (*.mfs)', extensions: ['mfs'] }] // filter the possible files
}
).then ((result) => {
if (result.canceled) return; // discard the result altogether; user has clicked "cancel"
else {
var filePath = result.filePath;
if (!filePath.endsWith('.mfs')) {
// This is an additional safety check which should not actually trigger.
// However, generally appending a file extension to a filename is not a
// good idea, as they would be (possibly) doubled without this check.
filePath += '.mfs';
}
fs.writeFile(filePath, response, (err) => console.log(err) )
}
}).catch ((err) => {
console.log (err);
});
})

Sign PDF in a modern browser natively?

What I'm trying to achieve
Sign a PDF in the browser using cliets certificate store or Smart Card
What I did so far
For accessing the local cert store I use FortifyApp.
Pdf is pre-signed on the server using iText(Sharp), then sent to the client via Ajax.
Relevant code:
using (var fileStream = new MemoryStream())
{
using (var stamper = PdfStamper.CreateSignature(reader, fileStream, '0', null, true))
{
var signatureAppearance = stamper.SignatureAppearance;
signatureAppearance.SetVisibleSignature(new iTextSharp.text.Rectangle(15,15,15,15), 1, "A");
IExternalSignatureContainer external =
new ExternalBlankSignatureContainer(PdfName.ADOBE_PPKLITE, PdfName.ADBE_PKCS7_DETACHED);
signatureAppearance.Reason = "AsdAsd";
signatureAppearance.Layer2Text = "Asd";
signatureAppearance.SignatureRenderingMode =
iTextSharp.text.pdf.PdfSignatureAppearance.RenderingMode.DESCRIPTION;
MakeSignature.SignExternalContainer(signatureAppearance, external, 512);
return fileStream.ToArray();
}
}
Following this, I managed to manipulate the pdf, extract byteRange, insert signature, etc. Relevant code:
let pdfBuffer = Buffer.from(new Uint8Array(pdf));
const byteRangeString = `/ByteRange `;
const byteRangePos = pdfBuffer.indexOf(byteRangeString);
if (byteRangePos === -1)
throw new Error('asd');
let len = pdfBuffer.slice(byteRangePos).indexOf(`]`) + 1;
// Calculate the actual ByteRange that needs to replace the placeholder.
const byteRangeEnd = byteRangePos + len;
const contentsTagPos = pdfBuffer.indexOf('/Contents ', byteRangeEnd);
const placeholderPos = pdfBuffer.indexOf('<', contentsTagPos);
const placeholderEnd = pdfBuffer.indexOf('>', placeholderPos);
const placeholderLengthWithBrackets = placeholderEnd + 1 - placeholderPos;
const placeholderLength = placeholderLengthWithBrackets - 2;
const byteRange = [0, 0, 0, 0];
byteRange[1] = placeholderPos;
byteRange[2] = byteRange[1] + placeholderLengthWithBrackets;
byteRange[3] = pdfBuffer.length - byteRange[2];
let actualByteRange = `/ByteRange [${byteRange.join(' ')}]`;
actualByteRange += ' '.repeat(len - actualByteRange.length);
// Replace the /ByteRange placeholder with the actual ByteRange
pdfBuffer = Buffer.concat([pdfBuffer.slice(0, byteRangePos) as any, Buffer.from(actualByteRange), pdfBuffer.slice(byteRangeEnd)]);
// Remove the placeholder signature
pdfBuffer = Buffer.concat([pdfBuffer.slice(0, byteRange[1]) as any, pdfBuffer.slice(byteRange[2], byteRange[2] + byteRange[3])]);
and
//stringSignature comes from the signature creations below, and is 'hex' encoded
// Pad the signature with zeroes so the it is the same length as the placeholder
stringSignature += Buffer
.from(String.fromCharCode(0).repeat((placeholderLength / 2) - len))
.toString('hex');
// Place it in the document.
pdfBuffer = Buffer.concat([
pdfBuffer.slice(0, byteRange[1]) as any,
Buffer.from(`<${stringSignature}>`),
pdfBuffer.slice(byteRange[1])
]);
The problem
This uses forge, and an uploaded p12 file. - This would probably work, if I could translate the imported(?) privateKey from Fortify (which is === typeof CryptoKey, and forge throws an error: TypeError: signer.key.sign is not a function).
p7.addCertificate(certificate); //certificate is the Certificate from Fortify CertificateStore.getItem(certId)
p7.addSigner({
key: privateKey, //this is the CryptoKey from Fortify
certificate: null/*certificate*/, //also tried certificate from Fortify
digestAlgorithm: forge.pki.oids.sha256,
authenticatedAttributes: [
{
type: forge.pki.oids.contentType,
value: forge.pki.oids.data,
}, {
type: forge.pki.oids.messageDigest,
// value will be auto-populated at signing time
}, {
type: forge.pki.oids.signingTime,
// value can also be auto-populated at signing time
// We may also support passing this as an option to sign().
// Would be useful to match the creation time of the document for example.
value: new Date(),
},
],
});
// Sign in detached mode.
p7.sign({detached: true});
I also tried pkijs for creating the signature (throws a similar error: Signing error: TypeError: Failed to execute 'sign' on 'SubtleCrypto': parameter 2 is not of type 'CryptoKey'.)
let cmsSigned = new pki.SignedData({
encapContentInfo: new pki.EncapsulatedContentInfo({
eContentType: "1.2.840.113549.1.7.1", // "data" content type
eContent: new asn.OctetString({ valueHex: pdfBuffer })
}),
signerInfos: [
new pki.SignerInfo({
sid: new pki.IssuerAndSerialNumber({
issuer: certificate.issuer,
serialNumber: certificate.serialNumber
})
})
],
certificates: [certificate]
});
let signature = await cmsSigned.sign(privateKey, 0, 'SHA-256');
What "works" is, if I create the signature using the code below:
let signature = await provider.subtle.sign(alg, privateKey, new Uint8Array(pdfBuffer).buffer);
"works", because it creates an invalid signature:
Error during signature verification.
ASN.1 parsing error:
Error encountered while BER decoding:
I tried multiple certificates, no luck.
Questions
Can I achieve my goal without having to manually upload a p12/pfx file, is it even possible?
Is the server-side implementation of the deferred signature correct, do I need something else?
Is the pdf manipulation in javascript correct?
Can I transform the native CrytpoKey to forge or pkijs?
What is wrong with the last signature? At first glance it seems right (at least the format):
<>>>/ContactInfo()/M(D:20200619143454+02'00')/Filter/Adobe.PPKLite/SubFilter/adbe.pkcs7.detached/ByteRange [0 180165 181191 1492] /Contents <72eb2731c9de4a5ccc94f1e1f2d9b07be0c6eed8144cb73f3dfe2764595dcc8f58b8a55f5026618fd9c79146ea93afdafc00b617c6e70de553600e4520f290bef70c499ea91862bb3acc651b6a7b162c984987f05ec59db5b032af0127a1224cad82e3be38ae74dd110ef5f870f0a0a92a8fba295009f267508c372db680b3d89d3157d3b218f33e7bf30c500d599b977c956e6a6e4b02a0bbd4a86737378b421ae2af0a4a3c03584eaf076c1cdb56d372617da06729ef364605ecd98b6b32d3bb792b4541887b59b686b41db3fc32eb4c651060bb02e2babeb30e6545834b2935993f6ee9edcc8f99fee8ad6edd2958c780177df6071fdc75208f76bbbcc21a00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000>>>
Thanks:
F
Original answer:
So I figured it out.
Can I achieve my goal without having to manually upload a p12/pfx
file, is it even possible?
Yes, it is. (See below on what needs to be changed.)
Is the server-side implementation of the deferred signature correct, do I need something else?
Yes, the code above is fine.
Is the pdf manipulation in javascript correct?
Also fine.
Can I transform the native CrytpoKey to forge or pkijs?
Yes, see below.
What is wrong with the last signature?
#mkl answered it in a comment, thank you.
FortifyApp has a CMS demo now. Although it didn't work with the version I was using, it works with version 1.3.4.
So I went with the pki.js implementation. The code changes need for the signing to be successful are the following:
Export the certificate:
const cryptoCert = await provider.certStorage.getItem(selectedCertificateId);
const certRawData = await provider.certStorage.exportCert('raw', cryptoCert);
const pkiCert = new pki.Certificate({
schema: asn.fromBER(certRawData).result,
});
return pkiCert;
Sign in detached mode
let cmsSigned = new pki.SignedData({
version: 1,
encapContentInfo: new pki.EncapsulatedContentInfo({
eContentType: '1.2.840.113549.1.7.1',
}),
signerInfos: [
new pki.SignerInfo({
version: 1,
sid: new pki.IssuerAndSerialNumber({
issuer: certificate.issuer,
serialNumber: certificate.serialNumber
})
})
],
certificates: [certificate]
});
let signature = await cmsSigned.sign(privateKey, 0, 'SHA-256', pdfBuffer);
const cms = new pki.ContentInfo({
contentType: '1.2.840.113549.1.7.2',
content: cmsSigned.toSchema(true),
});
const result = cms.toSchema().toBER(false);
return result;
Convert signature to 'HEX' string
let stringSignature = Array.prototype.map.call(new Uint8Array(signature), x => (`00${x.toString(16)}`).slice(-2)).join('');
let len = signature.byteLength;
Update (summary on the js side of things):
Download the pre-signed pdf (+ byteRange - this can be extracted with iText, so you can apply multiple signatures)
Prepare the signature (see first part of point 3. in the question)
Get private key:
const provider = await this.ws.getCrypto(selectedProviderId); // this.ws is a WebcryptoSocket
provider.sign = provider.subtle.sign.bind(provider.subtle);
setEngine(
'newEngine',
provider,
new CryptoEngine({
name: '',
crypto: provider,
subtle: provider.subtle,
})
);
const key = await this.getCertificateKey('private', provider, selectedCertificateId); //can be null
See Original answer points 1. and 2. Between theese I also have a hack:
let logout = await provider.logout();
let loggedIn = await provider.isLoggedIn();
if (!loggedIn) {
let login = await provider.login();
}
Add the signature on the pdf. Use original answer point 3., then the second part of point 3 in the question.

Firebase Storage - Get Download URL from Public Image

I have the logic of an image uploaded from the Frontend -> I get a download URL and attach it to the users document via a function.
This is the current code:
export const newProfilePictureUploaded = functions.region('europe-west1').storage.object().onFinalize(async (object) => {
if (!object.contentType?.startsWith('image/')) return;
return admin.storage().bucket(object.bucket).file(object.name || '').makePublic().then(async data => {
const filename = object.name as string;
const splitName = filename.split('/');
const folderName = splitName[0];
if (folderName === 'profilePics') {
const uid = splitName[1].substring(0, splitName[1].indexOf('.'));
console.log(uid);
await updateUserAvatar(uid, 'https://storage.googleapis.com/' + object.bucket + '/' + object.name)
return;
}
console.log('No operation for folder ' + folderName);
return
})
})
Now the thing is the image is displayed on the URL, but even if the Image in the Storage Bucket is replaced, then URL still points to that old image. While it should be deleted and the new image should be in that path.
Why dont I use the .getSignedUrl() method? The URL I get back is too long specially for storing it in the RTDB, and the URL can be public so its really unoptimal for me to use that method.

Azure blob storage sas token generation using js

I have this function running in an azure function to get a sas token for a browser application to upload to azure blob storage:
var azure = require('azure-storage');
module.exports = function(context, req) {
if (req.body.container) {
// The following values can be used for permissions:
// "a" (Add), "r" (Read), "w" (Write), "d" (Delete), "l" (List)
// Concatenate multiple permissions, such as "rwa" = Read, Write, Add
context.res = generateSasToken(
context,
req.body.container,
req.body.blobName,
req.body.permissions
);
} else {
context.res = {
status: 400,
body: "Specify a value for 'container'"
};
}
context.done(null, context);
};
function generateSasToken(context, container, blobName, permissions) {
var connString = process.env.AzureWebJobsStorage;
var blobService = azure.createBlobService(connString);
// Create a SAS token that expires in an hour
// Set start time to five minutes ago to avoid clock skew.
var startDate = new Date();
startDate.setMinutes(startDate.getMinutes() - 5);
var expiryDate = new Date(startDate);
expiryDate.setMinutes(startDate.getMinutes() + 60);
permissions = azure.BlobUtilities.SharedAccessPermissions.READ +
azure.BlobUtilities.SharedAccessPermissions.WRITE +
azure.BlobUtilities.SharedAccessPermissions.DELETE +
azure.BlobUtilities.SharedAccessPermissions.LIST;
var sharedAccessPolicy = {
AccessPolicy: {
Permissions: permissions,
Start: startDate,
Expiry: expiryDate
}
};
var sasToken = blobService.generateSharedAccessSignature(
container,
blobName,
sharedAccessPolicy
);
context.log(sasToken);
return {
token: sasToken,
uri: blobService.getUrl(container, blobName, sasToken, true)
};
}
I am then calling this url in the client and I try and upload with this code:
const search = new URLSearchParams(`?${token}`);
const sig = encodeURIComponent(search.get('sig'));
const qs = `?sv=${search.get('sv')}&ss=b&srt=sco&sp=rwdlac&se=${search.get('sv')}&st=${search.get(
'st'
)}&spr=https&sig=${sig}`;
return `${url}/${containerName}/${filename}${qs}`;
Which generates a url like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2018-03-28&st=2019-01-30T19:11:10Z&spr=https&sig=g0sceq3EkiAQTvyaZ07C+C4SZQz9FaGTV4Zwq4HkAnc=
Which returns this error:
403 (Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.)
If I generate the sas token from the azure portal it works, so the generated url looks like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-01-31T03:01:43Z&st=2019-01-30T19:01:43Z&spr=https&sig=ayE4gt%2FDfDzjv5DjMaD7AS%2F176Bi4Q6DWJNlnDzl%2FGc%3D
but my url looks like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-01-31T03:34:21Z&st=2019-01-30T19:34:21Z&spr=https&sig=Dx8Vm4XPnD1rn9uyzIAXZEfcdbWb0HjmOq%2BIq42Q%2FOM%3D
I have no idea what to do to get this working
Your Azure Function code is correct, and
var sasToken = blobService.generateSharedAccessSignature(
container,
blobName,
sharedAccessPolicy
);
is exactly the sasToken you need to upload blob. No need to process the token again(mishandle actually) as you have done in the 2nd code snippet.
It's expected that the sas token from the Azure portal(Account SAS) is different from the one generated in your code(Service SAS). Have a look at the doc.
To conclude,
Make sure the connection string belongs to the Storage you want to connect. You could avoid trouble and directly replace var connString = process.env.AzureWebJobsStorage; with var connString = "connectionStringGotFromPortal";
If 1 is confirmed, your Azure function code is correct and returns token as expected
{
token: sasToken,
uri: blobService.getUrl(container, blobName, sasToken, true)
};
Based on the 2nd code snippet you provide, you only need
return `${url}/${containerName}/${filename}?${token}`;
if the token is identical to what function returns.
The problem is that in your server-side code you're creating a Service SAS and then taking only signature portion of the code (sig) and creating an Account SAS on the client.
Since the parameters used to create token has now changed (in the original one, you didn't have parameters like ss, srt etc. but when you're creating your own URL, you're inserting these parameters), when you use the modified SAS URL you will get 403 error. This is happening because server again computes the signature based on the URL parameters and compare that with the signature passed in the URL. Since the two signatures won't match, you're getting the 403 error.
Since you're returning the SAS URL of the blob, there's no need for you to create the URL on the client. You can simply use the uri you're returning from your API layer on the client and use that to upload.
As Jerry Liu's answer explained your Azure function generates the correct token and already gives you the the correct uri to use which includes your blob name and token.
In your client side you can also use azure-sdk-for-js
// This is the response from your api with token and uri
const uri = response.uri;
const pipeline = StorageURL.newPipeline(new AnonymousCredential());
// Your uri already includes the full blob url with SAS signature
const blockBlobURL = BlockBlobURL.fromBlobURL(new BlobURL(uri, pipeline));
const uploadBlobResponse = await blockBlobURL.upload(
Aborter.none,
file,
file.size,
{ blobHTTPHeaders: { blobContentType: `${mime}; charset=utf-8`} }
);

How to check whether it is a blob url using javascript

I have a situation where I am converting blobURL to base64 dataURLs, but I want to do this only if url is a blobURL.
So is there any way to check whether it is valid blob url?
my blob url - blob:http://192.168.0.136/85017e84-0f2d-4791-b563-240794abdcbf
You are facing an x-y problem.
You absolutely don't need to check if your blobURI is a valid one, because you absolutely don't need to use the blobURI in order to create a base64 version of the Blob it's pointing to.
The only way to do it is to fetch the Blob and this means creating a copy of its data in memory for no-good.
What you need is a way to retrieve this Blob.
There is unfortunately no official way to do so with the web APIs, but it's not that hard to make it ourselves:
We simply have to overwrite the default URL.createObjectURL method in order to map the passed Blob in a dictionnary using the blobURI as key:
(() => {
// overrides URL methods to be able to retrieve the original blobs later on
const old_create = URL.createObjectURL;
const old_revoke = URL.revokeObjectURL;
Object.defineProperty(URL, 'createObjectURL', {
get: () => storeAndCreate
});
Object.defineProperty(URL, 'revokeObjectURL', {
get: () => forgetAndRevoke
});
Object.defineProperty(URL, 'getBlobFromObjectURL', {
get: () => getBlob
});
const dict = {};
function storeAndCreate(blob) {
var url = old_create(blob); // let it throw if it has to
dict[url] = blob;
return url
}
function forgetAndRevoke(url) {
old_revoke(url);
// some checks just because it's what the question titel asks for, and well to avoid deleting bad things
try {
if(new URL(url).protocol === 'blob:')
delete dict[url];
}catch(e){} // avoided deleting some bad thing ;)
}
function getBlob(url) {
return dict[url];
}
})();
// a few example uses
const blob = new Blob(['foo bar']);
// first normal use everyhting is alive
const url = URL.createObjectURL(blob);
const retrieved = URL.getBlobFromObjectURL(url);
console.log('retrieved: ', retrieved);
console.log('is same object: ', retrieved === blob);
// a revoked URL, of no use anymore
const revoked = URL.createObjectURL(blob);
URL.revokeObjectURL(revoked);
console.log('revoked: ', URL.getBlobFromObjectURL(revoked));
// an https:// URL
console.log('https: ', URL.getBlobFromObjectURL(location.href));
PS: for the ones concerned about the case a Blob might be closed (e.g user provided file has been deleted from disk) then simply listen for the onerror event of the FileReader you'd use in next step.
you could do something like
var url = 'blob:http://192.168.0.136/85017e84-0f2d-4791-b563-240794abdcbf';
if(url.search('blob:') == -1){
//do something
}
you may also use reg-expression based check with url.match('url expression')

Categories