Google Extended Access
Google Extended Access provides access to news article beyond the Publisher paywall if the user signs in via Google OAuth. The number of access is based on metering. More info.
The news article is redirected to the publisher where it's determined if it's a valid visit from Google Showcase panel. One of the steps of validation is the verification of the signature sent along in the URL parameters (gaa_sig).
The signature is generated from the base URL and the other three GAA parameters, which you can use to verify that this is a valid visit from a Showcase panel.
Verification Steps
Transform the gaa_sig value from its "web safe" format by replacing all '' characters with '+', and the '' characters with '/'.
Base64 decode the transformed value from step 1.
Remove the gaa_sig URL parameter from the URL to form the data to verify.
Loop through the latest JSON web keys (JWKs):
For development: https://play.google.com/newsstand/api/v3/articleaccess/publicsigningkey/dev
Verify the signature value from step2 with the data from step 3 and the keys from step 4.
Working Code in Node
const { subtle } = require('crypto').webcrypto;
const fetch = (...args) => import('node-fetch').then(({default: fetch}) => fetch(...args));
GOOGLE_JSON_WEB_KEYS = "https://play.google.com/newsstand/api/v3/articleaccess/publicsigningkey/dev"
async function getPlaySigningKeys() {
let response = await fetch(GOOGLE_JSON_WEB_KEYS);
let data = await response.json();
return data;
}
async function verifyGaaUrlSignature(url, keys){
let urlObj = new URL(url);
let params = new URLSearchParams(urlObj.search);
let sigB64Str = params.get('gaa_sig');
if (!sigB64Str) {
return false;
}
let sigBuffer = Buffer.from(sigB64Str.replace(/-/g,'+').replace(/_/g,'/'), 'base64');
params.delete('gaa_sig');
let data = new TextEncoder().encode(urlObj.origin + urlObj.pathname + '?' + params);
async function verifySig(key) {
let cryptoKey = await subtle.importKey(
'jwk', key, {name: 'ECDSA', namedCurve: key.crv}, true, key.key_ops);
return subtle.verify(
{name: 'ECDSA', hash: 'SHA-256'}, cryptoKey, sigBuffer, data);
}
const results = await Promise.all(keys.map(verifySig));
return results.includes(true);
}
async function run() {
const playSigningKeys = await getPlaySigningKeys();
url = "https://www.bbc.com/?gaa_at=la&gaa_n=ATKjfPG-7F6PGpCXtPZFAfqigovblSKOl3G6jduKn8zWcjHMSu-a3wQ1ub-mKBl47rjP&gaa_ts=630be8f6&gaa_sig=ZGvbOCFg5J_zGAtd6R39YbEEYjcoarQ7AaAjQPsAae5jikZTjX57_Ja3vVyp8bUIcUbftI5dQdTP7gtwtIC3eQ%3D%3D"
const isValidSig = await verifyGaaUrlSignature(url, playSigningKeys.keys);
console.log('Valid Signature: ' + isValidSig);
}
if (require.main === module) {
run();
}
Steps to produce the parameters and signature
Generate the GAA parameters for the required url by appending it to the below url. The updated url will be redirected to the with the GAA parameters.
https://play.google.com/newsstand/api/v3/articleaccess?testurl=
Example:
https://play.google.com/newsstand/api/v3/articleaccess?testurl=https://bbc.com
Redirected to
https://www.bbc.com/?gaa_at=la&gaa_n=ATKjfPG-7F6PGpCXtPZFAfqigovblSKOl3G6jduKn8zWcjHMSu-a3wQ1ub-mKBl47rjP&gaa_ts=630be8f6&gaa_sig=ZGvbOCFg5J_zGAtd6R39YbEEYjcoarQ7AaAjQPsAae5jikZTjX57_Ja3vVyp8bUIcUbftI5dQdTP7gtwtIC3eQ%3D%3D
What would be the working code in Python?
I tried looking into python-jose. I couldn't figure out the equivalent. The Web API for SubtleCrypto.verify().
Related
I'm building a website where people log in to their phantom wallet then by clicking on a button they will send a certain amount of our custom token to one wallet.
The code shown below is working with SOL and I would like to make it work with our custom SPL token, I have the mint address of the token but I couldn't find any way to make it work. Could anyone help me?
async function transferSOL(toSend) {
// Detecing and storing the phantom wallet of the user (creator in this case)
var provider = await getProvider();
console.log("Public key of the emitter: ",provider.publicKey.toString());
// Establishing connection
var connection = new web3.Connection(
"https://api.mainnet-beta.solana.com/"
);
// I have hardcoded my secondary wallet address here. You can take this address either from user input or your DB or wherever
var recieverWallet = new web3.PublicKey("address of the wallet recieving the custom SPL Token");
var transaction = new web3.Transaction().add(
web3.SystemProgram.transfer({
fromPubkey: provider.publicKey,
toPubkey: recieverWallet,
lamports: (web3.LAMPORTS_PER_SOL)*toSend //Investing 1 SOL. Remember 1 Lamport = 10^-9 SOL.
}),
);
// Setting the variables for the transaction
transaction.feePayer = await provider.publicKey;
let blockhashObj = await connection.getRecentBlockhash();
transaction.recentBlockhash = await blockhashObj.blockhash;
// Request creator to sign the transaction (allow the transaction)
let signed = await provider.signTransaction(transaction);
// The signature is generated
let signature = await connection.sendRawTransaction(signed.serialize());
// Confirm whether the transaction went through or not
console.log(await connection.confirmTransaction(signature));
//Signature chhap diya idhar
console.log("Signature: ", signature);
}
I'd like to specify that people will use phantom and I cant have access to their private keys (cause it was needed in all the answers I found on internet)
You're very close! You just need to replace the web3.SystemProgram.transfer instruction with an instruction to transfer SPL tokens, referencing the proper accounts. There's an example at the Solana Cookbook covering this exactly situation: https://solanacookbook.com/references/token.html#transfer-token
You can do this with help of anchor and spl-token, which is used for dealing with custom tokens on solana.
This is a custom transfer function. You'll need the mint address of the token, wallet from which the tokens will be taken( which you get in front end when user connects wallet. Can make use of solana-web3), to address and amount.
import * as splToken from "#solana/spl-token";
import { web3, Wallet } from "#project-serum/anchor";
async function transfer(tokenMintAddress: string, wallet: Wallet, to: string, connection: web3.Connection, amount: number) {
const mintPublicKey = new web3.PublicKey(tokenMintAddress);
const {TOKEN_PROGRAM_ID} = splToken
const fromTokenAccount = await splToken.getOrCreateAssociatedTokenAccount(
connection,
wallet.payer,
mintPublicKey,
wallet.publicKey
);
const destPublicKey = new web3.PublicKey(to);
// Get the derived address of the destination wallet which will hold the custom token
const associatedDestinationTokenAddr = await splToken.getOrCreateAssociatedTokenAccount(
connection,
wallet.payer,
mintPublicKey,
destPublicKey
);
const receiverAccount = await connection.getAccountInfo(associatedDestinationTokenAddr.address);
const instructions: web3.TransactionInstruction[] = [];
instructions.push(
splToken.createTransferInstruction(
fromTokenAccount.address,
associatedDestinationTokenAddr.address,
wallet.publicKey,
amount,
[],
TOKEN_PROGRAM_ID
)
);
const transaction = new web3.Transaction().add(...instructions);
transaction.feePayer = wallet.publicKey;
transaction.recentBlockhash = (await connection.getRecentBlockhash()).blockhash;
const transactionSignature = await connection.sendRawTransaction(
transaction.serialize(),
{ skipPreflight: true }
);
await connection.confirmTransaction(transactionSignature);
}
i am trying to re-create AWS signature version 2 authentication on javascript, what i have right now is
String.prototype.getBytes = () => {
return this.toString()
.split('')
.map((i) => i.charCodeAt(0));
};
let key = 'redacted_access_key_id';
const bytes = key.getBytes();
let signingKey = crypto.HmacSHA256(bytes, key);
let data = JSON.stringify({ lang: 'en', pageNumber: 0, pageSize: 20 });
const contentMd5 = crypto.MD5(data).toString();
data = data.getBytes();
signingKey = crypto.HmacSHA256(data, key);
const result = Buffer.from(signingKey.toString()).toString('base64');
Which outputs something like
ZGY0MmI3MDVjNmJlNzY5ZWYwZjU1ZTc5MDhhOGNkYzI3ZWVjYzQ5ODBmY2M1NGI5NTc2MmVmNTY1NzEwNjhhMA==
which is incorrect, because the hash should be exactly 28 characters in length. Now the AWS signature version 2 auth docs show how it is being made, but only in java
import java.security.SignatureException;
import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;
import com.amazonaws.util.*;
/**
* This class defines common routines for generating
* authentication signatures for AWS Platform requests.
*/
public class Signature {
private static final String HMAC_SHA256_ALGORITHM = "HmacSHA256";
public static String calculateRFC2104HMAC(String data, String key)
throws java.security.SignatureException
{
String result;
try {
// Get an hmac_sha256 key from the raw key bytes.
SecretKeySpec signingKey = new SecretKeySpec(key.getBytes("UTF-8"), HMAC_SHA256_ALGORITHM);
// Get an hmac_sha256 Mac instance and initialize with the signing key.
Mac mac = Mac.getInstance(HMAC_SHA256_ALGORITHM);
mac.init(signingKey);
// Compute the hmac on input data bytes.
byte[] rawHmac = mac.doFinal(data.getBytes("UTF-8"));
// Base64-encode the hmac by using the utility in the SDK
result = BinaryUtils.toBase64(rawHmac);
} catch (Exception e) {
throw new SignatureException("Failed to generate HMAC : " + e.getMessage());
}
return result;
}
}
I am trying to recreate this exact same code in javascript but something is wrong. Can someone please help me with this, i cant find any examples in javascript.
Thank you.
The following code is the equivalent of the Java version of calculateRFC2104HMAC in JS.
const CryptoJS = require('crypto-js');
const calculateRFC2104HMAC = (data, key) => {
const rawHmac = CryptoJS.HmacSHA256(CryptoJS.enc.Utf8.parse(data), CryptoJS.enc.Utf8.parse(key));
return CryptoJS.enc.Base64.stringify(rawHmac);
}
Sample usage based on the example on AWS Signature V2 page
const urlSafeSignature = (data, key) => encodeURIComponent(calculateRFC2104HMAC(data, key));
const data =
`GET
elasticmapreduce.amazonaws.com
/
AWSAccessKeyId=AKIAIOSFODNN7EXAMPLE&Action=DescribeJobFlows&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2011-10-03T15%3A19%3A30&Version=2009-03-31`
const key = `wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY`
console.log(urlSafeSignature(data, key));
The documentation advises to use AWS Signature V4 which has a AWS published library on NPM here. The AWS signed requests are for AWS Services and the signature in the request helps validating the request, prevents replay attacks. I'm not sure what you are trying to send in the following code and for which AWS service.
let data = JSON.stringify({ lang: 'en', pageNumber: 0, pageSize: 20 });
You must provide all details required to sign a request as per the AWS documentation.
What I'm trying to achieve
Sign a PDF in the browser using cliets certificate store or Smart Card
What I did so far
For accessing the local cert store I use FortifyApp.
Pdf is pre-signed on the server using iText(Sharp), then sent to the client via Ajax.
Relevant code:
using (var fileStream = new MemoryStream())
{
using (var stamper = PdfStamper.CreateSignature(reader, fileStream, '0', null, true))
{
var signatureAppearance = stamper.SignatureAppearance;
signatureAppearance.SetVisibleSignature(new iTextSharp.text.Rectangle(15,15,15,15), 1, "A");
IExternalSignatureContainer external =
new ExternalBlankSignatureContainer(PdfName.ADOBE_PPKLITE, PdfName.ADBE_PKCS7_DETACHED);
signatureAppearance.Reason = "AsdAsd";
signatureAppearance.Layer2Text = "Asd";
signatureAppearance.SignatureRenderingMode =
iTextSharp.text.pdf.PdfSignatureAppearance.RenderingMode.DESCRIPTION;
MakeSignature.SignExternalContainer(signatureAppearance, external, 512);
return fileStream.ToArray();
}
}
Following this, I managed to manipulate the pdf, extract byteRange, insert signature, etc. Relevant code:
let pdfBuffer = Buffer.from(new Uint8Array(pdf));
const byteRangeString = `/ByteRange `;
const byteRangePos = pdfBuffer.indexOf(byteRangeString);
if (byteRangePos === -1)
throw new Error('asd');
let len = pdfBuffer.slice(byteRangePos).indexOf(`]`) + 1;
// Calculate the actual ByteRange that needs to replace the placeholder.
const byteRangeEnd = byteRangePos + len;
const contentsTagPos = pdfBuffer.indexOf('/Contents ', byteRangeEnd);
const placeholderPos = pdfBuffer.indexOf('<', contentsTagPos);
const placeholderEnd = pdfBuffer.indexOf('>', placeholderPos);
const placeholderLengthWithBrackets = placeholderEnd + 1 - placeholderPos;
const placeholderLength = placeholderLengthWithBrackets - 2;
const byteRange = [0, 0, 0, 0];
byteRange[1] = placeholderPos;
byteRange[2] = byteRange[1] + placeholderLengthWithBrackets;
byteRange[3] = pdfBuffer.length - byteRange[2];
let actualByteRange = `/ByteRange [${byteRange.join(' ')}]`;
actualByteRange += ' '.repeat(len - actualByteRange.length);
// Replace the /ByteRange placeholder with the actual ByteRange
pdfBuffer = Buffer.concat([pdfBuffer.slice(0, byteRangePos) as any, Buffer.from(actualByteRange), pdfBuffer.slice(byteRangeEnd)]);
// Remove the placeholder signature
pdfBuffer = Buffer.concat([pdfBuffer.slice(0, byteRange[1]) as any, pdfBuffer.slice(byteRange[2], byteRange[2] + byteRange[3])]);
and
//stringSignature comes from the signature creations below, and is 'hex' encoded
// Pad the signature with zeroes so the it is the same length as the placeholder
stringSignature += Buffer
.from(String.fromCharCode(0).repeat((placeholderLength / 2) - len))
.toString('hex');
// Place it in the document.
pdfBuffer = Buffer.concat([
pdfBuffer.slice(0, byteRange[1]) as any,
Buffer.from(`<${stringSignature}>`),
pdfBuffer.slice(byteRange[1])
]);
The problem
This uses forge, and an uploaded p12 file. - This would probably work, if I could translate the imported(?) privateKey from Fortify (which is === typeof CryptoKey, and forge throws an error: TypeError: signer.key.sign is not a function).
p7.addCertificate(certificate); //certificate is the Certificate from Fortify CertificateStore.getItem(certId)
p7.addSigner({
key: privateKey, //this is the CryptoKey from Fortify
certificate: null/*certificate*/, //also tried certificate from Fortify
digestAlgorithm: forge.pki.oids.sha256,
authenticatedAttributes: [
{
type: forge.pki.oids.contentType,
value: forge.pki.oids.data,
}, {
type: forge.pki.oids.messageDigest,
// value will be auto-populated at signing time
}, {
type: forge.pki.oids.signingTime,
// value can also be auto-populated at signing time
// We may also support passing this as an option to sign().
// Would be useful to match the creation time of the document for example.
value: new Date(),
},
],
});
// Sign in detached mode.
p7.sign({detached: true});
I also tried pkijs for creating the signature (throws a similar error: Signing error: TypeError: Failed to execute 'sign' on 'SubtleCrypto': parameter 2 is not of type 'CryptoKey'.)
let cmsSigned = new pki.SignedData({
encapContentInfo: new pki.EncapsulatedContentInfo({
eContentType: "1.2.840.113549.1.7.1", // "data" content type
eContent: new asn.OctetString({ valueHex: pdfBuffer })
}),
signerInfos: [
new pki.SignerInfo({
sid: new pki.IssuerAndSerialNumber({
issuer: certificate.issuer,
serialNumber: certificate.serialNumber
})
})
],
certificates: [certificate]
});
let signature = await cmsSigned.sign(privateKey, 0, 'SHA-256');
What "works" is, if I create the signature using the code below:
let signature = await provider.subtle.sign(alg, privateKey, new Uint8Array(pdfBuffer).buffer);
"works", because it creates an invalid signature:
Error during signature verification.
ASN.1 parsing error:
Error encountered while BER decoding:
I tried multiple certificates, no luck.
Questions
Can I achieve my goal without having to manually upload a p12/pfx file, is it even possible?
Is the server-side implementation of the deferred signature correct, do I need something else?
Is the pdf manipulation in javascript correct?
Can I transform the native CrytpoKey to forge or pkijs?
What is wrong with the last signature? At first glance it seems right (at least the format):
<>>>/ContactInfo()/M(D:20200619143454+02'00')/Filter/Adobe.PPKLite/SubFilter/adbe.pkcs7.detached/ByteRange [0 180165 181191 1492] /Contents <72eb2731c9de4a5ccc94f1e1f2d9b07be0c6eed8144cb73f3dfe2764595dcc8f58b8a55f5026618fd9c79146ea93afdafc00b617c6e70de553600e4520f290bef70c499ea91862bb3acc651b6a7b162c984987f05ec59db5b032af0127a1224cad82e3be38ae74dd110ef5f870f0a0a92a8fba295009f267508c372db680b3d89d3157d3b218f33e7bf30c500d599b977c956e6a6e4b02a0bbd4a86737378b421ae2af0a4a3c03584eaf076c1cdb56d372617da06729ef364605ecd98b6b32d3bb792b4541887b59b686b41db3fc32eb4c651060bb02e2babeb30e6545834b2935993f6ee9edcc8f99fee8ad6edd2958c780177df6071fdc75208f76bbbcc21a00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000>>>
Thanks:
F
Original answer:
So I figured it out.
Can I achieve my goal without having to manually upload a p12/pfx
file, is it even possible?
Yes, it is. (See below on what needs to be changed.)
Is the server-side implementation of the deferred signature correct, do I need something else?
Yes, the code above is fine.
Is the pdf manipulation in javascript correct?
Also fine.
Can I transform the native CrytpoKey to forge or pkijs?
Yes, see below.
What is wrong with the last signature?
#mkl answered it in a comment, thank you.
FortifyApp has a CMS demo now. Although it didn't work with the version I was using, it works with version 1.3.4.
So I went with the pki.js implementation. The code changes need for the signing to be successful are the following:
Export the certificate:
const cryptoCert = await provider.certStorage.getItem(selectedCertificateId);
const certRawData = await provider.certStorage.exportCert('raw', cryptoCert);
const pkiCert = new pki.Certificate({
schema: asn.fromBER(certRawData).result,
});
return pkiCert;
Sign in detached mode
let cmsSigned = new pki.SignedData({
version: 1,
encapContentInfo: new pki.EncapsulatedContentInfo({
eContentType: '1.2.840.113549.1.7.1',
}),
signerInfos: [
new pki.SignerInfo({
version: 1,
sid: new pki.IssuerAndSerialNumber({
issuer: certificate.issuer,
serialNumber: certificate.serialNumber
})
})
],
certificates: [certificate]
});
let signature = await cmsSigned.sign(privateKey, 0, 'SHA-256', pdfBuffer);
const cms = new pki.ContentInfo({
contentType: '1.2.840.113549.1.7.2',
content: cmsSigned.toSchema(true),
});
const result = cms.toSchema().toBER(false);
return result;
Convert signature to 'HEX' string
let stringSignature = Array.prototype.map.call(new Uint8Array(signature), x => (`00${x.toString(16)}`).slice(-2)).join('');
let len = signature.byteLength;
Update (summary on the js side of things):
Download the pre-signed pdf (+ byteRange - this can be extracted with iText, so you can apply multiple signatures)
Prepare the signature (see first part of point 3. in the question)
Get private key:
const provider = await this.ws.getCrypto(selectedProviderId); // this.ws is a WebcryptoSocket
provider.sign = provider.subtle.sign.bind(provider.subtle);
setEngine(
'newEngine',
provider,
new CryptoEngine({
name: '',
crypto: provider,
subtle: provider.subtle,
})
);
const key = await this.getCertificateKey('private', provider, selectedCertificateId); //can be null
See Original answer points 1. and 2. Between theese I also have a hack:
let logout = await provider.logout();
let loggedIn = await provider.isLoggedIn();
if (!loggedIn) {
let login = await provider.login();
}
Add the signature on the pdf. Use original answer point 3., then the second part of point 3 in the question.
i try to activate Revit Levels and 2D Minimap extension in autodesk forge viewer, but can not get AEC Model Data. I got this worning`
i tried to get AEC data with this code
const url = window.location.search;
console.log(url);
const svf_path = `${url.replace("?", "/storage/").replace(/%20/g, " ")}`;
Autodesk.Viewing.endpoint.getItemApi = (endpoint, derivativeUrn, api) => {
return svf_path;
};
Autodesk.Viewing.Initializer(options, async () => {
const paths = svf_path.split("/");
const [dest, svf_dir] = [paths[2], paths[3]];
const url = `/api/viewer/dest/${dest}/svf/${svf_dir}/manifest`;
const response = await fetch(url);
const manifest = await response.json();
const init_div = document.getElementById("init_div");
viewer = new Autodesk.Viewing.GuiViewer3D(init_div, config3d);
const viewerDocument = new Autodesk.Viewing.Document(manifest);
const viewable = viewerDocument.getRoot().getDefaultGeometry();
viewer.start();
await viewerDocument.downloadAecModelData();
viewer.loadDocumentNode(viewerDocument, viewable)
.then(function (result) {
Autodesk.Viewing.Document.getAecModelData(viewable);
})
});
wats wrong in my code?
The warning comes from the BubbleNode.prototype.getAecModelData method. You are not calling it in your code but it's possible that it's being called by the LevelsExtension itself. Try configuring the extension so that it doesn't detect the AEC data automatically by passing in { autoDetectAecModelData: false } as the extension options.
Btw. to debug the issue on your side, you can also try getting the non-minified version of viewer3D.js, put a breakpoint to where the warning is being logged, and see the call stack when the breakpoint is hit.
I have this function running in an azure function to get a sas token for a browser application to upload to azure blob storage:
var azure = require('azure-storage');
module.exports = function(context, req) {
if (req.body.container) {
// The following values can be used for permissions:
// "a" (Add), "r" (Read), "w" (Write), "d" (Delete), "l" (List)
// Concatenate multiple permissions, such as "rwa" = Read, Write, Add
context.res = generateSasToken(
context,
req.body.container,
req.body.blobName,
req.body.permissions
);
} else {
context.res = {
status: 400,
body: "Specify a value for 'container'"
};
}
context.done(null, context);
};
function generateSasToken(context, container, blobName, permissions) {
var connString = process.env.AzureWebJobsStorage;
var blobService = azure.createBlobService(connString);
// Create a SAS token that expires in an hour
// Set start time to five minutes ago to avoid clock skew.
var startDate = new Date();
startDate.setMinutes(startDate.getMinutes() - 5);
var expiryDate = new Date(startDate);
expiryDate.setMinutes(startDate.getMinutes() + 60);
permissions = azure.BlobUtilities.SharedAccessPermissions.READ +
azure.BlobUtilities.SharedAccessPermissions.WRITE +
azure.BlobUtilities.SharedAccessPermissions.DELETE +
azure.BlobUtilities.SharedAccessPermissions.LIST;
var sharedAccessPolicy = {
AccessPolicy: {
Permissions: permissions,
Start: startDate,
Expiry: expiryDate
}
};
var sasToken = blobService.generateSharedAccessSignature(
container,
blobName,
sharedAccessPolicy
);
context.log(sasToken);
return {
token: sasToken,
uri: blobService.getUrl(container, blobName, sasToken, true)
};
}
I am then calling this url in the client and I try and upload with this code:
const search = new URLSearchParams(`?${token}`);
const sig = encodeURIComponent(search.get('sig'));
const qs = `?sv=${search.get('sv')}&ss=b&srt=sco&sp=rwdlac&se=${search.get('sv')}&st=${search.get(
'st'
)}&spr=https&sig=${sig}`;
return `${url}/${containerName}/${filename}${qs}`;
Which generates a url like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2018-03-28&st=2019-01-30T19:11:10Z&spr=https&sig=g0sceq3EkiAQTvyaZ07C+C4SZQz9FaGTV4Zwq4HkAnc=
Which returns this error:
403 (Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.)
If I generate the sas token from the azure portal it works, so the generated url looks like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-01-31T03:01:43Z&st=2019-01-30T19:01:43Z&spr=https&sig=ayE4gt%2FDfDzjv5DjMaD7AS%2F176Bi4Q6DWJNlnDzl%2FGc%3D
but my url looks like this:
https://mystorage.blob.core.windows.net/mycontainer/latest.png?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-01-31T03:34:21Z&st=2019-01-30T19:34:21Z&spr=https&sig=Dx8Vm4XPnD1rn9uyzIAXZEfcdbWb0HjmOq%2BIq42Q%2FOM%3D
I have no idea what to do to get this working
Your Azure Function code is correct, and
var sasToken = blobService.generateSharedAccessSignature(
container,
blobName,
sharedAccessPolicy
);
is exactly the sasToken you need to upload blob. No need to process the token again(mishandle actually) as you have done in the 2nd code snippet.
It's expected that the sas token from the Azure portal(Account SAS) is different from the one generated in your code(Service SAS). Have a look at the doc.
To conclude,
Make sure the connection string belongs to the Storage you want to connect. You could avoid trouble and directly replace var connString = process.env.AzureWebJobsStorage; with var connString = "connectionStringGotFromPortal";
If 1 is confirmed, your Azure function code is correct and returns token as expected
{
token: sasToken,
uri: blobService.getUrl(container, blobName, sasToken, true)
};
Based on the 2nd code snippet you provide, you only need
return `${url}/${containerName}/${filename}?${token}`;
if the token is identical to what function returns.
The problem is that in your server-side code you're creating a Service SAS and then taking only signature portion of the code (sig) and creating an Account SAS on the client.
Since the parameters used to create token has now changed (in the original one, you didn't have parameters like ss, srt etc. but when you're creating your own URL, you're inserting these parameters), when you use the modified SAS URL you will get 403 error. This is happening because server again computes the signature based on the URL parameters and compare that with the signature passed in the URL. Since the two signatures won't match, you're getting the 403 error.
Since you're returning the SAS URL of the blob, there's no need for you to create the URL on the client. You can simply use the uri you're returning from your API layer on the client and use that to upload.
As Jerry Liu's answer explained your Azure function generates the correct token and already gives you the the correct uri to use which includes your blob name and token.
In your client side you can also use azure-sdk-for-js
// This is the response from your api with token and uri
const uri = response.uri;
const pipeline = StorageURL.newPipeline(new AnonymousCredential());
// Your uri already includes the full blob url with SAS signature
const blockBlobURL = BlockBlobURL.fromBlobURL(new BlobURL(uri, pipeline));
const uploadBlobResponse = await blockBlobURL.upload(
Aborter.none,
file,
file.size,
{ blobHTTPHeaders: { blobContentType: `${mime}; charset=utf-8`} }
);