I've been stuck for the whole day making my first call of Azure Storage REST API. Postman's response showed that it's due to error in Azure authentication, but I have no idea what's the problem.
Here is the browser script to send Azure Storage REST API:
function azureListContainers() {
var key = "key-copied-from-azure-storage-account";
var strTime = (new Date()).toUTCString();
var strToSign = 'GET\n\n\n\nx-ms-date:' + strTime + '\nx-ms-version:2015-12-11\n/myaccount/?comp=list';
var hash = CryptoJS.HmacSHA256(strToSign, key);
var hashInBase64 = CryptoJS.enc.Base64.stringify(hash);
var auth = "SharedKeyLite myaccount:"+hashInBase64;
console.log(strToSign);
console.log(auth);
console.log(strTime);
$.ajax({
type: "GET",
beforeSend: function (request)
{
request.setRequestHeader("Authorization", auth);
request.setRequestHeader("x-ms-date", strTime);
request.setRequestHeader("x-ms-version", "2015-12-11");
},
url: "https://myaccount.blob.core.windows.net/?comp=list",
processData: false,
success: function(msg) {
console.log(msg);
}
});
}
Chrome Developer Tool just returned No 'Access-Control-Allow-Origin' header without further reason, so I copied the contents of var auth and var strTime, created the same request using Postman tool:
[Command]
GET https://myaccount.blob.core.windows.net/?comp=list
[Headers]
Authorization:SharedKeyLite myaccount:Z9/kY/D+osJHHz3is+8yJRqhj09VUlr5n+PlePUa8Lk=
x-ms-date:Tue, 09 Aug 2016 10:30:49 GMT
x-ms-version:2015-12-11
[Response Body]
<?xml version="1.0" encoding="utf-8"?>
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:9be3d595-0001-0012-4929-f2fde2000000
Time:2016-08-09T10:31:52.6542965Z</Message>
<AuthenticationErrorDetail>The MAC signature found in the HTTP request 'Z9/kY/D+osJHHz3is+8yJRqhj09VUlr5n+PlePUa8Lk=' is not the same as any computed signature. Server used following string to sign: 'GET
x-ms-date:Tue, 09 Aug 2016 10:30:49 GMT
x-ms-version:2015-12-11
/myaccount/?comp=list'.</AuthenticationErrorDetail>
</Error>
After diff the two strings, I believe var strToSign in my script is the same as the string Azure used to sign. But still there was an authentication error. Please help indicate what's the problem.
Chrome Developer Tool just returned No 'Access-Control-Allow-Origin' header
As you are using javascript to send http requests against to Azure Storage Server directly from client. It's a common CORS issue.
When you occur this issue, you can enable the CORS for your storage services. For simple, you can leverage Microsoft Azure Storage Explorer to configure your storage.
AuthenticationFailed
I did two modifications based on your code snippet, in my test project to make it work.
var hash = CryptoJS.HmacSHA256(strToSign, key); The second parameter, should be a base64 decode from the account key, refer to the Azure Storage SDK for node.js
In my test, I had to use SharedKey scheme and authenticate with SharedKey token to make your scenario to work.
Here is my test code snippet, FYI.,
var key = "key-copied-from-azure-storage-account";
var strTime = (new Date()).toUTCString();
var strToSign = 'GET\n\n\n\n\n\n\n\n\n\n\n\nx-ms-date:' + strTime + '\nx-ms-version:2015-12-11\n/<accountname>/\ncomp:list';
var secret = CryptoJS.enc.Base64.parse(key);
var hash = CryptoJS.HmacSHA256(strToSign, secret);
var hashInBase64 = CryptoJS.enc.Base64.stringify(hash);
var auth = "SharedKey <accountname>:"+hashInBase64;
Additionally, for security reason, please implement the Azure Storage Services in backend web server. As using pure javascript on client, will expose your account name and account key to public, which will raise the risk to your data and info.
Related
I'm trying to migrating my authentication method from Power BI Master User to service principal.
on master user I'm using msal with authentication flow like bellow:
login to AAD --> request for AAD token --> importing pbix file with rest API using AAD token as credential
this is the code
$(document).ready(function () {
myMSALObj.loginPopup(requestObj).then(function (loginResponse) {
acquireTokenPopup();
});
Msal.UserAgentApplication
});
function acquireTokenPopup() {
myMSALObj.acquireTokenSilent(requestObj).then(function (tokenResponse) {
AADToken = tokenResponse.accessToken;
importPBIX(AADToken);
});
}
function importPBIX(accessToken) {
xmlHttp.open("GET", "./importPBIX?accessToken=" + accessToken + "&pbixTemplate=" + pbixTemplate, true);
//the rest of import process//
}
so there are two question:
1. what kind of flow would it be if I use service principal instead?
on my head and from the info which I read from microsoft document it would be simpler like:
request token using application secret key --> importing pbix file with rest API using token
is this correct?
2. what kind of code that I can use to do this on javascript?I think MSAL couldn't do token request by using service principal. would appreciate any info or tutorial for this.
bests,
what kind of flow would it be if I use service principal instead? on my head and from the info which I read from microsoft document it would be simpler like: request token using application secret key --> importing pbix file with rest API using token is this correct?
According to my research, if you want to use the service principal to get Azure AD access token, you can use the client credentials grant flow
The client application authenticates to the Azure AD token issuance endpoint and requests an access token.
The Azure AD token issuance endpoint issues the access token.
The access token is used to authenticate to the secured resource.
Data from the secured resource is returned to the client application.
Regarding how to get access token, please refer to the following steps
Register Azure AD application
Configure API permissions
Get access token
POST https://login.microsoftonline.com/<tenant id>/oauth2/token
Content-Type: application/x-www-form-urlencoded
grant_type=client_credentials
&client_id=<>
&client_secret=<>
&resource=https://analysis.windows.net/powerbi/api
2. what kind of code that I can use to do this on javascript?I think MSAL couldn't do token request by using service principal. would appreciate any info or tutorial for this.
If you want to implement client credentials grant flow with sdk, you can use adal-node. For more details, please refer to https://www.npmjs.com/package/adal-node.
For example
var AuthenticationContext = require('adal-node').AuthenticationContext;
var authorityHostUrl = 'https://login.microsoftonline.com/';
var tenant = 'myTenant.onmicrosoft.com'; // AAD Tenant name.
var authorityUrl = authorityHostUrl + '/' + tenant;
var applicationId = 'yourApplicationIdHere'; // Application Id of app registered under AAD.
var clientSecret = 'yourAADIssuedClientSecretHere'; // Secret generated for app. Read this environment variable.
var resource = ''; // URI that identifies the resource for which the token is valid.
var context = new AuthenticationContext(authorityUrl);
context.acquireTokenWithClientCredentials(resource, applicationId, clientSecret, function(err, tokenResponse) {
if (err) {
console.log('well that didn\'t work: ' + err.stack);
} else {
console.log(tokenResponse);
}
});
thanks to Jim's answer, I've tweaked my code a little bit and the token authentication process went smoothly.
As my apps using javascript at front-end and python as its back-end, I decided to do the process at python and used python msal library instead.
the code is just like :
authority_host_uri = 'https://login.microsoftonline.com'
tenant = 'myTenantId'
authority_uri = authority_host_uri + '/' + tenant
client_id = 'myClienId'
client_secret = 'myClientSecretKey'
config={"scope":["https://analysis.windows.net/powerbi/api/.default"]}
app = ConfidentialClientApplication(client_id, client_credential=client_secret, authority=authority_uri)
token = app.acquire_token_for_client(scopes=config['scope'])
once again thanks to Jim for helping me on this one.
bests,
We are trying to invoke the TFS 2015 REST API's from a web-page using Javascript and have a challenge in establishing valid authentication with the TFS server.
We do not know how to generate a personal access tokens or an OAuth access tokens. The instruction below seem to apply more toward VSO than on-premise TFS.
https://www.visualstudio.com/en-us/integrate/get-started/rest/basics
How can I generate an authentication key/token?
UPDATE: As on Mar 2017, the latest release of On-Prem TFS supports creating personal access tokens for all users. Using the below javascript code by #Elmar you can make requests to update, edit TFS workitems from REST API.
The OAuth mechanism is used against the VSO api at the time of writing this as you've seemingly identified. official docs for VSO OAuth tokens here.
For on-prem however, the following is required:
Via a javascript client (note I'm using jquery for the ajax request here)
Since alternative creds or token based auth isn't available on-prem to match current vso implementation; You can consider the following approach: If you have admin permissions on the TFS app tier, you can configure basic authentication for the tfs application in IIS, and set the default domain.
And then invoke as follows:
var self = this;
self.tasksURI = 'https://<SERVER>/tfs/<COLLECTION>/<PROJECT>/_apis/build/builds?api-version=2.0';
self.username = "<USERNAME>"; //basic username so no domain here.
self.password = "<PASSWORD>";
self.ajax = function (uri, method, data) {
var request = {
url: uri,
type: method,
contentType: "application/json",
accepts: "application/json",
cache: false,
dataType: 'json',
data: JSON.stringify(data),
beforeSend: function (xhr) {
xhr.setRequestHeader("Authorization", "Basic " + btoa(self.username + ":" + self.password));
},
error: function (jqXHR) {
console.log("ajax error " + jqXHR.status);
}
};
return $.ajax(request);
}
self.ajax(self.tasksURI, 'GET').done(function (data) {
alert(data);
});
IMPORTANT NOTE! : If you enable basic auth you really should configure your site to use https too or your credentials will be sent in clear text (as indicated in the warning seen -> top right of the image above).
Via a .NET client
In on-prem (currently rtm'd: 2015 update 1) the api is generally gated/fenced off with NTLM, meaning a pre-flight request is made, 401 returned from server to challenge for auth, in this case, setting the request Credential as follows allows the request to auth against the server once the preflight challenge is received.
To accommodate the challenge you can do this:
request.Credentials = new NetworkCredential(this.UserName, this.Password);
//you may want to specify a domain too
If you've enabled basic auth for tfs on prem you can attempt authenticating as follows, this pattern matches the mechanism used when invoking vso after enabling alternative credentials in the ui:
request.Headers[HttpRequestHeader.Authorization] = "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes(this.UserName + ":" + this.Password));
Note: In some code I modified a few weeks ago; support for both VSO and on-prem was required so I used the two patterns above to deal with the specific scenario.
My question is old but as on Mar 2017, the latest release of On-Prem TFS supports creating personal access tokens for all users. Using the javascript code by #Elmar you can make requests to update, edit TFS workitems from REST API.
If possible, I would recommend using the .NET client libraries for Visual Studio Team Services (and TFS):
https://www.visualstudio.com/en-us/docs/integrate/get-started/client-libraries/dotnet
You can use windows authentication (which is what I needed). After including the following nuget packages in my code:
Microsoft.TeamFoundationServer.Client
Microsoft.VisualStudio.Services.Client
Microsoft.VisualStudio.Services.InteractiveClient
I was able to write this code:
// Create instance of VssConnection using Windows credentials (NTLM)
var connection = new VssConnection(new Uri("http://mytfsserver:8080/tfs/CollectionName"), new VssClientCredentials());
// Create instance of WorkItemTrackingHttpClient using VssConnection
var gitClient = connection.GetClient<GitHttpClient>();
var items = gitClient.GetRepositoriesAsync().Result;
foreach (var item in items)
{
Console.WriteLine(item.Name);
}
See also: https://www.visualstudio.com/en-us/docs/integrate/get-started/client-libraries/samples
I am stuck with the uploading of large blobs to Windows Azure Blob Storage.
I followed this link and, locally, everything goes fine.
Then I loaded my website, fully coded in HTML5 + JS, and the file upload feature does not work anymore. When I try to upload the first chunk, the browser stops the request because of CORS.
I followed this link, hoping to solve the issue, and the CORS rule in Windows Azure Blob Storage now says::
Allowed origins: *
Allowed methods: Get, Put
Allowed headers: *
Exposed headers: x-ms-*
Max age (seconds): 3600
So i think that every request, from any origin, to my BlobService should not be rejected due to the CORS. Am I wrong?
To describe my setup:
I have a Windows Azure Mobile Service which generates a SAS for me. I wonder why the SAS baseUrl is an http instead of https.
I have a website, protected by a certificate, that runs using the https protocol. Everything goes on https.
The website queries the service and it successfully gets a valid SAS.
If I use the SAS as is, with the url with the http protocol, the browser blocks the request because it is an "Active Mixed Content" (an http request issued by a https web site).
If I change the SAS url, forcing it to use the https protocol, the browser blocks the request because of the CORS.
My Azure Mobile Service Back-end is in NodeJS and I have not found a way to let it create an https SAS.
The code which upload a chunk of a file is the following (JS):
function uploadSlicedChunk(data){
//Block id padding
var num = '' + UploadManager.blockIdCounter;
while(num.length < 10){
num = '0'+num;
}
var blockId = btoa("block-"+num);
//URI and data creation
var uri = UploadManager.submitUri.substr(0,4)+"s"+UploadManager.submitUri.substr(4, UploadManager.submitUri.length)+"&comp=block&blockId="+blockId;
var requestData = new Uint8Array(data);
//AJAX options
var ajaxOption = {
url : uri,
type : "PUT",
data : requestData,
processData : false,
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('Content-Length', requestData.length);
},
success : function (data, status) {
//Upoad successfull -> update tracking variables.
UploadManager.blockIds.push(blockId);
UploadManager.blockIdCounter += 1;
UploadManager.bytesSent += requestData.length;
UploadManager.currentFilePointer += UploadManager.maxBlockSize;
UploadManager.totalRemainingBytes -= UploadManager.maxBlockSize;
if(UploadManager.totalRemainingBytes < UploadManager.maxBlockSize){
UploadManager.maxBlockSize = UploadManager.totalRemainingBytes;
}
//Prepare next upload;
var percentComplete = ((parseFloat(UploadManager.bytesSent) / parseFloat(UploadManager.fileLen)) * 100).toFixed(2);
$("#content-saving-logger").append(percentComplete + " %<br />");
if(UploadManager.totalRemainingBytes > 0){
var fileContent = UploadManager.file.slice(UploadManager.currentFilePointer,
UploadManager.currentFilePointer+UploadManager.maxBlockSize);
UploadManager.reader.readAsArrayBuffer(fileContent);
} else {
UploadManager.commitBlob();
}
},
error : function(jqXHR, status, error){
alert(status + ", "+error+": "+jqXHR.responseText);
}
};
$.ajax(ajaxOption);
}
I do not know what to do now to solve the issue... The only solution I found is to prepare a service on my Azure Mobile Services which receives the chunks and uploads them to the Blob Storage, making it a sort of proxy. And, at the end of the process, it commits the blob. But, basically, it will makes the service slower and it requires a lot of more work... while this solution should work!
Thank you for your patience and for your help,
Ric.
Has anyone successfully used the AWS SDK to generate signed URLs to objects in an S3 bucket which also work over CloudFront? I'm using the JavaScript AWS SDK and it's really simple to generate signed URLs via the S3 links. I just created a private bucket and use the following code to generate the URL:
var AWS = require('aws-sdk')
, s3 = new AWS.S3()
, params = {Bucket: 'my-bucket', Key: 'path/to/key', Expiration: 20}
s3.getSignedUrl('getObject', params, function (err, url) {
console.log('Signed URL: ' + url)
})
This works great but I also want to expose a CloudFront URL to my users so they can get the increased download speeds of using the CDN. I setup a CloudFront distribution which modified the bucket policy to allow access. However, after doing this any file could be accessed via the CloudFront URL and Amazon appeared to ignore the signature in my link. After reading some more on this I've seen that people generate a .pem file to get signed URLs working with CloudFront but why is this not necessary for S3? It seems like the getSignedUrl method simply does the signing with the AWS Secret Key and AWS Access Key. Has anyone gotten a setup like this working before?
Update:
After further research it appears that CloudFront handles URL signatures completely different from S3 [link]. However, I'm still unclear as to how to create a signed CloudFront URL using Javascript.
Update: I moved the signing functionality from the example code below into the aws-cloudfront-sign package on NPM. That way you can just require this package and call getSignedUrl().
After some further investigation I found a solution which is sort of a combo between this answer and a method I found in the Boto library. It is true that S3 URL signatures are handled differently than CloudFront URL signatures. If you just need to sign an S3 link then the example code in my initial question will work just fine for you. However, it gets a little more complicated if you want to generate signed URLs which utilize your CloudFront distribution. This is because CloudFront URL signatures are not currently supported in the AWS SDK so you have to create the signature on your own. In case you also need to do this, here are basic steps. I'll assume you already have an S3 bucket setup:
Configure CloudFront
Create a CloudFront distribution
Configure your origin with the following settings
Origin Domain Name: {your-s3-bucket}
Restrict Bucket Access: Yes
Grant Read Permissions on Bucket: Yes, Update Bucket Policy
Create CloudFront Key Pair. Should be able to do this here.
Create Signed CloudFront URL
To great a signed CloudFront URL you just need to sign your policy using RSA-SHA1 and include it as a query param. You can find more on custom policies here but I've included a basic one in the sample code below that should get you up and running. The sample code is for Node.js but the process could be applied to any language.
var crypto = require('crypto')
, fs = require('fs')
, util = require('util')
, moment = require('moment')
, urlParse = require('url')
, cloudfrontAccessKey = '<your-cloudfront-public-key>'
, expiration = moment().add('seconds', 30) // epoch-expiration-time
// Define your policy.
var policy = {
'Statement': [{
'Resource': 'http://<your-cloudfront-domain-name>/path/to/object',
'Condition': {
'DateLessThan': {'AWS:EpochTime': '<epoch-expiration-time>'},
}
}]
}
// Now that you have your policy defined you can sign it like this:
var sign = crypto.createSign('RSA-SHA1')
, pem = fs.readFileSync('<path-to-cloudfront-private-key>')
, key = pem.toString('ascii')
sign.update(JSON.stringify(policy))
var signature = sign.sign(key, 'base64')
// Finally, you build the URL with all of the required query params:
var url = {
host: '<your-cloudfront-domain-name>',
protocol: 'http',
pathname: '<path-to-s3-object>'
}
var params = {
'Key-Pair-Id=' + cloudfrontAccessKey,
'Expires=' + expiration,
'Signature=' + signature
}
var signedUrl = util.format('%s?%s', urlParse.format(url), params.join('&'))
return signedUrl
For my code to work with Jason Sims's code, I also had to convert policy to base64 and add it to the final signedUrl, like this:
sign.update(JSON.stringify(policy))
var signature = sign.sign(key, 'base64')
var policy_64 = new Buffer(JSON.stringify(policy)).toString('base64'); // ADDED
// Finally, you build the URL with all of the required query params:
var url = {
host: '<your-cloudfront-domain-name>',
protocol: 'http',
pathname: '<path-to-s3-object>'
}
var params = {
'Key-Pair-Id=' + cloudfrontAccessKey,
'Expires=' + expiration,
'Signature=' + signature,
'Policy=' + policy_64 // ADDED
}
AWS includes some built in classes and structures to assist in the creation of signed URLs and Cookies for CloudFront. I utilized these alongside the excellent answer by Jason Sims to get it working in a slightly different pattern (which appears to be very similar to the NPM package he created).
Namely, the AWS.CloudFront.Signer type description which abstracts the process of creating signed URLs and Cookies.
export class Signer {
/**
* A signer object can be used to generate signed URLs and cookies for granting access to content on restricted CloudFront distributions.
*
* #param {string} keyPairId - The ID of the CloudFront key pair being used.
* #param {string} privateKey - A private key in RSA format.
*/
constructor(keyPairId: string, privateKey: string);
....
}
And either an options with a policy JSON string or without a policy with a url and expiration time.
export interface SignerOptionsWithPolicy {
/**
* A CloudFront JSON policy. Required unless you pass in a url and an expiry time.
*/
policy: string;
}
export interface SignerOptionsWithoutPolicy {
/**
* The URL to which the signature will grant access. Required unless you pass in a full policy.
*/
url: string
/**
* A Unix UTC timestamp indicating when the signature should expire. Required unless you pass in a full policy.
*/
expires: number
}
Sample implementation:
import aws, { CloudFront } from 'aws-sdk';
export async function getSignedUrl() {
// https://abc.cloudfront.net/my-resource.jpg
const url = <cloud front url/resource>;
// Create signer object - requires a public key id and private key value
const signer = new CloudFront.Signer(<public-key-id>, <private-key>);
// Setup expiration time (one hour in the future, in this case)
const expiration = new Date();
expiration.setTime(expiration.getTime() + 1000 * 60 * 60);
const expirationEpoch = expiration.valueOf();
// Set options (Without policy in this example, but a JSON policy string can be substituted)
const options = {
url: url,
expires: expirationEpoch
};
return new Promise((resolve, reject) => {
// Call getSignedUrl passing in options, to be handled either by callback or synchronously without callback
signer.getSignedUrl(options, (err, url) => {
if (err) {
console.error(err.stack);
reject(err);
}
resolve(url);
});
});
}
I'm trying to authorize my application to integrate with Google Drive. Google documentation provides details for server based authorization and code samples for various server technologies.
There's also a JavaScript Google API library, that has support for authorization. Down in the samples section of the wiki there is a code snippet for creating a config and calling the authorize function. I've altered the scope to be that one I believe is required for drive:
var config = {
'client_id': 'my_client_ID',
'scope': 'https://www.googleapis.com/auth/drive.file'
};
gapi.auth.authorize(config, function() {
console.log(gapi.auth);
});
The callback function is never called (yes, the Google API library is loaded corrected) Looking the Java Retrieve and Use OAuth 2.0 Credentials example, the client secret seems to be a parameter, should this go into the config?
Has anyone tried this in JS, for Drive or other Google APIs? Does anyone know the best route for debugging such a problem, i.e. do I need to just step through the library and stop whinging?
Please don't suggest doing the authorization on the server side, our application is entirely client side, I don't want any state on the server (and I understand the token refresh issues this will cause). I am familiar with the API configuration in the Google console and I believe that and the drive SDK setting are correct.
It is possible to use the Google APIs Javascript client library with Drive but you have to be aware that there are some pain points.
There are 2 main issues currently, both of which have workarrounds:
Authorization
First if you have a look closely at how Google Drive auth works you will realize that, after a user has installed your Drive application and tries to open a file or create a new file with your application, Drive initiates the OAuth 2.0 authorization flow automatically and the auth parameters are set to response_type=code and access_type=offline.
This basically means that right now Drive apps are forced to use the OAuth 2 server-side flow which is not going to be of any use to the Javascript client library (which only uses the client-side flow).
The issue is that: Drive initiates a server-side OAuth 2.0 flow, then the Javascript client library initiates a client-side OAuth 2.0 flow.
This can still work, all you have to do it is use server-side code to process the authorization code returned after the Drive server-side flow (you need to exchange it for an access token and a refresh token).
That way, only on the first flow will the user be prompted for authorization. After the first time you exchange the authorization code, the auth page will be bypassed automatically.
Server side samples to do this is available in our documentation.
If you don't process/exchange the auth code on the server-side flow, the user will be prompted for auth every single time he tries to use your app from Drive.
Handling file content
The second issue is that uploading and accessing the actual Drive file content is not made easy by our Javascript client library. You can still do it but you will have to use custom Javascript code.
Reading the file content
When a file metadata/a file object is retrieved, it contains a downloadUrl attribute which points to the actual file content. It is now possible to download the file using a CORS request and the simplest way to auth is to use the OAuth 2 access token in a URL param. So just append &access_token=... to the downloadUrl and fetch the file using XHR or by forwarding the user to the URL.
Uploading file content
UPDATE UPDATE: The upload endpoints do now support CORS.
~~UPDATE: The upload endpoints, unlike the rest of the Drive API do not support CORS so you'll have to use the trick below for now:~~
Uploading a file is tricky because it's not built-in the Javascript client lib and you can't entirely do it with HTTP as described in this response because we don't allow cross-domain requests on these API endpoints. So you do have to take advantage of the iframe proxy used by our Javascript client library and use it to send a constructed multipart request to the Drive SDK. Thanks to #Alain, we have an sample of how to do that below:
/**
* Insert new file.
*
* #param {File} fileData File object to read data from.
* #param {Function} callback Callback function to call when the request is complete.
*/
function insertFileData(fileData, callback) {
const boundary = '-------314159265358979323846';
const delimiter = "\r\n--" + boundary + "\r\n";
const close_delim = "\r\n--" + boundary + "--";
var reader = new FileReader();
reader.readAsBinaryString(fileData);
reader.onload = function(e) {
var contentType = fileData.type || 'application/octet-stream';
var metadata = {
'title': fileData.fileName,
'mimeType': contentType
};
var base64Data = btoa(reader.result);
var multipartRequestBody =
delimiter +
'Content-Type: application/json\r\n\r\n' +
JSON.stringify(metadata) +
delimiter +
'Content-Type: ' + contentType + '\r\n' +
'Content-Transfer-Encoding: base64\r\n' +
'\r\n' +
base64Data +
close_delim;
var request = gapi.client.request({
'path': '/upload/drive/v2/files',
'method': 'POST',
'params': {'uploadType': 'multipart'},
'headers': {
'Content-Type': 'multipart/mixed; boundary="' + boundary + '"'
},
'body': multipartRequestBody});
if (!callback) {
callback = function(file) {
console.log(file)
};
}
request.execute(callback);
}
}
To improve all this, in the future we might:
Let developers choose which OAuth 2.0 flow they want to use (server-side or client-side) or let the developer handle the OAuth flow entirely.
Allow CORS on the /upload/... endpoints
Allow CORS on the exportLinks for native gDocs
We should make it easier to upload files using our Javascript client library.
No promises at this point though :)
I did it. Heres my code:
<!DOCTYPE html>
<html>
<head>
<meta charset='utf-8' />
<style>
p {
font-family: Tahoma;
}
</style>
</head>
<body>
<!--Add a button for the user to click to initiate auth sequence -->
<button id="authorize-button" style="visibility: hidden">Authorize</button>
<script type="text/javascript">
var clientId = '######';
var apiKey = 'aaaaaaaaaaaaaaaaaaa';
// To enter one or more authentication scopes, refer to the documentation for the API.
var scopes = 'https://www.googleapis.com/auth/drive';
// Use a button to handle authentication the first time.
function handleClientLoad() {
gapi.client.setApiKey(apiKey);
window.setTimeout(checkAuth,1);
}
function checkAuth() {
gapi.auth.authorize({client_id: clientId, scope: scopes, immediate: true}, handleAuthResult);
}
function handleAuthResult(authResult) {
var authorizeButton = document.getElementById('authorize-button');
if (authResult && !authResult.error) {
authorizeButton.style.visibility = 'hidden';
makeApiCall();
} else {
authorizeButton.style.visibility = '';
authorizeButton.onclick = handleAuthClick;
}
}
function handleAuthClick(event) {
gapi.auth.authorize({client_id: clientId, scope: scopes, immediate: false}, handleAuthResult);
return false;
}
// Load the API and make an API call. Display the results on the screen.
function makeApiCall() {
gapi.client.load('drive', 'v2', function() {
var request = gapi.client.drive.files.list ( {'maxResults': 5 } );
request.execute(function(resp) {
for (i=0; i<resp.items.length; i++) {
var titulo = resp.items[i].title;
var fechaUpd = resp.items[i].modifiedDate;
var userUpd = resp.items[i].lastModifyingUserName;
var fileInfo = document.createElement('li');
fileInfo.appendChild(document.createTextNode('TITLE: ' + titulo + ' - LAST MODIF: ' + fechaUpd + ' - BY: ' + userUpd ));
document.getElementById('content').appendChild(fileInfo);
}
});
});
}
</script>
<script src="https://apis.google.com/js/client.js?onload=handleClientLoad"></script>
<p><b>These are 5 files from your GDrive :)</b></p>
<div id="content"></div>
</body>
</html>
You only have to change:
var clientId = '######';
var apiKey = 'aaaaaaaaaaaaaaaaaaa';
to your clientID and ApiKey from your Google API Console :)
Of course you have to create your project on Google API Console, activate the Drive API and activate Google Accounts auth in OAuth 2.0 (really eeeeasy!)
PS: it wont work locally on your PC, it will work on some hosting, and yoy must provide the url from it on the project console :)