How to Use Javascript and AJAX to Upload into Wasabi Storage - javascript

How would I use client-side browser Javascript with AJAX to upload a file into Wasabi Storage?

Pankaj at Wasabi Tech Support got back to me and said that this code snippet will work just fine. They recommend one download and use the Amazon AWS SDK for S3 because Wasabi is S3 API compliant.
<!DOCTYPE html>
<html>
<head>
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.619.0.min.js"></script>
</head>
<body>
<h1>Wasabi Upload Test</h1>
<input type="file" id="wasabiupload" onchange="handleFile()" />
<script>
function handleFile() {
// console.log("handle file - " + JSON.stringify(event, null, 2));
var files = document.getElementById('wasabiupload').files;
if (!files.length) {
return alert('Please choose a file to upload first.');
}
var f = files[0];
var fileName = f.name;
const s3 = new AWS.S3({
correctClockSkew: true,
endpoint: 'https://s3.wasabisys.com', //use appropriate endpoint as per region of the bucket
accessKeyId: 'Wasabi-Access-keys',
secretAccessKey: 'Wasabi-Secret-Access-key',
region: 'us-east-1'
,logger: console
});
console.log('Loaded');
const uploadRequest = new AWS.S3.ManagedUpload({
params: { Bucket: 'bucket-name', Key: 'file-name', Body: f },
service: s3
});
uploadRequest.on('httpUploadProgress', function(event) {
const progressPercentage = Math.floor(event.loaded * 100 / event.total);
console.log('Upload progress ' + progressPercentage);
});
console.log('Configed and sending');
uploadRequest.send(function(err) {
if (err) {
console.log('UPLOAD ERROR: ' + JSON.stringify(err, null, 2));
} else {
console.log('Good upload');
}
});
}
</script>
</body>
</html>

Related

Google Drive ApI - Save files

I volunteer for a certain association and I wanted to create a landing page where the volunteers upload a picture, send it and it is saved in my Google Drive.
After many attempts - I get an error at the moment of sending. Brings up the code I wrote down and the error. The "---" is the details of the project.
Thanks!
```
<!DOCTYPE html>
<html>
<head>
<title>Upload Image to Google Drive</title>
<script src="https://apis.google.com/js/api.js"></script>
<script>
const CLIENT_ID = '------';
const API_KEY = '-----';
const PUBLIC_FOLDER_ID = '-----';
// Authorization scopes required by the API
const SCOPES = 'https://www.googleapis.com/auth/drive.file';
/**
* Load the API client library and authorize the user.
*/
function handleClientLoad() {
gapi.load('client:auth2', initClient);
}
function initClient() {
gapi.client.init({
apiKey: API_KEY,
clientId: CLIENT_ID,
discoveryDocs: ["https://www.googleapis.com/discovery/v1/apis/drive/v3/rest"],
scope: SCOPES,
plugin_name:'demoApp'
}).then(() => {
// Listen for form submit events
document.getElementById('upload-form').addEventListener('submit', uploadImage);
}).catch(error => {
console.error('Error initializing API client:', error);
});
}
/**
* Upload an image to Google Drive.
*/
function uploadImage(event) {
event.preventDefault();
const file = document.getElementById('file-input').files[0];
if (!file) {
console.error('No file selected.');
return;
}
const metadata = {
name: file.name,
parents: [PUBLIC_FOLDER_ID]
};
const reader = new FileReader();
reader.onload = function(e) {
const fileContent = e.target.result;
const fileData = new Blob([fileContent], {type: file.type});
const uploadRequest = gapi.client.drive.files.create({
resource: metadata,
media: {
mimeType: file.type,
body: fileData
},
fields: 'id'
});
uploadRequest.execute(response => {
console.log('Image uploaded with ID:', response.id);
});
};
reader.readAsArrayBuffer(file);
}
</script>
</head>
<body onload="handleClientLoad()">
<h1>Upload Image to Google Drive</h1>
<form id="upload-form">
<input type="file" id="file-input">
<button type="submit">Upload</button>
</form>
</body>
</html>
```

net::ERR_NAME_NOT_RESOLVED while uploading picture to s3 bucket

I am using angular to create my application. Am trying to upload a picture to s3 bucket but everytime i try uploading it, this error shows in my console.
Here's my upload.service.ts file code
uploadBarcode(file: any) {
const uniqueId: number = Math.floor(Math.random() * Date.now());
const file_name: string = 'lab_consignments/'+uniqueId + "_" + file.name ;
const contentType = file.type;
console.log(contentType)
const bucket = new S3(
{
accessKeyId: myAccessKey,
secretAccessKey: secretAccessKey,
region: 'Asia Pacific (Mumbai) ap-south-1'
}
);
const params = {
Bucket: 'chc-uploads',
Key: file_name,
Body: file,
ACL: 'public-read',
ContentType: contentType
};
return bucket.upload(params, function (err, data) {
if (err) {
console.log('There was an error uploading your file: ', err);
localStorage.setItem('docDataStatus', 'false');
return false;
}
else {
console.log('Successfully uploaded file from service');
return true;
}
});
}
}
the access key and secret access key are statically typed in my code so don't worry about it. i just changed those to the variable names because of security issues while posting this question to stack overflow.
Any help would be appreciated.
Though this does not answer your initial question, as Marcin said with your AWS credentials, hard-coding them into your code is very bad practice and should be avoided at all costs. For frontend applications, this can be performed by having a simple API endpoint generate you a signed upload URL to the S3 bucket:
https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/
To actually answer your question, you are likely seeing the error because the region you are passing is incorrectly formatted.
const bucket = new S3(
{
region: 'ap-south-1'
}
);
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html#region-property

MTurk how to upload file to S3 when task is submitted

I am trying to create a task on Amazon MTurk, where the workers would collect some data and upload a single file when they are ready & submit the task. When the task is submitted, I want to upload the file to my linked S3 bucket - which is mostly based on this tutorial.
However, the file is sometimes uploaded successfully, and sometimes not. Since the S3.upload function is asynchronous, it looks like the task submission is sometimes completed before the file upload is completed. I am a javascript newbie: I tried to make this happen synchronously, but it still doesn't work properly. Here is my javascript code:
<script>
let config = {
region: 'xxx',
pool: 'xxx',
bucket: 'xxx'
}
AWS.config.region = config.region;
AWS.config.credentials = new AWS.CognitoIdentityCredentials({
IdentityPoolId: config.pool,
});
var s3 = new AWS.S3({
apiVersion: '2006-03-01',
params: {Bucket: config.bucket},
});
start_upload = function (event) {
$("#status").text("Uploading...");
let file = $("#file").prop('files')[0];
if (file === null || file === undefined) {
alert("You must upload a file before submitting.");
$("#status").text("");
return false;
}
console.log('Filename: ' + file.name);
let workerId = turkGetParam('workerId');
let fileKey = '${food_name}' + '/' + workerId + '-' + file.name;
return upload_to_s3(file, fileKey);
};
upload_to_s3 = async (file, fileKey) => {
const params = {
Key: fileKey,
Body: file,
ContentType: file.type,
ACL: 'bucket-owner-full-control'
};
try {
console.log("Starting upload...");
const data = await s3.upload(params).promise();
console.log("Done uploading file");
$("#status").text("Success.");
return true;
} catch (err) {
console.log("Error uploading data. ", err);
alert("Failed to upload, please try again. If the problem persists, contact the Requester.");
$("#status").text("");
return false;
}
}
// Validate and upload file on submit
window.onload = function() {document.getElementById('submitButton').setAttribute('onclick', 'return start_upload()'); }
</script>
Here is the relevant part of the layout of this task (HIT):
How can I make sure that the file upload is completed before the task is completed? I saw that I can overwrite the default submit button added by MTurk, but I would prefer not doing that if possible.
I've found the problem: S3#upload returns a ManagedUpload object, but it doesn't mean that the file upload is completed. I am now using promises and in the callback I submit the form manually. Note that the form is provided by MTurk by default. I just find it by its ID and invoke the submit function manually.
For reference, here is the working code:
<script>
let config = {
region: 'xxx',
pool: 'xxx',
bucket: 'xxx'
}
AWS.config.region = config.region;
AWS.config.credentials = new AWS.CognitoIdentityCredentials({
IdentityPoolId: config.pool,
});
var s3 = new AWS.S3({
apiVersion: '2006-03-01',
params: {Bucket: config.bucket},
});
start_upload = function (event) {
$("#status").text("Uploading, please wait...");
let file = $("#file").prop('files')[0];
if (file === null || file === undefined) {
alert("You must choose a file before submitting.");
$("#status").text("");
return false;
}
let workerId = turkGetParam('workerId');
let fileKey = '${food_name}' + '/' + workerId + '-' + file.name;
upload_to_s3(file, fileKey);
return false;
};
upload_to_s3 = (file, fileKey) => {
const params = {
Key: fileKey,
Body: file,
ContentType: file.type,
ACL: 'bucket-owner-full-control'
};
let promise = s3.upload(params).promise();
promise.then( (data) => {
console.log("Upload completed");
$("#status").text("Success.");
const form = document.getElementById('mturk_form');
form.submit();
}, (err) => {
console.log("Upload failed!!!", err);
alert("Failed to upload, please try again. If the problem persists, contact the Requester.");
$("#status").text("");
} );
}
// Validate and upload file on submit
window.onload = function() {document.getElementById('submitButton').setAttribute('onclick', 'return start_upload()'); }
</script>

AWS S3 Upload after GET Request to Image, Not Uploading Correctly

I'm trying to upload an image to my AWS S3 bucket after downloading the image from another URL using Node (using request-promise-native & aws-sdk):
'use strict';
const config = require('../../../configs');
const AWS = require('aws-sdk');
const request = require('request-promise-native');
AWS.config.update(config.aws);
let s3 = new AWS.S3();
function uploadFile(req, res) {
function getContentTypeByFile(fileName) {
var rc = 'application/octet-stream';
var fn = fileName.toLowerCase();
if (fn.indexOf('.png') >= 0) rc = 'image/png';
else if (fn.indexOf('.jpg') >= 0) rc = 'image/jpg';
return rc;
}
let body = req.body,
params = {
"ACL": "bucket-owner-full-control",
"Bucket": 'testing-bucket',
"Content-Type": null,
"Key": null, // Name of the file
"Body": null // File body
};
// Grabs the filename from a URL
params.Key = body.url.substring(body.url.lastIndexOf('/') + 1);
// Setting the content type
params.ContentType = getContentTypeByFile(params.Key);
request.get(body.url)
.then(response => {
params.Body = response;
s3.putObject(params, (err, data) => {
if (err) { console.log(`Error uploading to S3 - ${err}`); }
if (data) { console.log("Success - Uploaded to S3: " + data.toString()); }
});
})
.catch(err => { console.log(`Error encountered: ${err}`); });
}
The upload succeeds when I test it out, however after trying to redownload it from my bucket the image is unable to display. Additionally, I notice after uploading the file with my function, the file listed in the bucket is much larger in filesize than the originally uploaded image. I'm trying to figure out where I've been going wrong but cannot find where. Any help is appreciated.
Try to open the faulty file with a text editor, you will see some errors written in it.
You can try using s3.upload instead of putObject, it works better with streams.

Uploading base64 encoded Image to Amazon S3 via Node.js

Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method

Categories