I'm trying to retrieve a profile picture from the Microsoft Azure API. The code below shows how.
private getProfilePicture(bearerToken: string, tenantId: string): void {
let command = CommandHelper.createRestBlobCommand([
{ name: 'https://graph.microsoft.com/beta'},
{ name: 'me/photo/$value' }
]);
command.Headers.push({ name: 'Authorization', value: 'bearer ' + bearerToken });
// load the photo from graph api.
this._gateway.send(command).subscribe(result => {
let info = result.payload;
var Base64 = require('js-base64').Base64;
var temp = 'data:image/bmp;base64,' + Base64.encode('info');
console.log(temp);
let action = actions.AuthenticationActions.photoLoaded(info);
this._store.dispatch(action);
});
}
The problem however is, that when I look at the output it returns :
data:image/bmp;base64,W29iamVjdCBCbG9iXQ==
Which translates to [object Blob].
My question is how to get the image object here?
with PHP is much simple in nodejs you have to put this to the function who return req as param and do it like this :
app.get('/...', function(req, res){ ...
var base64Data = req.rawBody.replace(/^data:image\/bmp;base64,/, "");
require("fs").writeFile("out.bmp", base64Data, 'base64', function(err) {
console.log(err);
});
Related
The full path to the endpoint with the query string parameters is:
https://api.mydomain.com/getData?param_01=value_01¶m_02=value_01
After importing the 'aws-api-gateway-client'
var apigClientFactory = require('aws-api-gateway-client').default;
I go ahead and configure the variables:
let url = 'https://api.mydomain.com'
let pathTemplate = '/getData?param_01=value_01¶m_02=value_01';
let method = 'GET';
let params = '';
let additionalParams = '';
let body = '';
var client = apigClientFactory.newClient({
invokeUrl: url,
accessKey: 'my-accessKeyId',
secretKey: 'my-secretAccessKey',
sessionToken: 'my-sessionToken',
region: 'MY_AWS_REGION'
});
Next invoke endpoint with:
client
.invokeApi(params, pathTemplate, method, additionalParams, body)
.then(function(res) {
console.log("...res:", res);
})
.catch(function(err) {
console.log("...err:", err);
});
But it fails with the error
The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details
Is there a way to send the queryStringParameters with invokeApi command?
let params = {};
let pathTemplate = '/getData';
let additionalParams = {
queryParams: {
param0: 'value0',
param1: 'value1'
}
};
aws-api-gateway-client - npm
Wrote a function in solidity which is like this,
function AddData(uint _index, string _projectName, string _devAddress, string _developer) public {
Datas[_index] = Data(_index, 0, _projectName, _devAddress, _developer);
}
Note that this is just a fragment of the whole solidity code. In js, i'm trying to pass variables into the functions but doesnt seem to be working. I assume it's the js variables unable to pass into solidity string variables.
counterDB.AddData(id_db, projectName, devAddress, developer, function (err, result) {
if (err) {
console.log('Error: ' + err);
}
else {
console.log(result);
}
});
The variables passed in are data i pulled out from the database to pass into the smart contract. I checked every data has been pulled in properly but i cant pass the data into the function. Am i missing a function to convert the var into string?
Are you using web3 v1.0 ? Have you tried to pass the function like this:
var yourContract = new web3.eth.Contract(ABI, contractAddress);
const contractFunction = yourContract.methods.AddData(id_db, projectName, devAddress, developer);
const functionBytes = contractFunction.encodeABI();
const rawTx = {
gasLimit: web3.utils.toHex(200000),
to: contractAddress,
from: addressFrom,
data: functionBytes
};
web3.eth.accounts.signTransaction(rawTx, privateKey)
.then(RLPencodedTx => {
web3.eth.sendSignedTransaction(RLPencodedTx['rawTransaction'])
.on('error', error => { callback(null, error) })
.on('receipt', receipt => { callback(receipt) });
})
Keep in mind, that you have to decrypt the private key if you are using the json export of it:
var path = process.cwd();
const exportedAccountString = fs.readFileSync(path + '/your_key_file.json').toString();
const decrypted = web3.eth.accounts.decrypt(exportedAccountString, 'YourPasswordHere...');
const privateKey = decrypted.privateKey;
I'm currently working with Node.js using the watson-developer-cloud Node.js SDK and I'm having problems when sending a query that includes an entity.
This is my code:
// require watson's node sdk and fs
var watson = require('watson-developer-cloud');
var fs = require('fs');
// Define output file
var outputJSONFile = '/home/vagrant/Desktop/node/dir/data.json';
// Create alchemy_data_news object using our api_key
var alchemy_data_news = watson.alchemy_data_news({
api_key: ''
});
// Define params for the query and what values to return
// Accepted returne values:
// docs.alchemyapi.com/v1.0/docs/full-list-of-supported-news-api-fields
var params = {
start: 'now-1m',
end: 'now',
count: 2,
qs: ['q.enriched.url.enrichedTitle.entities.entity.text=apple'],
return: ['enriched.url.url,enriched.url.title']
};
// Call getNews method and return json
alchemy_data_news.getNews(params, function (err, news) {
if (err) {
console.log('error:', err);
} else {
fs.writeFile(outputJSONFile, JSON.stringify(news, null, 2), function(err) {
if (err) {
console.log('WriteFile Error:', err);
} else {
console.log("JSON saved to " + outputJSONFile);
}
});
}
});
I'm still trying to figure out how to send the entities parameters using the params object.
While digging up through some code I came across qs so I have been using that to test but I haven't had success at all.
Any suggestions are greatly appreciated.
P.S: I'm trying to pass:
q.enriched.url.enrichedTitle.entities.entity.text=apple
q.enriched.url.enrichedTitle.entities.entity.type=company
If you look at the node-sdk source code for AlchemyDataNews, you will see that the top level parameters are being sent as query strings.
Then params map should be:
var params = {
start: 'now-1m',
end: 'now',
count: 2,
return: ['enriched.url.url,enriched.url.title'],
// fields here
'q.enriched.url.enrichedTitle.entities.entity.text': 'apple',
'q.enriched.url.enrichedTitle.entities.entity.type': 'company'
};
I need to convert some JSON to XML with this library, I need to do that in order to send some data to the Database in a post request I am working on.
This is what req.body returns
{ DealerName: 'amrcl',
CardId: '123',
Nickname: 'mkm123',
Picture: 'http://lorempixel.com/150/150/',
Active: '1',
LegalId: '1',
TypeId: '1'
}
is dynamic data. Let me show you the code
export default function (req, res) {
try {
const connection = new sql.Connection(spConfig, function spConnection (errSpConnection) {
const request = connection.request();
if (errSpConnection) {
res.status(401);
}
request.input('Dealer_Param', sql.VarChar(1000), req.body);
request.input('param_IS_DEBUG', sql.Bit, null);
request.output('output_IS_SUCCESSFUL', sql.Bit);
request.output('output_STATUS', sql.VarChar(500));
request.execute('[mydbo].[StoredProcedure]', function spExecution (errSpExecution, dataset) {
connection.close();
if (errSpExecution) {
res.status(401);
} else {
if (request.parameters.output_IS_SUCCESSFUL.value) {
res.status(200).json({
success : 'New dealer successfully inserted.',
});
}
}
});
});
}
}
that is for Stored Procedure method which is with the mssql module.
As you see the code above, there is a failure error, because in request.input('Dealer_Param', sql.VarChar(1000), req.body); I am sending the JSON I paste at the beginning of the question. But if instead of req.body I put this XML with Dummy data '<Dealers><Detail DealerName = "TESTING123" CardId = "1222" NickName = "tester123" Active = "1" LegalId = "16545" TypeId = "1"></Detail></Dealers>' then everything works fine because the DB need to receive an XML.
So, what are your recommendations, what should I do to put the JSON data as a XML ?
I would just load the json2xml library...
install $ npm install json2xml
import the module in your code: var json2xml = require("json2xml");
and then convert the json to xml like so:
var key,
attrs=[];
for (key in req.body) {
if (req.body.hasOwnProperty(key)) {
var obj = {};
obj.key = req.body.key;
attrs.push(obj);
}
}
var dealerXml = json2xml({dealer:req.body, attr:attrs}, { attributes_key:'attr'});
request.input('Dealer_Param', sql.VarChar(1000), dealerXml);
Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method