Tweeting w/ media fails using Firebase Cloud Functions - javascript

My Firebase web project has been working for several months now. But on Sunday June 3, 2018, my application stopped sending tweets with media (images) attached. Before this, it was working for several months. I have not changed the failing code before the 3rd and I have even reverted to code that worked before that date but the app still fails :(
SDK versions:
I am using the most up to date versions of Firebase tools (3.18.6), and cloud functions (1.0.3). Along with twit (2.2.10) a javascript library for twitter api.
Do note my project was also working on older versions of the above including pre v1.0 cloud functions. Also note, I am still able to send regular text tweets just not ones with any media (image,gif,mp4).
This mainly relates to Twitter's API, but I cannot rule out something funky going on in Firebase's Node.js environment.
How to reproduce:
For simplicity, I will link to the code in a tutorial which I originally used when starting the project.
Setup a twitter account and retrieve all the necessary tokens as outlined in the tutorial. Then simply call the cloud function below and it will attempt to tweet the NASA image of the day.
The function is able to upload the picture to the twitter server and the response I get is expected:
{ media_id: 1004461244487643100,
media_id_string: '1004461244487643136',
media_key: '5_1004461244487643136',
size: 92917,
expires_after_secs: 86400,
image: { image_type: 'image/jpeg', w: 960, h: 1318 } }
However, once it attempts to post the tweet with the media attached, I receive an error
code 324: 'Unsupported raw media category'
which doesn't exist in Twitter's docs: https://developer.twitter.com/en/docs/basics/response-codes.html
Now, Code 324 does exist but in Twitter's docs there is a different description:
"The validation of media ids failed"
Which I have yet to receive. So my media id is valid, so something else is wrong? No where on the internet can I find someone with this exact error.
Link to tutorial code:
https://medium.freecodecamp.org/how-to-build-and-deploy-a-multifunctional-twitter-bot-49e941bb3092
Javascript code that reproduces the issue:
**index.js
'use strict';
const functions = require('firebase-functions');
const request = require('request');
const path = require('path');
const os = require('os');
const fs = require('fs');
const tmpDir = os.tmpdir(); // Ref to the temporary dir on worker machine
const Twit = require('twit');
const T = new Twit({
consumer_key: 'your twitter key'
,consumer_secret: 'your twitter secret'
,access_token: 'your twitter token'
,access_token_secret: 'your twitter token secret'
});
exports.http_testMediaTweet = functions.https.onRequest((req, res) => {
function getPhoto() {
const parameters = {
url: 'https://api.nasa.gov/planetary/apod',
qs: {
api_key: 'DEMO_KEY'
},
encoding: 'binary'
};
request.get(parameters, (err, response, body) => {
if (err) {console.log('err: ' + err)}
body = JSON.parse(body);
var f = path.join(tmpDir, 'nasa.jpg');
saveFile(body, f);
});
}
function saveFile(body, fileName) {
const file = fs.createWriteStream(fileName);
request(body).pipe(file).on('close', err => {
if (err) {
console.log(err)
} else {
console.log('Media saved! '+body.title)
const descriptionText = body.title
uploadMedia(descriptionText, fileName);
}
})
}
function uploadMedia(descriptionText, fileName) {
const filePath = path.join(__dirname, `../${fileName}`)
console.log(`uploadMedia: file PATH ${fileName}`)
T.postMediaChunked({
file_path: fileName
}, (err, data, respone) => {
if (err) {
console.log(err)
} else {
console.log(data)
const params = {
status: descriptionText,
media_ids: data.media_id_string
}
postStatus(params);
}
})
}
function postStatus(params) {
T.post('statuses/update', params, (err, data, respone) => {
if (err) {
console.log(err)
res.status(500).send('Error: ' + err);
} else {
console.log('Status posted!')
res.status(200).send('success');
}
})
}
// Do thing
getPhoto();
});
I was hoping to launch my app next week but this has become a major issue for me. I've tried everything I can think of and consulted the docs for Twitter and the js library but I seem to be doing everything right. Hopefully someone can shed some light on this, thanks.

Related

Problem with file input size (video upload to cloudinary through netlify serverless functions)

I'm having issues uploading video files to cloudinary in my react app, deployed in netlify. In my App I have a react page with a form that sends the data to my API. My API handles the HTTP requests to my netlify functions (using Axios), and then with the serverless function I call the cloudinary node API to store the video file. The problem happens when I'm passing the data from my API to the serverless function, I'm getting "Error: Stream body too big", because the video exceeds the payload limit of netlify functions (6mb). Do I have to compress the file? It's okay to do it like this (frontend page -> api.js -> serverless function)? Thanks for all the help you guys provide everyday, it helped me a lot!
Files that I have and the error:
page.jsx
const formHandler = async (formValues) => {
try {
...
const res = await addExercise(formValues);
...
} catch (error) {
console.log("error ", error);
}
};
api.js
import { instance } from "./instance";
...
export const addExercise = async (data) => {
try {
const reader = new FileReader();
reader.readAsDataURL(data.exerciseVideo[0]);
reader.onloadend = async () => {
const cloudVideo = reader.result;
const cloudinaryResponse = await instance.post("upload-exerciseVideo", { cloudVideo });
...
}
} catch (error) {
console.log("error", error);
}
};
serverless function (upload-exerciseVideo.js)
import cloudinary from '../src/config/cloudinary';
export const handler = async (event, context) => {
try {
const data = JSON.parse(event.body);
const { cloudVideo } = data;
const cloudinaryRequest = await cloudinary.uploader
.upload(cloudVideo, {
resource_type: "video",
upload_preset: "z9qxfc6q"
});
...
} catch (error) {
console.log('error', error)
return {
statusCode: 400,
body: JSON.stringify(error)
}
}
}
Error:
Netlify Serverless functions are built on top of AWS Lambda functions so there is a hard limit to the size of the file and the amount of time it takes to run the code in the file. You didn't mention the size of your video, but the video does take longer to upload, and even if you are within the 1GB size limit, you may be exceeding the 10-second processing limit. Your video likely already has been compressed, so compression is not a viable option, and decompressing it in the serverless function would probably exceed the time limit. https://www.netlify.com/blog/intro-to-serverless-function.
If you're uploading a large file, like a video, from front-end code, consider using the Upload Widget with an unsigned preset. Here's a link to a code sandbox showing how to create and use the upload widget in React: https://codesandbox.io/s/cld-uw-uymkb. You will need to add your Cloudinary cloudname and an unsigned preset to make this work. You'll find instructions for creating unsigned presets here: https://cloudinary.com/documentation/upload_presets

ffmpeg app using node occasionally crashes as file doesn't appear to be read correctly

I have an simple Node application that allows me to pass an AWS S3 URL link to a file (in this case video files). It uses the FFMPEG library to read the video file and return data like codecs, duration, bitrate etc..
The script is called from PHP script which in turn send the data to the Node endpoint and passes the Amazon S3 URL to node. Sometimes for no obvious reasons the video file fails to return the expected values regarding container, codec, duration etc... and just returns '0'. But when I try the exact same file/request again it returns this data correctly e.g container:mp4
I'm not sure but I think the script somehow needs the createWriteStream to be closed but I cannot be sure, the problem is the issue I have found doesn't happen all the time but sporadically so its hard to get to the issue when its difficult to replicate it.
Any ideas?
router.post('/', async function(req, res) {
const fileURL = new URL(req.body.file);
var path = fileURL.pathname;
path = 'tmp/'+path.substring(1); // removes the initial / from the path
let file = fs.createWriteStream(path); // create the file locally
const request = https.get(fileURL, function(response) {
response.pipe(file);
});
// after file has saved
file.on('finish', function () {
var process = new ffmpeg(path);
process.then(function (video) {
let metadata = formatMetadata(video.metadata);
res.send ({
status: '200',
data: metadata,
errors: errors,
response: 'success'
});
}, function (err) {
console.warn('Error: ' + err);
res.send ({
status: '400',
data: 'Something went wrong processing this video',
response: 'fail',
});
});
});
file.on('error', function (err) {
console.warn(err);
});
});
function formatMetadata(metadata) {
const data = {
'video' : metadata.video,
'audio' : metadata.audio,
'duration' : metadata.duration
};
return data;
}
// Expected output
{"data":{"video":{"container":"mov","bitrate":400,"stream":0,"codec":"h264","resolution":{"w":1280,"h":720},"resolutionSquare":{"w":1280,"h":720},"aspect":{"x":16,"y":9,"string":"16:9","value":1.7777777777777777},"rotate":0,"fps":25,"pixelString":"1:1","pixel":1},"audio":{"codec":"aac","bitrate":"127","sample_rate":44100,"stream":0,"channels":{"raw":"stereo","value":2}},"duration":{"raw":"00:00:25.68","seconds":25}}
// Actual output
{"data":{"video":{"container":"","bitrate":0,"stream":0,"codec":"","resolution":{"w":0,"h":0},"resolutionSquare":{"w":0,"h":null},"aspect":{},"rotate":0,"fps":0,"pixelString":"","pixel":0},"audio":{"codec":"","bitrate":"","sample_rate":0,"stream":0,"channels":{"raw":"","value":""}},"duration":{"raw":"","seconds":0}}
Note - this happens sporadically
You are not accounting for a failed fetch from AWS. You should check the status code of the response before you move on to your pipe.
const request = https.get(fileURL, function(response) {
if(response.statusCode == 200)
response.pipe(file);
else
// Handle error case
});

authClient.request is not a function problem

I followed several hundred links, most including stackoverflow links, to try to come up with a solution to this problem, but none yielded results.
I am simply trying to get the server to access client's detail via google. Ie, get the client's Google Sheets. I followed their documentation, but for the most part, its on client side only. I followed the instructions for server-side, but it has uncompleted work in it. I found out that the method to do is to have the client sign in via OAuth2.0 and then send the recieved code to the server to process to its very own access code. That is what I'm doing, however, when I try to query any data, I get that error in the title. Here is the code snippets, please let me know if there's anything I'm missing. RIP my rep.
server:
const Router=require("express").Router()
const auth=require("../utils/auth")
const fs= require("fs")
const {OAuth2Client}=require("google-auth-library")//I tried with this library instead, and it will give the exact same error.
const {google} = require('googleapis');
var auths=[]
function oAuth2ClientGetToken(oAuth2Client, code) {
return new Promise((resolve, reject) => {
oAuth2Client.getToken(code, (err, token) => { // errors out here
if (err) reject(err);
resolve(token);
});
});
}
async function formAuthClient(code) {
const {client_secret, client_id,redirect_uris} = JSON.parse(fs.readFileSync(__dirname+"/credentials.json"))
const oAuth2Client = new google.auth.OAuth2( // form authObject
client_id, client_secret,redirect_uris[1]
);
// var oauth2Client = new OAuth2Client(client_id,client_secret,redirect_uris[1]); other method
const token = await oAuth2ClientGetToken(oAuth2Client, code).catch(console.err);
// oauth2Client.credentials=token other method of oauth2.0
oAuth2Client.setCredentials(token);
return oAuth2Client;
}
Router.get("/",(req,res)=>{
res.render("home")
})
Router.post("/sheet",async (req,res)=>{
try {
const requestBody = {
properties: {
title:"hello"
}
};
var sheets= google.sheets({version:"v4", auth: auths[req.session.id]})
await sheets.spreadsheets.create(requestBody)
} catch (error) {
res.json(error)
}
})
Router.post("/login",(req,res)=>{
console.log("token: ",req.body.token);
req.session.token=req.body.token
console.log("req.session.id:",req.session.id);
auths[req.session.id]=formAuthClient(req.body.token)
res.status(200).json()
})
module.exports=Router
the client scripts is a simple button that will trigger an "getOfflineAccess" command and ask the user to log in and then send that data to the server with "/login". then, once another button is pushed, it will call "/sheet". I appreciate all help with this. I ran out of links to click on trying to solve this problem

AWS Lambda w/ Google Vision API throwing PEM_read_bio:no start line or Errno::ENAMETOOLONG

The Goal: User uploads to S3, Lambda is triggered to take the file and send to Google Vision API for analysis, returning the results.
According to this, google-cloud requires native libraries and must be compiled against the OS that lambda is running. Using lambda-packager threw an error but some internet searching turned up using an EC2 with Node and NPM to run the install instead. In the spirit of hacking through this, that's what I did to get it mostly working*. At least lambda stopped giving me ELF header errors.
My current problem is that there are 2 ways to call the Vision API, neither work and both return a different error (mostly).
The Common Code: This code is always the same, it's at the top of the function, and I'm separating it to keep the later code blocks focused on the issue.
'use strict';
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
const Bucket = 'my-awesome-bucket';
const gCloudConfig = {
projectId: 'myCoolApp',
credentials: {
client_email: 'your.serviceapi#project.email.com',
private_key: 'yourServiceApiPrivateKey'
}
}
const gCloud = require('google-cloud')(gCloudConfig);
const gVision = gCloud.vision();
Using detect(): This code always returns the error Error: error:0906D06C:PEM routines:PEM_read_bio:no start line. Theoretically it should work because the URL is public. From searching on the error, I considered it might be an HTTPS thing, so I've even tried a variation on this where I replaced HTTPS with HTTP but got the same error.
exports.handler = (event, context, callback) => {
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
const img = S3.getSignedUrl('getObject', params);
gVision.detect(img, ['labels','text'], function(err, image){
if(err){
console.log('vision error', err);
}
console.log('vision result:', JSON.stringify(image, true, 2));
});
}
Using detectLabels(): This code always returns Error: ENAMETOOLONG: name too long, open ....[the image in base64].... On a suggestion, it was believed that the method shouldn't be passed the base64 image, but instead the public path; which would explain why it says the name is too long (a base64 image is quite the URL). Unfortunately, that gives the PEM error from above. I've also tried not doing the base64 encoding and pass the object buffer directly from aws but that resulted in a PEM error too.
exports.handler = (event, context, callback) => {
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
S3.getObject(params, function(err, data){
const img = data.Body.toString('base64');
gVision.detectLabels(img, function(err, labels){
if(err){
console.log('vision error', err);
}
console.log('vision result:', labels);
});
});
}
According to Best Practices, the image should be base64 encoded.
From the API docs and examples and whatever else, it seems that I'm using these correctly. I feel like I've read all those docs a million times.
I'm not sure what to make of the NAMETOOLONG error if it's expecting base64 stuff. These images aren't more than 1MB.
*The PEM error seems to be related to credentials, and because my understanding of how all these credentials work and how the modules are being compiled on EC2 (which doesn't have any kind of PEM files), that might be my problem. Maybe I need to set up some credentials before running npm install, kind of in the same vein as needing to be installed on a linux box? This is starting to be outside my range of understanding so I'm hoping someone here knows.
Ideally, using detect would be better because I can specify what I want detected, but just getting any valid response from Google would be awesome. Any clues you all can provide would be greatly appreciated.
So, a conversation with another colleague pointed me to consider abandoning the whole loading of the API and using the google-cloud module. Instead, I should consider trying the Cloud REST API via curl and seeing if it can work that way.
Long story short, making an HTTP request and using the REST API for Google Cloud was how I solved this issue.
Here is the working lambda function I have now. Probably still needs tweaks but this is working.
'use strict';
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
const Bucket = 'yourBucket';
const fs = require('fs');
const https = require('https');
const APIKey = 'AIza...your.api.key...kIVc';
const options = {
method: 'POST',
host: `vision.googleapis.com`,
path: `/v1/images:annotate?key=${APIKey}`,
headers: {
'Content-Type': 'application/json'
}
}
exports.handler = (event, context, callback) => {
const req = https.request(options, res => {
const body = [];
res.setEncoding('utf8');
res.on('data', chunk => {
body.push(chunk);
});
res.on('end', () => {
console.log('results', body.join(''));
callback(null, body.join(''));
});
});
req.on('error', err => {
console.log('problem with request:', err.message);
});
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
S3.getObject(params, function(err, data){
const payload = {
"requests": [{
"image": {
"content": data.Body.toString('base64')
},
"features": [{
"type": "LABEL_DETECTION",
"maxResults": 10
},{
"type": "TEXT_DETECTION",
"maxResults": 10
}]
}]
};
req.write(JSON.stringify(payload));
req.end();
});
}

push notifications via quickblox

I am trying to implement push notifications for my app using nodejs for the backend using quickblox. I'm following the steps to do that as mentioned on the quickblox site, i.e create a session user, create a push token, and last subscribe to notification channel. I'm facing a problem with the creation of the push token. My server side code looks like this:
app.post('/test_quickblox', function(req, res) {
var params = {
login: req.user.qb_username,
password: req.user.qb_password,
}
console.log(params);
QB.createSession(params, function(err, result) {
if (err) {
console.log(err);
}
console.log(result);
var options = {};
options.headers = {};
options.headers['QuickBlox-REST-API-Version'] = '0.1.0';
options.headers['QB-Token'] = result.token;
options.body = {};
options.body['push_token'] = {};
options.body['push_token']['environment'] = 'development';
options.body['push_token']['client_identification_sequence'] = '54b1e2b9e9081ed60520824054b1e2b8e9081ed60520823f';
options.body['device'] = {};
options.body['device']['platform'] = 'ios';
options.body['device']['udid'] = 'e0101010d38bde8e6740011221af335301010333';
options.url = 'http://api.quickblox.com/push_tokens.json';
QuickbloxRequest(options, function(err, response) {
if (err) {
console.log(err);
return apiError();
}
console.log(response);
res.apiSuccess();
});
});
});
when logging the response it looks like the following
{ _id: '54b1e3a1535c121c2000be66',
application_id: 18113,
created_at: '2015-01-11T02:44:49Z',
device_id: 0,
nonce: 8394,
token: 'bf61098a35fac9389be236caec44f0a9827630d1',
ts: 1420944288,
updated_at: '2015-01-11T02:44:49Z',
user_id: 2179940,
id: 56046 }
and the error I get is:
{"code":null,"message":"No device registered for current user session. Device is obligatory to be able to execute actions with push token."}
I guess the problem lies in the device_id being 0.
Note that I am creating the users in another controller without supplying any device_id upon creation, so I think that might be my problem but I am new to quickblox and do not understand yet all the semantics so please help me find out what the problem is. Thanks
And here we are 4 years later and I faced the same problem. No answer, no nothing, it makes you wonder how large is the quickblox community :O
Anyway, for anyone coming here with the same problem : It seems the problem is that the Android UUID returned by PhoneGap is too short so quickblox rejects it silently.
Here is what worked for me. Pay attention to the doubling of the uuid :
window.device.uuid + window.device.uuid
JS Code :
//REGISTER AS ANDROID
var message = {
environment: "development",
client_identification_sequence: e.regid,
platform: "android",
udid: window.device.uuid + window.device.uuid,
};
if (BBPushNotification.showLog) console.log(message);
QB.messages.tokens.create(message, function(err, response){
if (err) {
if (BBPushNotification.showLog) console.log("Create token error : ",err);
} else {
if (BBPushNotification.showLog) console.log("Create token success : ",response);
}
});

Categories