NETLIFY functions: Uncaught Exception: Runtime.ImportModuleError - javascript

I'm running some webhooks on Netlify, but I'm getting an uncaught exception: Runtime.ImportModuleError.
Here is the full error, and you can access a snapshot of the function HERE
{"errorType":"Runtime.ImportModuleError","errorMessage":"Error: Cannot find module 'safe-buffer'\nRequire stack:\n- /var/task/src/node_modules/mqtt/lib/connect/ws.js
That is when I import 'async-mqtt', but the problem is that I have another function which uses the exact same import, without any problem. I can reproduce it on another function, but I'm not sure how I can go about reproducing it separately. I'm really lost, so ask if there is any information I need to explain.
Netlify runs on AWS functions, if that can help. Here is the full code of the function:
console.log('at least it opens');
const mqtt = require('async-mqtt');
exports.handler = async function (event, context) {
const client = await mqtt.connectAsync(
'mqtts://mqtt.flespi.io', {
username: 'SECRET HERE',
port: 8883,
clientId: `action-on-google--${Math.random().toString(16).substr(2, 8)}`,
},
);
await client.publish('lights/bulbs', 'N255,0');
client.end();
return { statusCode: 200, body: 'Hello world' };
};

Related

How do I get AWS Lambda, API Gateway and SES to cooperate on a mailform?

The problem I'm having is implementing a simple mailform using a combination of S3/Cloudfront/API Gateway/Lambda and SES. I've been banging my head against it for a while (and Googling every similar tutorial or SOF answer I can find) and have resolved a lot of the initial issues, but this last one is driving me crazy. The Lambda function is below:
const aws = require("aws-sdk");
const ses = new aws.SES({ region: "ap-southeast-2" });
var response = {
"statusCode": 200,
"headers": {
"Content-Type": "application/json","Access-Control-Allow-Origin": "*"
},
"isBase64Encoded": false,
"body": JSON.stringify({"Result" : "Success"})
}
exports.handler = async function (event, context) {
console.log('Received event:', event);
sendEmail(event);
return response;
};
function sendEmail(event) {
const {name, email, phone, message} = JSON.parse(event.body);
const params = {
Destination: {
ToAddresses: [process.env.RECEIVER],
},
Message: {
Body: {
Text: {
Data: `You just got a message from ${name} - ${email} - ${phone}:
${message}`
},
},
Subject: { Data: `Message from ${name}` },
},
Source: process.env.SENDER,
};
ses.sendEmail(params).promise();
}
The irritating thing that's happening is - if the exports.handler function is async, API Gateway is delivering the correct response to the browser, Lambda and API Gateway are producing no errors BUT no email is sent. If it's not async, then API Gateway is producing a 502 error (Malformed Lambda response) BUT the email is being sent with the correct information from the POST.
Clearly I'm a noob when it comes to network requests/asynch programming and I'd really appreciate it if someone could see what I'm doing wrong.
*** EDIT ***
So I found an answer, which is the promise in the sendEmail function needs to use await. The interaction of multiple async events confused me.
Does anyone have a link to a REALLY GOOD tutorial on this for Javascript?
*** END EDIT ***

Is there a way to use firebase secret environment variables in slack/bolt

We operate bots by combining Firebase and Slack/bolt.
I currently use functions.config()to manage my Slack tokens and secrets, but would like to migrate to using Secret Manager.
https://firebase.google.com/docs/functions/config-env#secret-manager
boltapp.js
const functions = require("firebase-functions");
const { App, ExpressReceiver, subtype } = require("#slack/bolt");
const config = functions.config();
const expressReceiver = new ExpressReceiver({
signingSecret: process.env.SLACK_SECRET,
endpoints: "/events",
processBeforeResponse: true,
});
const app = new App({
receiver: expressReceiver,
token: process.env.SLACK_TOKEN
});
app.error(error => { console.error(error) });
app.use(async ({ client, payload, context, next }) => {
console.info('It\'s payload', JSON.stringify(payload))
if (!context?.retryNum) {
await next();
} else {
console.debug('app.use.context', context);
}
});
//**bot processing**//
// https://{your domain}.cloudfunctions.net/slack/events
module.exports = functions
.runWith({ secrets: ["SLACK_TOKEN","SLACK_SECRET"] })
.https.onRequest(expressReceiver.app);
But when I rewrote it for migration, I got the following error.
Is there a way to rewrite the code while avoiding this error?
Failed to load function definition from source: FirebaseError: Failed to load function definition from source: Failed to generate manifest from function source: Error: Apps used in a single workspace can be initialized with a token. Apps used in many workspaces should be initialized with oauth installer options or authorize.
Since you have not provided a token or authorize, you might be missing one or more required oauth installer options. See https://slack.dev/bolt-js/concepts#authenticating-oauth for these required fields.

getStaticPaths array.map isn't a function when deploying to Vercel [duplicate]

This question already has an answer here:
Fetch error when building Next.js static website in production
(1 answer)
Closed last year.
Code is below, the dev server works perfectly fine without error, however when I try to deploy the site I get this error.
> Build error occurred
TypeError: artists.map is not a function
at getStaticPaths (/vercel/path0/.next/server/pages/artists/[slug].js:106:24)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async buildStaticPaths (/vercel/path0/node_modules/next/dist/build/utils.js:498:31)
at async /vercel/path0/node_modules/next/dist/build/utils.js:641:119
at async Span.traceAsyncFn (/vercel/path0/node_modules/next/dist/trace/trace.js:74:20) {
type: 'TypeError'
}
export async function getStaticProps({ params }) {
const siteSettings = await fetcher("http://localhost:3000/api/settings");
const artists = await fetcher(
`${process.env.URL}/api/artists/${params.slug}`
);
const artist = artists.allArtist[0];
return {
props: {
siteSettings,
artist,
},
};
}
export async function getStaticPaths() {
const artists = await fetch(`${process.env.URL}/api/artists`);
return {
paths: artists.map((artist) => {
return {
params: {
slug: artist.slug.current,
},
};
}),
fallback: false,
};
}
How would I go about fixing this error, as I said it works perfectly fine in the dev server with no console logs containing any errors.
Any help would be greatly appreciated.
I attempted writing the API route logic directly in the getStaticProps function and this fixed the error #juliomalves also posted this link which suggests a similar answer: Fetch error when building Next.js static website in production
Thanks for the help.
Mac

getting error while deploying firebase cloud pub-sub code

while deploying pub-sub code on firebase i'm getting following error :
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. This is likely due to a bug in the user code. Error message: Code in file index.js can't be loaded.
Is there a syntax error in your code?
Detailed stack trace: /srv/node_modules/#google-cloud/pubsub/build/src/pubsub.js:527
async *listSchemas(view = schema_1.SchemaViews.Basic, options) {
^
I'm not getting why this error is occurring.
following is the code:
exports.publish = async (req, res) => {
if (!req.body.topic || !req.body.message) {
res.status(400).send('Missing parameter(s); include "topic" and "message" properties in your request.');
return;
}
console.log(`Publishing message to topic ${req.body.topic}`);
const topic = pubsub.topic(req.body.topic);
const messageObject = {
data: {
message: req.body.message,
},
};
const messageBuffer = Buffer.from(JSON.stringify(messageObject), "utf8");
try {
await topic.publish(messageBuffer);
res.status(200).send("Message published.");
} catch (err) {
console.error(err);
res.status(500).send(err);
return Promise.reject(err);
}
};
I'm going to guess that this function is set to use the Node 8 runtime, since support for async iterators was added in Node 10. The Pub/Sub library has only supported Node 10 and above since 2.0, so bumping the runtime version on the Firebase function should help:
https://firebase.google.com/docs/functions/manage-functions#set_nodejs_version
Unfortunately I don't have enough points to ask for more details on the original question, but hopefully that helps!

AWS Lambda w/ Google Vision API throwing PEM_read_bio:no start line or Errno::ENAMETOOLONG

The Goal: User uploads to S3, Lambda is triggered to take the file and send to Google Vision API for analysis, returning the results.
According to this, google-cloud requires native libraries and must be compiled against the OS that lambda is running. Using lambda-packager threw an error but some internet searching turned up using an EC2 with Node and NPM to run the install instead. In the spirit of hacking through this, that's what I did to get it mostly working*. At least lambda stopped giving me ELF header errors.
My current problem is that there are 2 ways to call the Vision API, neither work and both return a different error (mostly).
The Common Code: This code is always the same, it's at the top of the function, and I'm separating it to keep the later code blocks focused on the issue.
'use strict';
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
const Bucket = 'my-awesome-bucket';
const gCloudConfig = {
projectId: 'myCoolApp',
credentials: {
client_email: 'your.serviceapi#project.email.com',
private_key: 'yourServiceApiPrivateKey'
}
}
const gCloud = require('google-cloud')(gCloudConfig);
const gVision = gCloud.vision();
Using detect(): This code always returns the error Error: error:0906D06C:PEM routines:PEM_read_bio:no start line. Theoretically it should work because the URL is public. From searching on the error, I considered it might be an HTTPS thing, so I've even tried a variation on this where I replaced HTTPS with HTTP but got the same error.
exports.handler = (event, context, callback) => {
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
const img = S3.getSignedUrl('getObject', params);
gVision.detect(img, ['labels','text'], function(err, image){
if(err){
console.log('vision error', err);
}
console.log('vision result:', JSON.stringify(image, true, 2));
});
}
Using detectLabels(): This code always returns Error: ENAMETOOLONG: name too long, open ....[the image in base64].... On a suggestion, it was believed that the method shouldn't be passed the base64 image, but instead the public path; which would explain why it says the name is too long (a base64 image is quite the URL). Unfortunately, that gives the PEM error from above. I've also tried not doing the base64 encoding and pass the object buffer directly from aws but that resulted in a PEM error too.
exports.handler = (event, context, callback) => {
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
S3.getObject(params, function(err, data){
const img = data.Body.toString('base64');
gVision.detectLabels(img, function(err, labels){
if(err){
console.log('vision error', err);
}
console.log('vision result:', labels);
});
});
}
According to Best Practices, the image should be base64 encoded.
From the API docs and examples and whatever else, it seems that I'm using these correctly. I feel like I've read all those docs a million times.
I'm not sure what to make of the NAMETOOLONG error if it's expecting base64 stuff. These images aren't more than 1MB.
*The PEM error seems to be related to credentials, and because my understanding of how all these credentials work and how the modules are being compiled on EC2 (which doesn't have any kind of PEM files), that might be my problem. Maybe I need to set up some credentials before running npm install, kind of in the same vein as needing to be installed on a linux box? This is starting to be outside my range of understanding so I'm hoping someone here knows.
Ideally, using detect would be better because I can specify what I want detected, but just getting any valid response from Google would be awesome. Any clues you all can provide would be greatly appreciated.
So, a conversation with another colleague pointed me to consider abandoning the whole loading of the API and using the google-cloud module. Instead, I should consider trying the Cloud REST API via curl and seeing if it can work that way.
Long story short, making an HTTP request and using the REST API for Google Cloud was how I solved this issue.
Here is the working lambda function I have now. Probably still needs tweaks but this is working.
'use strict';
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
const Bucket = 'yourBucket';
const fs = require('fs');
const https = require('https');
const APIKey = 'AIza...your.api.key...kIVc';
const options = {
method: 'POST',
host: `vision.googleapis.com`,
path: `/v1/images:annotate?key=${APIKey}`,
headers: {
'Content-Type': 'application/json'
}
}
exports.handler = (event, context, callback) => {
const req = https.request(options, res => {
const body = [];
res.setEncoding('utf8');
res.on('data', chunk => {
body.push(chunk);
});
res.on('end', () => {
console.log('results', body.join(''));
callback(null, body.join(''));
});
});
req.on('error', err => {
console.log('problem with request:', err.message);
});
const params = {
Bucket,
Key: event.Records[0].s3.object.key
}
S3.getObject(params, function(err, data){
const payload = {
"requests": [{
"image": {
"content": data.Body.toString('base64')
},
"features": [{
"type": "LABEL_DETECTION",
"maxResults": 10
},{
"type": "TEXT_DETECTION",
"maxResults": 10
}]
}]
};
req.write(JSON.stringify(payload));
req.end();
});
}

Categories