Specifying a maxDuration in vercel.json for an Express.js application - javascript

I've upgraded to a Pro plan in hosting my Express app on Vercel. I expected that the maxDuration should have been increased from the default 10 seconds t0 60 seconds (for the pro plan) but it seems I have to explicitly specify the maxDuration value in my vercel.json file. Here's my vercel.json:
{
"version": 2,
"builds": [
{
"src": "./server.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/(.*)",
"dest": "/server.js"
}
],
"functions": {
"controllers/*.js": {
"maxDuration": 60
},
"middleware/**/*.js": {
"maxDuration": 60
}
}
}
When I commit and successfully pushed to Github, Github shows an error-based notification from Vercel that says "Conflicting functions and builds Configuration" which clearly says that in my vercel.json file I can't use the "builds" and the "functions" command together; I can only use either of them.
I tried deleting the "builds" command. Upon pushing again, Vercel couldnt interprete/convert my Express.js controller functions/files into serverless functions. Though I know that the build command is definitely necessary.
I really need help. How can I do the configuration in a way that I can set a maxDuration so that my API doesn't timeout after the default 10 seconds?
Thanks.

Related

Is this a Vercel bug? Cannot find module './model'

I have 2 files:
controller.js
model.js
I'm making an express.js app.
So, model.js is required inside controller.js but I have this error when I call my api
And the logs are:
But here is the problem, './model.js' really does exist, but vercel doesn't recognize it
And works fine in local development and it is required correctly
this is model.js
const { nanoid } = require("nanoid");
const getDateStr = () => {
const now = new Date();
return `${now.getFullYear()}-${now.getMonth()+1}-${now.getDate()}`;
}
const Purchase = ({ id_user_buyer, id_user_seller, id_product, quantity }) => (
{
id_user_buyer,
id_user_seller,
id_product,
quantity,
id_purchase: nanoid(),
date: getDateStr(),
}
);
module.exports = {
Purchase
}
And this is controller.js
const err = require("../../../utils/error");
const { Purchase } = require("./model")
// others modules and they are well imported, so the problem is './model'
const userController = require("../user");
const productController = require("../product");
const cartController = require("../cart");
const TABLE = 'purchase';
function purchaseController(injectedStore) {
// example code
async function makePurchase(data) {
const purchase = Purchase(data);
await injectedStore.insert(TABLE, purchase);
}
return {
makePurchase,
}
}
module.exports = purchaseController;
As you can see, model.js is well imported inside controller.js I don't know why vercel says ERROR Cannot find module './model' I say it again, works fine in local development but not in vercel
A quick fix is copy and paste all the code of model.js inside controller.js I tried it, I deploy him and it worked.
All my app also works fine if I just comment that line where I import ./model , but obviously my application would stop having that functionality, so the first solution is uggly but works, but those are not the best solutions. the best solution is for the file to be imported correctly
Curious fact I tried renaming the file and it didn't help either. It also doesn't work if I import a new file.
NOTE I changed the nodejs version from 12 to 14, will that have something to do with it?
just in case I put my folder structure
root
api
this is my vercel.json
{
"version": 2,
"builds": [
{
"src": "/api/index.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/api/auth(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/users(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/products(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/cart(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/purchases(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/sales(.*)",
"dest": "/api/index.js"
}
]
}
I don't know if it's a vercel bug or it's a mistake on my part. My application works currently but doing the trick that I named before, which was about putting all the code of model.js inside of controller.js
Thanks for reading the whole issue.
Well, I don't know why, but i think was a vercel bug, the solution for my problem was not to use optional chaining.
In controller.js I have more code and I was using optional chaining, just change the logic and works fine. Vercel supports optional chaining but with node 14.x and I was using node 12.x in vercel, so I change it to 14.x and works fine in others files, but not in controller.js
So, if you are using javascript new things, like optional chaining, nullish operator, etc. And you are using version 12.x in vercel, that will produce errors. So you change to node 14.x, it could give you bugs too.
So the best you can do is make sure from the beginning that you have version 14.x of node.js in vercel
how to change node version in vercel

How to cache all urls like /page/id where id is a number using workbox

Given the following code snippet from my nodejs server:
router.get('/page/:id', async function (req, res, next) {
var id = req.params.id;
if ( typeof req.params.id === "number"){id = parseInt(id);}
res.render('page.ejs' , { vara:a , varb:b });
});
I want to do exactly what I'm doing in the nodejs server but on from the service worker.
I've generated & built it using workbox but I don't know how to cache all the urls like /page/1 or /page/2 or .... /page/4353 and so on without overcharging the service worker source code.
The nodejs code from above it's working 100%.
I tried to so something like:
.....{
"url": "/page/\*",
"revision": "q8j4t1d072f2g6l5unc0q6c0r7vgs5w0"
},....
It doesn't work in the service worker pre-cache when I reloaded the website with this code added into the service worker it was installing and it took pretty much. Is that normal? Can't I do that without overcharging the entire installing process and browser cache memory?
Thank you for help!
EDIT:
My service worker looks like :
importScripts("https://storage.googleapis.com/workbox-cdn/releases/3.6.2/workbox-sw.js");
if (workbox) {
console.log('Workbox status : ONLINE');
} else {
console.log('Workbox status : OFFLINE');
}
workbox.skipWaiting();
workbox.clientsClaim();
self.__precacheManifest = [
{
"url": "/",
"revision": "7e50eec344ce4d01730894ef1d637d4d"
},
'head.ejs',
'navbar.ejs',
'map_script.ejs',
'scripts.ejs',
'details.json',
'page.ejs',
'home.ejs',
'map.ejs',
'about.ejs',
{
"url": "/page",
"revision": "881d8ca1f2aacfc1617c09e3cf7364f0",
"cleanUrls": "true"
},
{
"url": "/about",
"revision": "11d729194a0669e3bbc938351eba5d00"
},
{
"url": "/map",
"revision": "c3942a2a8ac5b713d63c53616676974a"
},
{
"url": "/getJson",
"revision": "15c88be34ben24a683f7be97fd8abc4e"
},
{
"url": "/getJson1",
"revision": "15c88be34bek24a6l3f7be97fd3aoc4e"
},
{
"url": "/getJson2",
"revision": "15c82be34ben24a683f7be17fd3amc4e"
},
{
"url": "/getJson3",
"revision": "15c62be94ben24a683f7be17gd3amc4r"
},
{
"url": "/getJson4",
"revision": "15c62beh4ben24a6g3f7be97gd3amc4p"
},
{
"url": "/public/_processed_/2/7/csm.jpg",
"revision": "15c62beh4bek44a6g3f7ben7gd3amc4p"
},
{
"url": "/public/_processed_/2/7/csm.jpg",
"revision": "15c62beh4ben24a6g3f7be9ngd3a2c4p"
}
].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
Workbox service worker has its own process of caching.
Based from the documentation of Workbox PreCaching:
One feature of service workers is the ability to save a set of files
to the cache when the service worker is installing. This is often
referred to as "precaching", since you are caching content ahead of
the service worker being used.
The precacheAndRoute method sets up an implicit cache-first handler. This is why the home page loaded while we were offline even though we had not written a fetch handler for those files.
If a request fails to match the precache, we'll add .html to end to
support "clean" URLs (a.k.a "pretty" URLs). This means a request like
/about will match /about.html.
workbox.precaching.precacheAndRoute(
[
'/styles/index.0c9a31.css',
'/scripts/main.0d5770.js',
{ url: '/index.html', revision: '383676' },
],
{
cleanUrls: true,
}
);
You can try to implement this method of caching files with service worker:
cache.addAll(requests) - This method is the same as add except it
takes an array of URLs and adds them to the cache. If any of the files
fail to be added to the cache, the whole operation will fail and none
of the files will be added.
For more infos:
Workbox-SW
Workbox Codelab
The idea behind the precache is to cache the minimal set of web assets -that you need to get your page up and running - during service worker installation. It is generally not used to cache a whole bunch of data.
What you probably need is a dynamic cache:
workbox.routing.registerRoute(
\/page/g,
workbox.strategies.CacheFirst({
cacheName: 'my-cache',
})
);
In this case, the first hit will get the page from the network. Thereafter it will be served from the cache.
I think your issue may be that you are expecting clean urls according to the manifest, but you are actually expecting dynamic urls with different params. For that, I suspect you might want to use regex.
An example for caching google fonts:
workbox.routing.registerRoute(
/^https:\/\/fonts\.googleapis\.com/,
workbox.strategies.staleWhileRevalidate({
cacheName: 'google-fonts-stylesheets'
})
);
Maybe try something closer to (regex might need some love, I did not verify):
workbox.routing.registerRoute(
/\/page\/*/,
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages'
})
);
Edit: I realize my answer is the same as #voiceuideveloper but I believe this is the right answer. I do this in my current app and it works well.

Portable electron app is extracted in a different folder every time it opens

electron-builder version: 20.9.2
Target: windows/portable
I'm building a portable app with electron-builder and using socket.io to keep a real-time connection with a backend service but I have an issue with the firewall. Because this is a portable app everytime the app is opened it looks that it is extracted in the temporary folder, which will generate a new folder (so the path to the app will be different) in every run which will make the firewall think that this is another app asking for the connection permissions. How can I change the extraction path when I run the app?
(This is the screen that I get every time I run the app)
This is my socket.io configuration
const io = require("socket.io")(6524);
io.on("connect", socket => {
socket.on("notification", data => {
EventBus.$emit("notifications", JSON.parse(data));
});
});
My build settings in package.json
"build": {
"productName": "xxx",
"appId": "xxx.xxx.xxx",
"directories": {
"output": "build"
},
"files": [
"dist/electron/**/*",
"!**/node_modules/*/{CHANGELOG.md,README.md,README,readme.md,readme,test,__tests__,tests,powered-test,example,examples,*.d.ts}",
"!**/node_modules/.bin",
"!**/*.{o,hprof,orig,pyc,pyo,rbc}",
"!**/._*",
"!**/{.DS_Store,.git,.hg,.svn,CVS,RCS,SCCS,__pycache__,thumbs.db,.gitignore,.gitattributes,.editorconfig,.flowconfig,.yarn-metadata.json,.idea,appveyor.yml,.travis.yml,circle.yml,npm-debug.log,.nyc_output,yarn.lock,.yarn-integrity}",
"!**/node_modules/search-index/si${/*}"
],
"win": {
"icon": "build/icons/myicon.ico",
"target": "portable"
}
},
Any idea about how at least I could specify an extraction path or make this extract it the execution folder?
BTW I already created an issue about this in the electron-builder repo
In version 20.40.1 they added a new configuration key unpackDirName
/**
* The unpack directory name in [TEMP](https://www.askvg.com/where-does-windows-store-temporary-files-and-how-to-change-temp-folder-location/) directory.
*
* Defaults to [uuid](https://github.com/segmentio/ksuid) of build (changed on each build of portable executable).
*/
readonly unpackDirName?: string
Example
config: {
portable: {
unpackDirName: "0ujssxh0cECutqzMgbtXSGnjorm",
}
}
More info #3799.

Error retrieving a new session from the selenium server

I'm attempting to setup Nightwatch.js for the first time. I'm following the following tutorial: https://github.com/dwyl/learn-nightwatch
Unfortunately I've hit a roadblock, and I'm in need of help resolving it.
Error retrieving a new session from the selenium server.
Connection refused! Is selenium server started?
nightwatch.conf.js
module.exports = {
"src_folders": [
"test"// Where you are storing your Nightwatch e2e/UAT tests
],
"output_folder": "./reports", // reports (test outcome) output by nightwatch
"selenium": {
"start_process": true, // tells nightwatch to start/stop the selenium process
"server_path": "./node_modules/nightwatch/bin/selenium.jar",
"host": "127.0.0.1",
"port": 4444, // standard selenium port
"cli_args": {
"webdriver.chrome.driver" : "./node_modules/nightwatch/bin/chromedriver"
}
},
"test_settings": {
"default": {
"screenshots": {
"enabled": true, // if you want to keep screenshots
"path": './screenshots' // save screenshots here
},
"globals": {
"waitForConditionTimeout": 5000 // sometimes internet is slow so wait.
},
"desiredCapabilities": { // use Chrome as the default browser for tests
"browserName": "chrome"
}
},
"chrome": {
"desiredCapabilities": {
"browserName": "chrome",
"javascriptEnabled": true // set to false to test progressive enhancement
}
}
}
}
guinea-pig.js
module.exports = { // addapted from: https://git.io/vodU0
'Guinea Pig Assert Title': function(browser) {
browser
.url('https://saucelabs.com/test/guinea-pig')
.waitForElementVisible('body')
.assert.title('I am a page title - Sauce Labs')
.saveScreenshot('ginea-pig-test.png')
.end();
}
};
Based on the configuration setup. I kept it as basic as possible. I cannot pinpoint the source where it would suggest another selenium server has started. Any ideas?
EDIT: TIMEOUT ERROR
In your nightwatch.json file, within "selenium"
Make sure your server path is correct.
Make sure your webdriver.chrome.driver path is correct.
Those are specific to your machine. If those do not refer to the correct file in the correct location, you'll get problems starting the selenium server.
After that, you want to make certain that the version of the selenium server you have works with the version of chrome driver you have and that that will work with the version of the Chrome browser you have.
But as Krishnan Mahadevanindicated, without the whole error message, we can't be of much more help.
The solution involved deleting my instance of Chrome (although it was the most recent version) and simply reinstalling the browser again.
I encourage all facing the same problems to first look at QualiT's response above as it's the more conventional troubleshooting strategy.
I had got the same issue when I used vue-cli init on my project. after I updated to Java 9, this problem was resolved.

Kartograph: Map Creation fails

Recently I started using Kartograph. I am inexperienced in SVG, so the map creation is creating headaches for me. After initial trouble creating a world map that outlines country borders - similar to this - and a few other things(city regions and some decorating elements), my problem boils down to a undocumented - or at least I haven't found it in the docs - error. I guess it is related with my ignorance towards the kartograph.py framework.
The json file I provide Kartograph looks like that:
{
"proj": {
"id": "lonlat",
"lon0": 20,
"lat0": 0
},
"layers": {
"background": {
"special": "sea",
"charset": "latin-1",
"simplify": false
},
"graticule": {
"special": "graticule",
"charset": "latin-1",
"simplify": false,
"latitudes": 1,
"longitudes": 1,
"styles":{
"stroke-width": "0.3px"
}
},
"world":{
"src": "ne_50m_admin_0_countries.shp",
"charset": "latin-1",
"simplify": false
},
"lakes":{
"src": "Lakes.shp",
"charset": "latin-1",
"simplify": false
},
"trees":{
"src": "Trees.shp",
"charset": "latin-1",
"simplify": false
},
"depth":{
"src": "DepthContours.shp",
"charset": "latin-1",
"simplify": false
},
"cities":{
"src": "CityAreas.shp",
"charset": "latin-1",
"simplify": false
}
}
}
I know the output file will be huge and the generation will take ages, but it is just a test. I will experiment with the "simplify" option later. Much of the code in the file is based on this tutorial. Also, the empty simplify clause might not be necessary, but kartograph complained about the lack of the option, so I added it.
The command I use is this one:
kartograph world.json -o world.svg
It runs for some time(I guess, parsing all the input files etc.) before aborting. Now, the error I am facing is this one:
cli.py, in render_map()
71: K.generate(cfg, args.output, preview=args.preview, format=format, stylesheet=css) kartograph.py, in generate()
46: _map = Map(opts, self.layerCache, format=format) map.py, in __init__()
50: me.bounds_poly = me._init_bounds() map.py, in _init_bounds()
192: features = self._get_bounding_geometry() map.py, in _get_bounding_geometry()
257: charset=layer.options['charset']
get_features() got an unexpected keyword argument 'filter'
I tried looking at the file which throws the error(map.py), but I realized quickly that there's just too much interaction in the files for me to grasp things quickly.
I hope the data I provided is sufficient for someone more familiar with kartograph than me to track the error down.
UPDATE: The error is still valid. I tested it on both a MacBook Pro and an Asus Netbook now(Arch and Bodhi Linux, respectively).
Thanks in advance,
Carson
As far as I know, you can solve that problem by including a 'bounds' parameter. It is in deed very tricky, because according to the documentation (is it valid to call it 'documentation') this error should not appear, since the only required parameter is 'layers'. Also, how the bounds are defined depend apparently from the chosen projection. For your example I would use a simple polygon bounds.
I also had problems with that error. But, after many trials to set up everything, I noticed that apparently it only appears in the command-line version of Kartograph, and not when using Kartograph as a Python module in a script. I.e., try to include the json dictionary into a Python script where you import kartograph, like in the example here below.
I also put an example of filtering, for the record, because it was another thing that failed to work when using the command-line version of Kartograph.
# file: makeMap.py
from kartograph import Kartograph
K = Kartograph()
def myfilter(record):
return record['iso_a3'] in ["FRA","ITA","DEU"]
config = {
"layers": {
"mylayer": {
"src": "ne_50m_admin_0_countries.shp",
"filter": myfilter,
"attributes": {"iso_a3":"iso_a3", "name":"name", "id":"iso_a3"}
}
},
}
K.generate(config, outfile='world.svg')
Then, run the script as a Python script:
python makeMap.py

Categories