Retrieve or Specify output file name in electron-builder - javascript

I am working with electron-builder programmatically to generate installation packages. So far I have this as my utility to create the installation package for the current OS type:
const packagejson = require("../package.json");
const builder = require("electron-builder");
const Platform = builder.Platform;
function buildPromise(){
//Development package.json
const devMetadata = packagejson.electronBuilder;
//Application package.json
const appMetadata = {
name: packagejson.name,
version: packagejson.version,
description: packagejson.description,
author: packagejson.author,
productName: packagejson.productName
};
//Build for the current target and send back promise
return builder.build({
projectDir: "./",
devMetadata,
appMetadata
});
}
module.exports = {
buildPromise,
outputPath : packagejson.electronBuilder.directories.output
};
What it does is pull in the needed metadata from the apps MAIN package.json file which contains this section (so the application package.json is empty):
...
"electronBuilder": {
"build": {
"productName": "Node App",
"appId": "my.id",
"asar": false,
"win": {
"iconUrl": "http://localhost:5000/images/logo-multi.ico",
"target": "nsis"
},
"nsis" :{
"oneClick": false
}
},
"directories": {
"output": "electron/output",
"app":"electron/app",
"buildResources": "electron/buildResources"
}
}
...
When I run the build in Windows I get a file out called Node App Setup 1.0.0.exe. So far so go. But how do I actually control that final file name? Or at least retrieve that file name programmatically so I can read it in and respond to the client in some way? Obviously, I could piece it together from the json file settings but it I would rather it be more definitive.

You can specify the output filename using artifactName in the build section of your package.json.
The docs say the artifact file name template supports the ${ext} macro:
${ext} macro is supported in addition to file macros.
File Macros
You can use macros in the file patterns, artifact file name patterns and publish configuration url:
${arch} — expanded to ia32, x64. If no arch, macro will be removed from your pattern with leading space, - and _ (so, you don't need to worry and can reuse pattern).
${os} — expanded to mac, linux or win according to target platform.
${name} – package.json name.
${productName} — Sanitized product name.
${version} — from package.json
${channel} — detected prerelease component from version (e.g. beta).
${env.ENV_NAME} — any environment variable.
Any property of AppInfo (e.g. buildVersion, buildNumber).
Example
"build": {
"appId": "com.electron.app.my",
"artifactName": "node-app-${version}.${ext}",
...
},
If your package version is 1.0.0, a Windows target would output:
node-app-1.0.0.exe

At my request the author added it to the current version (8.5.1):
https://github.com/electron-userland/electron-builder/issues/899
so now we can do:
builder.build()
.then(paths => {
//paths contains an array of export file paths, e.g.:
console.log(paths[0]); //= c:/MyProject/dist/My Project Setup 1.0.0.exe
console.log(paths[1]); //= c:/MyProject/dist/myproject-1.0.0-x86_64.AppImage
});

Related

Bundle multiple named AMD modules with dependencies into one JS file (building a web app extension system)

I'm working on an extension system for my web app. Third-party developers should be able to extend the app by providing named AMD modules exporting constants and functions following a predefined spec and bundled into a single .js JavaScript file.
Example JavaScript bundle:
define('module1', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Hello, World.') }
}));
define('module2', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Foo. Bar.') }
}));
define('module3', ['exports'], (function (exports) {
exports.respond = function (message) { return { type: 'message', message: message } }
}));
In the above example module1 and module2 are extension modules (identified by the spec export) and module3 is a shared dependency (e.g. coming from an NPM package). Extension bundles will be loaded in a worker within a sandboxed iframe to seal of the untrusted code in the browser.
Example TypeScript source:
// module1.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Hello, World.')
// module2.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Foo. Bar.')
// module3.ts
import dep from 'some-npm-package'
export respond = (message: string) => dep.createMessageObject(message)
Here is my list of requirements to bundling:
All necessary dependencies (e.g. shared module, NPM package logic) must be included in the bundle
The source code needs to be transpiled to browser compatible code if necessary
The AMD format is required by the custom extension loader implementation
The AMD modules must not be anonymous as the module file names are lost while bundling
No relative paths must be used among dependencies (e.g. ./path/to/module3 instead of module3)
The result should be one JavaScript bundle, thus ONE JavaScript file and ONE sourcemaps file
What's the easiest way to do this?
This is the closest solution I found using rollup and the following rollup.config.js:
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
import typescript from '#rollup/plugin-typescript'
export default {
input: [
'src/module1.ts',
'src/module2.ts'
],
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser()
]
}
From this I get the desired named AMD modules (one for each entry point and chunk) in separate .js files. Problems:
Some dependencies are referenced by ./module3 while being named module3.
The modules appear in separate JavaScript and Sourcemap files instead of being concatenated into a single bundle.
Questions:
Is there an easy fix to the above rollup.config.js config to solve the problem?
I tried to write a small rollup plugin but I failed to get the final AMD module code within it to concatenate it to a bundle. Only the transpiled code is available to me. In addition I don't know how to handle sourcemaps during concatenation.
Is there an alternative to rollup better suited to this bundling scenario?
The big picture: Am I completely on the wrong track when it comes to building an extension system? Is AMD the wrong choice?
I found a way to extend the rollup.config.js mentioned in the question with a custom concatChunks rollup plugin to bundle multiple AMD chunks within a single file and having the source maps rendered, too. The only issue I didn't find an answer to was the relative module names that kept popping up. However, this may be resolved in the AMD loader.
Here's the full rollup.config.js that worked for me:
import Concat from 'concat-with-sourcemaps'
import glob from 'glob'
import typescript from '#rollup/plugin-typescript'
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
const concatChunks = (
fileName = 'bundle.js',
sourceMapFileName = 'bundle.js.map'
) => {
return {
name: 'rollup-plugin-concat-chunks',
generateBundle: function (options, bundle, isWrite) {
const concat = new Concat(true, fileName, '\n')
// Go through each chunk in the bundle
let hasSourceMaps = false
Object.keys(bundle).forEach(fileId => {
const fileInfo = bundle[fileId]
if (fileInfo.type === 'chunk') {
let hasSourceMap = fileInfo.map !== null
hasSourceMaps = hasSourceMaps || hasSourceMap
// Concat file content and source maps with bundle
concat.add(
fileInfo.fileName,
fileInfo.code,
hasSourceMap ? JSON.stringify(fileInfo.map) : null
)
// Prevent single chunks from being emitted
delete bundle[fileId]
}
})
// Emit concatenated chunks
this.emitFile({
type: 'asset',
name: fileName,
fileName: fileName,
source: concat.content
})
// Emit concatenated source maps, if any
if (hasSourceMaps) {
this.emitFile({
type: 'asset',
name: sourceMapFileName,
fileName: sourceMapFileName,
source: concat.sourceMap
})
}
}
}
}
export default {
input: glob.sync('./src/*.{ts,js}'),
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser(),
concatChunks()
]
}
Please make sure you npm install the dependencies referenced in the import statements to make this work.
Considering the big picture, i.e. the extension system itself, I am moving away from a "one AMD module equals one extension/contribution" approach, as current developer tools and JavaScript bundlers are not ready for that (as this question shows). I'll go with an approach similar to the Visual Studio Code Extension API and will use a single "default" module with an activate export to register contributions a bundle has to offer. I hope that this will make extension bundling an easy task no matter what tools or languages are being used.

Renaming file by using Node Package Version [duplicate]

Is there a way to get the version set in package.json in a nodejs app? I would want something like this
var port = process.env.PORT || 3000
app.listen port
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, app.VERSION
I found that the following code fragment worked best for me. Since it uses require to load the package.json, it works regardless of the current working directory.
var pjson = require('./package.json');
console.log(pjson.version);
A warning, courtesy of #Pathogen:
Doing this with Browserify has security implications.
Be careful not to expose your package.json to the client, as it means that all your dependency version numbers, build and test commands and more are sent to the client.
If you're building server and client in the same project, you expose your server-side version numbers too.
Such specific data can be used by an attacker to better fit the attack on your server.
If your application is launched with npm start, you can simply use:
process.env.npm_package_version
See package.json vars for more details.
Using ES6 modules you can do the following:
import {version} from './package.json';
Or in plain old shell:
$ node -e "console.log(require('./package.json').version);"
This can be shortened to
$ node -p "require('./package.json').version"
There are two ways of retrieving the version:
Requiring package.json and getting the version:
const { version } = require('./package.json');
Using the environment variables:
const version = process.env.npm_package_version;
Please don't use JSON.parse, fs.readFile, fs.readFileSync and don't use another npm modules it's not necessary for this question.
For those who look for a safe client-side solution that also works on server-side, there is genversion. It is a command-line tool that reads the version from the nearest package.json and generates an importable CommonJS module file that exports the version. Disclaimer: I'm a maintainer.
$ genversion lib/version.js
I acknowledge the client-side safety was not OP's primary intention, but as discussed in answers by Mark Wallace and aug, it is highly relevant and also the reason I found this Q&A.
Here is how to read the version out of package.json:
fs = require('fs')
json = JSON.parse(fs.readFileSync('package.json', 'utf8'))
version = json.version
EDIT: Wow, this answer was originally from 2012! There are several better answers now. Probably the cleanest is:
const { version } = require('./package.json');
There is another way of fetching certain information from your package.json file namely using pkginfo module.
Usage of this module is very simple. You can get all package variables using:
require('pkginfo')(module);
Or only certain details (version in this case)
require('pkginfo')(module, 'version');
And your package variables will be set to module.exports (so version number will be accessible via module.exports.version).
You could use the following code snippet:
require('pkginfo')(module, 'version');
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, module.exports.version
This module has very nice feature - it can be used in any file in your project (e.g. in subfolders) and it will automatically fetch information from your package.json. So you do not have to worry where you package.json is.
I hope that will help.
NPM one liner:
From npm v7.20.0:
npm pkg get version
Prior to npm v7.20.0:
npm -s run env echo '$npm_package_version'
Note the output is slightly different between these two methods: the former outputs the version number surrounded by quotes (i.e. "1.0.0"), the latter without (i.e. 1.0.0).
One solution to remove the quotes in Unix is using xargs
npm pkg get version | xargs echo
Option 1
Best practice is to version from package.json using npm environment variables.
process.env.npm_package_version
more information on: https://docs.npmjs.com/using-npm/config.html
This will work only when you start your service using NPM command.
Quick Info: you can read any values in pacakge.json using process.env.npm_package_[keyname]
Option 2
Setting version in environment variable using https://www.npmjs.com/package/dotenv as .env file and reading it as process.env.version
Just adding an answer because I came to this question to see the best way to include the version from package.json in my web application.
I know this question is targetted for Node.js however, if you are using Webpack to bundle your app just a reminder the recommended way is to use the DefinePlugin to declare a global version in the config and reference that. So you could do in your webpack.config.json
const pkg = require('../../package.json');
...
plugins : [
new webpack.DefinePlugin({
AppVersion: JSON.stringify(pkg.version),
...
And then AppVersion is now a global that is available for you to use. Also make sure in your .eslintrc you ignore this via the globals prop
If you are looking for module (package.json: "type": "module") (ES6 import) support, e.g. coming from refactoring commonJS, you should (at the time of writing) do either:
import { readFile } from 'fs/promises';
const pkg = JSON.parse(await readFile(new URL('./package.json', import.meta.url)));
console.log(pkg.version)
or, run the node process with node --experimental-json-modules index.js to do:
import pkg from './package.json'
console.log(pkg.version)
You will however get a warning, until json modules will become generally available.
If you get Syntax or (top level) async errors, you are likely in a an older node version. Update to at least node#14.
You can use ES6 to import package.json to retrieve version number and output the version on console.
import {name as app_name, version as app_version} from './path/to/package.json';
console.log(`App ---- ${app_name}\nVersion ---- ${app_version}`);
A safe option is to add an npm script that generates a separate version file:
"scripts": {
"build": "yarn version:output && blitz build",
"version:output": "echo 'export const Version = { version: \"'$npm_package_version.$(date +%s)'\" }' > version.js"
}
This outputs version.js with the contents:
export const Version = { version: "1.0.1.1622225484" }
To determine the package version in node code, you can use the following:
const version = require('./package.json').version; for < ES6 versions
import {version} from './package.json'; for ES6 version
const version = process.env.npm_package_version;
if application has been started using npm start, all npm_* environment variables become available.
You can use following npm packages as well - root-require, pkginfo, project-version.
we can read the version or other keys from package.json in two ways
1- using require and import the key required e.g:
const version = require('./package.json')
2 - using package_vars as mentioned in doc
process.env.npm_package_version
You can use the project-version package.
$ npm install --save project-version
Then
const version = require('project-version');
console.log(version);
//=> '1.0.0'
It uses process.env.npm_package_version but fallback on the version written in the package.json in case the env var is missing for some reason.
Why don't use the require resolve...
const packageJson = path.dirname(require.resolve('package-name')) + '/package.json';
const { version } = require(packageJson);
console.log('version', version)
With this approach work for all sub paths :)
In case you want to get version of the target package.
import { version } from 'TARGET_PACKAGE/package.json';
Example:
import { version } from 'react/package.json';
I know this isn't the intent of the OP, but I just had to do this, so hope it helps the next person.
If you're using docker-compose for your CI/CD process, you can get it this way!
version:
image: node:7-alpine
volumes:
- .:/usr/src/service/
working_dir: /usr/src/service/
command: ash -c "node -p \"require('./package.json').version.replace('\n', '')\""
for the image, you can use any node image. I use alpine because it is the smallest.
The leanest way I found:
const { version } = JSON.parse(fs.readFileSync('./package.json'))
I've actually been through most of the solutions here and they either did not work on both Windows and Linux/OSX, or didn't work at all, or relied on Unix shell tools like grep/awk/sed.
The accepted answer works technically, but it sucks your whole package.json into your build and that's a Bad Thing that only the desperate should use temporarily to get unblocked, and in general should be avoided, at least for production code. The alternative is to use that method only to write the version to a single constant that can be used instead of the whole file.
So for anyone else looking for a cross-platform solution (not reliant on Unix shell commands) and local (without external dependencies):
Since it can be assumed that Node.js is installed, and it's already cross-platform for this, I just created a make_version.js file with:
const PACKAGE_VERSION = require("./package.json").version;
console.log(`export const PACKAGE_VERSION = "${PACKAGE_VERSION}";`);
console.error("package.json version:", PACKAGE_VERSION);
and added a version command to package.json:
scripts: {
"version": "node make_version.js > src/version.js",
and then added:
"prebuild": "npm run version",
"prestart": "npm run version",
and it creates a new src/versions.js on every start or build. Of course this can be easily tuned in the version script to be a different location, or in the make_version.js file to output different syntax and constant name, etc.
I do this with findup-sync:
var findup = require('findup-sync');
var packagejson = require(findup('package.json'));
console.log(packagejson.version); // => '0.0.1'
I am using gitlab ci and want to automatically use the different versions to tag my docker images and push them. Now their default docker image does not include node so my version to do this in shell only is this
scripts/getCurrentVersion.sh
BASEDIR=$(dirname $0)
cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g'
Now what this does is
Print your package json
Search for the lines with "version"
Take only the first result
Replace " and ,
Please not that i have my scripts in a subfolder with the according name in my repository. So if you don't change $BASEDIR/../package.json to $BASEDIR/package.json
Or if you want to be able to get major, minor and patch version I use this
scripts/getCurrentVersion.sh
VERSION_TYPE=$1
BASEDIR=$(dirname $0)
VERSION=$(cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g')
if [ $VERSION_TYPE = "major" ]; then
echo $(echo $VERSION | awk -F "." '{print $1}' )
elif [ $VERSION_TYPE = "minor" ]; then
echo $(echo $VERSION | awk -F "." '{print $1"."$2}' )
else
echo $VERSION
fi
this way if your version was 1.2.3. Your output would look like this
$ > sh ./getCurrentVersion.sh major
1
$> sh ./getCurrentVersion.sh minor
1.2
$> sh ./getCurrentVersion.sh
1.2.3
Now the only thing you will have to make sure is that your package version will be the first time in package.json that key is used otherwise you'll end up with the wrong version
I'm using create-react-app and I don't have process.env.npm_package_version available when executing my React-app.
I did not want to reference package.json in my client-code (because of exposing dangerous info to client, like package-versions), neither I wanted to install an another dependency (genversion).
I found out that I can reference version within package.json, by using $npm_package_version in my package.json:
"scripts": {
"my_build_script": "REACT_APP_VERSION=$npm_package_version react-scripts start"
}
Now the version is always following the one in package.json.
I made a useful code to get the parent module's package.json
function loadParentPackageJson() {
if (!module.parent || !module.parent.filename) return null
let dir = path.dirname(module.parent.filename)
let maxDepth = 5
let packageJson = null
while (maxDepth > 0) {
const packageJsonPath = `${dir}/package.json`
const exists = existsSync(packageJsonPath)
if (exists) {
packageJson = require(packageJsonPath)
break
}
dir = path.resolve(dir, '../')
maxDepth--
}
return packageJson
}
If using rollup, the rollup-plugin-replace plugin can be used to add the version without exposing package.json to the client.
// rollup.config.js
import pkg from './package.json';
import { terser } from "rollup-plugin-terser";
import resolve from 'rollup-plugin-node-resolve';
import commonJS from 'rollup-plugin-commonjs'
import replace from 'rollup-plugin-replace';
export default {
plugins: [
replace({
exclude: 'node_modules/**',
'MY_PACKAGE_JSON_VERSION': pkg.version, // will replace 'MY_PACKAGE_JSON_VERSION' with package.json version throughout source code
}),
]
};
Then, in the source code, anywhere where you want to have the package.json version, you would use the string 'MY_PACKAGE_JSON_VERSION'.
// src/index.js
export const packageVersion = 'MY_PACKAGE_JSON_VERSION' // replaced with actual version number in rollup.config.js
const { version } = require("./package.json");
console.log(version);
const v = require("./package.json").version;
console.log(v);
Import your package.json file into your server.js or app.js and then access package.json properties into server file.
var package = require('./package.json');
package variable contains all the data in package.json.
Used to version web-components like this:
const { version } = require('../package.json')
class Widget extends HTMLElement {
constructor() {
super()
this.attachShadow({ mode: 'open' })
}
public connectedCallback(): void {
this.renderWidget()
}
public renderWidget = (): void => {
this.shadowRoot?.appendChild(this.setPageTemplate())
this.setAttribute('version', version)
}
}

Portable electron app is extracted in a different folder every time it opens

electron-builder version: 20.9.2
Target: windows/portable
I'm building a portable app with electron-builder and using socket.io to keep a real-time connection with a backend service but I have an issue with the firewall. Because this is a portable app everytime the app is opened it looks that it is extracted in the temporary folder, which will generate a new folder (so the path to the app will be different) in every run which will make the firewall think that this is another app asking for the connection permissions. How can I change the extraction path when I run the app?
(This is the screen that I get every time I run the app)
This is my socket.io configuration
const io = require("socket.io")(6524);
io.on("connect", socket => {
socket.on("notification", data => {
EventBus.$emit("notifications", JSON.parse(data));
});
});
My build settings in package.json
"build": {
"productName": "xxx",
"appId": "xxx.xxx.xxx",
"directories": {
"output": "build"
},
"files": [
"dist/electron/**/*",
"!**/node_modules/*/{CHANGELOG.md,README.md,README,readme.md,readme,test,__tests__,tests,powered-test,example,examples,*.d.ts}",
"!**/node_modules/.bin",
"!**/*.{o,hprof,orig,pyc,pyo,rbc}",
"!**/._*",
"!**/{.DS_Store,.git,.hg,.svn,CVS,RCS,SCCS,__pycache__,thumbs.db,.gitignore,.gitattributes,.editorconfig,.flowconfig,.yarn-metadata.json,.idea,appveyor.yml,.travis.yml,circle.yml,npm-debug.log,.nyc_output,yarn.lock,.yarn-integrity}",
"!**/node_modules/search-index/si${/*}"
],
"win": {
"icon": "build/icons/myicon.ico",
"target": "portable"
}
},
Any idea about how at least I could specify an extraction path or make this extract it the execution folder?
BTW I already created an issue about this in the electron-builder repo
In version 20.40.1 they added a new configuration key unpackDirName
/**
* The unpack directory name in [TEMP](https://www.askvg.com/where-does-windows-store-temporary-files-and-how-to-change-temp-folder-location/) directory.
*
* Defaults to [uuid](https://github.com/segmentio/ksuid) of build (changed on each build of portable executable).
*/
readonly unpackDirName?: string
Example
config: {
portable: {
unpackDirName: "0ujssxh0cECutqzMgbtXSGnjorm",
}
}
More info #3799.

How to bind a static directory to an angular 4 application

Is it possible to bind a whole directory of static files to an angular 4 application? The intention behind this is to get the content of this directory dynamically to see what files are inside the directory.
I know it´s possible to bind a directory to the application using
"assets": [
"favicon.ico",
"assets",
{
"glob": "**/*.svg",
"input": "../node_modules/#myModule/assets/images",
"output": "./assets/images"
}
]
With the above approach, you can only address the files by calling them explicit e.g. HOST:PORT/assets/images/myImage.svg
So the question is:
Is it possible to bind an directory, so I´m able to call HOST:PORT/assets/images and get all containing files dynamically?
If not:
Is there another way to get all files dynamically from my static directory in my Angular app?
The short answer: no.
The longer answer, not right out of the box. The icons (as mentioned in your comments) are accessible, but not listable. So you can create a list of icons on build time, something like this.
Add an enumerator script in your tools or similar directory. E.g.
const fs = require('fs');
const readdirSync = fs.readdirSync;
const writeFileSync = fs.writeFileSync;
const files = readdirSync('src/assets/svg');
const jsonObj = { files };
writeFileSync('src/app/svg-files.json', JSON.stringify(jsonObj, null, 2));
Run that script in your package.json. E.g.
"scripts": {
...
"listSvgs": "node tools/list-svgs"
}
Run that script in your build pipeline in package.json:
"scripts": {
...
- "build": "ng build -p",
+ "build": "npm run listSvgs && ng build -p"
...
}
(You'll know, I guess, which line goes out and which comes in its place.)
Generate the files first time manually, so you don't forget (npm run listSvgs).
Add a service to fetch the files in your Angular code:
#Injectable
export class SVGListingService {
constructor(private httpClient: HttpClient) {}
getSVGs() {
return this.http.get('svg-files.json');
}
}
You should be able to use it, remember to rerun it when you add svgs. And adjust paths.

Creating a lambda function in AWS from zip file

I am trying to create a simple lambda function, and I'm running into an error.
My code is basically
console.log('Loading function');
exports.handler = function(event, context) {
console.log('value1 =', event.key1);
console.log('value2 =', event.key2);
console.log('value3 =', event.key3);
context.succeed(event.key1); // Echo back the first key value
// context.fail('Something went wrong');
}
in a helloworld.js file. I zip that up and upload it as a zip file in the creating a lambda function section, and I keep getting this error:
{
"errorMessage": "Cannot find module 'index'",
"errorType": "Error",
"stackTrace": [
"Function.Module._resolveFilename (module.js:338:15)",
"Function.Module._load (module.js:280:25)",
"Module.require (module.js:364:17)",
"require (module.js:380:17)"
]
}
Does anyone have any ideas?
The name of your file needs to match the module name in the Handler configuration. In this case, your Handler should be set to helloworld.handler, where helloworld is the file that would be require()'d and handler is the exported function. Then it should work with the same zip file.
Make sure your index.js is in the root of the zipfile and not in a subdirectory.
In my case I had the name of the module matching the name of the file and the exported handler, the real problem was macOS and the zip program which basically creates a folder inside the zip file so when uncompressed in AWS Lambda engine the index.js ends in a subdirectory.
Using Finder
Don't right click and compress the directory, instead select the files individual files like index.js, package.json and the node_modules directory and right-click to compress, you may end up with a file Archive.zip in the same directory. The name of the zip file is not going to be fancy but at least it will work when you submit it to AWS Lambda.
Using the command line
You could make the same mistake using the command line with zip -r function.zip function which basically creates a zip file with a directory called function in it, instead do:
$ zip function.zip index.js package.json node_modules
adding: index.js (deflated 47%)
adding: package.json (deflated 36%)
adding: node_modules/ (stored 0%)
How to know verify your zip file
Using finder, if you double click the zip file and it uncompresses in a subdirectory then Lambda won't be able to see the file as index.js lives in that subdirectory.
Using the command line and zipinfo:
$ zipinfo function.zip | grep index.js | more
-rw-r--rw- 2.1 unx 1428 bX defN 27-Jul-16 12:21 function/index.js
Notice how index.js ended up inside the subdirectory function, you screwed up.
$ zipinfo function.zip | grep index.js | more
-rw-r--rw- 3.0 unx 1428 tx defN 27-Jul-16 12:21 index.js
Notice that index.js is not inside a subfolder, this zip file will work in AWS Lambda.
Leveraging npm commands to zip the function
So I added a script to my package to zip the project files for me just by running npm run zip
{
"name": "function",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"zip": "zip function.zip package.json *.js node_modules"
},
"dependencies": {
"aws-sdk": "^2.4.10"
}
}
$ npm run zip
> function#1.0.0 zip
> zip function.zip package.json *.js node_modules
adding: package.json (deflated 41%)
adding: index.js (deflated 47%)
adding: local.js (deflated 42%)
adding: node_modules/ (stored 0%)
Here is an advance way with AWS CLI. It will save your time in long term use.
First of all you should install and configure AWS CLI:
http://docs.aws.amazon.com/cli/latest/userguide/installing.html
1) Create an archive
$ zip -r lambda *
It will create for us lambda.zip file with all folders and files in the our current location.
2) Get role ARN
$ aws iam list-roles | grep "your_role"
It will return to us ARN that we will use with our lambda. You should create it by your hands
Example for list-roles
3) Create our lambda
$ aws lambda create-function --function-name "your_lambda_name" --zip-file fileb://lambda.zip --handler index.handler --runtime nodejs6.10 --timeout 15 --role COPY_HERE_YOUR_ARN_FROM_THE_STEP_2
We are done!
Automation - using Grunt
Complete AWS Lambda Seed project is available on Git.
Step 1: Init npm module
npm init
Step 2: Install Grunt
npm install --save-dev grunt grunt-cli
Step 3: Install grunt-aws-lambda
npm install --save-dev grunt-aws-lambda
Step 4: Create Folder for Lambda service
# Create directory
mkdir lambdaTest
# Jump into folder
cd lambdaTest
# Create service file
touch lambdaTest.js
# Initialize npm
npm init
Keep your logic/code into lambdaTest.js
'use strict'
exports.handler = (event, context, callback) => {
console.log("Hello it's looks like working");
};
Step 5: Create Gruntfile.js
Navigate back to root folder
touch Gruntfile.js
'use strict'
module.exports = function (grunt) {
grunt.initConfig({
lambda_invoke: {
lambdaTest: {
options: {
file_name: "lambdaTest/lambdaTest.js",
event: "lambdaTest/test.json",
}
}
},
lambda_package: {
lambdaTest: {
options: {
package_folder: 'lambdaTest/'
}
}
},
lambda_deploy: {
lambdaTest: {
arn: 'arn:aws:lambda:eu-central-1:XXXXXXXX:function:lambdaTest',
options: {
credentialsJSON: 'awsCredentials.json',
region: "eu-central-1"
},
}
},
});
grunt.loadNpmTasks('grunt-aws-lambda');
grunt.registerTask('ls-deploy', ['lambda_package:lambdaTest', 'lambda_deploy:lambdaTest']);
};
Step 6: Create awsCredentials.js
Create AWS IAM User with custom policy, Custom policy should have access to lambda:GetFunction, lambda:UploadFunction, lambda:UpdateFunctionCode, lambda:UpdateFunctionConfiguration and iam:PassRole
{
"accessKeyId": "XXXXXXXXXXXXXXXXXXXX",
"secretAccessKey": "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
}
Step 7: Create a zip and deploy to AWS Lambda.
ls-deploy is custom task created by in Gruntfile above, which creates a zip of source code and deploy to Lambda.
grunt ls-deploy
Complete AWS Lambda Seed project is available on Git.
Let's take a folder named 'sample' as an example which we want to zip. Let's assume there are some subfolders or files within the sample folder.
Q. What you have to do?
A: Following are the steps:
Go inside the folder 'sample'.
select all required files or subfolders.
Right click on any one and select send to.
You will see Archive.zip, simply save it in your laptop anywhere you want.
Upload this zip as Amazon lambda function.
Q. What not to do?
A: Do not zip 'sample' folder. It won't work.
The same error occurs when you use the wrong runtime language
Its because in exports.handler, you are not referencing the index function. This can be solved in a more simpler way
Try this,
console.log('Loading function');
exports.handler = function index (event, context) {
console.log('value1 =', event.key1);
console.log('value2 =', event.key2);
console.log('value3 =', event.key3);
context.succeed(event.key1); // Echo back the first key value
// context.fail('Something went wrong');
}

Categories