Using private key from GoDaddy on Nodejs - javascript

We purchased a domain name and SSL certificate on godaddy, but our server is not on GoDaddy. WE run Lampp and NodeJS in our server, and we are trying to set up SSL with both. There is no problem with Lampp. the private key and certificate from godaddy is working. but when i try the same files with NodeJS. it fails.
This is my js script:
ssl = {
key: fs.readFileSync("./key.pem",'utf8'),
cert: fs.readFileSync("./cert.crt",'utf8'),
ca: [fs.readFileSync('./g1.crt','utf8'),
fs.readFileSync('./g2.crt','utf8'), fs.readFileSync('./g3.crt','utf8')]
};
server = require('https').createServer(ssl, app);
This is the Error
_tls_common.js:104
c.context.setKey(options.key, options.passphrase);
^
Error: error:0909006C:PEM routines:get_name:no start line
After some googling, i have tried several solution: adding "utf8", spliting gd bundle, using nodepad++ to fix code. None of them helped.
However, nodejs can use my self-signed key and certificate files. So i would like to ask. Did i generate my key incorrectly? Should I manually generate private key/CSR locally and request a new certificate on GoDaddy? or there is something wrong in my code?

This error message would mean that those files are wrong, corrupt or was requested for other OS Enviroments. So we have some options.
Resolution about the code (importing file system library and use full path).
let yourKey = fs.readFileSync('./folderOne/folderTwo/initial.key').toString();
let yourCertificate = fs.readFileSync('./folderOne/folderTwo/certificate.crt').toString();
var credentials = { key: yourKey, cert: yourCertificate };
Resolution requesting per OS compatibility:
Request for new certificates with a note about the OS (Linux, Windows, etc) sending the initial key for the provider that was sent to you.
Important.: You only need the .crt file and the private key.

Related

reading in a file from ubuntu (AWS EC2) on local machine?

I have a python script which I'm running on AWS (EC2 instance with Ubuntu). This python script outputs a JSON file daily, to a directory in /home/ubuntu:
with open("/home/ubuntu/bandsintown/sf_events.json", "w") as writeJSON:
file_str = json.dumps(allEvents, sort_keys=True)
file_str = "var sf_events = " + file_str
All works as expected here. My issue is that I'm unsure of how to read this JSON (existing on ubuntu) into a javascript file that I'm running on my local machine.
Javascript can't find the file if I call the file from ubuntu:
<script src="/home/ubuntu/bandsintown/sf_events.json"></script>
In other words, I'd like to read in the JSON that I've created in the cloud, to a file that exists on my local machine. Should I output the JSON somewhere other than home/ubuntu? Or, can my local file somehow recognize /home/ubuntu as a file location?
Thanks in advance.
The problem occurs because the file does not exist on your local machine, only on the running EC2 instance.
A possible solution is to upload the JSON file from EC2 instance to S3 and afterward download the JSON file to your local machine /home/ubuntu/bandsintown/sf_events.json.
First, install the AWS CLI toolkit on running EC2 instance AWS CLI and run the following commands in the terminal
aws configure
aws s3 cp /home/ubuntu/bandsintown/sf_events.json s3://mybucket/sf_events.json
Or install Python AWS SDK boto3 and upload it via python
s3 = boto3.resource('s3')
def upload_file_to_s3(s3_path, local_path):
bucket = s3_path.split('/')[2] #bucket is always second as paths are S3://bucket/.././
file_path = '/'.join(s3_path.split('/')[3:])
response = s3.Object(bucket, file_path).upload_file(local_path)
return response
s3_path = "s3://mybucket/sf_events.json"
local_path = "/home/ubuntu/bandsintown/sf_events.json"
upload_file_to_s3(s3_path, local_path)
Then on your local machine download file from s3 via AWS CLI
aws configure
aws s3 cp s3://mybucket/sf_events.json /home/ubuntu/bandsintown/sf_events.json
Or if you prefer python SDK:
s3 = boto3.resource('s3')
def download_file_from_s3(s3_path, local_path):
bucket = s3_path.split('/')[2] #bucket is always second as paths are S3://bucket/.././
file_path = '/'.join(s3_path.split('/')[3:])
filename = os.path.basename(s3_path)
s3.Object(bucket, file_path).download_file(local_file_path)
s3_path = "s3://mybucket/sf_events.json"
local_path = "/home/ubuntu/bandsintown/sf_events.json"
download_file_from_s3(s3_path, local_path)
Or using Javascript SDK running inside of browser, but I would not recommend this because you must make your bucket public and also take care of browser compatibility issue
You can use aws S3
You can run one python script on your instance which uploads the json file to s3 whenever the json gets generated and another python script on local machine where you can use (script for sqs queue and s3 download configuration) or (script which downloads the latest file uploaded to s3 bucket).
Case1:
Whenever the json file gets uploaded to S3 you will get message in the sqs queue that the file has been uploaded to s3 and then the file gets downloaded to your local machine.
Case2:
Whenever the json file gets uploaded to s3, you can run the download script which downloads the latest json file.
upload.py:
import boto3
import os
import socket
def upload_files(path):
session = boto3.Session(
aws_access_key_id='your access key id',
aws_secret_access_key='your secret key id',
region_name='region'
)
s3 = session.resource('s3')
bucket = s3.Bucket('bucket name')
for subdir, dirs, files in os.walk(path):
for file in files:
full_path = os.path.join(subdir, file)
print(full_path[len(path)+0:])
with open(full_path, 'rb') as data:
bucket.put_object(Key=full_path[len(path)+0:], Body=data)
if __name__ == "__main__":
upload_files('your pathwhich in your case is (/home/ubuntu/)')
your other script on local machine:
download1.py with sqs queue
import boto3
import logzero
from logzero import logger
s3_resource = boto3.resource('s3')
sqs_client=boto3.client('sqs')
### Queue URL
queue_url = 'queue url'
### aws s3 bucket
bucketName = "your bucket-name"
### Receive the message from SQS queue
response_message = sqs_client.receive_message(
QueueUrl=queue_url,
MaxNumberOfMessages=1,
MessageAttributeNames=[
'All'
],
)
message=response_message['Messages'][0]
receipt_handle = message['ReceiptHandle']
messageid=message['MessageId']
filename=message['Body']
try:
s3_resource.Bucket(bucketName).download_file(filename,filename)
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code']=='404':
logger.info("The object does not exist.")
else:
raise
logger.info("File Downloaded")
download2.py with latest file downloading from s3:
import boto3
### S3 connection
s3_resource = boto3.resource('s3')
s3_client = boto3.client('s3')
bucketName = 'your bucket-name'
response = s3_client.list_objects_v2(Bucket=bucketName)
all = response['Contents']
latest = max(all, key=lambda x: x['LastModified'])
s3 = boto3.resource('s3')
key=latest['Key']
print("downloading file")
s3_resource.Bucket(bucketName).download_file(key,key)
print("file download")
You basically need to copy a file from remote machine to your local one. The most simple way is to use scp. In the following example it just copies to your current directory. If you are on Windows, open PowerShell, if you are on Linux , scp should be installed already.
scp <username>#<your ec2 instance host or IP>:/home/ubuntu/bandsintown/sf_events.json ./
Run the command, enter your password, done. The same way you are using ssh to connect to your remote machine. (I believe your username would be ubuntu)
More advanced method would be mounting your remote directory via SSHFS. It is a little cumbersome to set up, but then you will have instant access to the remote files as if they were local.
And if you want to do it pragramatically from Python, see this question.
Copying files from local to EC2
Your private key must not be publicly visible. Run the following command so that only the root user can read the file.
chmod 400 yourPublicKeyFile.pem
To copy files between your computer and your instance you can use an FTP service like FileZilla or the command scp. “scp” means “secure copy”, which can copy files between computers on a network. You can use this tool in a Terminal on a Unix/Linux/Mac system.
To use scp with a key pair use the following command:
scp -i /directory/to/abc.pem /your/local/file/to/copy user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:path/to/file
You need to specify the correct Linux user. From Amazon:
For Amazon Linux, the user name is ec2-user.
For RHEL, the user name is ec2-user or root.
For Ubuntu, the user name is ubuntu or root.
For Centos, the user name is centos.
For Fedora, the user name is ec2-user.
For SUSE, the user name is ec2-user or root.
Otherwise, if ec2-user and root don’t work, check with your AMI provider.
To use it without a key pair, just omit the flag -i and type in the password of the user when prompted.
Note: You need to make sure that the user “user” has the permission to write in the target directory. In this example, if ~/path/to/file was created by user “user”, it should be fine.
Copying files from EC2 to local
To use scp with a key pair use the following command:
scp -i /directory/to/abc.pem user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:path/to/file /your/local/directory/files/to/download
Reference: Screenshot from terminal
Hack 1: While downloading file from EC2, download folder by archiving it.
zip -r squash.zip /your/ec2/directory/
Hack 2 : You can download all archived files from ec2 to just by below command.
scp -i /directory/to/abc.pem user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:~/* /your/local/directory/files/to/download
Have you thought about using EFS for this? You can mount EFS on ec2 as well as on your local machine over a VPN or a direct connect? Can you not save the file on EFS so both sources can access it?
Hope this helps.

How to use .pem file inside Amazon Lambda to access EC2 instance

I'm currently working on a project that take place inside the AWS environment. I have configure a S3 bucket in order to receive mails (mails are coming from SES but that's not relevant).
What I want to do is to create a Lambda function that will be able to access a EC2 instance and launch a python scripts. So far i have the code below. The problem is that when I created my ec2 instance, I didnt create any username or password to connect via SSH. I only have a .pem file (certificate file) to authenticate to the instance.
I did some research but i couldn't find anything useful.
var SSH = require('simple-ssh');
var ssh = new SSH({
host: 'localhost',
user: 'username',
pass: 'password'
});
ssh.exec('python3.6 path/to/my/python/script.py', {
out: function(stdout) {
console.log(stdout);
}
}).start();
i've been thinking of severals solutions, but i'm not sure at all :
find an SSH library in Javascript that handle .pem file
converting .pem into a String (not secure at all, in my opinion).
maybe create a new ssh user in EC2 ?
Thanks you for your time.
A better option would be to use AWS Systems Manager to remotely run commands on your Amazon EC2 instances.
If you still choose to use simple-ssh then you need to supply an SSH key in config.key when creating your SSH object. You can store the private key in Parameter Store or Secrets Manager and retrieve it within the Lambda. In this case, you should definitely use passwordless SSH (with keypair).

Loading an intermediate Certificate to Node.js

I am trying to get my intermediate cert to be recognized by node.js.
I am on a windows server 2008R2 and IIS 7 for the main application
However I have an application on port 4443 that is node.js and needs to be served via https. I have the pfx file that I am pointing to in the ssloptions and passing that into the createserver along with the passphrase.
The intermediate cert has been installed into the windows certificate store .
I am using this middleware https://github.com/coolaj86/node-ssl-root-cas
He
Here is my code:
var myssl = require('ssl-root-cas/latest').inject()
.addFile('path-to-app/ssl/gd-g2_iis_intermediates.pem');
…
var ssloptions = {
pfx: fs.readFileSync('path-to-app/ssl/my.pfx'),
passphrase: "thecorrectpassword"
};
app.listen = function(){
//var server = http.createServer(this);
var server = https.createServer(ssloptions,this);
return server.listen.apply(server, arguments);
};
When I test the url https://example.com/ at the SSL checker it reports the chain as complete. when I check this url https://example.com:4443/ I get the message
The certificate is not trusted in all web browsers. You may need to install an Intermediate/chain certificate to link it to a trusted root certificate. Learn more about this error. You can fix this by following GoDaddy's Certificate Installation Instructions for your server platform. Pay attention to the parts about Intermediate certificates.
The GoDaddy docs are no help.
Any and all help very much appreciated.
Thanks
Mark

Unable to verify leaf signature

I'm using node.js request.js to reach an api. I'm getting this error
[Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE]
All of my credentials are accurate and valid, and the server's fine. I made the same request with postman.
request({
"url": domain+"/api/orders/originator/"+id,
"method": "GET",
"headers":{
"X-API-VERSION": 1,
"X-API-KEY": key
},
}, function(err, response, body){
console.log(err);
console.log(response);
console.log(body);
});
This code is just running in an executable script ex. node ./run_file.js, Is that why? Does it need to run on a server?
Note: the following is dangerous, and will allow API content to be intercepted and modified between the client and the server.
This also worked
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = '0';
It's not an issue with the application, but with the certificate which is signed by an intermediary CA.
If you accept that fact and still want to proceed, add the following to request options:
rejectUnauthorized: false
Full request:
request({
"rejectUnauthorized": false,
"url": domain+"/api/orders/originator/"+id,
"method": "GET",
"headers":{
"X-API-VERSION": 1,
"X-API-KEY": key
},
}, function(err, response, body){
console.log(err);
console.log(response);
console.log(body);
});
The Secure Solution
Rather than turning off security you can add the necessary certificates to the chain. First install ssl-root-cas package from npm:
npm install ssl-root-cas
This package contains many intermediary certificates that browsers trust but node doesn't.
var sslRootCAs = require('ssl-root-cas/latest')
sslRootCAs.inject()
Will add the missing certificates. See here for more info:
https://git.coolaj86.com/coolaj86/ssl-root-cas.js
CoolAJ86's solution is correct and it does not compromise your security like disabling all checks using rejectUnauthorized or NODE_TLS_REJECT_UNAUTHORIZED. Still, you may need to inject an additional CA's certificate explicitly.
I tried first the root CAs included by the ssl-root-cas module:
require('ssl-root-cas/latest')
.inject();
I still ended up with the UNABLE_TO_VERIFY_LEAF_SIGNATURE error. Then I found out who issued the certificate for the web site I was connecting to by the COMODO SSL Analyzer, downloaded the certificate of that authority and tried to add only that one:
require('ssl-root-cas/latest')
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
I ended up with another error: CERT_UNTRUSTED. Finally, I injected the additional root CAs and included "my" (apparently intermediary) CA, which worked:
require('ssl-root-cas/latest')
.inject()
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
For Create React App (where this error occurs too and this question is the #1 Google result), you are probably using HTTPS=true npm start and a proxy (in package.json) which goes to some HTTPS API which itself is self-signed, when in development.
If that's the case, consider changing proxy like this:
"proxy": {
"/api": {
"target": "https://localhost:5001",
"secure": false
}
}
secure decides whether the WebPack proxy checks the certificate chain or not and disabling that ensures the API self-signed certificate is not verified so that you get your data.
It may be very tempting to do rejectUnauthorized: false or process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = '0'; but don't do it! It exposes you to man in the middle attacks.
The other answers are correct in that the issue lies in the fact that your cert is "signed by an intermediary CA." There is an easy solution to this, one which does not require a third party library like ssl-root-cas or injecting any additional CAs into node.
Most https clients in node support options that allow you to specify a CA per request, which will resolve UNABLE_TO_VERIFY_LEAF_SIGNATURE. Here's a simple example using node's built-int https module.
import https from 'https';
const options = {
host: '<your host>',
defaultPort: 443,
path: '<your path>',
// assuming the bundle file is co-located with this file
ca: readFileSync(__dirname + '/<your bundle file>.ca-bundle'),
headers: {
'content-type': 'application/json',
}
};
https.get(options, res => {
// do whatever you need to do
})
If, however, you can configure the ssl settings in your hosting server, the best solution would be to add the intermediate certificates to your hosting provider. That way the client requester doesn't need to specify a CA, since it's included in the server itself. I personally use namecheap + heroku. The trick for me was to create one .crt file with cat yourcertificate.crt bundle.ca-bundle > server.crt. I then opened up this file and added a newline after the first certificate. You can read more at
https://www.namecheap.com/support/knowledgebase/article.aspx/10050/33/installing-an-ssl-certificate-on-heroku-ssl
You can also try by setting strictSSL to false, like this:
{
url: "https://...",
method: "POST",
headers: {
"Content-Type": "application/json"},
strictSSL: false
}
I had the same issues. I have followed #ThomasReggi and #CoolAJ86 solution and worked well but I'm not satisfied with the solution.
Because "UNABLE_TO_VERIFY_LEAF_SIGNATURE" issue is happened due to certification configuration level.
I accept #thirdender solution but its partial solution.As per the nginx official website, they clearly mentioned certificate should be combination of The server certificate and chained certificates.
Just putting this here in case it helps someone, my case was different and a bit of an odd mix. I was getting this on a request that was accessed via superagent - the problem had nothing to do with certificates (which were setup properly) and all to do with the fact that I was then passing the superagent result through the async module's waterfall callback. To fix: Instead of passing the entire result, just pass result.body through the waterfall's callback.
Following commands worked for me :
> npm config set strict-ssl false
> npm cache clean --force
The problem is that you are attempting to install a module from a repository with a bad or untrusted SSL[Secure Sockets Layer] certificate. Once you clean the cache, this problem will be resolved.You might need to turn it to true later on.
Another approach to solving this securely is to use the following module.
node_extra_ca_certs_mozilla_bundle
This module can work without any code modification by generating a PEM file that includes all root and intermediate certificates trusted by Mozilla. You can use the following environment variable (Works with Nodejs v7.3+),
NODE_EXTRA_CA_CERTS
To generate the PEM file to use with the above environment variable. You can install the module using:
npm install --save node_extra_ca_certs_mozilla_bundle
and then launch your node script with an environment variable.
NODE_EXTRA_CA_CERTS=node_modules/node_extra_ca_certs_mozilla_bundle/ca_bundle/ca_intermediate_root_bundle.pem node your_script.js
Other ways to use the generated PEM file are available at:
https://github.com/arvind-agarwal/node_extra_ca_certs_mozilla_bundle
NOTE: I am the author of the above module.
I had an issue with my Apache configuration after installing a GoDaddy certificate on a subdomain. I originally thought it might be an issue with Node not sending a Server Name Indicator (SNI), but that wasn't the case. Analyzing the subdomain's SSL certificate with https://www.ssllabs.com/ssltest/ returned the error Chain issues: Incomplete.
After adding the GoDaddy provided gd_bundle-g2-g1.crt file via the SSLCertificateChainFile Apache directive, Node was able to connect over HTTPS and the error went away.
If you come to this thread because you're using the node postgres / pg module, there is a better solution than setting NODE_TLS_REJECT_UNAUTHORIZED or rejectUnauthorized, which will lead to insecure connections.
Instead, configure the "ssl" option to match the parameters for tls.connect:
{
ca: fs.readFileSync('/path/to/server-ca.pem').toString(),
cert: fs.readFileSync('/path/to/client-cert.pem').toString(),
key: fs.readFileSync('/path/to/client-key.pem').toString(),
servername: 'my-server-name' // e.g. my-project-id/my-sql-instance-id for Google SQL
}
I've written a module to help with parsing these options from environment variables like PGSSLROOTCERT, PGSSLCERT, and PGSSLKEY:
https://github.com/programmarchy/pg-ssl
Hello just a small adition to this subject since in my case the
require('ssl-root-cas/latest')
.inject()
.addFile(__dirname + '/comodohigh-assurancesecureserverca.crt');
didn't work out for me it kept returning error that the file could not be downloaded i had been a couple of hours into the reasearch of this particular error when I ran into this response https://stackoverflow.com/a/65442604
Since in my application we do have a proxy to proxy some of our requests as a security requirement of some of our users I found that in the case you are consulting an API that has this issue and if you can access the API url throught your browser you can proxy your request and it might fix the [Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE] issue.
An example of how i use my proxy
await axios.get(url, {
timeout: TIME_OUT,
headers: {
'User-Agent': 'My app'
},
params: params,
proxy: {
protocol: _proxy.protocol,
host: _proxy.hostname,
port: _proxy.port,
auth: {
username: _proxy_username,
password: _proxy_password
}
}
});
I had the same problem and I am able to fix it the following way,
Use the full-chain or just the chain certificate instead of just the certificate.
That is all.
This same error can be received when trying to install a local git shared repo from npm.
The error will read: npm ERR! code UNABLE_TO_VERIFY_LEAF_SIGNATURE
Apparently there is an issue with the certificate, however what worked for me was change the link to my shared repo in the package.json file from:
"shared-frontend": "https://myreposerver"
to:
"shared-frontend": "git+https://myreposerver"
In short, just adding git+ to the link solved it.
Another reason node could print that error is because a backend connection/service is misconfigured.
Unfortunately, the node error doesn't say which certificate it was unable to verify [feature request !]
Your server may have a perfectly good certificate chain installed for clients to connect and even show a nice padlock in the browser's URL bar, but when the server tries to connect to a backend database using a different misconfigured certificate, then it could raise an identical error.
I had this issue in some vendor code for some time. Changing a backend database connection from self-signed to an actual certificate resolved it.
You have to include the Intermediate certificate in your server. This solves the [Error: UNABLE_TO_VERIFY_LEAF_SIGNATURE]

Writing firefox extension: How to export private key from certificate database? Possible?

I am writing a Firefox extension and am looking for a way to export the private key from an installed certificate.
This would be replacing the previous process of saving a backup PKCS12 .p12 file, then running using: "openssl pkcs12 -nocert -in backup.p12 -out userkey.pem"
Thanks!
EDIT: I can now save a PKCS12 backup using the XPCOM API, I can extract the Certificate, but am still looking for a way to extract the private key (see the openssl command above). This needs to be cross platform...
If the point is simply avoiding to export the private key manually then you can use pk12util tool which is part of NSS. You can export the certificate like this:
pk12util -o backup.p12 -n certificate_name -d /firefox/profile/dir
That's a lot easier than doing the same thing from an extension. From what I know, NSS explicitly doesn't allow storing the private key unencrypted in the PEM format so you would still need OpenSSL for that.
I have given up doing this. Instead I'm writing a python CGI script and sending the certificate and keys over an SSL connection to an Apache server.

Categories