I have a CSR and I can parse all the data with pkijs.org lib, but I have no luck to parse alternative names data. How is it possible to do with a javascript? Some other libs can be in use, I guess, do you know one?
Following the docs of CertificationRequest class provided by pkijs here https://pkijs.org/docs/classes/CertificationRequest.html. We can see that the structure of a CSR. The subject alternative name will be stored in attributes propery of CertificationRequest object. But the structure inside of attributes is quite complex to make it as plain text. This is my code used to print out the subject alternative name
const pkijs = require('pkijs');
const utils = require("pvtsutils");
const asn1js = require("asn1js");
let base64 = "<your_csr_in_base64>"
let csrraw = utils.Convert.FromBase64(base64);
console.log(csrraw)
const pkcs10 = pkijs.CertificationRequest.fromBER(csrraw);
let seq = pkcs10.attributes[0].values[0];
let exts = pkijs.Extensions.fromBER(seq.toBER(false));
console.log(exts);
var san = getExtentionsForSANFromExtensions(exts);
console.log(san)
if (san != undefined) {
san.names.forEach(element => {
console.log(element.type + " = " + element.value)
});
}
function getExtentionsForSANFromExtensions(exts){
for (var i = 0 ; i< exts.extensions.length; i++) {
var ext = exts.extensions[i];
if(ext.extnID == '2.5.29.17') {
var octetString = asn1js.fromBER(ext.extnValue.toBER(false)).result;
return pkijs.GeneralNames.fromBER(octetString.getValue());
}
}
}
I've tested this code and it works properly with CSR generated by Keystore Explorer. Have not tested with another tool to generate CSR that supports subject alternative names.
Cheers!
If you have a CSR and need to extract the alternative names data from it, you can use the following command:
openssl req -in csr.pem -noout -text
This will print out the entire CSR, including the alternative names data.
Is there a way to have a selection of many transactions printed into a single PDF document? I only see two options which seem to have significant drawbacks:
1) Load individual records into each of their own nlobjTemplateRenderer objects, and then stitch them all together within tags before rendering to PDF. Has a limit of less than 50 transactions depending on other actions taken when used within a Suitelet.
2) Do a search based upon internals IDs of selected records and pass the search results into a nlobjTemplateRenderer object. This method, based upon existing documentation, does not lead me to believe that it will properly display records with line data as result columns completely within a single document.
It almost seems like my best option is #1, but to split the desired transaction up into groups of 5-10 records and repeatedly calling a Suitelet with the small groups in the hopes of meeting the 45-second timeout limit of nlapiRequestURL before stitching together all of the results and returning the final PDF document. I pretty much see a basic form of that as the following:
// initial called function that will return completed PDF document file
function buildPdfFromRecords() {
var pdfBuilder = [];
var selectedIDs = [];
var chunks = chunkify(selectedIDs, 10);
for (var c = 0; c < chunks.length; c++) {
var param = { id_list : JSON.stringify(chunks[s]) };
var result = nlapiRequestURL(url, param).getBody();
pdfBuilder.push(result);
}
var finalXML = "<pdfset>" + pdfBuilder.join("") + "</pdfset>";
var pdfDoc = nlapiXMLToPDF(finalXML);
}
// function in suitelet called by url to handle individual groups of record internal IDs
// to mitigate scripting governance limits
function handleRecordIdListRequest(request, reponse) {
var idList = JSON.parse(request.getParameter("id_list"));
var templateXML = nlapiLoadRecord("template.txt").getValue();
var pdfBuilder = [];
for (var i = 0; i < idList.length; i++) {
var transRecord = nlapiLoadRecord("recordtype", idList[i]);
var renderer = nlapiCreateTemplateRenderer();
renderer.setTemplate(templateXML);
renderer.addRecord("record", transRecord);
pdfBuilder.push(renderer.renderToString());
}
response.write(pdfBuilder.join(""));
}
If this is really the best way, then so be it, but I'm hoping there's a more elegant solution out there that I'm just not seeing.
There are a number of pieces you can stitch together to get this done.
In the post handler of your Suitelet use the N/task library to schedule a map/reduce task. The task.submit method returns a taskId that you can use to monitor the progress of your job. Once your UI has a taskId it can periodically check to see if the task has completed. When complete you can show the generated .pdf. You could also let the user know that the pdf might take a few minutes to generate and offer to email it to them when done. Here's a snippet that schedules a scheduled script with parameters:
const mrTask = task.create({
taskType:task.TaskType.SCHEDULED_SCRIPT,
scriptId:'customscript_knsi_batch_products',
deploymentId: deploymentId,
params: {
custscript_knsi_batch_operator:user.id,
custscript_knsi_batch_sourcing: sourcingId
}
});
try{
const taskId = mrTask.submit();
context.response.setHeader({name:'content-type', value:'application/json'});
context.response.write(JSON.stringify({
success:true,
message:'queued as task: '+ taskId
}));
}catch(e){
log.error({
title:'triggering '+ sourcingId +' for '+ user.email,
details:(e.message || e.toString()) + (e.getStackTrace ? (' \n \n' + e.getStackTrace().join(' \n')) : '')
});
context.response.setHeader({name:'content-type', value:'application/json'});
context.response.write(JSON.stringify({
success:false,
message:'An error occured scheduling this script\n'+e.message
}));
Use a Map/Reduce script where your map method generates and returns each transaction's pdf file url. You'll only have a single key so that the results of all map stages coalesce into a single reduce.
In the reduce step you can generate open and close pdf files as necessary and put their references into your mapped array of pdfs.
Use the pdfset to bind all your individual pdfs into a single pdf:
function renderSet(opts){
var tpl = ['<?xml version="1.0"?>','<pdfset>'];
opts.files.forEach(function(id, idx){
const partFile = file.load({id:id});
var pdf_fileURL = xml.escape({xmlText:partFile.url});
tpl.push("<pdf src='" + pdf_fileURL + "'/>");
});
tpl.push("</pdfset>");
log.debug({title:'bound template', details:xml.escape({xmlText:tpl.join('\n')})});
return render.xmlToPdf({
xmlString: tpl.join('\n')
});
}
Why not use a Map Reduce script to generate the PDF? Does it need to be a Suitelet?
Import data from google sheets to a MySQL table using google apps script. I have a significantly huge dataset to import google sheet into a table. But, I am running into exceeded maximum execution time exception are there other options to speed-up execution.
var address = 'database_IP_address';
var rootPwd = 'root_password';
var user = 'user_name';
var userPwd = 'user_password';
var db = 'database_name';
var root = 'root';
var instanceUrl = 'jdbc:mysql://' + address;
var dbUrl = instanceUrl + '/' + db;
function googleSheetsToMySQL() {
var RecId;
var Code;
var ProductDescription;
var Price;
var dbconnection = Jdbc.getConnection(dbUrl, root, rootPwd);
var statement = dbconnection.createStatement();
var googlesheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('product');
var data = googlesheet.getDataRange().getValues();
for (var i = 1; i < data.length; i++) {
RecId = data[i][0];
Code = data[i][1];
ProductDescription = data[i][2];
Price = data[i][3];
var sql = "{call [dbo].[sp_googlesheetstotable](?,?,?,?)}";
statement = dbconnection.prepareCall(sql);
statement.setString(1, RecId);
statement.setString(2, Code);
statement.setString(3, ProductDescription);
statement.setString(4, Price);
statement.executeUpdate();
}
statement.close();
dbconnection.close();
}
Using batch execution
dbconnection.setAutoCommit(false)
for (var i = 1; i < data.length; i++) {
RecId = data[i][0];
Code = data[i][1];
ProductDescription = data[i][2];
Price = data[i][3];
var sql = "{call [dbo].[sp_googlesheetstotable](?,?,?,?)}";
statement = dbconnection.prepareCall(sql);
statement.setString(1, RecId);
statement.setString(2, Code);
statement.setString(3, ProductDescription);
statement.setString(4, Price);
statement.addBatch()
statement.executeBatch()
}
dbconnection.commit()
I suspect that you may have figured out the solution to your problem, but for all those who might stumble across this like I did, there is an easy way to speed up these requests. The OP was nearly there...
Using the provided code:
function googleSheetsToMySQL() {
var sheetName = 'name_of_google_sheet';
var dbAddress = 'database_ip_address';
var dbUser = 'database_user_name';
var dbPassword = 'database_user_password';
var dbName = 'database_name';
var dbTableName = 'database_table_name';
var dbURL = 'jdbc:mysql://' + dbAddress + '/' + dbName;
// Regarding the statement used by the OP, you might find something like....
//
// "INSERT INTO " + dbTableName + " (recid, code, product_description, price) VALUES (?, ?, ?, ?);";
//
// to be more practical if you're trying to implement the OP's code,
// as you are unlikely to have a stored procedure named 'sp_googlesheetstotable', or may be more
// familiar with basic queries like INSERT, UPDATE, or SELECT
var sql = "{call [dbo].[sp_googlesheetstotable](?,?,?,?)}";
// The more records/requests you load into the statement object, the longer it will take to process,
// which may mean you exceed the execution time before you can do any post processing.
//
// For example, you may want to record the last row you exported in the event the export must be halted
// prematurely. You could create a series of Triggers to re-initiate the export, picking up right where
// you left off.
//
// The other consideration is that you want your GAS memory utilization to remain as low as possible to
// keep things running smoothly and quickly, so try to strike a balance that fits the data you're
// working with.
var maxRecordsPerBatch = 1000;
var spreadsheet = SpreadsheetApp.getActiveSpreadsheet();
var sheet = spreadsheet.getSheetByName(sheetName);
var sheetData = sheet.getDataRange().getValues();
var dbConnection = Jdbc.getConnection(dbURL, dbUser, dbPassword);
// The following only needs to be set when you are changing the statement that needs to be prepared
// or when you need to reset the variable.
//
// For example, if you were to switch to a different sheet which may have different values, columns,
// structure, and/or target database table.
var dbStatement = dbConnection.prepareCall(sql);
var RecId;
var Code;
var ProductDescription;
var Price;
var recordCounter = 0;
var lastRow;
dbConnection.setAutoCommit(false);
for (var i = 1; i < sheetData.length; i++) {
lastRow = (i + 1 == sheetData.length ? true : false);
RecId = sheetData[i][0];
Code = sheetData[i][1];
ProductDescription = sheetData[i][2];
Price = sheetData[i][3];
dbStatement.setString(1, RecId);
dbStatement.setString(2, Code);
dbStatement.setString(3, ProductDescription);
dbStatement.setString(4, Price);
// This command takes what has been set above and adds the request to the array that will be sent
// to the database for processing.
dbStatement.addBatch();
recordCounter += 1;
if (recordCounter == maxRecordsPerBatch || lastRow)
{
try {
dbStatement.executeBatch();
}
catch(e)
{
console.log('Attempted to update TABLE `' + dbTableName + '` in DB `' + dbName + '`, but the following error was returned: ' + e);
}
if (!lastRow)
{ // Reset vars
dbStatement = dbConnection.prepareCall( sql ); // Better to reset this variable to avoid any potential "No operations allowed after statement closed" errors
recordCounter = 0;
}
}
}
dbConnection.commit();
dbConnection.close();
}
The OP may still have run up against the execution time limit (I did at less than 10k records), but you should avoid batching individual requests unless you're having trouble locating a problem row.
From this link
It is important to keep in mind, that each update added to a Statement
or PreparedStatement is executed separately by the database. That
means, that some of them may succeed before one of them fails. All the
statements that have succeeded are now applied to the database, but
the rest of the updates may not be. This can result in an inconsistent
data in the database.
To avoid this, you can execute the batch update inside a JDBC
transaction. When executed inside a transaction you can make sure that
either all updates are executed, or none are. Any successful updates
can be rolled back, in case one of the updates fail.
Alternative Solution
If the time limit is a huge bother, you might try externally accessing the data within your Sheets. I've copied the basic instructions for posterity's sake, but please visit the link if it still works.
Link to source
Update composer.json to require “google/apiclient”: “^2.0” and run composer update
Create project on https://console.developers.google.com/apis/dashboard.
Click Enable APIs and enable the Google Sheets API
Go to Credentials, then click Create credentials, and select Service account key
Choose New service account in the drop down. Give the account a name, anything is fine.
For Role I selected Project -> Service Account Actor
For Key type, choose JSON (the default) and download the file. This file contains a private key so be very careful with it, it is your credentials after all
Finally, edit the sharing permissions for the spreadsheet you want to access and share either View (if you only want to read the file) or Edit (if you need read/write) access to the client_email address you can find in the JSON file.
<?php
require __DIR__ . '/vendor/autoload.php';
/*
* We need to get a Google_Client object first to handle auth and api calls, etc.
*/
$client = new \Google_Client();
$client->setApplicationName('My PHP App');
$client->setScopes([\Google_Service_Sheets::SPREADSHEETS]);
$client->setAccessType('offline');
/*
* The JSON auth file can be provided to the Google Client in two ways, one is as a string which is assumed to be the
* path to the json file. This is a nice way to keep the creds out of the environment.
*
* The second option is as an array. For this example I'll pull the JSON from an environment variable, decode it, and
* pass along.
*/
$jsonAuth = getenv('JSON_AUTH');
$client->setAuthConfig(json_decode($jsonAuth, true));
/*
* With the Google_Client we can get a Google_Service_Sheets service object to interact with sheets
*/
$sheets = new \Google_Service_Sheets($client);
/*
* To read data from a sheet we need the spreadsheet ID and the range of data we want to retrieve.
* Range is defined using A1 notation, see https://developers.google.com/sheets/api/guides/concepts#a1_notation
*/
$data = [];
// The first row contains the column titles, so lets start pulling data from row 2
$currentRow = 2;
// The range of A2:H will get columns A through H and all rows starting from row 2
$spreadsheetId = getenv('SPREADSHEET_ID');
$range = 'A2:H';
$rows = $sheets->spreadsheets_values->get($spreadsheetId, $range, ['majorDimension' => 'ROWS']);
if (isset($rows['values'])) {
foreach ($rows['values'] as $row) {
/*
* If first column is empty, consider it an empty row and skip (this is just for example)
*/
if (empty($row[0])) {
break;
}
$data[] = [
'col-a' => $row[0],
'col-b' => $row[1],
'col-c' => $row[2],
'col-d' => $row[3],
'col-e' => $row[4],
'col-f' => $row[5],
'col-g' => $row[6],
'col-h' => $row[7],
];
/*
* Now for each row we've seen, lets update the I column with the current date
*/
$updateRange = 'I'.$currentRow;
$updateBody = new \Google_Service_Sheets_ValueRange([
'range' => $updateRange,
'majorDimension' => 'ROWS',
'values' => ['values' => date('c')],
]);
$sheets->spreadsheets_values->update(
$spreadsheetId,
$updateRange,
$updateBody,
['valueInputOption' => 'USER_ENTERED']
);
$currentRow++;
}
}
print_r($data);
/* Output:
Array
(
[0] => Array
(
[col-a] => 123
[col-b] => test
[col-c] => user
[col-d] => test user
[col-e] => usertest
[col-f] => email#domain.com
[col-g] => yes
[col-h] => no
)
[1] => Array
(
[col-a] => 1234
[col-b] => another
[col-c] => user
[col-d] =>
[col-e] => another
[col-f] => another#eom.com
[col-g] => no
[col-h] => yes
)
)
*/
Try to check this related SO question for some information on how to import data from Google Spreadsheets into MySQL using an Apps Script code.
Now, for your error exceeded maximum execution time exception, remember that the Apps Script quotas have only a maximum execution time for a single script of 6 mins / execution. So it means that you exceeded this limit.
Try to check this page for the tecnique on how to prevent Google Scripts from exceeding the maximum execution time limit.
For more information, check this links:
Exceeded maximum execution time in Google Apps Script
Google app script timeout ~ 5 minutes?
I have this response data from the server that I authenticate.
<session session-guid="D39C1215-2F31-8641-E5T2-C7G35RT69127" user-id="34"> </session>
How can I get the value of session-guid and user-id and store them into 1 variable for each.
Thank you.
In plain Javascript in a modern browser you can use document.querySelectorAll().
For example, assuming you only have one session tag:
var session = document.querySelectorAll("session")[0];
var session_guid = session.getAttribute("session-guid");
var user_id = session.getAttribute("user-id");
console.log("session-guid: " + session_guid);
console.log("user-id: " + user_id);
If you have more that one session tag you can use forEach on the results of querySelectorAll() to locate the one you want. If you know that you're only going to have one session element you can use document.querySelector() instead of document.querySelectorAll()[0].
Here is one way to get the values client side. For this method you will need to add any custom class to your element like below:
Now write below lines of code inside tag:
var $mySession = jQuery(document.getElementsByClassName("mySession"));
for (i = 0; i < $mySession.length; i++) {
var sessionguid = jQuery($mySession[i]).attr('session-guid');
var userid = jQuery($mySession[i]).attr('user-id');
console.log(sessionguid);
console.log(userid);
}
You can check the values of "sessionguid" and "userid" variable in your browser console.
Here is what you can do to get the required data from the XML.
function getXMLData(xml) {
var txt, xmlDoc;
//get the responseXML
xmlDoc = xml.responseXML;
txt = "";
//get all the session nodes
var sessions = xmlDoc.getElementsByTagName("session");//this returns a array of nodes
for(var i =0;i<sessions.length;i++){
//iterate over nodes and get the attributes
txt += sessions[i].getAttribute("session-guid")+"<br />";
}
document.getElementById("demo").innerHTML = txt;
}
this function accepts the response as a parameter. And then extracts out the sessions nodes and the required attribute. You can modify it according to your requirements.
Here is a PLNKR demo for the same
Users will be hitting up against a URL that contains a query string called inquirytype. For a number of reasons, I need to read in this query string with javascript (Dojo) and save its value to a variable. I've done a fair amount of research trying to find how to do this, and I've discovered a few possibilities, but none of them seem to actually read in a query string that isn't hard-coded somewhere in the script.
You can access parameters from the url using location.search without Dojo Can a javascript attribute value be determined by a manual url parameter?
function getUrlParams() {
var paramMap = {};
if (location.search.length == 0) {
return paramMap;
}
var parts = location.search.substring(1).split("&");
for (var i = 0; i < parts.length; i ++) {
var component = parts[i].split("=");
paramMap [decodeURIComponent(component[0])] = decodeURIComponent(component[1]);
}
return paramMap;
}
Then you could do the following to extract id from the url /hello.php?id=5&name=value
var params = getUrlParams();
var id = params['id']; // or params.id
Dojo provides http://dojotoolkit.org/reference-guide/dojo/queryToObject.html which is a bit smarter than my simple implementation and creates arrays out of duplicated keys.
var uri = "http://some.server.org/somecontext/?foo=bar&foo=bar2&bit=byte";
var query = uri.substring(uri.indexOf("?") + 1, uri.length);
var queryObject = dojo.queryToObject(query);
//The structure of queryObject will be:
// {
// foo: ["bar", "bar2],
// bit: "byte"
// }
In new dojo it's accessed with io-query:
require([
"dojo/io-query",
], function (ioQuery) {
GET = ioQuery.queryToObject(decodeURIComponent(dojo.doc.location.search.slice(1)));
console.log(GET.id);
});
Since dojo 0.9, there is a better option, queryToObject.
dojo.queryToObject(query)
See this similar question with what I think is a cleaner answer.