Maximum size of an Google Apps Script Array - javascript

One of my scripts is a leave approval system.
It reads a spreadsheet of all leave requests ever submitted, loading all data into an array.
This array is then processed and displayed in a dynamic grid.
The way this is designed, all leave requests need to be in a single sheet. Even once requests are approved, employees can view their current and past requests through this script.
Over time this will grow into thousands of lines. Each line is ~140 bytes.
I can't find any reference to a maximum array size in Apps Script.
I suppose I may hit execution time limits before I exceed the size of the structure anyway!
Does anyone know if there is a limit, and what it is?
Tony
DataSource = SpreadsheetApp.openById("0AgHhFhurd2nCdFV4dmdRS3....");
DataSheet = DataSource.setActiveSheet(DataSource.getSheets()[0]);
var numRows = DataSheet.getLastRow()-1; // -1 to omit header row
var LeaveData = DataSheet.getRange(2, 1, numRows, 16).getValues();

Related

Pasting Values Only For Certain Formulas

I have a large workbook that pulls in weekly data (columns) for hundreds of metrics on several tabs. It pulls this data in via SUMIFS formulas, and on most tabs there are several rows that contain ratios/rates calculated from these SUMIFS formulas.
Here is a toy example.
For each sheet, I would like to paste values only for formulas that are based on 'SUMIFS', and leave the other calculations. I was able to select a range and loop cell-by-cell to accomplish this, but it takes a long time due to the size of the workbook. Is there a way to do this at once in a batch-fashion? Basically, I want to copy and paste-values only if a certain condition exists.
Solution:
Updated - Slight improvement to make it faster by Iamblichus.
I am not quite sure what you have tried so far, but this one works relatively fast for a couple of sheets:
function myFunction() {
let ss = SpreadsheetApp.getActiveSpreadsheet();
let sheets = ['Sheet1','Sheet2'].map(sh=>ss.getSheetByName(sh));
sheets.forEach(sh=>{
let range = sh.getDataRange();
let formula = range.getFormulasR1C1();
let values = range.getValues();
formula.forEach( (fr,fx) =>
{fr.forEach((fc,fy)=>{
let outR=sh.getRange(fx+1,fy+1);
if (fc.toUpperCase().includes('SUMIFS')) outR.setValue(values[fx][fy]);
})})});
}
Please adjust ['Sheet1','Sheet2','Sheet3'] to your needs.
Explanation:
I am iterating through each sheet and I am checking whether the cells contain SUMIFS formula or not. If they do, I overwrite them with their value, otherwise I keep their formula.

change the domain of my line chart at runtime

In my code I am loading a JSON of more than 900 data. these data represent the data emitted by some machines. I'm drawing a line chart, the keys in this JSON represent the name of the machines.
This is the structure of my JSON:
{"AF3":3605.1496928113393,"AF4":-6000.4375230516,"F3":1700.3827875419374,"F4":4822.544985821321,"F7":4903.330735023786,"F8":824.4048714773611,"FC5":3259.4071092472655,"FC6":4248.067359141752,"O1":3714.5106599153364,"O2":697.2904723891061,"P7":522.7300768483767,"P8":4050.79490288753,"T7":2939.896657485737,"T8":9.551935316881588}
each line represents each machine and I put a space to see each machine separately. I am currently reading the data with the help of an counter called cont. all the data in the JSON is between 0 to 5000. But I have modified some objects of the JSON to achieve change the domain and then the new domain in general for all the lines must be equal to the change.
for example on line 106 of the JSON to "AF3":7000. (in this case the domain should be [0-7000] for all the lines)
in the line 300, "AF4": - 1000.(in this case the domain should be [-1000,7000] for all the lines)
I have modified some data on purpose to achieve this change. I would like all lines to be updated to this new domain, if possible with an animation.
How can I do it?
this is my code:
http://plnkr.co/edit/KVVyOYZ4CVjxeei7pd9H?p=preview
To update domain across all line charts, we need to recalculate the domain before new data gets pushed in.
Plunker: http://plnkr.co/edit/AHWVM3HT7TDAiINFRlN9?p=preview
var newDomain = d3.extent(ids.map(function(d) {
return aData[cont][d]
}));
var oldDomain = y.domain()
newDomain[0] = newDomain[0] < oldDomain[0] ? newDomain[0] : oldDomain[0]
newDomain[1] = newDomain[1] > oldDomain[1] ? newDomain[1] : oldDomain[1]
y.domain(newDomain)
domain.text(y.domain())
With respect to graph getting trimmed, data needs to be manipulated ( In your case, 14 arrays, push and shift operation to the array and D3 transition) all within a 1ms, which may not be enough. Unfortunately I don't have any resource to back this up. In case anyone can edit this answer to provide proof, please feel free.

Cesium large number of entity updates

I am working on a project dealing with sensor data. In my backend everything is stored in a database which is getting polled by a controller and converted into kml to display on the cesium globe. This poll happens every 5-10 seconds and contains around 4000-8000 objects (we store up to 1 minute worth of data so we are looking at somewhere like 20k - 50k points). Following this I have an update function which slowly fades the markers out which updates every 5 seconds.
To load the kml on the map I use the following function:
var dataSource = new Cesium.KmlDataSource();
dataSource.load('link').then(function(value);
viewer.dataSources.add(dataSource);
});
On the update color function I am iterating over all of the objects within the datasources entity collection and updating them like so (this is very inefficient):
var colorUpdate = Cesium.Color.fromAlpha(newColor, .4);
dataSource.entities.values[i].billboard.color = colorUpdate;
When I do and add or color update I see a large amount of lag and was curious if there was anything you would suggest to fix this? Generally I get a freeze up for a few seconds. After 60 seconds of the data being on the map it gets removed like so (just a different if case within the color update loop)
dataSource.entities.remove(dataSource.entities.values[i]);
Is there potentially a way to set a propertiy for an entire entity collection so when this collection becomes 30 seconds old it updates the color to a new one? It seems that I just need to find a way to set a property for the entire collection vs individual entities. Does anyone know how to do that or have a suggestion for something better?

Big data amounts with Highcharts / Highstock (async loading)

Since my data amount becomes bigger everyday (right now > 200k MySQL rows in one week), the chart is very slow at loading. I guess the async loading method is the right way to go here (http://www.highcharts.com/stock/demo/lazy-loading).
I tried to implement it, but it doesn't work. So far I can provide my data with Python via URL parameters e.g. http://www.url.de/data?start=1482848100&end=1483107000, but there are several things I don't understand in the example code:
If a period of All Data is chosen in the Navigator, then all data is
provided by my server and loaded by the chart. So its the same as I
what I do right now without lazy loading. Whats the difference then?
Why there is a second getJSON() method without any URL parameter in
the above mentioned example code? Its the following URL, which is empty. What do I
need it for? I don't understand it:
https://www.highcharts.com/samples/data/from-sql.php?callback=?
And which method to load the data is better?:
This one: chart.series[0].setData(data);
or the code below which I use so far:
var ohlc = [],
volume = [],
dataLength = data.length,
i = 0;
for (i; i < dataLength; i += 1) {
ohlc.push([
data[i]['0'], // date
data[i]['1_x'], // open
data[i]['2_x'], // high
data[i]['3'], // low
data[i]['4'] // close ]);
The idea behind the lazy loading demo is that you fetch only the amount of points which is necessary, so if you have the data which includes 1.7 mln points, you never load so many points to the chart.
Based on Highcharts demo. Instead of loading too many points, you request for already grouped points, you have 1.7 milion daily points, you set the navigator to 'all' (time range 1998 - 2011), you don't need daily data, so the response will include monthly points. Gains are: fetching smaller amount of data (12 * 14 = 168 instead of 1.7 mln), avoiding heavy processing data on the client side (processing, grouping, etc.) -> lower memory and cpu usage for the client, faster chart loading.
The request for the data is in JSONP format. More information about its advantages here. So actually, url has 3 params - mandatory callback=? and optional start=?&stop=? - which indicates the points time range and its density. The first request does not have start/stop params because the server has some default values already set. After the navigator is moved, more detailed points are requested and loaded to the chart. This is the downside of the lazy loading - after the navigator is moved, your request a new set of data -> frequent data request and interruption due to the network failure.
The answer for your last question depends on if you have your data in a proper format or you don't. If you do, you can avoid looping the data on the client side and load it to the chart directly. If the format is not correct, then you have to preprocess the data, so the chart will be able to visualize them correctly. Ideally, you want the data to be in the right format after you request them - so if you can, you should do it on the server side.

How do I decrease the governance cost of the following code

It netsuite there is a limit on how frequently you can use certain APIs (as well as certain scripts). For what I am doing I believe the following is the applicable cost:
nlapiLoadSearch: 5
nlobjSearchResultSet.getSearch(): 10
It takes about an hour, but every time my script(which follows) errors out, probably due to this. How do I change it to make it have less governance cost?
function walkCat2(catId, pad){
var loadCategory = nlapiLoadRecord("sitecategory", "14958149");
var dupRecords = nlapiLoadSearch('Item', '1951'); //load saved search
var resultSet = dupRecords.runSearch(); //run saved search
resultSet.forEachResult(function(searchResult)
{
var InterID=(searchResult.getValue('InternalID')); // process- search
var LINEINX=loadCategory.getLineItemCount('presentationitem');
loadCategory.insertLineItem("presentationitem",LINEINX);
loadCategory.setLineItemValue("presentationitem", "presentationitem", LINEINX, InterID+'INVTITEM'); //--- Sets the line value.-jf
nlapiSubmitRecord(loadCategory , true);
return true; // return true to keep iterating
});
}
nlapiLoadRecord uses 5 units, nlapiLoadSearch uses 5, then actually it is resultSet.forEachResult that uses another 10. On top of that, you are running nlapiSubmitRecord for each search result, which will use 10 more units for each result.
It looks to me like all you are doing with your search results is adding line items to the Category record. You do not need to submit the record until you are completely done adding all the lines. Right now, you are submitting the record after every line you add.
Move the nlapiSubmitRecord after your forEachResult call. This will reduce your governance (and especially your execution time) from 10 units per search result to just 10 units.
Different APIs have different costs associated with them[see suiteanswers ID 10365]. Also, different types of scripts (user, scheduled, etc) have different max limits on what the total usage limit can be. [see suiteanswers ID 10481]
Your script should consume less than that limit else NetSuite will throw an error.
You can use the following line of code to measure your remaining usage at different points in your code.
nlapiLogExecution('AUDIT', 'Script Usage', 'RemainingUsage:'+nlapiGetContext().getRemainingUsage());
One strategy to avoid the max usage exceeded exception is to change the type of script to "scheduled script" since that has the maximum limit. Given that your loop is working off a search, the resultset could be huge and that may cause even a scheduled script to exceed its limits. In such case, you would want to introduce checkpoints in your code and make it reentrant. That way if you see the nlapiGetContext().getRemainingUsage() is less than your threshold, you offload the remaining work to a subsequent scheduled script.

Categories