Dynamic SQL using Snowflake SQL Stored Procedure - javascript

I want to be able to update rows of data in a set list of tables for certain columns within those tables.
Basically something like this;
TABLE_NAME COL1
TABLE1 NAME
TABLE2 ADDRESS
select * from TABLE1;
Aidan
Select * from TABLE2;
Ireland
So something like the query below as a stored procedure that would gather all the tables and columns to be altered and update the records accordingly.
update $TABLE_NAME set $COL1 ='*' where ID in (select ID FROM EXTERNAL_TABLE)
Any help greatly appreciated.
Tried messing around with this sample but got nowhere when I try build on it
CREATE OR REPLACE PROCEDURE DATA_MASKING("P_TABLE_NAME" VARCHAR(16777216))
RETURNS VARCHAR(16777216)
LANGUAGE JAVASCRIPT
EXECUTE AS OWNER
AS '
// Define query String
var sql_command = "UPDATE " + P_TABLE_NAME + " SET IND_ADDR1 = NULL;"
//Prepare SQL statement
var stmt = snowflake.createStatement({sqlText: sql_command});
//Execute SQL Statement
var rs = stmt.execute();
return ''Table truncated successfully.''
';
call PERSISTENT_DATA_MASKING('TEST_TABLE');

The Proc you had given works. Just needs few amends which you must have been able to do. So, not sure what exactly is the question.
CREATE OR REPLACE PROCEDURE DATA_MASKING("P_TABLE_NAME" VARCHAR(16777216), "P_COLUMN_NAME" VARCHAR(16777216))
RETURNS VARCHAR(16777216)
LANGUAGE JAVASCRIPT
EXECUTE AS OWNER
AS '
// Define query String
var sql_command = "UPDATE " + P_TABLE_NAME + " SET " + P_COLUMN_NAME + " = ''x'';"
//Prepare SQL statement
var stmt = snowflake.createStatement({sqlText: sql_command});
//Execute SQL Statement
var rs = stmt.execute();
return ''successful''
';
CALL DATA_MASKING('SCHEMA.TABLE_NAME','COLUMN_NAME');
SELECT * FROM SCHEMA.TABLE_NAME;

Related

Snowflake how to optimize a JavaScript procedure that loops over an array and insert into a satellite table which is slowing the down the procedure?

We are having a tabular forms (mainly CSV files) coming into the warehouse. Our achitecture is based on adding each field/value as a single row into a satellite table on Snowflake.
After doing all necessary merge into queries, basically to add hash keys, load dates, record sources and other metadata about the incoming data into several tables and to make sure not to add existing records again, we start a loop over fields that are saved into an array extracted from the file. And then each field/value will be added into the table if its parent key does not exist in the parent hub table.
Here is the loop script:
for (var col_num = 0; col_num<odk_fields.length; col_num = col_num+1){
var COL_NAME = odk_fields[col_num];
var TABLE_COL_NAME = table_fields[col_num];
var obs_field_query = "MERGE INTO LINK_OBSERVATION_FIELD AS LOBSF "+
"USING (SELECT T.OBSERVATION_DATE, T.CAMPNO, T._SUBMISSION_TIME FROM "+TEMP_TABLE_NAME+" T) ST "+
"ON (MD5(CONCAT_WS('', ?, DATE(ST.OBSERVATION_DATE, 'DD/MM/YYYY'), 'CAMP', CONCAT(ST.CAMPNO, ST._SUBMISSION_TIME))) = LOBSF.OBSERVATION_DATE_LOCATION_HASH_KEY) "+
"AND MD5(?)=LOBSF.FIELD_NAME_HASH_KEY "+
"WHEN NOT MATCHED THEN "+
"INSERT (FIELD_NAME_OBSERVATION_HASH_KEY, LOAD_DT, RECORD_SRC, OBSERVATION_DATE_LOCATION_HASH_KEY, FIELD_NAME_HASH_KEY) "+
"VALUES (MD5(CONCAT_WS('', ?, DATE(ST.OBSERVATION_DATE, 'DD/MM/YYYY'), 'CAMP', ST.CAMPNO)), CURRENT_TIMESTAMP(), 'ONA', "+
"md5(CONCAT_WS('', DATE(ST.OBSERVATION_DATE, 'DD/MM/YYYY'), 'CAMP', ST.CAMPNO, ST._SUBMISSION_TIME)), md5(?)) ";
var obs_field_query_stmt = snowflake.createStatement({sqlText: obs_field_query, binds: [COL_NAME, COL_NAME, COL_NAME, COL_NAME]});
var obs_field_rs = obs_field_query_stmt.execute();
obs_field_rs.next();
if (obs_field_rs) {
var field_value_query = "INSERT INTO SAT_FIELD_VALUE "+
"SELECT md5(md5(CONCAT_WS('', ?, DATE(OBSERVATION_DATE, 'DD/MM/YYYY'), 'CAMP', CAMPNO))), "+
"CURRENT_TIMESTAMP(), NULL, 'ONA', "+TABLE_COL_NAME+", md5(CONCAT_WS('', ?, DATE(OBSERVATION_DATE, 'DD/MM/YYYY'), 'CAMP', CAMPNO)) "+
"FROM "+TEMP_TABLE_NAME+"";
var field_value_query_stmt = snowflake.createStatement({sqlText: field_value_query, binds: [COL_NAME, COL_NAME]});
var field_value_rs = field_value_query_stmt.execute();
field_value_rs.next();
sat_field_val_log += field_value_rs['number of rows inserted'];
}
}
We noticed that all merge into commands are noticeably fast n executing, but when it comes into the loop, the insert into query is taking lots of time to be executed.
For a file having 3 rows, with 100 fields each, it would take 5 minutes to be added to Snowflake.
Any idea how to optimize the script to be more efficient in uploading the data?

Javascript function in snowflake to append tablename with current date

I have recently started to use snowflake and have been stuck at this issue:
I want to clone a table called AB_USER to AB_USER_(current_date). I have written following code to accomplish this:
CREATE or replace PROCEDURE backup_proc()
RETURNS VARCHAR
LANGUAGE javascript
AS
$$
var tab_name = `AB_USER_BCK_2020_`+ current_date();
stat = `create or replace table staging.` + tab_name + ` clone staging.AB_USER`;
var rs = snowflake.execute( { sqlText: stat} );
return 'Done.';
$$;
The problem is I cannot find appropriate function to get current date. Snowflake provides a JS environment but I don't know which function to use to get current date.
I am very new to snowflake so any help in this will be much appreciated.
Thanks.
CURRENT_DATE is an SQL command, so you need to call it as SQL statement with snowflake.execute.
As I see, you want to get values of month and day from current date, so you can use the following procedure:
CREATE or replace PROCEDURE backup_proc()
RETURNS VARCHAR
LANGUAGE javascript
AS
$$
var curdate = snowflake.execute( { sqlText: "SELECT TO_CHAR(CURRENT_DATE,'MMDD') as curdate"} );
curdate.next();
var tab_name = "AB_USER_BCK_2020_"+ curdate.getColumnValue('CURDATE');
var stat = "create or replace table staging." + tab_name + " clone staging.AB_USER";
var rs = snowflake.execute( { sqlText: stat} );
return "Done.";
$$
;

retrieve auto_incremented ID of first INSERT for second INSERT [duplicate]

This question already has answers here:
How to get the insert ID in JDBC?
(14 answers)
Closed 4 years ago.
I'm using JDBC and google app script.
I need to insert rows created via an HTML interface, only problem is that rows contain 2 inputs that are represented in different tables of my MYSQL database.
My second table has a foreign key that references the first table (product_id references id from table product).
Here are my tables
product:
id, name
product_quantity:
id, product_id, quantity
Here is what I would like to achieve in pseudo-code:
function update(row){
insertProductSQL = "INSERT INTO products (name) VALUES (" + row.name + ")";
insertProductQuantity = "INSERT INTO products_quantity (product_id, quantity) VALUES (" + /*HERE IS THE PROBLEM*/ + ", " + row.quantity + ")"
var conn = getConnection(); //JDBC connection object
var stmt = conn.createStatement();
stmt.executeQuery(insertProductSQL)
stmt = conn.createStatement();
stmt.executeQuery(insertProductQuantity);
conn.close();
}
So problem I'm facing is that I don't know if SQL or JDBC gives a simple way to retrieve the auto incremented value id created on the first insert to use it for my second insert.
Help is greatly appreciated, don't hesitate telling me if unclear to you.
Use the LAST_INSERT_ID() function.
insertProductQuantity = "INSERT INTO products_quantity (product_id, quantity) VALUES (LAST_INSERT_ID(), " + row.quantity + ")"

Array in sql statement not working with placeholder ?nodejs

sql query does not accept array as value when it is used in a placeholder, It only returns one result even though result is greater than 1. Not using a placeholder and withouting escaping it works perfectly returns the right amount of results.
//works
SELECT * FROM users WHERE userId IN (" + followerIds.join() + ");";
//does not work
SELECT * FROM users WHERE userId IN (?);";
con.query(queryFollowerstTable, [followeringIsd.join()],function(err,result)..
All I had to do was parse followerIds.join() to an int and It worked.
followerIdsParsed = followerIds.join().split(',').map(Number);
followingIdsParsed = followingIds.join().split(',').map(Number);
var queryFollowerstTable = "SELECT * FROM users WHERE userId IN (?); SELECT *
FROM users WHERE userId IN (?);";
con.query(queryFollowerstTable, [followerIdsParsed, followingIdsParsed],
function(err, result) {..
Change
con.query(queryFollowerstTable, [followeringIds.join()],function(err,result)
to
con.query(queryFollowerstTable, followeringIds.join(),function(err,result)
In your original example:
SELECT * FROM users WHERE userId IN (" + followerIds.join() + ");";
You are passing in a string not an array

WebSQL: Error processing SQL: number of '?'s in statement string does not match argument count

I want to create a dynamic function to INSERT data into the webSQL Database. I cannot use indexed DB because Zetakey does not support it.
tx.executeSql("INSERT INTO " + table + "(" + formatfields + ")
VALUES (" + formatqm + ")",
[formatvalues],
webdb.onSuccess,
webdb.onError);
Ich übergebe an den Query:
formatfields = "one, two"; (up to an undefined number)
formatqm = "?, ?";
formatvalues = "123, 456"; (dynamic user entries for x fields)
Can someone tell me what do I have to do with the formatvalues? When I write 123, 456 directly its working fine.
Thanks in advance!
Instead of dynamically create or change table column fields, use JSON serialization of the record. Basically store stringify user given object data on INSERT and parse on retrieval. If you need query over column, initialize those columns only. It will be just like IndexedDB does.
/*array.push */
formatvalues = new Array;
formatvalues.push("123");
and so on!

Categories