I have drag&drop event and I would like to hash the filed dragged. I have this:
var file = ev.dataTransfer.items[i].getAsFile();
var hashf = CryptoJS.SHA512(file).toString();
console.log("hashf", hashf)
But when I drag differents files, "hashf" is always the same string.
https://jsfiddle.net/9rfvnbza/1/
The issue is that you are attempting to hash the File object. Hash algorithms expect a string to hash.
When passing the File Object to the CryptoJS.SHA512() method, the API attempts to convert the object to a string. That conversion results in CryptoJS.SHA512() receiving the same string not matter what File object you send provide it.
The string is [object File] - you can replace file in your code with that string and discover it is the same hash code you've see all along.
To fix this, retrieve the text from the file first and pass that to the hashing algorithm:
file.text().then((text) => {
const hashf = CryptoJS.SHA512(text).toString();
console.log("hashf", hashf);
});
If you prefer async/await, you can put it in an IIFE:
(async() => {
const text = await file.text()
const hashf = CryptoJS.SHA512(text).toString();
console.log("hashf", hashf);
})();
Related
I am trying to use the Spotify PKCE authorization with Siri Shortcuts. Unfortunately, none of the solutions I have found have been applicable to my specific situation. I have this bit of code
And I really have no idea what I am doing. Basically I need a SHA256 hash of a string of characters, but this needs to be by bytes vs the hex. This then needs to be base64Url encoded. I’ve have tried most of the solutions on stack but I can’t seem to output the final product onto a webpage, which is the main way I am able to run Java script natively on iPhone. Any help would be greatly appreciated
<script>
function sha256(plain) {
// returns promise ArrayBuffer
const encoder = new TextEncoder();
const data = encoder.encode(plain);
return window.crypto.subtle.digest('SHA-256', data);
}
function base64urlencode(a) {
// Convert the ArrayBuffer to string using Uint8 array.
// btoa takes chars from 0-255 and base64 encodes.
// Then convert the base64 encoded to base64url encoded.
// (replace + with -, replace / with _, trim trailing =)
return btoa(String.fromCharCode.apply(null, new Uint8Array(a)))
.replace(/\+/g, '-').replace(/\//g, '_').replace(/=+$/, '');
}
async function pkce_challenge_from_verifier(v) {
hashed = await sha256(v);
base64encoded = base64urlencode(hashed);
return base64encoded;
}
const code = await pkce_challenge_from_verifier("Zg6klgrnixQJ629GsawRMV8MjWvwRAr-vyvP1MHnB6X8WKZN")
document.getElementById("value").innerHTML = code;
</script>
<body>
<p id="value"></p>
</body>
</html> ```
await is not allowed at global level i.e you can use await only inside async functions.
So how to solve? use promise instead of await. like this
pkce_challenge_from_verifier("Zg6klgrnixQJ629GsawRMV8MjWvwRAr-vyvP1MHnB6X8WKZN")
.then((code) => document.getElementById("value").innerHTML = code)
.catch((error) => console.error(error))
also place <script> ... </script> after body end tag otherwise JS cant find p element.
So i am dropping a .txt file in an uploader which is converting it into base64 data like this:
const {getRootProps, getInputProps} = useDropzone({
onDrop: async acceptedFiles => {
let font = ''; // its not actually a font just reusing some code i'll change it later its a .txt file so wherever you see font assume its NOT a font.
let reader = new FileReader();
let filename = acceptedFiles[0].name.split(".")[0];
console.log(filename);
reader.readAsDataURL(acceptedFiles[0]);
reader.onload = await function (){
font = reader.result;
console.log(font);
dispatch({type:'SET_FILES',payload:font})
};
setFontSet(true);
}
});
Then a POST request is made to the node js server and I indeed receive the base64 value. I then proceed to convert it back into a .txt file by writing it into a file called signals.txt like this:
server.post('/putInDB',(req,res)=>{
console.log(req.body);
var bitmap = new Buffer(req.body.data, 'base64');
let dirpath = `${process.cwd()}/signals.txt`;
let signalPath = path.normalize(dirpath);
connection.connect();
fs.writeFile(signalPath, bitmap, async (err) => {
if (err) throw err;
console.log('Successfully updated the file data');
//all the ending brackets and stuff
Now the thing is the orignal file looks like this :
Time,1,2,3,4,5,6,7,8,9,10,11,12
0.000000,7.250553,14.951141,5.550423,2.850217,-1.050080,-3.050233,1.850141,2.850217,-3.150240,1.350103,-2.950225,1.150088
But the file when writing back from base64 looks like this :
u«Zµìmþ™ZŠvÚ±î¸Time,1,2,3,4,5,6,7,8,9,10,11,12
0.000000,1.250095,0.250019,-4.150317,-0.350027,3.650278,1.950149,0.950072,-1.250095,-1.150088,-7.750591,-1.850141,-0.050004
See the weird characters in the beginning ? Why is this happening.
Remember to read up on what the functions you use do, because you're using readAsDataURL which does not give you the base64 encoded version of your data: it gives you Data-URL, and Data-URLs have a header prefix to tell URL parsers what kind of data this will be, and how to decode the data directly following the header.
To quote the MDN article:
Note: The blob's result cannot be directly decoded as Base64 without first removing the Data-URL declaration preceding the Base64-encoded data. To retrieve only the Base64 encoded string, first remove data:*/*;base64, from the result.
If you don't, blindly converting the Data-URL from base64 to plain text will give you some nonsense data at the start:
> Buffer.from('data:*/*;base64', 'base64').toString('utf-8')
'u�Z���{�'
Which raises another point: you would have caught this with POST data validation, because the Data-URL that you sent contains characters that are not allowed in base64. POST validation is always a good idea.
I know this isn't the exact code, but it is difficult to reproduce your problem with the code you provided. But the data you are sending needs to be a URL/URI encoded form.
So essentially:
encodeURI(base64data);
Encode URI is built into javascript: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/encodeURI
EDIT:
I saw you used the function readDataAsUrl(), but try using the encodeURI function and then readDataAsUrl().
I am relatively new to JavaScript and I want to get the hash of a file, and would like to better understand the mechanism and code behind the process.
So, what I need: An MD5 or SHA-256 hash of an uploaded file to my website.
My understanding of how this works: A file is uploaded via an HTML input tag of type 'file', after which it is converted to a binary string, which is consequently hashed.
What I have so far: I have managed to get the hash of an input of type 'text', and also, somehow, the hash of an uploaded file, although the hash did not match with websites I looked at online, so I'm guessing it hashed some other details of the file, instead of the binary string.
Question 1: Am I correct in my understanding of how a file is hashed? Meaning, is it the binary string that gets hashed?
Question 2: What should my code look like to upload a file, hash it, and display the output?
Thank you in advance.
Basically yes, that's how it works.
But, to generate such hash, you don't need to do the conversion to string yourself. Instead, let the SubtleCrypto API handle it itself, and just pass an ArrayBuffer of your file.
async function getHash(blob, algo = "SHA-256") {
// convert your Blob to an ArrayBuffer
// could also use a FileRedaer for this for browsers that don't support Response API
const buf = await new Response(blob).arrayBuffer();
const hash = await crypto.subtle.digest(algo, buf);
let result = '';
const view = new DataView(hash);
for (let i = 0; i < hash.byteLength; i += 4) {
result += view.getUint32(i).toString(16).padStart(2, '0');
}
return result;
}
inp.onchange = e => {
getHash(inp.files[0]).then(console.log);
};
<input id="inp" type="file">
I am trying to concatenate a string and a Readable stream (the readable stream is pointing to a file which may have data in multiple chunks, i.e. the file may be large) into one writable stream so that the writable stream can finally be written to a destination.
I am encrypting the string and content of the file and then applying zlib compression on them, then finally I want to pipe them to the writable stream.
To achieve this, I can:
a) Convert the file content into a string then concatenate both the string then encrypt, do compression and then finally pipe it into the writable stream. But this is not possible because the file may be big in size, thus I can't convert its content to string.
b) I can first encrypt and compress the string then convert the string into a stream then pipe it into the writable stream after that is done completely, pipe the file contents into the same writable stream.
To do so, I have written this:
var crypto = require('crypto'),
algorithm = 'aes-256-ctr',
password = 'd6FAjkdlfAjk';
var stream = require('stream');
var fs = require('fs');
var zlib = require('zlib');
// input file
var r = fs.createReadStream('file.txt');
// zip content
var zip = zlib.createGzip();
// encrypt content
var encrypt = crypto.createCipheriv(algorithm, password, iv);
var w = fs.createWriteStream('file.out');
// the string contents are being converted into a stream so that they can be piped
var Readable = stream.Readable;
var s = new Readable();
s._read = function noop() {};
s.push(hexiv+':');
s.push(null);
s.pipe(zlib.createGzip()).pipe(w);
// start pipe when 'end' event is encountered
s.on('end', function(){
r.pipe(zip).pipe(encrypt).pipe(w);
});
What I observe is:
Only the first pipe is done successfully, i.e. the string is written to the file.out. The second pipe doesn't make any difference on the output destination. At first, I thought that the reason might be due to asynchronous behavior of pipe. So, for this reason, I am piping the file content after the first piping is closed. But still, I didn't get the desired output.
I want to know why is this happening any the appropriate way for doing this.
Found the reason why was the second piping to the writable stream w wasn't working.
When the first .pipe(w) finishes, the writable at w is also automatically closed.
So instead of using s.pipe(zlib.createGzip()).pipe(w); I should have used:
s.pipe(zlib.createGzip()).pipe(w, {end: false});
Where the {end: false} is passed to pipe() as the second argument, stating that the writable should not get closed when the piping is done.
I want to read and write to a file in a specific way.
An example file could be:
name1:100
name2:400
name3:7865786
...etc etc
What would be the best way to read this data in and store in, and eventually write it out?
I don't know which type of data structure to use? I'm still fairly new to javascript.
I want to be able to determine if any key,values are matching.
For example, if I were to add to the file, I could see that name1 is already in the file, and I just edit the value instead of adding a duplicate.
You can use localStorage as a temporary storage between reads and writes.
Though, you cannot actually read and write to a user's filesystem at will using client side JavaScript. You can however request the user to select a file to read the same way you can request the user to save the data you push, as a file.
localStorage allow you to store the data as key-value pairs and it's easy to check if an item exists or not. Optionally simply use a literal object which basically can do the same but only exists in memory. localStorage can be saved between sessions and navigation between pages.
// set some data
localStorage.setItem("key", "value");
// get some data
var data = localStorage.getItem("key");
// check if key exists, set if not (though, you can simply override the key as well)
if (!localStorage.getItem("key")) localStorage.setItem("key", "value");
The method getItem will always return null if the key doesn't exist.
But note that localStorage can only store strings. For binary data and/or large sizes, look into Indexed DB instead.
To read a file you have to request the user to select one (or several):
HTML:
<label>Select a file: <input type=file id=selFile></label>
JavaScript
document.getElementById("selFile").onchange = function() {
var fileReader = new FileReader();
fileReader.onload = function() {
var txt = this.result;
// now we have the selected file as text.
};
fileReader.readAsText(this.files[0]);
};
To save a file you can use File objects this way:
var file = new File([txt], "myFilename.txt", {type: "application/octet-stream"});
var blobUrl = (URL || webkitURL).createObjectURL(file);
window.location = blobUrl;
The reason for using octet-stream is to "force" the browser to show a save as dialog instead of it trying to show the file in the tab, which would happen if we used text/plain as type.
So, how do we get the data between these stages. Assuming you're using key/value approach and text only you can use JSON objects.
var file = JSON.stringify(localStorage);
Then send to user as File blob shown above.
To read you will have to either manually parse the file format if the data exists in a particular format, or if the data is the same as you save out you can read in the file as shown above, then convert it from string to an object:
var data = JSON.parse(txt); // continue in the function block above
Object.assign(localStorage, data); // merge data from object with localStorage
Note that you may have to delete items from the storage first. There is also the chance other data have been stored there so these are cases that needs to be considered, but this is the basis of one approach.
Example
// due to security reasons, localStorage can't be used in stacksnippet,
// so we'll use an object instead
var test = {"myKey": "Hello there!"}; // localStorage.setItem("myKey", "Hello there!");
document.getElementById("selFile").onchange = function() {
var fileReader = new FileReader();
fileReader.onload = function() {
var o = JSON.parse(this.result);
//Object.assign(localStorage, o); // use this with localStorage
alert("done, myKey=" + o["myKey"]); // o[] -> localStorage.getItem("myKey")
};
fileReader.readAsText(this.files[0]);
};
document.querySelector("button").onclick = function() {
var json = JSON.stringify(test); // test -> localStorage
var file = new File([json], "myFilename.txt", {type: "application/octet-stream"});
var blobUrl = (URL || webkitURL).createObjectURL(file);
window.location = blobUrl;
}
Save first: <button>Save file</button> (<code>"myKey" = "Hello there!"</code>)<br><br>
Then read the saved file back in:<br>
<label>Select a file: <input type=file id=selFile></label>
Are you using Nodejs? Or browser javascript?
In either case the structure you should use is js' standard object. Then you can turn it into JSON like this:
var dataJSON = JSON.stringify(yourDataObj)
With Nodejs, you'll want to require the fs module and use one of the writeFile or appendFile functions -- here's sample code:
const fs = require('fs');
fs.writeFileSync('my/file/path', dataJSON);
With browser js, this stackoverflow may help you: Javascript: Create and save file
I know you want to write to a file, but but consider a database instead so that you don't have to reinvent the wheel. INSERT ... ON DUPLICATE KEY UPDATE seems like the logical choice for what you're looking to do.
For security reasons it's not possible to use JavaScript to write to a regular text or similar file on a client's system.
However Asynchronous JavaScript and XML (AJAX) can be used to send an XMLHttpRequest to a file on the server, written in a server-side language like PHP or ASP.
The server-side file can then write to other files, or a database on the server.
Cookies are useful if you just need to save relatively small amounts of data locally on a client's system.
For more information have a look at
Read/write to file using jQuery