I have a code snippet on javascript that I'm trying to convert to python
var cipherAlgorithm = 'aes256';
var decipher = crypto.createDecipher(cipherAlgorithm, cipher);
password = decipher.update(encryptedpass, 'hex', 'utf8')+ decipher.final('utf8');
I'm trying to rewrite that in python using pythoncrypto and I'm just getting the wrong values no matter what I do
cyphertext=faaafaaa"
cipher=AES.new(key, AES.MODE_CBC, "\0"*16)
cipher.decrypt(cypertext)
and it returns the wrong value. (didn't decrypt correctly)
in the javascript code (I'm not a js expert ) I noticed that the output encoding is utf8
so I tries something like this
unicode(cipher.decrypt(cyphertext), "utf-8")
but I'm getting an error
'utf8' codec can't decode byte 0x81 in position 6: invalid start byte
what's the solution?
Related
I need to recode something from js to c# that utilises the btoa method in js on a string of unicode chars to convert them to base64. However, as far as I know the encoding used by javascrpt is different to all those available in c#. I need the encoding to be exactly the same and not return different values across these languages. I have tried setting up a nodejs server and making get requests to it, in order to run the encoding that way, but this is far too slow and unstable. I am under the impression I would need to make my own encoding table but I have no idea where to start or how to implement this. If anyone could help that would be greatly appreciated.
tl;dr: javascript's btoa returns different value than base 64 encoding in c# does. I need it to be the same values.
code and outputs:
c#:
String fromChar = new
String(247,71,186,8,125,72,2,1.0078125,0.003936767578125);
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(fromChar);
Console.WriteLine(System.Convert.ToBase64String(plainTextBytes));
output = w7dHwroIfUgCAQA=
javascript:
var x = btoa(String.fromCharCode(247,71,186,8,125,72,2,1.0078125,0.003936767578125);
console.log(x)
output =
90e6CH1IAgEA
I am aware the former example uses utf8 encoding which js does not, the problem is there is no encoding in .net that matches the javascript encoding.
Edit: Tried to compare the byte arrays of both c# and javascript but the problem is the btoa function uses an unnamed encoding, so I can't actually get the bytes to print the byte array for it without assuming it is something like utf8 which it is not.
Worked it out. For anyone wondering the encoding used is iso encoding. The btoa function in javascript can be replicated by using the following c# method:
public string encoding(string toEncode)
{
byte[] bytes= Encoding.GetEncoding(28591).GetBytes(toEncode);
string toReturn = System.Convert.ToBase64String(bytes);
return toReturn;
}
The decoding will be the following:
string base64EncodedString = "6Q==";
Encoding.GetEncoding(28591).GetString(Convert.FromBase64String(base64EncodedString));
// "é"
I have a javascript function that I'm trying to convert to PHP, It uses CryptoJS library, speciafically components/enc-base64-min.js and rollups/md5.js. They can be found here.
In it is this bit of code
// Let's say str = 'hello';
var md5 = CryptoJS.MD5(str);
md5 = md5.toString(CryptoJS.enc.Base64);
// md5 outputs "XUFAKrxLKna5cZ2REBfFkg=="
I assumed the str variable is hashed using md5 then encoded to Base64, so I tried this simple code
$md5 = md5($str);
$md5 = base64_encode($md5);
// md5 outputs "MmZjMGE0MzNiMjg4MDNlNWI5NzkwNzgyZTRkNzdmMjI="
Then I tried validating both the outputs, looks like the JS output isnt even a valid Base64 string.
To understand further I tried to look for toString() parameter from W3Schools, but it doesnt make sense to me, as per the reference the parameter is supposed to be an integer (2, 8 or 16), then why is CryptoJS.enc.Base64 used instead?
My goal here isn't to produce a valid base64 encoded string using JS but rather to produce the same output using PHP.
php's md5() with a single parameter returns the md5 hash as a hex string.
Instead you want the raw bytes to be encoded into Base64 so you have to pass the optional parameter $raw_output too to md5() (set to true)
$md5 = md5($str, true);
http://php.net/manual/it/function.md5.php
To communicate with a server, I need to send the password SHA1 & base64 encoded the same way CryptoJS does this.
My problem is that I'm using VB.NET. The typical base64 encoding (UTF-8) result is different than the results of CryptoJS.
How can I base64 encode the SHA1 string in .NET the same way CryptoJS encodes it?
You can see both results here: https://jsfiddle.net/hn5qqLo7/
var helloworld = "Hello World";
var helloword_sha1 = CryptoJS.SHA1(helloworld);
document.write("SHA1: " + helloword_sha1);
var helloword_base64 = helloword_sha1.toString(CryptoJS.enc.Base64);
document.write("1) Base64: " + helloword_base64);
document.write("2) Base64: " + base64_encode(helloword_sha1.toString()));
where base64_encode converts a given string to a Base 64 encoded string.
I saw a similar question, but I don't understand it.
Decode a Base64 String using CryptoJS
In (1) of your fiddle the CryptoJS calculates the SHA1 value of the string, and then, converts the raw bytes to Base64. However (2) first calculates the SHA1 value of 'Hello World' and then puts it in hexadecimal form (consisting of only 0-9 and a-f), and then converts this hexadecimal form of SHA1 to base64. So that is why you end up with two different results.
I am having issues in turning Texture2D type image to bytes and then to string. When I do the following:
var myTextureBytes : byte[] = myTexture.EncodeToPNG();
Debug.Log(System.Text.Encoding.UTF8.GetString(myTextureBytes));
I just get a log output of "�PNG". Why is it so short? Whats the question mark? Shouldn't Unity be able to interpret UTF-8 chars? Also when I send that to my NodeJS server it says SyntaxError: Unexpected token and crashes the server.
the problem is that the bytes of PNG representation of the texture is not UTF-8 encoded, which is only for text.
To convert binary data to a string I would recommend base64 encoding.
var myTextureBytes : byte[] = myTexture.EncodeToPNG();
var myTextureBytesEncodedAsBase64 : String = System.Convert.ToBase64String(myTextureBytes);
have you tried using Default encoding?
Debug.Log(System.Text.Encoding.Default.GetString(myTextureBytes));
i am new in SJCL crypto library, i am doing the following for encrypting the plain text using 256 bit key in
var h = sjcl.codec.hex ;
salt = h.fromBits(sjcl.random.randomWords('10','0'));
var encryptedMessage = sjcl.encrypt(password,message,{count:2048,salt:salt,ks:256});
but i am unable to decrypt the same cipher , i want to know how to decrypt this cipher .
well after so many hit and try i found this line working for me.
sjcl.decrypt(password,encMessage,{count:2048,ks:256});