java encryption aes value and javascript encryption value does not match - javascript
I have java code which produce aes encryption code for me now I am trying to use it on javascript using crypto-js but both codes provides different keys I dont know why and how to get the same key here is my code
public static String encrypt(String text, byte[] iv, byte[] key)throws Exception{
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
SecretKeySpec keySpec = new SecretKeySpec(key, "AES");
System.out.println("KEY SPECCCC: "+keySpec);
IvParameterSpec ivSpec = new IvParameterSpec(iv);
cipher.init(Cipher.ENCRYPT_MODE,keySpec,ivSpec);
byte [] results = cipher.doFinal(text.getBytes("UTF-8"));
BASE64Encoder encoder = new BASE64Encoder();
return encoder.encode(results);
}
JavaScript code
require(["crypto-js/core", "crypto-js/aes"], function (CryptoJS, AES) {
ciphertext = CryptoJS.AES.encrypt(JSON.stringify(jsondata),
arr.toString(),arr.toString());
});
string to utf-8
var utf8 = unescape(encodeURIComponent(key));
var arr = [];
for (var i = 0; i < utf8.length; i++) {
arr.push(utf8.charCodeAt(i));
}
First of all even tough your code works fine you wont be able to decrypt it back properly because while you are creating your AES cipher in Java you are using CBC Cipher and You are implementing a Padding algorithm which is PKCS5Padding.
So your java code does the followings;
When it gets the input it first divide it into the 16 bits blocks then if your input doesnt divide into the 16 overall then the reminders will be padded for filling the block with the same number of reminder.You can see what i mean by the following picture.
So it will do the encryption with the padded ciphers in the java side but in the Javascript Part You neither declare what type of Mode Aes will use nor declaring the what type of Padding it suppose to do. So you should add those values into the your code.You can make search following code parts.
mode:CryptoJS.mode.CBC,
padding: CryptoJS.pad.Pkcs7
About the different keys it is occuring because you are sending a Byte[] in to the your Encrypt method then use this unknown Byte[] while you are creating your Key.You didnt mention why your encryption method will be used in your program but you should create that "Byte[] key" same way in the both method.For instance you can refer following code as a example of generating that but it is not secure way of generating keys I just added it for showing you what I mean by you should generate both keys in the same way.
//DONT USE THIS IMPLEMENTATION SINCE IT IS NOT SAFE!
byte[] key = (username + password).getBytes("UTF-8");
Java code generates an encrypted string and for JavaScript to also generate same encrypted string, Following code works!
(function (CryptoJS) {
var C_lib = CryptoJS.lib;
// Converts ByteArray to stadnard WordArray.
// Example: CryptoJS.MD5(CryptoJS.lib.ByteArray ([ Bytes ])).toString(CryptoJS.enc.Base64);
C_lib.ByteArray = function (arr) {
var word = [];
for (var i = 0; i < arr.length; i += 4) {
word.push (arr[i + 0] << 24 | arr[i + 1] << 16 | arr[i + 2] << 8 | arr[i + 3] << 0);
}
return C_lib.WordArray.create (word, arr.length);
};
})(CryptoJS);
var IVstring = CryptoJS.lib.ByteArray(your IV bytearray).toString(CryptoJS.enc.Base64);
var keystring = CryptoJS.lib.ByteArray(your KEY bytearray).toString(CryptoJS.enc.Base64);
var text = 'texttobeencrypted';
var key = CryptoJS.enc.Base64.parse(keystring);
var iv = CryptoJS.enc.Base64.parse(IVstring);
var encrypted = CryptoJS.AES.encrypt(text, key, {iv: iv});
console.log(encrypted.toString());
Edited: Removed dangerous third party resource reference.
aes encryption javascript cryptojs java
Related
AES encryption in JS equivalent of C#
I need to encrypt a string using AES encryption. This encryption was happening in C# earlier, but it needs to be converted into JavaScript (will be run on a browser). The current code in C# for encryption is as following - public static string EncryptString(string plainText, string encryptionKey) { byte[] clearBytes = Encoding.Unicode.GetBytes(plainText); using (Aes encryptor = Aes.Create()) { Rfc2898DeriveBytes pdb = new Rfc2898DeriveBytes(encryptionKey, new byte[] { 0x49, 0x76, 0x61, 0x6e, 0x20, 0x4d, 0x65, 0x64, 0x76, 0x65, 0x64, 0x65, 0x76 }); encryptor.Key = pdb.GetBytes(32); encryptor.IV = pdb.GetBytes(16); using (MemoryStream ms = new MemoryStream()) { using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write)) { cs.Write(clearBytes, 0, clearBytes.Length); cs.Close(); } plainText = Convert.ToBase64String(ms.ToArray()); } } return plainText; } I have tried to use CryptoJS to replicate the same functionality, but it's not giving me the equivalent encrypted base64 string. Here's my CryptoJS code - function encryptString(encryptString, secretKey) { var iv = CryptoJS.enc.Hex.parse('Ivan Medvedev'); var key = CryptoJS.PBKDF2(secretKey, iv, { keySize: 256 / 32, iterations: 500 }); var encrypted = CryptoJS.AES.encrypt(encryptString, key,{iv:iv); return encrypted; } The encrypted string has to be sent to a server which will be able to decrypt it. The server is able to decrypt the encrypted string generated from the C# code, but not the encrypted string generated from JS code. I tried to compare the encrypted strings generated by both the code and found that the C# code is generating longer encrypted strings. For example keeping 'Example String' as plainText and 'Example Key' as the key, I get the following result - C# - eAQO+odxOdGlNRB81SHR2XzJhyWtz6XmQDko9HyDe0w= JS - 9ex5i2g+8iUCwdwN92SF+A== The length of JS encrypted string is always shorter than the C# one. Is there something I am doing wrong? I just have to replicated the C# code into the JS code. Update: My current code after Zergatul's answer is this - function encryptString(encryptString, secretKey) { var keyBytes = CryptoJS.PBKDF2(secretKey, 'Ivan Medvedev', { keySize: 48 / 4, iterations: 1000 }); console.log(keyBytes.toString()); // take first 32 bytes as key (like in C# code) var key = new CryptoJS.lib.WordArray.init(keyBytes.words, 32); // skip first 32 bytes and take next 16 bytes as IV var iv = new CryptoJS.lib.WordArray.init(keyBytes.words.splice(32 / 4), 16); console.log(key.toString()); console.log(iv.toString()); var encrypted = CryptoJS.AES.encrypt(encryptString, key, { iv: iv }); return encrypted; } As illustrated in his/her answer that if the C# code converts the plainText into bytes using ASCII instead of Unicode, both the C# and JS code will produce exact results. But since I am not able to modify the decryption code, I have to convert the code to be equivalent of the original C# code which was using Unicode. So, I tried to see, what's the difference between both the bytes array between ASCII and Unicode byte conversion in C#. Here's what I found - ASCII Byte Array: [69,120,97,109,112,108,101,32,83,116, 114, 105, 110, 103] Unicode Byte Array: [69,0,120,0,97,0,109,0,112,0,108,0,101,0,32,0,83,0,116,0, 114,0, 105,0, 110,0, 103,0] So some extra bytes are available for each character in C# (So Unicode allocates twice as much bytes to each character than ASCII). Here's the difference between both Unicode and ASCII conversion respectively - ASCII clearBytes: [69,120,97,109,112,108,101,32,83,116,114,105,110,103,] encryptor.Key: [123,213,18,82,141,249,182,218,247,31,246,83,80,77,195,134,230,92,0,125,232,210,135,115,145,193,140,239,228,225,183,13,] encryptor.IV: [101,74,46,177,46,233,68,252,83,169,211,13,249,61,118,167,] Result: eQus9GLPKULh9vhRWOJjog== Unicode: clearBytes: [69,0,120,0,97,0,109,0,112,0,108,0,101,0,32,0,83,0,116,0,114,0,105,0,110,0,103,0,] encryptor.Key: [123,213,18,82,141,249,182,218,247,31,246,83,80,77,195,134,230,92,0,125,232,210,135,115,145,193,140,239,228,225,183,13,] encryptor.IV: [101,74,46,177,46,233,68,252,83,169,211,13,249,61,118,167,] Result: eAQO+odxOdGlNRB81SHR2XzJhyWtz6XmQDko9HyDe0w= So since both the key and iv being generated have exact same byte array in both Unicode and ASCII approach, it should not have generated different output, but somehow it's doing that. I think it's because of clearBytes' length, as it's using its length to write to CryptoStream. I tried to see what's the output of the generated bytes in the JS code is and found that it uses words which needed to be converted into Strings using toString() method. keyBytes: 7bd512528df9b6daf71ff653504dc386e65c007de8d2877391c18cefe4e1b70d654a2eb12ee944fc53a9d30df93d76a7 key: 7bd512528df9b6daf71ff653504dc386e65c007de8d2877391c18cefe4e1b70d iv: 654a2eb12ee944fc53a9d30df93d76a7 Since, I am not able to affect the generated encrypted string's length in the JS code (No access to the write stream directly), thus still stuck here.
Here is the example how to reproduce the same ciphertext between C# and CryptoJS: static void Main(string[] args) { byte[] plainText = Encoding.Unicode.GetBytes("Example String"); // this is UTF-16 LE string cipherText; using (Aes encryptor = Aes.Create()) { var pdb = new Rfc2898DeriveBytes("Example Key", Encoding.ASCII.GetBytes("Ivan Medvedev")); encryptor.Key = pdb.GetBytes(32); encryptor.IV = pdb.GetBytes(16); using (MemoryStream ms = new MemoryStream()) { using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write)) { cs.Write(plainText, 0, plainText.Length); cs.Close(); } cipherText = Convert.ToBase64String(ms.ToArray()); } } Console.WriteLine(cipherText); } And JS: var keyBytes = CryptoJS.PBKDF2('Example Key', 'Ivan Medvedev', { keySize: 48 / 4, iterations: 1000 }); // take first 32 bytes as key (like in C# code) var key = new CryptoJS.lib.WordArray.init(keyBytes.words, 32); // skip first 32 bytes and take next 16 bytes as IV var iv = new CryptoJS.lib.WordArray.init(keyBytes.words.splice(32 / 4), 16); // use the same encoding as in C# code, to convert string into bytes var data = CryptoJS.enc.Utf16LE.parse("Example String"); var encrypted = CryptoJS.AES.encrypt(data, key, { iv: iv }); console.log(encrypted.toString()); Both codes return: eAQO+odxOdGlNRB81SHR2XzJhyWtz6XmQDko9HyDe0w=
TL;DR the final code looks like this - function encryptString(encryptString, secretKey) { encryptString = addExtraByteToChars(encryptString); var keyBytes = CryptoJS.PBKDF2(secretKey, 'Ivan Medvedev', { keySize: 48 / 4, iterations: 1000 }); console.log(keyBytes.toString()); var key = new CryptoJS.lib.WordArray.init(keyBytes.words, 32); var iv = new CryptoJS.lib.WordArray.init(keyBytes.words.splice(32 / 4), 16); var encrypted = CryptoJS.AES.encrypt(encryptString, key, { iv: iv, }); return encrypted; } function addExtraByteToChars(str) { let strResult = ''; for (var i = 0; i < str.length; ++i) { strResult += str.charAt(i) + String.fromCharCode(0); } return strResult; } Explanation: The C# code in the Zergatul's answer (Thanks to him/her) was using ASCII to convert the plainText into bytes, while my C# code was using Unicode. Unicode was assigning extra byte to each character in the resultant byte array, which was not affecting the generation of both key and iv bytes, but affecting the result since the length of the encryptedString was dependent on the length of the bytes generated from plainText. As seen in the following bytes generated for each of them using "Example String" and "Example Key" as the plainText and secretKey respectively - ASCII clearBytes: [69,120,97,109,112,108,101,32,83,116,114,105,110,103,] encryptor.Key: [123,213,18,82,141,249,182,218,247,31,246,83,80,77,195,134,230,92,0,125,232,210,135,115,145,193,140,239,228,225,183,13,] encryptor.IV: [101,74,46,177,46,233,68,252,83,169,211,13,249,61,118,167,] Result: eQus9GLPKULh9vhRWOJjog== Unicode: clearBytes: [69,0,120,0,97,0,109,0,112,0,108,0,101,0,32,0,83,0,116,0,114,0,105,0,110,0,103,0,] encryptor.Key: [123,213,18,82,141,249,182,218,247,31,246,83,80,77,195,134,230,92,0,125,232,210,135,115,145,193,140,239,228,225,183,13,] encryptor.IV: [101,74,46,177,46,233,68,252,83,169,211,13,249,61,118,167,] Result: eAQO+odxOdGlNRB81SHR2XzJhyWtz6XmQDko9HyDe0w= The JS result was similar too, which confirmed that it's using ASCII byte conversion - keyBytes: 7bd512528df9b6daf71ff653504dc386e65c007de8d2877391c18cefe4e1b70d654a2eb12ee944fc53a9d30df93d76a7 key: 7bd512528df9b6daf71ff653504dc386e65c007de8d2877391c18cefe4e1b70d iv: 654a2eb12ee944fc53a9d30df93d76a7 Thus I just need to increase the length of the plainText to make it use Unicode equivalent byte generation (Sorry, not familiar with the term). Since Unicode was assigning 2 space for each character in the byteArray, keeping the second space as 0, I basically created gap in the plainText's characters and filled that gap with character whose ASCII value was 0 using the addExtraByteToChars() function. And it made all the difference. It's a workaround for sure, but started working for my scenario. I suppose this may or may not prove useful to others, thus sharing the findings. If anyone can suggest better implementation of the addExtraByteToChars() function (probably some term for this conversion instead of ASCII to Unicode or a better, efficient, and not hacky way to do that), please suggest it.
Trouble decrypting openSSL AES CTR encrypted text
I have trouble decrypting an message encrypted in php with the openssl_encrypt method. I am using the new WebCrypto API (so I use crypto.subtle). Encrypting in php: $ALGO = "aes-256-ctr"; $key = "ae6865183f6f50deb68c3e8eafbede0b33f9e02961770ea5064f209f3bf156b4"; function encrypt ($data, $key) { global $ALGO; $iv = openssl_random_pseudo_bytes(openssl_cipher_iv_length($ALGO), $strong); if (!$strong) { exit("can't generate strong IV"); } return bin2hex($iv).openssl_encrypt($data, $ALGO, $key, 0, $iv); } $enc = encrypt("Lorem ipsum dolor", $key); exit($enc); example output: 8d8c3a57d2dbb3287aca61be0bce59fbeAQ4ILKouAQ5eizPtlUTeHU= (I can decrypt that in php and get the cleartext back) In JS I decrypt like this: function Ui8FromStr (StrStart) { const Ui8Result = new Uint8Array(StrStart.length); for (let i = 0; i < StrStart.length; i++) { Ui8Result[i] = StrStart.charCodeAt(i); } return Ui8Result; } function StrFromUi8 (Ui8Start) { let StrResult = ""; Ui8Start.forEach((charcode) => { StrResult += String.fromCharCode(charcode); }); return StrResult; } function Ui8FromHex (hex) { for (var bytes = new Uint8Array(Math.ceil(hex.length / 2)), c = 0; c < hex.length; c += 2) bytes[c/2] = parseInt(hex.substr(c, 2), 16); return bytes; } const ALGO = 'AES-CTR' function decrypt (CompCipher, HexKey) { return new Promise (function (resolve, reject) { // remove IV from cipher let HexIv = CompCipher.substr(0, 32); let B64cipher = CompCipher.substr(32); let Ui8Cipher = Ui8FromStr(atob(B64cipher)); let Ui8Iv = Ui8FromHex (HexIv); let Ui8Key = Ui8FromHex (HexKey); crypto.subtle.importKey("raw", Ui8Key, {name: ALGO}, false, ["encrypt", "decrypt"]). then (function (cryptokey){ return crypto.subtle.decrypt({ name: ALGO, counter: Ui8Iv, length: 128}, cryptokey, Ui8Cipher).then(function(result){ let Ui8Result = new Uint8Array(result); let StrResult = StrFromUi8(Ui8Result); resolve(StrResult); }).catch (function (err){ reject(err) }); }) }) } when I now run decrypt("8d8c3a57d2dbb3287aca61be0bce59fbeAQ4ILKouAQ5eizPtlUTeHU=", "ae6865183f6f50deb68c3e8eafbede0b33f9e02961770ea5064f209f3bf156b4").then(console.log) I get gibberish: SÌõÅ°blfçSÑ- The problem I have is, that I am not sure what is meant with counter. I tried the IV but failed. This Github tutorial suggests*1, that it is the IV - or at least part of it, as I've seen people talk about that the counter is part of the IV (something like 4 bytes, that means that the IV is made from 12 bytes IV and 4 bytes Counter) If that is indeed true, my question then becomes: Where do I give the script the other 12 bytes of IV when counter is only 4 bytes of it. Can anyone maybe give me a working example of encryption in php *1 It says that the same counter has to be used for en- and decryption. This leads me to believe, that it is at least something similar to the IV
You are handling the key incorrectly in PHP. In the PHP code you are passing the hex encoded key directly to the openssl_encrypt function, without decoding it. This means the key you are trying to use is twice as long as expected (i.e. 64 bytes). OpenSSL doesn’t check the key length, however—it just truncates it, taking the first 32 bytes and using them as the encryption key. The Javascript code handles the key correctly, hex decoding it before passing the decoded array to the decryption function. The overall result is you are using a different key in each case, and so the decryption doesn’t work. You need to add a call to hex2bin on the key in your PHP code, to convert it from the hex encoding to the actual 32 raw bytes.
different output for JAVA vs javascript AES 256 cbc
Im trying to create an AES 256 cbc encryption using java and I need to emulate EXACTLY this javascript code (I know the iv is the same as the key (turnicated to 16 bytes), that's how it is from the site i'm trying to log into using java) var recievedStr = "MDk4NTY1MDAyMjg2MTU1OA=="; //some var key = CryptoJS.enc.Base64.parse(recievedStr); var iv = CryptoJS.enc.Base64.parse(recievedStr); var pw = "PASSWORD"; var encres = CryptoJS.AES.encrypt(pw, key, {iv:iv, keySize: 256, mode: CryptoJS.mode.CBC, padding: CryptoJS.pad.Pkcs7}); var finalStr = encres.toString(); finalStr will be : Su92ZXLm/MdOyruRnWDRqQ== I need to make a java code that will output exactly the same output as finalStr from the javascript. Im using bouncy castle for that. String recievedStr = "MDk4NTY1MDAyMjg2MTU1OA=="; String pw = "PASSWORD"; AESEngine blockCipher = new AESEngine(); CBCBlockCipher cbcCipher = new CBCBlockCipher(blockCipher); BufferedBlockCipher cipher = new PaddedBufferedBlockCipher (cbcCipher); byte[] key = encodeBase64(recievedStr); byte [] iv = java.util.Arrays.copyOf(key,16); byte[] input = pw.getBytes(); ParametersWithIV pwIV= new ParametersWithIV(new KeyParameter(key),iv); cipher.init(true, pwIV); byte[] cipherText = new byte[cipher.getOutputSize(input.length)]; int outputLen = cipher.processBytes(input, 0, input.length, cipherText, 0); try { cipher.doFinal(cipherText, outputLen); } catch (CryptoException ce) { System.err.println(ce); } System.out.println(new String(Base64.encodeBase64(cipherText))); this will output : qEGQ1PC/QKxfAxGBIbLKpQ== while I can decrypt it to the original input, that is not what i want. I need my java code to output exactly what the javascript did. I have 0 ideas left on how to approach this. Thanks. EDIT: problem was solved, I had to decode the received string to base64 instead of encoding it.
I think you are on the right track. But I think you are running with AES-128 instead of AES-256. If you have a look at Java 256-bit AES Password-Based Encryption I think maybe you can find something useful.
Encrypting data with forgejs on the client side, and decrypting with ruby
For a given project, I'm looking to encrypt a piece of data with AES 256, and then RSA encrypt the key. I've been using Forge and the Encryptor gem in ruby, and i can't seem get matching encryption values: var key = 'strengthstrengthstrengthstrength'; var iv = 'cakecakecakecakecakecakecakecake'; var cipher = forge.aes.createEncryptionCipher(key, 'CBC'); cipher.start(iv); cipher.update(forge.util.createBuffer("some string")); cipher.finish(); var encrypted = cipher.output; console.log(btoa(encrypted.data)); // outputs: CjLmWObDO2Dlwa5tJnRBRw== Then in IRB: Encryptor.encrypt 'some string', :key => 'strengthstrengthstrengthstrength', :key => 'cakecakecakecakecakecakecakecake' Base64.encode64 _ # outputs: C9Gtk9YfciVMJEsbhZrQTw==\n Over using string values for Key & IV, tried: var key = forge.random.getBytesSync(32); var iv = forge.random.getBytesSync(32); Then doing a btoa() call on each of them. Using the Base64.decode64 on the ruby side, before passing them to Encryptor.decrypt, but still no luck. Any idea where i've gone wrong?
I managed to get it to work. Since i'm just using one key to encrypt one value, i just used the key as the IV & Salt as well. This is not recommended if you are using the key to encrypt multiple values. You will then need to generate proper salt & iv values. Also the gen key value is a pretty poor way, as Math.random is not secure. Just ran out of time to do this properly, but works okay for this case. var Encryption = (function () { var api = { getKey: function () { var possible = "ABCDEFabcdef0123456789"; var key = ''; for (var i = 0; i < 32; i++) { key += possible.charAt(Math.floor(Math.random() * possible.length)); } return key; }, encryptPII: function (rawKey, value) { var salt = rawKey; var iv = rawKey; var key = forge.pkcs5.pbkdf2(rawKey, salt, 2000, 32); var cipher = forge.aes.createEncryptionCipher(key, 'CBC'); cipher.start(iv); cipher.update(forge.util.createBuffer(value)); cipher.finish(); return btoa(cipher.output.data); } }; return api; })(); The rawKey is the value returned from getKey(). The value property is the the string to be encrypted. I use the rawkey for iv & salt values, generate a key in the same way that the Encryptor gem in ruby does. Use forge to then encrypt the string value. If i take the base64, decode it in ruby, and pass the same rawKey value to the encryptor gem for key, salt and iv, it works.
Encrypt text using AES in Javascript then Decrypt in C# WCF Service
I am trying to Encrypt a string using AES 128bit encryption. I have code for both Javascript and C#. The main objective is to encrypt the string using Javascript CryptoJS and then take the resultant cipher text and Decrypt it using C# AES AesCryptoServiceProvider. Javascript Code: function EncryptText() { var text = document.getElementById('textbox').value; var Key = CryptoJS.enc.Hex.parse("PSVJQRk9QTEpNVU1DWUZCRVFGV1VVT0="); var IV = CryptoJS.enc.Hex.parse("YWlFLVEZZUFNaWl="); var encryptedText = CryptoJS.AES.encrypt(text, Key, {iv: IV, mode: CryptoJS.mode.CBC, padding: CryptoJS.pad.Pkcs7}); //var decrypted = CryptoJS.AES.decrypt(encrypted, "Secret Passphrase"); var encrypted = document.getElementById('encrypted'); encrypted.value = encryptedText; } C# Code: private String AES_decrypt(string encrypted) { byte[] encryptedBytes = Convert.FromBase64String(encrypted); AesCryptoServiceProvider aes = new AesCryptoServiceProvider(); aes.BlockSize = 128; aes.KeySize = 256; aes.Mode = CipherMode.CBC; aes.Padding = PaddingMode.Pkcs7; aes.Key = Key; aes.IV = IV; ICryptoTransform crypto = aes.CreateDecryptor(aes.Key, aes.IV); byte[] secret = crypto.TransformFinalBlock(encryptedBytes, 0, encryptedBytes.Length); crypto.Dispose(); return System.Text.ASCIIEncoding.ASCII.GetString(secret); } When using "hello" as the plain text for javascript i get this ciphertext: uqhe5ya+mISuK4uc1WxxeQ== When passing that into the C# application, upon running the Decrypt method i recieve: Padding is invalid and cannot be removed. I am stumped here and have tried many solutions resulting in the same error. When encrypting hello through the C# encryption AES method I receive: Y9nb8DrV73+rmmYRUcJiOg== I thank you for your help in advance!
javascript code : function EncryptText() { var text = CryptoJS.enc.Utf8.parse(document.getElementById('textbox').value); var Key = CryptoJS.enc.Utf8.parse("PSVJQRk9QTEpNVU1DWUZCRVFGV1VVT0="); //secret key var IV = CryptoJS.enc.Utf8.parse("2314345645678765"); //16 digit var encryptedText = CryptoJS.AES.encrypt(text, Key, {keySize: 128 / 8,iv: IV, mode: CryptoJS.mode.CBC, padding:CryptoJS.pad.Pkcs7}); var encrypted = document.getElementById('encrypted'); encrypted.value = encryptedText; //Pass encryptedText through service } C# code : private String AES_decrypt(string encrypted,String secretKey,String initVec) { byte[] encryptedBytes = Convert.FromBase64String(encrypted); AesCryptoServiceProvider aes = new AesCryptoServiceProvider(); //aes.BlockSize = 128; Not Required //aes.KeySize = 256; Not Required aes.Mode = CipherMode.CBC; aes.Padding = PaddingMode.Pkcs7; aes.Key = Encoding.UTF8.GetBytes(secretKey);PSVJQRk9QTEpNVU1DWUZCRVFGV1VVT0= aes.IV = Encoding.UTF8.GetBytes(initVec); //2314345645678765 ICryptoTransform crypto = aes.CreateDecryptor(aes.Key, aes.IV); byte[] secret = crypto.TransformFinalBlock(encryptedBytes, 0, encryptedBytes.Length); crypto.Dispose(); return System.Text.ASCIIEncoding.ASCII.GetString(secret); } Used above code working fine !!!
Try using var Key = CryptoJS.enc.Utf8.parse("PSVJQRk9QTEpNVU1DWUZCRVFGV1VVT0="); instead of HEX. Because actually the string you are putting in your key (and IV) and parsing is not a hex string. hex is 0 to F.
First, your Key variable in JS contains a string with 32 characters (after the odd-looking parse call). Although this might be interpreted as a 128-bit key, there is a certain chance that CryptoJS takes it as a pass phrase instead (and generates a key from it using some algorithm). So your actual key looks quite different. The string also looks suspiciously like hex-encoded, so there might be some additional confusion about its C# value. You have to make sure that you are using the same key in JS and C#. Second, the IV variable also, after parsing, looks like a hex-encoded value. So you have to be careful what value you are using on the C# side as well. FYI, here are the values for Key and IV after parsing: Key = 00000000000e00000d000c0000010000, IV = 0000000e000f0a00
Thank you "Uwe" parsing with UTF8 solved everything. What happens if you use: var Key = CryptoJS.enc.Utf8.parse("PSVJQRk9QTEpNVU1DWUZCRVFGV1VVT0="); instead >of HEX? And what is your Key and IV in C#? Because actually the string you are putting in your key and >parsing is not a hex string. hex is 0 to F Thank you so much!