I try to hash a tokenId with a seed in my smart contract. For simplicity and to avoid other errors I leave the seed out for now. I basically just want to hash a number on my contract and hash the same number on my javascript code and receive the same output.
Code looks something like this on Solidity:
function _tokenURI(uint256 tokenId) internal view returns (string memory) {
string memory currentBaseURI = _baseURI();
bytes32 hashedToken = keccak256(abi.encodePacked(tokenId));
return
bytes(currentBaseURI).length > 0
? string(abi.encodePacked(currentBaseURI, hashedToken, baseExtension))
: "";
}
which also leads to an error on client side invalid codepoint at offset. To tackle this I tried to cast bit32 to string using these functions
function _bytes32ToString(bytes32 _bytes32)
private
pure
returns (string memory)
{
uint8 i = 0;
bytes memory bytesArray = new bytes(64);
for (i = 0; i < bytesArray.length; i++) {
uint8 _f = uint8(_bytes32[i / 2] & 0x0f);
uint8 _l = uint8(_bytes32[i / 2] >> 4);
bytesArray[i] = _toByte(_f);
i = i + 1;
bytesArray[i] = _toByte(_l);
}
return string(bytesArray);
}
function _toByte(uint8 _uint8) private pure returns (bytes1) {
if (_uint8 < 10) {
return bytes1(_uint8 + 48);
} else {
return bytes1(_uint8 + 87);
}
}
though I'm not sure if this is equivalent. Code on the frontend looks like:
const hashed = web3.utils.soliditySha3(
{ type: "uint256", value: tokenId}
);
What do I need to change in order to receive the exact same output? And what does
invalid codepoint at offset
mean?
Maybe issue is that tokenId is not uint256 or Web3, Solidity version? I did few tests with Remix IDE and I recieved the same results.
Solidity code:
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract Hash {
function getHashValue_1() public view returns(bytes32){
return keccak256(abi.encodePacked(uint256(234)));
}
// bytes32: 0x61c831beab28d67d1bb40b5ae1a11e2757fa842f031a2d0bc94a7867bc5d26c2
function getHashValue_3() public view returns(bytes32){
return keccak256(abi.encodePacked(uint256(10),string('StringSecretValue')));
}
// bytes32: 0x5938b4caf29ac4903ee34628c3dc1eb5c670a6bd392a006d0cb91f1fc5db3819
}
JS code:
(async () => {
try {
console.log('Web3 version is '+ Web3.version);
// Web3 version is 1.3.0
let theValueYouNeed = web3.utils.soliditySha3("234");
theValueYouNeed = web3.utils.soliditySha3({type: 'uint256', value: '234'});
theValueYouNeed = web3.utils.soliditySha3({t: 'uint256', v: '234'});
// above hashed value is 0x61c831beab28d67d1bb40b5ae1a11e2757fa842f031a2d0bc94a7867bc5d26c2
console.log('Hashed value 1 is '+theValueYouNeed);
theValueYouNeed = web3.utils.soliditySha3({t: 'uint256', v: '10'},{t: 'string', v: 'StringSecretValue'});
console.log('Hashed value 2 is '+theValueYouNeed);
// above hashed value is 0x5938b4caf29ac4903ee34628c3dc1eb5c670a6bd392a006d0cb91f1fc5db3819
} catch (e) {
console.log(e.message)
}
})()
I'm not sure but invalid codepoint at offset should mean that a designated value does not fall within the range or set of allowed values... So maybe there is something wrong with tokenId and you could do some tests with hardcoded values?
You get the invalid codepoint error because you mix string and byte data when you call abi.encodePacked(currentBaseURI, hashedToken, baseExtension)).
When Javascript gets the return value from the contract it expects a UTF8 string, but inside your hashedToken you have byte values that are not valid for a UTF8-encoded string.
This kind of error might be "intermittent". It might happen in just some cases. You're lucky to see it during development and not in production.
How to fix it?
You are on the right track converting the hash result to a string.
There is an alternative way to do it in this answer which uses less gas by using only bitwise operations.
To convert the hex value in Javascript you can use web3.utils.hexToNumberString().
Related
As per Aave documentation for liquidationCall one must pass uint(-1) for debtToCover parameter in order to liquidate the maximum amount possible for an account with a healthFactor < 1. How is it possible to encode -1 as a uint256 using web3, ethers, etc?
Attempting this using web3, for example yields an error.
> web3.eth.abi.encodeParameter("uint", "-1")
Uncaught:
Error: value out-of-bounds (argument=null, value="-1", code=INVALID_ARGUMENT, version=abi/5.0.7)
at Logger.makeError (-/node_modules/#ethersproject/logger/lib/index.js:199:21)
at Logger.throwError (-/node_modules/#ethersproject/logger/lib/index.js:208:20)
at Logger.throwArgumentError (-/node_modules/#ethersproject/logger/lib/index.js:211:21)
at NumberCoder.Coder._throwError (-/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/coders/abstract-coder.js:40:16)
at NumberCoder.encode (-/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/coders/number.js:40:18)
at -/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/coders/array.js:71:19
at Array.forEach (<anonymous>)
at Object.pack (-/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/coders/array.js:57:12)
at TupleCoder.encode (-/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/coders/tuple.js:36:24)
at AbiCoder.encode (-/node_modules/web3-eth-abi/node_modules/#ethersproject/abi/lib/abi-coder.js:86:15)
at ABICoder.encodeParameters (-/node_modules/web3-eth-abi/lib/index.js:120:27)
at ABICoder.encodeParameter (-/node_modules/web3-eth-abi/lib/index.js:78:17) {
reason: 'value out-of-bounds',
code: 'INVALID_ARGUMENT',
argument: null,
value: '-1'
uint stands for "unsigned integer", so it doesn't accept -1 as a valid value.
Solidity converts uint(-1) to the maximal value of uint up to the version 0.7.6, because the value underflows.
pragma solidity ^0.7;
contract MyContract {
// returns 115792089237316195423570985008687907853269984665640564039457584007913129639935
function foo() external pure returns (uint256) {
return uint(-1);
}
}
Version 0.8.0 introduced automatic revert on integer underflow/overflow, and it doesn't even allow casting the -1 literal to uint, but you can test the revert this way:
pragma solidity ^0.8;
contract MyContract {
// reverts on underflow
function foo() external pure returns (uint256) {
uint256 number = 0;
number--;
return number;
}
}
Many JS libraries also don't allow passing -1 to the "unsigned integer" simply because it's an invalid value for the datatype. But since the -1 effectively represents the maximal value in older versions of Solidity, you can pass the uint maximal value.
For uint8, that's (2^8)-1 (or 255)
const BN = web3.utils.BN;
const number = (new BN(2)).pow(new BN(8)).sub(new BN(1));
console.log(number.toString());
For uint256, that's (2^256)-1 (or the large number starting 115...)
const BN = web3.utils.BN;
const number = (new BN(2)).pow(new BN(256)).sub(new BN(1));
console.log(number.toString());
Good day everyone, I'm trying to obtain the numerical value of 'review' what happens is that in the JSON it returns it as a String and therefore I cann't perform a sum that I did, resulting in a meaningless value
The objective is that, when someone makes a comment about an X product, the data is saved in Firebase in a JSON, the problem is that 'review' is saved as a String and not saved as a number
Is there a way to take this single piece of data and convert it to number?
This is the JSON response
"[{\"review\":\"4\",\"comment\":\"Good product \",\"name\":\"SnowFall\",\"image\":\"https://lh3.googleusercontent.com\",\"method\":\"google\"}]"
This is my TypeScript
callback(i, totalReviews) {
if (!this.render) {
this.render = true;
let globalRating = 0;
let globalReviews = 0;
setTimeout(function () {
totalReviews.forEach((review, index) => {
globalRating += review.length;
for (const i in review) {
globalReviews += review[i].review;
console.log("globalReviews", globalReviews);
}
})
console.log("Raiting", globalRating);
console.log("Reviews", globalReviews);
let averageReviews = Math.round(globalReviews / globalRating);
let precentage = Math.round(globalReviews * 100 / (globalRating * 5));
$(".globalRating").html(globalRating);
$(".percentage").html(precentage);
let averageRating = DinamicReviews.fnc(averageReviews);
$(".br-theme-fontawesome-stars").html(`
<select class="ps-rating reviewsOption" data-read-only="true"></select>
`)
for (let i = 0; i < averageRating.length; i++) {
$(".reviewsOption").append(`
<option value="${averageRating[i]}">${i + 1}</option>
`)
}
Rating.fnc();
}, i * 10)
}
}
This is how it is saved in FireBase FireBase RealTime Database
This is how it looks on my FrontEnd
FrontEnd
There is a line where you assign the .review value to the globalReviews variable as this is using += this shall be concatenating the string and that is the problem you are facing.
If you change that same line to: globalReviews += +review[i].review;
This will fix the error.
The reason why adding a + sign in front of a variable works is because this sign is known as the Unary Plus Operator which converting the value to a number or in case the store value is not a number it will convert following the Unary Plus Operator documentation
To make this code safe and thus not add a possible NaN you can failsafe to 0 by using the following syntax globalReviews += +review[i].review || 0;
I've been spending a day or two on this issue - I've cut the code to the bare bones minimum that's needed. Both functions are returning a different output, even though the input is the same.
The delphi code generates a result similar to: https://www.freeformatter.com/hmac-generator.html
Is there any known issue with either the CryptoJS or Delphi's Indy10 sha256 hashing code that might explain the different results in this case?
JS:
function SetHashMessage(message_str)
{
var APIKey = "test";
var signature_hash_obj = CryptoJS.HmacSHA256(message_str.toString(), APIKey);
var signature_str = signature_hash_obj.toString(CryptoJS.enc.Base64);
return signature_str;
}
Delphi:
function FDoHashMessageString(MessageString: String): String;
var
hmac: TIdHMACSHA256;
hash: TIdBytes;
begin
LoadOpenSSLLibrary;
if not TIdHashSHA256.IsAvailable then
raise Exception.Create('SHA256 hashing is not available: ' + WhichFailedToLoad());
hmac := TIdHMACSHA256.Create;
try
hmac.Key := ToBytes('test');
hash := hmac.HashValue(ToBytes(MessageString));
Result := TIdEncoderMIME.EncodeBytes(hash); // EncodeBytes returns base64
finally
hmac.Free;
end;
end;
Is it possible to convert mongo objectId into string.
The above pictures shows data i received and shown in console.I need id value in string form .but ObjectId is returning as object
In Database id is look like this- 565d3bf4cefddf1748d1fc5e -objectId and i need id exactly like this –
According to the Mongo documentation:
a 4-byte value representing the seconds since the Unix epoch,
a 3-byte machine identifier,
a 2-byte process id, and
a 3-byte counter, starting with a random value.
You can check it out here: https://docs.mongodb.org/manual/reference/object-id/
So in javascript you could do something like this.
var mId = {
Timestamp:1448950573,
Machine:13565407,
Pid:1756,
Increment:8888962
};
function getId(mongoId) {
var result =
pad0(mongoId.Timestamp.toString(16), 8) +
pad0(mongoId.Machine.toString(16), 6) +
pad0(mongoId.Pid.toString(16), 4) +
pad0(mongoId.Increment.toString(16), 6);
return result;
}
function pad0(str, len) {
var zeros = "00000000000000000000000000";
if (str.length < len) {
return zeros.substr(0, len-str.length) + str;
}
return str;
}
console.log(getId(mId))
It produces "565d3b2dcefddf06dc87a282" which was not exactly the id you had, but that might just be a tweak or i was working with different data :D.
EDIT
Added a padding function so that zeros are not truncated.
Hope that helps
EDIT:
I assume you are using c# to connect to and serve documents from the mongo DB. In that case, there is a driver that also supports toString().
Here is an example using the mongo csharp driver:
using MongoDB.Bson;
using MongoDB.Bson.IO;
using MongoDB.Bson.Serialization;
using MongoDB.Driver;
// ...
string outputFileName; // initialize to the output file
IMongoCollection<BsonDocument> collection; // initialize to the collection to read from
using (var streamWriter = new StreamWriter(outputFileName))
{
await collection.Find(new BsonDocument())
.ForEachAsync(async (document) =>
{
using (var stringWriter = new StringWriter())
using (var jsonWriter = new JsonWriter(stringWriter))
{
var context = BsonSerializationContext.CreateRoot(jsonWriter);
collection.DocumentSerializer.Serialize(context, document);
var line = stringWriter.ToString();
await streamWriter.WriteLineAsync(line);
}
});
}
ORIGINAL:
These are Mongo ObjectId's and if you haven't already deserialised the document they should support a toString method that will return a hexadecimal string.
but if you want this applied to the whole document, using JSON.stringify(MogoDocument) should deserialize this for you into a plain object.
For example, if you have an expression like this:
Expression<Func<int, int>> fn = x => x * x;
Is there anything that will traverse the expression tree and generate this?
"function(x) { return x * x; }"
It's probably not easy, but yes, it's absolutely feasible. ORMs like Entity Framework or Linq to SQL do it to translate Linq queries into SQL, but you can actually generate anything you want from the expression tree...
You should implement an ExpressionVisitor to analyse and transform the expression.
EDIT: here's a very basic implementation that works for your example:
Expression<Func<int, int>> fn = x => x * x;
var visitor = new JsExpressionVisitor();
visitor.Visit(fn);
Console.WriteLine(visitor.JavaScriptCode);
...
class JsExpressionVisitor : ExpressionVisitor
{
private readonly StringBuilder _builder;
public JsExpressionVisitor()
{
_builder = new StringBuilder();
}
public string JavaScriptCode
{
get { return _builder.ToString(); }
}
public override Expression Visit(Expression node)
{
_builder.Clear();
return base.Visit(node);
}
protected override Expression VisitParameter(ParameterExpression node)
{
_builder.Append(node.Name);
base.VisitParameter(node);
return node;
}
protected override Expression VisitBinary(BinaryExpression node)
{
base.Visit(node.Left);
_builder.Append(GetOperator(node.NodeType));
base.Visit(node.Right);
return node;
}
protected override Expression VisitLambda<T>(Expression<T> node)
{
_builder.Append("function(");
for (int i = 0; i < node.Parameters.Count; i++)
{
if (i > 0)
_builder.Append(", ");
_builder.Append(node.Parameters[i].Name);
}
_builder.Append(") {");
if (node.Body.Type != typeof(void))
{
_builder.Append("return ");
}
base.Visit(node.Body);
_builder.Append("; }");
return node;
}
private static string GetOperator(ExpressionType nodeType)
{
switch (nodeType)
{
case ExpressionType.Add:
return " + ";
case ExpressionType.Multiply:
return " * ";
case ExpressionType.Subtract:
return " - ";
case ExpressionType.Divide:
return " / ";
case ExpressionType.Assign:
return " = ";
case ExpressionType.Equal:
return " == ";
case ExpressionType.NotEqual:
return " != ";
// TODO: Add other operators...
}
throw new NotImplementedException("Operator not implemented");
}
}
It only handles lambdas with a single instruction, but anyway the C# compiler can't generate an expression tree for a block lambda.
There's still a lot of work to do of course, this is a very minimal implementation... you probably need to add method calls (VisitMethodCall), property and field access (VisitMember), etc.
Script# is used by Microsoft internal developers to do exactly this.
Take a look at Lambda2Js, a library created by Miguel Angelo for this exact purpose.
It adds a CompileToJavascript extension method to any Expression.
Example 1:
Expression<Func<MyClass, object>> expr = x => x.PhonesByName["Miguel"].DDD == 32 | x.Phones.Length != 1;
var js = expr.CompileToJavascript();
Assert.AreEqual("PhonesByName[\"Miguel\"].DDD==32|Phones.length!=1", js);
Example 2:
Expression<Func<MyClass, object>> expr = x => x.Phones.FirstOrDefault(p => p.DDD > 10);
var js = expr.CompileToJavascript();
Assert.AreEqual("System.Linq.Enumerable.FirstOrDefault(Phones,function(p){return p.DDD>10;})", js);
More examples here.
The expression has already been parsed for you by the C# compiler; all that remains is for you to traverse the expression tree and generate the code. Traversing the tree can be done recursively, and each node could be handled by checking what type it is (there are several subclasses of Expression, representing e.g. functions, operators, and member lookup). The handler for each type can generate the appropriate code and traverse the node's children (which will be available in different properties depending on which expression type it is). For instance, a function node could be processed by first outputting "function(" followed by the parameter name followed by ") {". Then, the body could be processed recursively, and finally, you output "}".
A few people have developed open source libraries seeking to solve this problem. The one I have been looking at is Linq2CodeDom, which converts expressions into a CodeDom graph, which can then be compiled to JavaScript as long as the code is compatible.
Script# leverages the original C# source code and the compiled assembly, not an expression tree.
I made some minor edits to Linq2CodeDom to add JScript as a supported language--essentially just adding a reference to Microsoft.JScript, updating an enum, and adding one more case in GenerateCode. Here is the code to convert an expression:
var c = new CodeDomGenerator();
c.AddNamespace("Example")
.AddClass("Container")
.AddMethod(
MemberAttributes.Public | MemberAttributes.Static,
(int x) => "Square",
Emit.#return<int, int>(x => x * x)
);
Console.WriteLine(c.GenerateCode(CodeDomGenerator.Language.JScript));
And here is the result:
package Example
{
public class Container
{
public static function Square(x : int)
{
return (x * x);
}
}
}
The method signature reflects the more strongly-typed nature of JScript. It may be better to use Linq2CodeDom to generate C# and then pass this to Script# to convert this to JavaScript. I believe the first answer is the most correct, but as you can see by reviewing the Linq2CodeDom source, there is a lot of effort involved on handling every case to generate the code correctly.