I'm using #azure/service-bus JavaScript library to publish and subscribe to messages on Azure Service Bus topic from Azure Functions.
To receive messages, I'm using Azure Service Bus Topic trigger function created from the template without any changes.
When I publish message using sender.send(message) I receive it fine.
import { AzureFunction, Context, HttpRequest } from "#azure/functions"
import * as sb from "#azure/service-bus"
const PublishToServiceBus: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
const eventDoc = req.body;
const serviceBusConnectionString = process.env["ServiceBusConnection"];
const topicName = process.env["TopicName"]';
const sbClient = sb.ServiceBusClient.createFromConnectionString(serviceBusConnectionString);
const topicClient = sbClient.createTopicClient(topicName);
const sender = topicClient.createSender();
const message: sb.SendableMessageInfo = { body: eventDoc };
// this works
sender.send(message);
// this creates message without body?
const scheduledEnqueueTimeUtc = new Date(Date.now() + 10000);
sender.scheduleMessages(scheduledEnqueueTimeUtc, [message]);
};
export default PublishToServiceBus;
But when I schedule message with sender.scheduleMessages(), my incoming binding variable is undefined in Azure Service Bus Topic trigger function.
import { AzureFunction, Context } from "#azure/functions"
const serviceBusTopicTrigger: AzureFunction = async function (context: Context, mySbMsg: any): Promise<void> {
context.log('ServiceBus topic trigger function processed message', mySbMsg);
};
export default serviceBusTopicTrigger;
Output: ServiceBus topic trigger function processed message undefined
Is this a problem with the library or I'm doing something wrong?
The issue is caused by a bug in the #azure/service-bus sdk.
Workaround
Import DefaultDataTransformer from "#azure/amqp-common" library.
In typescript, import { DefaultDataTransformer } from "#azure/amqp-common";
In javascript, const { DefaultDataTransformer } = require("#azure/amqp-common");
Update the message body before calling the scheduleMessage() method to send the message as follows
Instantiate the data transformer used by the sdk:
const dt = new DefaultDataTransformer();
When you need to schedule the message, encode the message body before sending:
message.body = dt.encode(message.body);
More Reference and Investigation around this bug - Azure/azure-sdk-for-js#6816
in the new world (azure functions .net5) you can no longer use brokered messages. the new libraries do not cater for it.
Function app declaration is no longer [FunctionName=] but [Function=
You can no longer receive 'Message' or byte. but only a string!!!.
example;
[Function("TestFA")]
public async Task Run([ServiceBusTrigger(topicName, subscriberName, Connection = ???)] string messageText, string id, FunctionContext executionContext)
the magic is now in FunctionContext executionContext
you can get properties from this
e.g KeyValuePair<string, object> props = executionContext.BindingContext.BindingData.Where(x => x.Key == "UserProperties").FirstOrDefault();
Related
My calling smart contract method is: Transfer ethers and use alchemy . when clicking the button call payDrink method then show this error:
[Error: Returned error: Unsupported method: eth_sendTransaction. Alchemy does not hold users' private keys. See available methods at https://docs.alchemy.com/alchemy/documentation/apis]
import Web3 from 'web3';
const web3 = new Web3("https://eth-goerli.g.alchemy.com/v2/YPhlCYJ_fLdms1LpSRNs1n6rfcIqGHT9");
const payDrink = (async () => {
console.log("ethAmount");
try{
const contract = new web3.eth.Contract(ContractAbi,contractAddress);
const transaction = await contract.methods.transfer().send({
from: '0x9126de09872d12c4f6d417e2cb6061d1ad9e4708',
value: web3.utils.toWei("0.0001", 'ether'),
});
const transactionReciept = await transaction.wait();
console.log(transactionReciept);}
catch(err){console.log("eee",err);}
})
there are many different methods for making web3 objects.
by using the network RPC URL you have to make HTTP provide first. Then it passes into the Web3 object.
const provider = new Web3.providers.HttpProvider(Your Network RPC URL);
const web3 = new Web3(provider);
Now this should work perfectly.
I'm making a dApp and I want to add a button where a user (the one with their wallet connected) can send exactly 0.01 SOL to another user. I already wrote the function in my Rust program and after testing it with anchor test it seems to be working when I use my own personal wallet's Keypair to sign the transaction. However, now I am writing the event handler function in my web app's frontend and I'm not sure what to pass for the signers parameter if I want the user to sign the transaction. What do I pass if I don't know their secret key? Is there a way that I can generate a user's Keypair from their public key alone or would I need to use the Solana Wallet Adapter for this? Any help would be appreciated. This is my first time working with Solana!
This is the function:
const tipSol = async (receiverAddress) => {
try {
const provider = getProvider();
const program = new Program(idl, programID, provider);
const lamportsToSend = LAMPORTS_PER_SOL / 100;
const amount = new anchor.BN(lamportsToSend);
await program.rpc.sendSol(amount, {
accounts: {
from: walletAddress,
to: receiverAddress,
systemProgram: SystemProgram.programId,
},
signers: ?
})
console.log('Successfully sent 0.01 SOL!')
window.alert(`You successfully tipped ${receiverAddress} 0.01 SOL!`)
} catch (error) {
console.error('Failed to send SOL:', error);
window.alert('Failed to send SOL:', error);
}
}
Frontends never access private keys. Instead the flow is something like:
Frontend creates the transaction
Frontend sends the transaction to the wallet
Wallet signs the transaction
Wallet returns the signed transaction to the frontend
Frontend send the transaction
You can use the #solana/wallet-adapter to implement this on your frontend https://github.com/solana-labs/wallet-adapter
In practice it would be something like this in your frontend
export const Component = () => {
const { connection } = useConnection();
const { sendTransaction } = useWallet();
const handle = async () => {
const ix: TransactionInstruction = await tipSol(receiverKey);
const tx = new Transaction().add(ix);
const sig = await sendTransaction(tx, connection);
};
// ...
};
I'm using sveltekit and trying to understand all the new features added after retiring Sapper. One of those new features is hooks.js which runs on the server and not accessible to the frontend. It makes dealing with db safe. So I created a connection to my mongodb to retrieve user's data before I use the db results in my getSession function. It works but I noticed that it access my database TWICE. Here is my hooks.js code:
import * as cookie from 'cookie';
import { connectToDatabase } from '$lib/mongodb.js';
export const handle = async ({event, resolve})=>{
const dbConnection = await connectToDatabase();
const db = dbConnection.db;
const userinfo = await db.collection('users').findOne({ username: "a" });
console.log("db user is :" , userinfo) //username : John
const response = await resolve(event)
response.headers.set(
'set-cookie', cookie.serialize("cookiewithjwt", "sticksafterrefresh")
)
return response
}
export const getSession = (event)=>{
return {
user : {
name : "whatever"
}
}
}
The console.log you see here returns the user data twice. One as soon as I fire up my app at localhost:3000 with npm run dev and then less than a second, it prints another console log with the same information
db user is : John
a second later without clicking on anything a second console.log prints
db user is : John
So my understanding from the sveltekit doc is that hooks.js runs every time SvelteKit receives a request. I removed all prerender and prefetch from my code. I made sure I only have the index.svelte in my app but still it prints twice. My connection code I copied from an online post has the following:
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
Here is my connection code:
import { MongoClient } from 'mongodb';
const mongoURI ="mongodb+srv://xxx:xxx#cluster0.qjeag.mongodb.net/xxxxdb?retryWrites=true&w=majority";
const mongoDB = "xxxxdb"
export const MONGODB_URI = mongoURI;
export const MONGODB_DB = mongoDB;
if (!MONGODB_URI) {
throw new Error('Please define the mongoURI property inside config/default.json');
}
if (!MONGODB_DB) {
throw new Error('Please define the mongoDB property inside config/default.json');
}
/**
* Global is used here to maintain a cached connection across hot reloads
* in development. This prevents connections growing exponentially
* during API Route usage.
*/
let cached = global.mongo;
if (!cached) {
cached = global.mongo = { conn: null, promise: null };
}
export const connectToDatabase = async() => {
if (cached.conn) {
return cached.conn;
}
if (!cached.promise) {
const opts = {
useNewUrlParser: true,
useUnifiedTopology: true
};
cached.promise = MongoClient.connect(MONGODB_URI).then((client) => {
return {
client,
db: client.db(MONGODB_DB)
};
});
}
cached.conn = await cached.promise;
return cached.conn;
So my question is : is hooks.js runs twice all the time, one time on the server and one time on the front? If not, then why the hooks.js running/printing twice the db results in my case?
Anyone?
I am trying to make a simple nextjs API route (https://nextjs.org/docs/api-routes/introduction) that is connected to the Ethereum blockchain to perform a view function (requires no gas) from a smart contract.
I have a system where you can buy the rights to mint a NFT (ERC721), and this functions checks if the user has paid for any collectionIds that is not yet minted.
import Web3 from 'web3'
import { getPaidForCollectionsIds } from '../../database'
const mnemonic2 = 'main check ...'
var HDWalletProvider = require('truffle-hdwallet-provider')
export default async function (req, res) {
const paidFor = await getPaidForCollectionsIds(req.body.userId)
if (paidFor.length < 1) return res.json({ data: [] })
const provider = new HDWalletProvider(mnemonic2, 'https://rinkeby.infura.io/v3/INFURAAPIKEY', 0)
const web3 = new Web3(provider)
const TheContractAddress = '0xfbeF...'
const { abi } = require('../../abis/TheContract.json')
const KS = new web3.eth.Contract(abi, TheContractAddress, {
from: '0x5EE...', // default from address
gasPrice: '20000000000' // default gas price in wei, 20 gwei in this case
})
const unminted = []
await Promise.all(paidFor.data.map(async pf => KS.methods.readCollectionIdIsUsed(pf.collectionId).call().then(d => {
console.log(d, 'readCollectionIdIsUsed', pf.collectionId)
}).catch(e => {
unminted.push(sign)
})
)
}))
res.statusCode = 200
res.json({ data: unminted })
}
here is the code from the readCollectionIdIsUsed method in the smart contract:
mapping (uint256 => bool) collectionIdIsUsed;
function readCollectionIdIsUsed(uint256 collectionId) external view returns (bool res) {
require(collectionIdIsUsed[collectionId], 'This signature has not been used');
res = collectionIdIsUsed[collectionId];
}
This all works fine, except for that after a while, I reach the 100 000 request limit of infura.
infura top methods
I dont know why it is calling eth_getBlockByNumber 10 times for each call, is this necessary or is there a way around it?
Web3.js should not do this for calls, but should do for sends.
This is because when you await any web3.js contract methods, it has internal, somewhat unexpected, implied "wait X number of blocks before tx is confirmed" mechanism and somehow this get triggered, although your example code does not seem to have any contract writes. It is documented here.
Because you are using HTTPS connections instead of WebSocket connection, Web3.js needs to poll the new block numbers for confirmations. But if you switch to WebSocket infura provider, these calls should disappear, as Web3.js can simply subscribe to WebSocket new block detected events.
I am trying to implement single index searching using Algoliasearch for my iOS mobile app. I have about 110 users on my application. However, when I upload their data to Algolia search's index the function times out before uploading all users. Instead it throws an Error message in the http browser and declares a timeout in the Firestore console.
Firestore console:
sendCollectionToAlgolia
Function execution took 60044 ms, finished with status: 'timeout'
I created the function using this tutorial:
https://medium.com/#soares.rfarias/how-to-set-up-firestore-and-algolia-319fcf2c0d37
Although i have ran into some complications, I highly recommend that tutorial if you have your app using swiftUI iOS platform and implement cloud functions using Typescript.
Heres my function:
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
import algoliasearch from 'algoliasearch';
admin.initializeApp();
const db = admin.firestore();
const algoliaClient = algoliasearch(functions.config().algolia.appid, functions.config().algolia.apikey)
const collectionIndexName = functions.config().projectId === 'PROJECT-XXXX' ? 'prod_SEARCH' : 'dev_SEARCH';
const collectionIndex = algoliaClient.initIndex(collectionIndexName);
//rename to uploadUsersToAlgolia
export const sendCollectionToAlgolia = functions.https.onRequest(async (req, res) => {
const algoliaRecords: any[] = [];
const querySnapshot = await db.collection('users').get();
querySnapshot.docs.forEach(doc => {
const document = doc.data();
const record = {
objectID: doc.id,
fullname: document.fullname,
bio: document.bio,
username: document.username,
uid: document.uid,
profileImageURL: document.profileImageURL,
backgroundImageURL: document.backgroundImageURL,
fcmToken: document.fcmToken,
accountCreated: document.accountCreated,
inspirationCount: document.inspriationCount,
BucketListCount: document.BucketListCount,
CompletedBucketListCount: document.CompletedBucketListCount,
FriendsCount: document.FriendsCount
};
algoliaRecords.push(record);
});
// After all records are created, we save them to
collectionIndex.saveObjects(algoliaRecords, (_error: any, content: any) => {
res.status(200).send("users collection was indexed to Algolia successfully.");
});
});
If you just want to change the default 1 minute timeout, you can do that when you configure the function.
functions.runWith({timeoutSeconds: X}).https.onRequest(async (req, res)
Increasing the timeout won't help if your function doesn't end up sending a response, so you should also add some logging/debugging to figure out if the final call to res.send() is actually happening. If the function never sends a response, it will definitely time out no matter what happens.