How can I get online and offline users using signalR? - javascript

I have already working and simple code to get the number of users using SIGNALR. I'm new with using it.
All I need now is to get a list of online users and offline users.
here is my code
OnlineCountHub.cs
public class OnlineCountHub : Hub
{
private static int Count = 0;
public override async Task OnConnectedAsync()
{
var u = Context.User.Identity;
var result = u.IsAuthenticated ? u.Name : "";
var claims = Context.User.Claims.ToList();
foreach (var user in claims)
{
//await Groups.AddToGroupAsync(Context.ConnectionId, user.Value);
// await Groups.AddToGroupAsync("ConectedUsers-ConnectionId", "ConnectedUsers-GroupName");
// await Groups.AddToGroupAsync("UNConectedUsers-ConnectionId", "UNConnectedUsers-GroupName");
}
Count++;
await base.OnConnectedAsync();
await Clients.User("9b9ff00d-6de4-4487-9f00-99bd7f3aa345").SendAsync("updateCount", Count);
//return Task.CompletedTask;
}
public override async Task OnDisconnectedAsync(Exception exception)
{
//var claims = Context.User.Claims.ToList();
//foreach (var user in claims)
//{
// await Groups.RemoveFromGroupAsync(Context.ConnectionId, user.Value);
//}
Count--;
await base.OnDisconnectedAsync(exception);
await Clients.User("9b9ff00d-6de4-4487-9f00-99bd7f3aa345").SendAsync("updateCount", Count);
//return Task.CompletedTask;
}
}
onlinecount.js
let onlineCount = document.querySelector('span.online-count');
let updateCountCallback = function (message) {
if (!message) return;
console.log('updateCount = ' + message);
if (onlineCount) onlineCount.innerText = message;
};
function onConnectionError(error) {
if (error && error.message) console.error(error.message);
}
let countConnection = new signalR.HubConnectionBuilder().withUrl('/onlinecount').build();
countConnection.on('updateCount', updateCountCallback);
countConnection.onclose(onConnectionError);
countConnection.start()
.then(function () {
console.log('OnlineCount Connected');
})
.catch(function (error) {
console.error(error.message);
});
Don't forget
Startup.cs [ConfigureServices]
if (EnableSignalR)
services.AddSignalR();
Startup.cs [Configure]
app.UseEndpoints(endpoints =>
{
endpoints.MapControllerRoute(
name: "default",
pattern: "{controller=Home}/{action=Index}/{id?}");
if (EnableSignalR)
endpoints.MapHub<OnlineCountHub>("/onlinecount");
});
layout or index
#if (Startup.EnableSignalR)
{
<strong class="text-warning ps-3 mt-3 text-white">Online(<span class="online-count"></span>)</strong>
}
#if (Startup.EnableSignalR)
{
<script src="~/SignalR/signalRlibs/signalr.js"></script>
<script src="~/SignalR/onlinecount.js"></script>
}
Now .. how to pass to the view list of online users and list of offline users ?

Related

opensea place bid using metamask

const NetworkToUse = process.env.REACT_APP_NETWORK;
const mnemonicWalletSubprovider = new MnemonicWalletSubprovider({
mnemonic: process.env.REACT_APP_MNEMONIC,
});
const infuraRpcSubprovider = new RPCSubprovider({
rpcUrl: `https://${NetworkToUse}.infura.io/v3/${process.env.REACT_APP_INFURA_KEY}`,
});
const providerEngine = new Web3ProviderEngine();
if (window.ethereum) {
providerEngine.addProvider(new SignerSubprovider(window.ethereum));
}
// providerEngine.addProvider(mnemonicWalletSubprovider);
providerEngine.addProvider(infuraRpcSubprovider);
providerEngine.start();
const seaport = new OpenSeaPort(
providerEngine,
{
networkName: NetworkToUse === "mainnet" ? Network.Main : Network.Rinkeby,
apiKey: process.env.REACT_APP_API_KEY,
},
(arg) => {
console.log("From OpenSeaPort CB:");
console.log(arg);
}
);
const placeBidMetaMask = async (order) => {
setIsProcessing(true);
if (typeof window.ethereum === "undefined") {
setError("Please make sure you have MetaMask installed!");
return;
}
if (!bidPrice || bidPrice < asset.price) {
setError("Insufficient Funds!");
return;
}
const { tokenId, tokenAddress } = order.asset;
try {
const [userAccount] = await window.ethereum.request({
method: "eth_requestAccounts",
});
const offer = await seaport.createBuyOrder({
asset: {
tokenId,
tokenAddress,
schemaName: asset.details.assetContract.schemaName,
},
accountAddress: userAccount,
startAmount: bidPrice,
});
console.log(offer);
setMessage("Buy Order Created");
} catch (err) {
setError(err.message);
console.log(err.message);
} finally {
setIsProcessing(false);
}
};
I am using metamask as wellet for bidding
Hi, I am using above code to place bid on opensea It is working but, I am using my personal MNEMONIC
But, in real time i can't get this from users meta mask wallet.
Is there any alternate way to place the bid.
I am using metamask as wellet for bidding
Hi, I am using above code to place bid on opensea It is working but, I am using my personal MNEMONIC
But, in real time i can't get this from users meta mask wallet.
Is there any alternate way to place the bid.

Module export function that post on RabbitMQ queue

I have a nodeJS server and want to set up a connection and export function to post messages to queue from the js file.
const amqp = require("amqplib");
const url = process.env.RABBITMQ_SERVER;
let channel = null;
amqp.connect(url, (e, conn) =>
conn.createChannel((e, ch) => {
channel = ch;
})
);
module.exports = publishToQueue = (
data,
queueName = process.env.RABBITMQ_QUEUE
) => channel.sendToQueue(queueName, new Buffer.from(data));
process.on("exit", code => {
ch.close();
console.log("closing rabbitmq channel");
});
But when I try to import and use it, I've got empty object {}
hooks: {
beforeCreate (instance) {
console.log(amqp)
amqp(JSON.stringify(instance.toJSON()))
}
}
UPDATE:
thanks HosseinAbha's answer, i've ended up with creating a class and set up connection in constuctor
const amqp = require("amqplib");
const url = process.env.RABBITMQ_SERVER;
class RABBITMQ {
constructor () {
this.connection = null
this.channel = null
this.connect()
}
async connect () {
try {
this.connection = await amqp.connect(url)
this.channel = await this.connection.createChannel()
await this.channel.assertQueue(process.env.RABBITMQ_QUEUE)
await this.channel.bindQueue(process.env.RABBITMQ_QUEUE, process.env.RABBITMQ_EXCHANGE)
await this.channel.assertExchange(process.env.RABBITMQ_EXCHANGE, 'fanout', { durable: true })
} catch (err){
console.log(err)
throw new Error('Connection failed')
}
}
async postData (data) {
if (!this.connection) await this.connect()
try {
this.channel.publish(process.env.RABBITMQ_EXCHANGE, `${data.type}.${data.event_type}`, new Buffer.from(JSON.stringify(data)))
} catch (err){
console.error(err)
}
}
}
module.exports = new RABBITMQ()
Your publishToQueue function should return a promise and it should connect to rabbitMQ before doing anything. Your function should be something like this:
const connectToChannel = async () => {
try {
let connection = await amqp.connect(url)
return connection.createChannel()
} catch (e) {
console.error('failed to create amqp channel: ', e)
}
}
let channel;
module.exports = publishToQueue = async (
data,
queueName = process.env.RABBITMQ_QUEUE
) => {
if (channel == null) {
channel = await connectToChannel();
}
return channel.sendToQueue(queueName, Buffer.from(data));
}
You also don't need to instantiate the Buffer and Buffer.from is enough.

Getting empty response while doing async and await

I am new to node.js and javascript. I get confused in my code as It give me empty response. I am trying to implement the promise and async and await feature however getting respones {}.
Could anybody help me to understand where I am wrong.
Please see below code may be it will long however I need help on issue of await where I am not getting empty result
var response = {};
var newSecret ='';
class FabricClientRegister {
constructor() {
console.log("called constructer");
}
async RegisterUser(Username, roleid) {
try {
const setAsyncTimeout = (cb, timeout = 0) => new Promise(resolve => {
setTimeout(() => {
cb();
resolve();
}, timeout);
});
let query1 = {}
query1.RoleID = roleid;
// query1.name = '';
var name ='';
console.log('roleid',roleid)
// console.log('Username',Username);
var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('./config/Config.json', 'utf8'));
// var Username = req.body.username;
console.log('Username',Username)
const walletPath = path.join(process.cwd(), 'wallet');
const wallet = new FileSystemWallet(walletPath);
console.log(`Wallet path: ${walletPath}`);
// Check to see if we've already enrolled the user.
const userExists = await wallet.exists(Username);
if (userExists) {
response.data = null;
response.httpstatus = 400;
response.message = `An identity for the ${Username} already exists in the wallet`;
return response;
}
console.log("Username1",Username)
// Check to see if we've already enrolled the admin user.
const adminExists = await wallet.exists(appAdmin);
if (!adminExists) {
response.data = null;
response.httpstatus = 400;
response.message = "Am admin identity is not registered . please register admin first";
return response;
}
// Create a new gateway for connecting to our peer node.
const gateway = new Gateway();
await gateway.connect(ccp, { wallet, identity: appAdmin, discovery: { enabled: false, asLocalhost: true }
/*** Uncomment lines below to disable commit listener on submit ****/
, eventHandlerOptions: {
strategy: null
}
});
// Get the CA client object from the gateway for interacting with the CA.
const ca = gateway.getClient().getCertificateAuthority();
const adminIdentity = gateway.getCurrentIdentity();
console.log("Username4",Username)
MongoClient.connect(config.Database.DFARM.connectString, async function (err, client) {
if (err) {
let connError = new Error(500, "Error connecting to DFARM database", err);
res.status(connError.status).json(connError);
}
else {
client.db(config.Database.DFARM.dbName).collection("Role").find(query1).toArray(function (err, docs) {
if(err) {
console.log('err db',err);
} else{
console.log('Role name DB',docs);
name = docs[0].name;
query1.name = name;
console.log('Role',query1);
}
client.close();
})
}
})
setTimeout(() => console.log('Role 10',query1.name), 5 * 1000);
const doStuffAsync = async () => {
setAsyncTimeout( async () => {
console.log('Role Name',query1.name);
const secret = await ca.register({enrollmentID: Username, role: query1.name}, adminIdentity);
console.log('secret',secret);
response.secret = secret;
newSecret = secret;
console.log('newSecret', newSecret );
response.httpstatus = 200;
response.message = `Successfully registered admin user ${Username} and imported it into the wallet`;
return response;
}, 10000);
};
doStuffAsync();
// .then(function(result) {
// // console.log('promise result',result) // error here undefined
// }).catch(err)
// {
// console.log("eee", err)
// };
console.log('newSecret1', newSecret)
console.log('respones', response)
return newSecret;
} catch (error) {
response.error = error;
response.httpstatus = 500;
response.message = "Failed to enroll admin due to above error";
return response;
}
}
};
Please see below output in CLI
Username Abhinav345
called constructer
roleid 1
Username Abhinav345
Wallet path: /vagrant/Dfarm-app/dFarmUserService/dFarmUserService/wallet
Username1 Abhinav345
Username4 Abhinav345
newSecret1
respones {} //getting empty one however need some data
data result
Role name DB [ { _id: 5d029dec7e8409b489e04cff,
appName: 'Farmer',
RoleID: 1,
name: 'farmer',
routes: [ [Object], [Object], [Object], [Object] ],
tabs: [ [Object], [Object], [Object], [Object] ] } ]
Role { RoleID: 1, name: 'farmer' }
Role 10 farmer
Role Name farmer
secret XXCephExVetS
newSecret XXCephExVetS
Maybe you should consider put await before doStuffAsync().

Close Event Triggers Before Data Events on File Stream

I've got a script that adds JSON data from a file to a DynamoDB table. The script uses the "fs" module to open a read stream to the json file and retrieve the data line by line. As the data is returned, it's inserted into a DynamoDB table. When the operation ends, an execution summary is given with number of records processed, successfully inserted, and unsuccessfully inserted. The problem is the summary executes before the file has completely processed. As result the numbers are wrong.
The script...
ddb_table_has_records(table_name, (err, dat) => {
if (dat.Count === 0 || force) {
const transformStream = JSONStream.parse("*");
const inputStream = fs.createReadStream(import_file);
let record_position = 0;
let count_imported_successful = 0;
let count_imported_fail = 0;
inputStream.pipe(transformStream).on("data", (Item) => {
const params = {
TableName: table_name,
Item
}
ddb_client.put(params, (err, data) => {
++record_position;
if (err) {
console.error("Unable to add mapping for record " + record_position + ", error = " + err);
++count_imported_fail;
} else {
console.log("PutItem succeeded " + record_position);
++count_imported_successful;
}
});
}).on("close", () => {
console.log("=".repeat(70));
console.log(`'Completed: ${import_file}' has been loaded into '${table_name}'.`);
console.log(` Record Count: ${record_position}`);
console.log(` Imported Record Count: ${count_imported_successful}`);
console.log(` Rejected Record Count: ${count_imported_fail}`);
});
} else {
console.log("=".repeat(70));
console.log(`Completed: Skipping import of '${import_file}' into '${table_name}'.`);
};
});
When this runs, it looks like the following
PS C:\> node --max-old-space-size=8192 .\try.js 'foo' 'us-west-2' 'development' '.\data.json' true
Target Profile: development
Target Region: us-west-2
Target Table: foo
Source File: .\data.json
Force Import: true
Confirming Table's State...
======================================================================
'Completed: .\data.json' has been loaded into 'foo'.
Record Count: 0
Imported Record Count: 0
Rejected Record Count: 0
PutItem succeeded 1
PutItem succeeded 2
PutItem succeeded 3
PutItem succeeded 4
...
The portion of the code that gets the record counts runs before the inserts completes so the records imported and rejected numbers are always wrong. It looks like the file stream closes while inserts are occurring. I've tried changing from the "close" to "end" event, same result.
Test this script with the following call...
node --max-old-space-size=8192 .\data.load.js 'foo' 'us-west-1' 'dev' '.\foo.default.json' true
Here is the content for the script I ultimately used...
'use strict'
if (process.argv.length < 6) {
throw new Error ('Please pass the table-name, aws-Region, aws-Profile, and file-path to the script.');
}
let [, , TableName, Region, Profile, ImportFile, Force] = process.argv;
process.env.AWS_SDK_LOAD_CONFIG = true;
process.env.AWS_PROFILE = Profile;
Force = typeof(Force) !== 'undefined' ? Force : false;
const AWS = require('aws-sdk');
const fs = require('fs');
const JSONStream = require('JSONStream');
AWS.config.update({ region: Region });
const ddbc = new AWS.DynamoDB.DocumentClient();
console.log('Target Profile: ', Profile);
console.log('Target Region: ', Region);
console.log('Target Table: ', TableName);
console.log('Source File: ', ImportFile);
console.log('Force Import: ', Force);
// Returns the number of records in a specified table
const ddb_table_has_items = (TableName) => {
return new Promise((resolve, reject) => {
const ddb_query_parameters = { TableName, Select: 'COUNT' }
ddbc.scan(ddb_query_parameters, (error, data) => {
(error) ? reject(error) : resolve(data);
});
});
}
const ddb_table_upsert_items = (TableName, Item) => {
return new Promise((reject, resolve) => {
const ddb_insert_payload = { TableName, Item };
ddbc.put(ddb_insert_payload, (error, data) => {
(error) ? reject(error) : resolve(data);
});
});
}
const ddb_bulk_load = (TableName, ImportFile) => {
return new Promise ( (resolve, reject) => {
let count_succeeded = 0;
let count_failed = 0;
let count_attempted = 0;
let inserts = [];
const json_stream = JSONStream.parse( "*" );
const source_data_stream = fs.createReadStream(ImportFile);
const ddb_source_item = source_data_stream.pipe(json_stream);
ddb_source_item.on("data", (source_data_item) => {
count_attempted++;
let ddb_insert = ddb_table_upsert_items(TableName, source_data_item)
.then( (data) => count_succeeded++ )
.catch( (error) => count_failed++ );
inserts.push(ddb_insert);
});
ddb_source_item.on("end", () => {
Promise.all(inserts)
.then(() => {
resolve({count_succeeded, count_failed, count_attempted});
})
.catch((error) => {
console.log(error);
reject(error);
});
});
ddb_source_item.on("error", (error) => {
reject(error);
});
});
}
(async () => {
try {
let proceed_with_import = false;
if ( Force.toString().toLowerCase() === 'true' ) {
proceed_with_import = true;
} else {
const table_scan = await ddb_table_has_items(TableName);
proceed_with_import = ( table_scan.Count === 0 );
}
if (proceed_with_import) {
let ddb_inserts = await ddb_bulk_load(TableName, ImportFile);
console.log("=".repeat(75));
console.log("Completed: '%s' has been loaded into '%s'.", ImportFile, TableName);
console.log(" Insert Attempted: %s", ddb_inserts.count_attempted);
console.log(" Insert Succeeded: %s", ddb_inserts.count_succeeded);
console.log(" Insert Failed : %s", ddb_inserts.count_failed);
}
} catch (error) {
console.log(error);
}
})();
Wrapping each insert in a promise, pushing the insert-promises into an array, and using promise all on that array did the trick. I execute the promise all once we're finished reading from the file; once the "end" event is emitted on the ddb_source_item stream.

How do I make this angular2 service somewhat synchronous?

I have an angular2 service and I want it to do the following:
Ftp to remote server
Find a file read some lines from it
Build a 'results' json object and return to the calling component
So - actually I have steps 1 / 2 working - but of course its all 'async'. So what is happening is in my component I am doing this call to the service where this.ftp is the instance of my service:
this.servers = this.ftp.lookForServers();
Now this correctly calls the lookForServers method of my FTP service , which looks like this:
lookForServers(){
var servers = [];
var whereAreWe = 0;
var possibles = ["/path/to/servers/"];
for(var i=0;i<possibles.length;i++){
whereAreWe = i;
this.c.list(possibles[i],false,(err,list)=>{
for(var p=0;p<list.length;p++){
console.log(list[p]);
var server_version = this.grabLog(possibles[whereAreWe]+list[p].name);
servers.push({
name: list[p].name,
path: possibles[whereAreWe]+list[p].name,
version: server_version
});
}
});
}
return servers;
}
Now - the this.grabLog(possibles[whereAreWe]+list[p].name); function call ends up making further calls to the this.c - the FTP client, which of course is async, so this method returns almost immediately - whilst the callbacks continue to run. Those callbacks download a file, and then another callback function processes this file - again line by line, asynchronously picking out various details i want to store.
By the end of this chain - I have all my details in the final :
lineReader.on('close', () => { function - but of course my `this.ftp.lookForServers();` function call has long gone....and the component is none the wiser.
So how can I let this work happen asynchronously, and still pass back to the component my results JSON object once the work is complete? This is probably quite a simple question about how do I make a service call a component callback...?
You don't need it to run syncronously. You should make lookForServers (and the other function it's using) use observables, then subscribe to the result like this:
this.ftp.lookForServers().subscribe((data) => { this.servers = data });
Here are the implementations:
const Client = require('ftp');
const fs = require('fs');
const readline = require('readline');
import { NextObserver } from 'rxjs/Observer';
import { Observable } from 'rxjs/Rx';
interface server {
name: string;
path: string;
version: string;
java_version: string;
}
export class FTPClient {
username: string;
password: string;
host: string;
port: number;
c: any;
constructor() {
}
init(username, password, host, port) {
console.log("initiating FTP connection to:" + host + "on port:" + port);
this.username = username;
this.password = password;
this.host = host;
this.port = port;
this.c = new Client();
console.log("Client created");
}
connect() {
console.log("About to start connection");
this.c.on('ready', () => {
this.c.list((err: any, list: any) => {
if (err) throw err;
console.dir(list);
this.c.end();
});
});
// connect to localhost:21 as anonymous
var connectProps = {
host : this.host,
port : this.port,
user : this.username,
password : this.password
};
console.log("Connecting now...");
this.c.connect(connectProps);
}
public lookForServers(name: string): Observable<any[]> {
return Observable.create((observer: NextObserver <any[]>) => {
let servers = [];
let whereAreWe = 0;
let possibles = [ "/path/to/servers/" ];
for (var i = 0; i < possibles.length; i++) {
whereAreWe = i;
this.c.list(possibles[ i ], false, (err: any, list: any) => {
for (var p = 0; p < list.length; p++) {
this.grabMessagesLog(possibles[ whereAreWe ] + list[ p ].name)
.subscribe((data: any) => {
let server_version = data;
servers.push({
name : list[ p ].name,
path : possibles[ whereAreWe ] + list[ p ].name,
version : server_version
});
observer.next(servers);
observer.complete();
}
);
}
});
}
});
}
grabMessagesLog(path): Observable<any> {
return Observable.create((observer: NextObserver <any>) => {
let result = '';
let unix = Math.round(+new Date() / 1000);
this.c.binary(function(err) {
console.log(err);
});
this.c.get(path + "/logs/messages.log", (err, stream) => {
if (err) throw err;
stream.once('close', () => {
this.c.end();
this.getServerMetadataFromMessagesLog(unix + "_messages.log")
.subscribe((data) => {
stream.pipe(fs.createWriteStream(unix + "_messages.log"));
observer.next(data);
observer.complete();
});
});
});
});
}
getServerMetadataFromMessagesLog(path): Observable<any> {
return Observable.create((observer: NextObserver <any>) => {
let lineReader = readline.createInterface({
input : fs.createReadStream(path)
});
let server_version = "";
let java_version = "";
let line_no = 0;
lineReader.on('line', function(line) {
line_no++;
console.log("line number is:" + line_no);
if (line.includes("STUFF") && line.includes("FLAG2") && line_no == 2) {
var first = line.split("FLAG2")[ 1 ];
var last = first.split(" (")[ 0 ];
var version = "FLAG2" + last;
this.server_version = version;
console.log("version is:" + version);
}
if (line.includes("java.version =")) {
var javav = line.split("java.version =")[ 1 ];
this.java_version = javav;
lineReader.close();
}
console.log('Line from file:', line);
});
lineReader.on('close', () => {
var res = {
version : server_version,
java_version : java_version
};
alert("RES IS:" + JSON.stringify(res));
observer.next(res);
observer.complete();
});
});
}
}
Try using a recursive function with the $timeout function of Angular
function recursiveWait(server_version){
if(server_version != null){
return;
}
$timeout(function(){recursiveWait()}, 500);
}
And place it here:
console.log(list[p]);
var server_version = this.grabLog(possibles[whereAreWe]+list[p].name);
recursiveWait(server_version);
servers.push({
name: list[p].name,
This will ask the var if it's != null If it's equal it will call the function again in 500ms, if it's not it will return and exit the function, letting the code continue.

Categories