Js-IPFS Error: cid.toBaseEncodedString() is not a function - javascript

I'm currently working on the backend of a website that would work similar to YouTube but only use IPFS for storage, meaning if you want to "upload" videos to the site it would already have to be on the IPFS network. The actual function is more of an Index than anything else but it was something that I wanted to tackle.
The section I'm working on is intended to verify the integrity of the CID hashes by making sure that there are still providers on the network for that specific content. If there aren't any then the CID and any information associated will get removed from my database but I'm currently getting an issue when trying using the ipfs.dhs.findProvs function.
Here is part of my code:
const ipfs = await IPFS.create({
libp2p: { config: { dht: { enabled: true } } },
});
for (var i of integrityData) {
let cid = new CID(i.CID);
console.log(cid);
let providers = ipfs.dht.findProvs(cid, { numProviders: 2 });
for await (const provider of providers) {
console.log(provider);
}
}
Error Log:
C:\Users\...\node_modules\libp2p-kad-dht\src\providers.js:202
this._log('getProviders %s', cid.toBaseEncodedString())
^
TypeError: cid.toBaseEncodedString is not a function
To further explain my code, the for loops is iterating the JSON content received the Database after querying for all the CIDs in it. i.CID does return the correct CID as a string which I then create a CID object from and pass to the function here ipfs.dht.findProvs(cid, { numProviders: 2 });. The nested for loop is there to iterate through the object that is received but I haven't made it to that stage as I keep getting the same error.

Related

Cannot map variable from data stream to users identified response while developing voice app

I am currently developing a voice app with Google Actions where users are able to ask for information about items in a list that is provided through a file stream with Axios as shown in the following LINK. The data looks like this:
[
{
"Name": "Beam",
"Level": "2",
"Zone": "A",
"Location": "Beam is located on Level 2 in zone A",
"Responsible": "Contractor"
},
{
"Name": "Column",
"Level": "3",
"Zone": "A",
"Location": "Column is located on Level 3 in zone A",
"Responsible": "Kiewit"
},
{
"Name": "Window",
"Level": "2",
"Zone": "B",
"Location": "Window is located on Level 2 in zone B",
"Responsible": "Tech"
}
]
Here, it shows three items being a BEam, a Column, and a Window so the objective is that users ask about one of the items and the voice app will provide the other information such as Level, ZOne, Location, or Responsible to the user.
To complete this, I am using the web interface of Google Actions and using inline cloud functions as webhooks in Google Actions that looks like this:
const { conversation } = require('#assistant/conversation');
const functions = require('firebase-functions');
require('firebase-functions/lib/logger/compat'); // console.log compact
const axios = require('axios');
const app = conversation({debug: true});
app.handle('getItem', async conv => {
const data = await getItem();
const itemParam = conv.intent.params.Item.resolved;
// console.log(itemParam);
// conv.add(`This test to see if we are accessing the webhook for ${itemParam}`);
data.map(item => {
if (item.Name === itemParam);
conv.add(`These are the datails for ${itemParam}. It is located in zone ${item.Zone}, at level ${item.Level}`);
// conv.add(`This test to see if we are accessing the webhook for ${item.Name}`);
});
});
async function getItem() {
const res = await axios.get('https://sheetdb.io/api/v1/n3ol4hwmfsmqd');
console.log(res.data);
return res.data; // To use in your Action's response
}
exports.ActionsOnGoogleFulfillment = functions.https.onRequest(app);
When I check the console logs, I can see that I am retrieving the data in a single event as provided in the LINK. Also, the recognition of the item name is working in the app by defining a type within the app to be recognized based on type categories. And this information is being stored in ItemParam.
However, the main issue I have right now is to link both things together. I was trying to use a map function to match the itemParam and the Item.Name from the data stream. However, this is not working at al. The function I was trying to do is:
data.map(item => {
if (item.Name === itemParam);
conv.add(`These are the datails for ${itemParam}. It is located in zone ${item.Zone}, at level ${item.Level}`);
What I am trying to do here is when the function detects that the user ItemParam is matched to the Item in the stream, use the information from that stream item and add a phrase to the conversation that includes the ItemParam, and the other information about that same item.
Besides, the way this function is right now, also shoots this error:
cf-GPfYHj4HKDWGvHKWArq34w-name
Error: Error adding simple response: **Two simple responses already defined**
at addSimple (/workspace/node_modules/#assistant/conversation/dist/conversation/prompt/prompt.js:34:15)
at Prompt.add (/workspace/node_modules/#assistant/conversation/dist/conversation/prompt/prompt.js:108:17)
at ConversationV3.add (/workspace/node_modules/#assistant/conversation/dist/conversation/conv.js:102:21)
at data.map.item (/workspace/index.js:16:13)
at Array.map (<anonymous>) at app.handle (/workspace/index.js:14:8) at process._tickCallback (internal/process/next_tick.js:68:7)
I am honestly not that familiar with Javascript and I might be doing silly mistakes but I really cannot figure out this.
Any help will be much appreciated. Thank you
The error you are seeing is:
Error: Error adding simple response: Two simple responses already defined
Your action's response can only include two simple responses. Each response is rendered as a separate text bubble on a phone, for instance.
So it seems like the item.Name === itemParam is true multiple times and you end up creating too many responses.
Why does this happen? It comes from how your conditional is written:
data.map(item => {
if (item.Name === itemParam);
conv.add(`These are the datails for ${itemParam}. It is located in zone ${item.Zone}, at level ${item.Level}`);
});
You have correctly identified that the ; semicolon character denotes the end of a statement. However, this does not apply to if statements. Because the way it's written, you have this conditional and then conclude it before you actually run conv.add. This means that conv.add escapes your check and runs for every item. If you were to log the conv response, you'd see a bunch of text.
To fix it, keep in mind that a conditional needs to wrap the surrounding code. This is done with curly braces { & }.
data.map(item => {
if (item.Name === itemParam) {
conv.add(`These are the datails for ${itemParam}. It is located in zone ${item.Zone}, at level ${item.Level}`);
}
});
You can even see this in the map method, where the mapping logic surrounds your if-statement with curly braces. This shows that one is contained entirely within the other.
Thanks, Nick, I fixed my function based on your feedback and now I understand a little better about the map function. Another issue that I figured out along the way was that upper and lower case does matter to match the map function so I also had to modify the type to lowercase and add .toLowerCase() methods to variables.
Now my code is working with two variables Item and Item_ID so if the user asks about a generic item, it can get detailed by adding the ID of the item to the query question.
Now my code looks like this:
// From here, there are all the required libraries to be loaded
const { conversation } = require('#assistant/conversation'); // This the app coversation
const functions = require('firebase-functions'); //These are the firebase functions
require('firebase-functions/lib/logger/compat'); // console.log compact
const axios = require('axios'); // This is axios to retrieve the data stream
// To here, there all the required libraries to be loaded
const app = conversation({debug: true}); // This instantiate the conversation
/* This function retrieve the data from the file stream */
async function getItem() {
const res = await axios.get('https://sheetdb.io/api/v1/n3ol4hwmfsmqd');
return res.data; // To use in your Action's response
}
/* This is the fuction to match user's responses and data stream*/
app.handle('getItem', async conv => { //getItem is the weekhook name used in Google Actions, conv is the conversation
const data = await getItem(); // Here the data stream is retrieved and send to the data variable
// console.log(data);
const itemParam = conv.intent.params.Item.resolved; // This is the user's response, in other words, what item the user's want to know from the data.
const itemIDParam = conv.intent.params.Item_ID.resolved.replace(/\s/g, ''); //This is the user's response for item ID
const itemFromUser = itemParam + " " + itemIDParam;
console.log(itemParam);
console.log(itemIDParam);
console.log(itemFromUser);
// conv.add(`This test to see if we are accessing the webhook for ${itemParam}`); // This is to know if I was getting the correct item from the user. Currently this is working
// console.log(data);
data.map(item => { //Then, I am trying to map the data stream to recognize the data headers and identify items
// console.log(data);
// console.log(item);
if (item.Name.toLowerCase() === itemFromUser.toLowerCase()){
console.log(item);
conv.add(`These are the details for ${itemFromUser}. It is located in zone ${item.Zone}, at level ${item.Level}.`);
// console.log(conv);
// console.log(data);
}
else {
conv.add(`I am sorry. I could not find any information about that object. Please try with another construction object.`);
}
});
});
exports.ActionsOnGoogleFulfillment = functions.https.onRequest(app);
Now I can handle most of the questions except when something is not in the data stream which makes the app to show me this error:
"error": "Error adding simple response: Two simple responses already defined"
This is the same error as I was getting before and I am not sure how to fix it yet. I tried to implement an else statement for that condition as follows:
else {
conv.add(`I am sorry. I could not find any information about that object. Please try with another construction object.`);
}
But I am still getting same error.
I am still working on this.

Stored procedure azure Cosmos DB returns empty collection

I tried to create a stored procedure using the sample sp creation code from Azure docs, but i couldn't fetch the collection details. It always returns null.
Stored Procedure
// SAMPLE STORED PROCEDURE
function sample(prefix) {
var collection = getContext().getCollection();
console.log(JSON.stringify(collection));
// Query documents and take 1st item.
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
'SELECT * FROM root r',
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found',
// else take 1st element from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found');
}
else {
var response = getContext().getResponse();
var body = { prefix: prefix, feed: feed[0] };
response.setBody(JSON.stringify(body));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
The console shows only this.
the results shows no doc found because of not getting collection.I have passed the partition key at time of execution via explorer.
I had a similar issue. I think the Azure portal doesn't execute stored procedures properly when the partition key is not a string.
In my case I had a partitionKey that is a number. When I executed the stored procedure via the portal I always got an empty resultSet, even though I had documents in my database. When I changed the structure a little, and made my partitionKey a string, the stored procedure worked fine.
Did you create the ToDoList Database with the Items Collection? Yo can do this from the Quick start blade in the Azure portal.
And then create an SP to run against that collection. There is no partition key required, so no additional params are required (leave blank).
The Collection is created without any documents. You may choose to add documents via the Query Explorer blade or via the sample ToDoList App that is available via the Quick start blade.
You are debugging in a wrong way.
It is perfectly fine to see "{\"spatial\":{}}" in your console log, even if the collection has items. Why? well because that is a property of that object.
So regarding what you said:
the results shows no doc found because of not getting collection
is false. I have the same console log text, but I have items in my collection.
I have 2 scenarios for why your stored procedure return no items:
I had the same issue trying on azure portal UI(in browser) and for my surprise I had to insert an item without the KEY in order that my stored procedure to see it.
On code you specify the partition as a string ie. new PartitionKey("/UserId") instead of your object ie. new PartitionKey(stock.UserId)

Error: Network error: Error writing result to store for query (Apollo Client)

I am using Apollo Client to make an application to query my server using Graphql. I have a python server on which I execute my graphql queries which fetches data from the database and then returns it back to the client.
I have created a custom NetworkInterface for the client that helps me to make make customized server request (by default ApolloClient makes a POST call to the URL we specify). The network interface only has to have a query() method wherein we return the promise for the result of form Promise<ExecutionResult>.
I am able to make the server call and fetch the requested data but still getting the following error.
Error: Network error: Error writing result to store for query
{
query something{
row{
data
}
}
}
Cannot read property 'row' of undefined
at new ApolloError (ApolloError.js:32)
at ObservableQuery.currentResult (ObservableQuery.js:76)
at GraphQL.dataForChild (react-apollo.browser.umd.js:410)
at GraphQL.render (react-apollo.browser.umd.js:448)
at ReactCompositeComponent.js:796
at measureLifeCyclePerf (ReactCompositeComponent.js:75)
at ReactCompositeComponentWrapper._renderValidatedComponentWithoutOwnerOrContext (ReactCompositeComponent.js:795)
at ReactCompositeComponentWrapper._renderValidatedComponent (ReactCompositeComponent.js:822)
at ReactCompositeComponentWrapper._updateRenderedComponent (ReactCompositeComponent.js:746)
at ReactCompositeComponentWrapper._performComponentUpdate (ReactCompositeComponent.js:724)
at ReactCompositeComponentWrapper.updateComponent (ReactCompositeComponent.js:645)
at ReactCompositeComponentWrapper.performUpdateIfNecessary (ReactCompositeComponent.js:561)
at Object.performUpdateIfNecessary (ReactReconciler.js:157)
at runBatchedUpdates (ReactUpdates.js:150)
at ReactReconcileTransaction.perform (Transaction.js:140)
at ReactUpdatesFlushTransaction.perform (Transaction.js:140)
at ReactUpdatesFlushTransaction.perform (ReactUpdates.js:89)
at Object.flushBatchedUpdates (ReactUpdates.js:172)
at ReactDefaultBatchingStrategyTransaction.closeAll (Transaction.js:206)
at ReactDefaultBatchingStrategyTransaction.perform (Transaction.js:153)
at Object.batchedUpdates (ReactDefaultBatchingStrategy.js:62)
at Object.enqueueUpdate (ReactUpdates.js:200)
I want to know the possible cause of the error and solution if possible.
I had a similar error.
I worked it out by adding id to query.
for example, my current query was
query {
service:me {
productServices {
id
title
}
}
}
my new query was
query {
service:me {
id // <-------
productServices {
id
title
}
}
}
we need to include id,
otherwise it will cause the mentioned error.
{
query something {
id
row {
id
data
}
}
}
I've finally found out what is causing this issue after battling with it in various parts of our app for months. What helped to shed some light on it was switching from apollo-cache-inmemory to apollo-cache-hermes.
I experimented with Hermes hoping to mitigate this ussue, but unfortunately it fails to update the cache the same as apollo-cache-inmemory. What is curious though is that hermes shows a very nice user friendly message, unlike apollo-cache-inmemory. This lead me to a revelation that cache really hits this problem when it's trying to store an object type that is already in the cache with an ID, but the new object type is lacking it. So apollo-cache-inmemory should work fine if you are meticulously consistent when querying your fields. If you omit id field everywhere for a certain object type it will happily work. If you use id field everywhere it will work correctly. Once you mix queries with and without id that's when cache blows up with this horrible error message.
This is not a bug-it's working as intended, it's even documented here: https://www.apollographql.com/docs/react/caching/cache-configuration/#default-identifiers
2020 update: Apollo has since removed this "feature" from the cache, so this error should not be thrown anymore in apollo-client 3 and newer.
I had a similar looking issue.
Perhaps your app was attempting to write (the network response data) to the store with the wrong store address?
Solution for my problem
I was updating the store after adding a player to a team:
// Apollo option object for `mutation AddPlayer`
update: (store, response) => {
const addr = { query: gql(QUERY_TEAM), variables: { _id } };
const data = store.readQuery(addr);
stored.teams.players.push(response.data.player));
store.writeQuery({...addr, data});
}
I started to get a similar error above (I'm on Apollo 2.0.2)
After digging into the store, I realised my QUERY_TEAM request made with one variable meta defaulting to null. The store "address" seems to use the *stringified addr to identify the record. So I changed my above code to mimic include the null:
// Apollo option object for `mutation AddPlayer`
update: (store, response) => {
const addr = { query: gql(QUERY_TEAM), variables: { _id, meta: null } };
const data = store.readQuery(addr);
data.teams.players.push(response.data.player));
store.writeQuery({...addr, data});
}
And this fixed my issue.
* Defaulting to undefined instead of null will probably avoid this nasty bug (unverified)
Further info
My issue may be only tangentially related, so if that doesn't help I have two peices of advice:
First, add these 3 lines to node_modules/apollo-cache-inmemory/lib/writeToStore.js to alert you when the "record" is empty.
And then investigate _a to understand what is going wrong.
exports.writeResultToStore = writeResultToStore;
function writeSelectionSetToStore(_a) {
var result = _a.result, dataId = _a.dataId, selectionSet = _a.selectionSet, context = _a.context;
var variables = context.variables, store = context.store, fragmentMap = context.fragmentMap;
+if (typeof result === 'undefined') {
+ debugger;
+}
Second, ensure all queries, mutations and manual store updates are saving with the variables you expect
For me adding "__typename" into query helped.
Solution for this is 1. it happening when missing id, second one is it is happening when you have same query and hitting them alternately.
Example if you have query like dog and cat.
query dog(){id, name}
query cat(){id, name }
here both query are same just their header are different, during that time, this type of issue is coming. currently i have fetching same query with different status and getting this error and am lost in search of solution.

Meteor: Best practice for modifying document data with user data

Thanks for looking at my question. It should be easy for anyone who has used Meteor in production, I am still at the learning stage.
So my meteor setup is I have a bunch of documents with ownedBy _id's reflecting which user owns each document (https://github.com/rgstephens/base/tree/extendDoc is the full github, note that it is the extendDoc branch and not the master branch).
I now want to modify my API such that I can display the real name of each owner of the document. On the server side I can access this with Meteor.users.findOne({ownedBy}) but on the client side I have discovered that I cannot do this due to Meteor security protocols (a user doesnt have access to another user's data).
So I have two options:
somehow modify the result of what I am publishing to include the user's real name on the server side
somehow push the full user data to the clientside and do the mapping of the _id to the real names on the clientside
what is the best practice here? I have tried both and here are my results so far:
I have failed here. This is very 'Node' thinking I know. I can access user data on clientside but Meteor insists that my publications must return cursors and not JSON objects. How do I transform JSON objects into cursors or otherwise circumvent this publish restriction? Google is strangely silent on this topic.
Meteor.publish('documents.listAll', function docPub() {
let documents = Documents.find({}).fetch();
documents = documents.map((x) => {
const userobject = Meteor.users.findOne({ _id: x.ownedBy });
const x2 = x;
if (userobject) {
x2.userobject = userobject.profile;
}
return x2;
});
return documents; //this causes error due to not being a cursor
}
I have succeeded here but I suspect at the cost of a massive security hole. I simply modified my publish to be an array of cursors, as below:
Meteor.publish('documents.listAll', function docPub() {
return [Documents.find({}),
Meteor.users.find({}),
];
});
I would really like to do 1 because I sense there is a big security hole in 2, but please advise on how I should do it? thanks very much.
yes, you are right to not want to publish full user objects to the client. but you can certainly publish a subset of the full user object, using the "fields" on the options, which is the 2nd argument of find(). on my project, i created a "public profile" area on each user; that makes it easy to know what things about a user we can publish to other users.
there are several ways to approach getting this data to the client. you've already found one: returning multiple cursors from a publish.
in the example below, i'm returning all the documents, and a subset of all the user object who own those documents. this example assumes that the user's name, and whatever other info you decide is "public," is in a field called publicInfo that's part of the Meteor.user object:
Meteor.publish('documents.listAll', function() {
let documentCursor = Documents.find({});
let ownerIds = documentCursor.map(function(d) {
return d.ownedBy;
});
let uniqueOwnerIds = _.uniq(ownerIds);
let profileCursor = Meteor.users.find(
{
_id: {$in: uniqueOwnerIds}
},
{
fields: {publicInfo: 1}
});
return [documentCursor, profileCursor];
});
In the MeteorChef slack channel, #distalx responded thusly:
Hi, you are using fetch and fetch return all matching documents as an Array.
I think if you just use find - w/o fetch it will do it.
Meteor.publish('documents.listAll', function docPub() {
let cursor = Documents.find({});
let DocsWithUserObject = cursor.filter((doc) => {
const userobject = Meteor.users.findOne({ _id: doc.ownedBy });
if (userobject) {
doc.userobject = userobject.profile;
return doc
}
});
return DocsWithUserObject;
}
I am going to try this.

Chrome Storage Set - Asynchronous Issues onMessage [duplicate]

I'm developing a chrome extension and I will store objects sent by server.
For example, I will receive:
command = {id:"1", type: "A", size: "B", priority: "C"}
If I had a database, I would insert it as a line in table commands.
Using chrome.storage, I'm storing an array of these object in key commands.
But, when I receive a new command by server, I have to get from local storage, update the array and then set again. I'm worried about cases when I receive another command while I'm getting and setting or while I delete a stored command. I'm thinking about semaphores, but I don't know if it's a great idea.
Can someone suggest me what to do?
thanks!
Extensions can use a database: IndexedDB (the sample code may look convoluted, but it's pretty simple in the actual extensions, for example two small functions here, getStyles and saveStyle, or IDB-keyval wrapper library).
If you want to use chrome.storage, just maintain a global queue array that is populated by the server listener:
queue.push(newItem);
updateStorage();
and processed in chrome.storage.local.get callback:
function updateStorage() {
if (!queue.length || updateStorage.running) {
return;
}
updateStorage.running = true;
chrome.storage.local.get('commands', data => {
data.commands = [].concat(data.commands || [], queue);
queue = [];
chrome.storage.local.set(data, () => {
updateStorage.running = false;
if (queue.length) updateStorage();
});
});
}

Categories