I'm sending a JS object from my front-end to my Java backend, and I'm passing a object like so, which contains different types
wrapperObject = {
JSONOBJ = {
'key': 'value'
},
id: '123',
date: 'exampledate'
}
My java backend then takes this wrapperObject and converts every field inside into a value inside of a hashmap Map. Whenever it reaches the JSONObject, however, it parses it and attempts to insert into the db and I reach a
bad SQL grammar []; nested exception is org.postgresql.util.PSQLException: No hstore extension installed.
What can I do about this, and is there a better way of approaching this?
It sounds like it may be as simple as adding the hstore extension. The PostgreSQL documentation for installation looks pretty straightforward:
Let me know if I'm missing something, hope this helps!
Related
my website relies on a database which is a big JSON file like this:
var myjsonData =
[ {
"ID": 0,
"name": "Henry",
"surname": "McLarry",
"...": "...",
}]
I do generate this data every month at high cost to me, therefore I would like to avoid calling it straight in my html <head>, because this will allow any user to download the full database in no time.
I would like to build a "something" that can only call specific items from the json file (just the only one I want to show) without "exposing" the full .json onto client side.
today I use the call
var myvar= myjsonData.ID.Name
to get "Henry" into myvar, I would like to build something like
var myvar = mycallfunction(ID,Name)
I did try with PHP as intermediary but the ajax calls from javacript doesn't allow me to fetch the data.
Can I use JQuery with the JSON Url to get only the item I need?
What you can do is parse your json for an object. So you can get any value you want from json.
Example:
var myjsonData = '{"ID": 0,"name": "Henry","surname": "McLarry"}';
obj = JSON.parse(myjsonData);
console.log(myjsonData.ID); //print the id
console.log(myjsonData.name); //print the name
console.log(myjsonData.surname); //print the surname
So you have a NoSQL Database which has only one kind of Document that is the full JSON element you use in your website. In that scenario you have three options:
Depending on the NoSQL Database you're using you can limit the fields which will be returned(I.e: For MongoDB you can look here: https://docs.mongodb.com/manual/tutorial/project-fields-from-query-results/)
Change the way you store you data into more modular documents and make the logic to connect them in you application. So instead of one big document you'll have modular ones as Users, Products, Transactions and etc and you can use your application to query them individually.
Build a Server Side logic as an API to deal with your data and provide only what you need, so the API(Which can be node.js, php, or any you may like) will get the full JSON it`s endpoints will only the data you want. For example: myapi.com/getUser, myapi.com/getProducts and so on.
If you're able to provide more info on the technologies you're using that would help us. Hope that helped :).
I have came across the read and write operations using fs in Node Js.
My scenario is like, I have a file having the data like ,
[
{
"Pref":"Freedom",
"ID":"5545"
},
{
"Pref":"Growth",
"ID":"8946545"
}
]
I have to replace the Pref of the element whose ID is 5545 using Node js.
How can I do it. Thanks
To do what you want, you wound need to:
read JSON data from the file with fs.readFile()
parse the JSON with JSON.parse()
find the correct object in the array
change the object
serialize the object back to JSON with JSON.stringify()
write the file with fs.writeFile()
but this is not that simple as it may look like, because you will have to:
add locking to writes so that you never do two writes at the same time
add locking to reads so that you never read while the write is in progress
handle incorrect JSON
handle cases of objects that cannot be serialized
avoid blocking operations (those with "Sync" in their name) anywhere else then in the first tick of the event loop
Considering all of that you should consider using a database to store any data that changes. Some databases like Mongo, Postgres or Redis need to be run as standalone application either on the same or on a different server. Some embedded databases like SQLite don't need a standalone process and can be run directly in your application.
It's not that it is impossible to write to JSON files and then read those files as needed, but the amount of work that you'd have to do to synchronize the access to the data all without accidentally blocking the event loop in the process is much more difficult than just using any database as intended.
You have some data:
const data = [
{
"Pref":"Freedom",
"ID":"5545"
},
{
"Pref":"Growth",
"ID":"8946545"
}
]
First we need to find the element you want to change (use [0] to only select the first in case there are multiple items with ID 5545:
const objectToChange = data.filter(item => item.ID === "5545")[0]
And then change it!
objectToChange['Pref'] = "Liberty"
We can see the change reflected in the data object:
console.log(data)
// [{
// ID: "5545",
// Pref: "Liberty"
// },{
// ID: "8946545",
// Pref: "Growth"
// }]
1- Load file: let json = JSON.parse(fs.readFileSync('file.json', 'utf-8'));
2- Update content:
json = json.map(el => {
if(el.ID === "5545") {
el.Pref = "TEST";
}
return el;
});
3- Save again maybe?
fs.writeFileSync('test.json', JSON.stringify(json), 'utf-8');
Apologies if this seems basic to some, but I'm new to JS/node.js/JSON and still finding my way. I've searched this forum for an hour but cannot find a specific solution.
I have a basic website setup running of a local Node.js server along with 2x JSON data files with information about 32x local suburbs.
An example of an API GET request URL on the site would be:
.../api/b?field=HECTARES
The structure of the JSON files are like:
JSON Structure
In the JSON file there are 32x Features (suburbs), each with it's own list of Properties as shown above. What I am trying to do is use the API 'field' query to push all the HECTARES values each of the 32x Features into a single output variable. The code below is an example of how far I have got:
var fieldStats = [];
var fieldQ = req.query['field'];
for (i in suburbs.features) {
x = suburbs.features[i].properties.HECTARES;
fieldStats.push(x);
}
As you can see in the above "HECTARES" is hard-coded - I need to be able to pass the 'fieldQ' variable to this code but have no idea how to.
Advice appreciated!
Exactly the same syntax you are using just above:
suburbs.features[i].properties[fieldQ];
I'm developing a web app with Node.js using Sails framework(based on Express) and i'm using a third party image solution called Transloadit (no need to know Transloadit).
Anyway, that's not the problem, i'm been able to implement the Transloadit form and receive the information from their API.
My problem is that, Transloadit gives me the response as a String, and I need to access the response objects, so i'm using var objRes = JSON.parse(req.body.transloadit); to parse it to an JSON object, and when I console.log(objRes); the object is not correctly parsed, i get this: (see all JSON here https://gist.github.com/kevinblanco/9631085 )
{
a bunch of fields here .....
last_seq: 2,
results: {
thumb: [
[
Object
]
]
}
}
And I need the data from the thumb array, my question is, Why is doing that when parsing ?
Here's the entire request req.body object: https://gist.github.com/kevinblanco/9628156 as you can see the transloadit field is a string, and I need the data from some of their fields.
Thanks in advance.
There is nothing wrong with the parsing of the JSON -- in fact there is no problem at all.
consol.log limits the depth of what it is printing which is why you are seeing [object] in the output.
If you want to see the full output in node.js then just use the inspect utility like this;
console.log(util.inspect( yourobject, {depth:null} ));
and that will print the entire content.
Note that this is just an artifact of console.log printing it.
Somewhere in my Django app, I make an Ajax call to my view
$.post("/metrics", {
'program': 'AWebsite',
'marketplace': 'Japan',
'metrics': {'pageLoadTime': '1024'}
});
in my python code, I have got
#require_POST
def metrics(request):
program = request.POST.get('program', '')
marketplace = request.POST.get('marketplace', '')
metrics = request.POST.get('metrics', '')
reportMetrics(metrics, program, marketplace)
the metrics() function in python is supposed to call reportMetrics() with these parameters which are then supposed to go in a log file. But In my log file, I do not see the 'pageLoadTime' value - probably because it is being passed in as a dictionary. In future I need to add more items to this, so it needs to remain a dictionary (and not a string like first two).
Whats the easiest way to convert this incoming javascript dictionary to python dictionary?
Send the javascript dictionary as json and import the python json module to pull it back out. You'll use json.loads(jsonString).
Edit - Added example
$.post("/metrics", {
data: JSON.stringify({
'program': 'AWebsite',
'marketplace': 'Japan',
'metrics': {'pageLoadTime': '1024'}
})
});
Then on the python side
import json
def metrics(request):
data = json.loads(request.POST.get('data'))
program = data.get('program','')
marketplace = data.get('marketplace','')
metrics = data.get('metrics','')
I don't really have a good way of testing this right now, but I believe it should work. You may also have to do some checking if a field is blank, but I believe .get() will handle that for you.