I have a script, which is reading a google sheet and returning
its content.I want to store all data to database.
var request = gapi.client.sheets.spreadsheets.values.get(params);
request.then(function(response) {
console.log(response.result.values[2][2]);
}, function(reason) {
console.error('error: ' + reason.result.error.message);
});
}
how to store it to mysql database or how to make recognize this return data to
PHP script
You'll need to do a PHP page to receive your data. To send your data, you should use an AJAX requests that will send your data to the php script. Let's have a look on XML HTTP Request with Javascript on Google, or you can read this : https://developer.mozilla.org/en-US/docs/Web/Guide/AJAX
Related
I am designing a web app where users can post content to a feed.
I usually send the post and the post data to the server via an XMLHttpRequest.
request = new XMLHttpRequest()
var user = 'current user'
var post = 'some text'
request.open("POST", "/sent_new_post?x=" + post + "&user=" + user)
request.setRequestHeader("X-CSRFToken", csrftoken);
request.setRequestHeader("Content-Type", "text/plain;charset=UTF-8");
request. onload = () => { // do stuff when response comes back }
request.send()
On the server I then access the data with
x = request.GET['x']
But I have run into some problems using this method especially when the post text contains an '#'.
I was wondering if I can send the data using
request.send(data)
But I don't know how to access that data in the view function...
How can I access data in my view function that has been send to the server using request.send(data)????
I am not sure, because the question is not entirely clear to me.
If what you want to do is accessing the payload of the request you receive, then you need to know that
In an HttpRequest object, the GET and POST attributes are instances of django.http.QueryDict, a dictionary-like class customized to deal with multiple values for the same key. This is necessary because some HTML form elements, notably , pass multiple values for the same key1.
Therefore you can access the 'data' payload by writing something like this data= request.POST.get('data','').
If you want to send a # in a url parameter to the backend you have to encode it
"/sent_new_post?x=" + encodeURIComponent(post) + "&user=" + encodeURIComponent(user)
If you want to send the data in the post body you'll have to use request.POST to access the data, you will still have to encode the data you're sending.
request.send("x=" + encodeURIComponent(post) + "&user=" + encodeURIComponent(user));
I've got a small Express JS api that I'm building to handle and process multiple incoming requests from the browser and am having some trouble figuring out the best approach to handle them.
The use case is that there's a form, with potentially up-to 30 or so people submitting form data to the Express JS api at any given time, the API then POSTS this data of to some place using axios, and each one needs to return a response back to the browser of the person that submitted the data, my endpoint so far is:
app.post('/api/process', (req, res) => {
if (!req.body) {
res.status(400).send({ code: 400, success: false, message: "No data was submitted" })
return
}
const application = req.body.Application
axios.post('https://example.com/api/endpoint', application)
.then(response => {
res.status(200).send({ code: 200, success: true, message: response })
})
.catch(error => {
res.status(200).send({ code: 200, success: false, message: error })
});
})
If John and James submit form data from different browsers to my Express JS api, which is forwarded to another api, I need the respective responses to go back to the respective browsers...
Let's make clear for you, A response of a request will only send to the requester, But if you need to send a process request and send a response like, hey i received your request and you can use another get route to get the result sometimes later, then you need to determine which job you mean. So You can generate a UUID when server receives a process request and send it back to the sender as response, Hey i received your process request, you can check the result of process sometimes later and this UUID is your reference code. Then you need to pass the UUID code as GETparam or query param and server send you the correct result.
This is the usual way when you are usinf WebSockettoo. send a process req to server and server sends back a reference UUID code, sometime later server sends the process result to websocket of requester and says Hey this is the result of that process with that UUID reference code.
I hope i said clear enough.
I want to use nodeJS as tool for website scrapping. I have already implemented a script which logs me in on the system and parse some data from the page.
The steps are defined like:
Open login page
Enter login data
Submit login form
Go to desired page
Grab and parse values from the page
Save data to file
Exit
Obviously, the problem is that every time my script has to login, and I want to eliminate that. I want to implement some kind of cookie management system, where I can save cookies to .txt file, and then during next request I can load cookies from file and send it in request headers.
This kind of cookie management system is not hard to implement, but the problem is how to access cookies in nodejs? The only way I found it is using request response object, where you can use something like this:
request.get({headers:requestHeaders,uri: user.getLoginUrl(),followRedirect: true,jar:jar,maxRedirects: 10,},function(err, res, body) {
if(err) {
console.log('GET request failed here is error');
console.log(res);
}
//Get cookies from response
var responseCookies = res.headers['set-cookie'];
var requestCookies='';
for(var i=0; i<responseCookies.length; i++){
var oneCookie = responseCookies[i];
oneCookie = oneCookie.split(';');
requestCookies= requestCookies + oneCookie[0]+';';
}
}
);
Now content of variable requestCookies can be saved to the .txt file and can loaded next time when script is executed, and this way you can avoid process of logging in user every time when script is executed.
Is this the right way, or there is a method which returns cookies?
NOTE: If you want to setup your request object to automatically resend received cookies on every subsequent request, use the following line during object creation:
var request = require("request");
request = request.defaults({jar: true});//Send cookies on every subsequent requests
In my case, i've used 'http'library like the following:
http.get(url, function(response) {
variable = response.headers['set-cookie'];
})
This function gets a specific cookie value from a server response (in Typescript):
function getResponseCookieValue(res: Response, param: string) {
const setCookieHeader = res.headers.get('Set-Cookie');
const parts = setCookieHeader?.match(new RegExp(`(^|, )${param}=([^;]+); `));
const value = parts ? parts[2] : undefined;
return value;
}
I use Axios personally.
axios.request(options).then(function (response) {
console.log(response.config.headers.Cookie)
}).catch(function (error) {
console.error(error)
});
I'm currently trying to use an API and for the API, the developer console of that app asks the developer to submit a callback URL. Whenever the user of the app does something, it submits a GET request to the callback URL and I can retrieve data from that request. The current url I am using is https://appId:javascript-key=myJavascriptKey#api.parse.com/1/functions/receiveInfo. How can I handle the data, a.k.a the GET parameters, from the GET request? I found an answer on Parse.com that says how to retrieve data from a POST request, but all it says is that data = request.body. Do I do the same for GET requests and if so what do I do after that? Is request.body a json value?
Parse.Cloud.define("receiveInfo", function(request,response){
var params = request.body;//is this right to get the GET parameters they send? if so what do I do next?
});
The documentation has your solution at: https://parse.com/docs/cloud_code_guide#functions
For GET requests you have to use the request.params object which has all your request parameters for a GET are there. POSTS are sent in the request body, GET in the request parameters.
It looks like you are trying to get the params you can use something similar to:
Parse.Cloud.define("myMethod", function(request, response) {
if(request.params.myparam == "moo") {
response.success("Cow!");
}
else {
response.error("Unknown type of animal");
}
});
I am facing very strange issue here. I have a servlet running on my machine which renders my web page based on some input parameters.
Now, my screen capture with PhantomJS is not working if I try to send my data as a JSON object as a POST request type. For e.g. if I try:
Client side
var data = {"op": "get"};
page.open(address, 'POST', data, function (status) {
if (status !== 'success') {
console.log('[ERROR] :: Unable to fetch the address - ' + address + ", with data - " + data);
phantom.exit();
} else {
page.render(output);
}
console.log('processing...');
});
Server side
Now, on the server side, I am using Apache Velocity View templating so I have a single method which handles both get and post like :
public Template handleRequest(HttpServletRequest request, HttpServletResponse response,
Context context){
System.out.println(request.getParameter("op"));
//Always null
}
However, if I try sending my data from phantomjs as:
var data = "op=get&..."
It works
Also, at many places elsewhere in my code..I am doing Ajax POST requests to the same servlet and it works perfectly for all those request.
Would anybody explain why my servlet is not reading the JSON parameters passed from Phantomjs?
Servlets deal with simple request, so they only know how to parse (natively) HTTP parameters, either GET parameters from the URL, or POST parameters sent as application/x-www-form-urlencoded. Newer versions of the Servlet specification can also read multipart/form-data.
But there's nothing about JSON mentioned either in the Servlet or the HTTP specifications. So you must use a library that knows how to parse JSON and make the resulting object available in the Velocity context.