How I can use Swagger-generated API client source on client-site (normal browser application without NodeJs)?
In a first test I generated a javascript client for Swaggers' petstore API (https://petstore.swagger.io/v2) using editor.swagger.io
The generated code is containing a index.js which provides access to constructors for public API classes, which I try to embed and use in my web application.
The documentation describes the usage of the API like so:
var SwaggerPetstore = require('swagger_petstore');
var defaultClient = SwaggerPetstore.ApiClient.instance;
// Configure API key authorization: api_key
var api_key = defaultClient.authentications['api_key'];
api_key.apiKey = 'YOUR API KEY';
// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
//api_key.apiKeyPrefix = 'Token';
var apiInstance = new SwaggerPetstore.PetApi();
var petId = 789; // Number | ID of pet to return
var callback = function(error, data, response) {
if (error) {
console.error(error);
} else {
console.log('API called successfully. Returned data: ' + data);
}
};
apiInstance.getPetById(petId, callback);
This works fine for NodeJs applications. But how I can use the API for conventional client-site web-apps inside the browser? For such applications the nodejs function require does not work.
From https://github.com/swagger-api/swagger-js
The example has cross origin problem, but it should work in your own project
<html>
<head>
<script src='//unpkg.com/swagger-client' type='text/javascript'></script>
<script>
var specUrl = '//petstore.swagger.io/v2/swagger.json'; // data urls are OK too 'data:application/json;base64,abc...'
SwaggerClient.http.withCredentials = true; // this activates CORS, if necessary
var swaggerClient = new SwaggerClient(specUrl)
.then(function (swaggerClient) {
return swaggerClient.apis.pet.addPet({id: 1, name: "bobby"}); // chaining promises
}, function (reason) {
console.error("failed to load the spec" + reason);
})
.then(function(addPetResult) {
console.log(addPetResult.obj);
// you may return more promises, if necessary
}, function (reason) {
console.error("failed on API call " + reason);
});
</script>
</head>
<body>
check console in browser's dev. tools
</body>
</html>
Related
I am using service workers to intercept requests for me and provide the responses to the fetch requests by communicating with a Web worker (also created from the same parent page).
I have used message channels for direct communication between the worker and service worker. Here is a simple POC I have written:
var otherPort, parentPort;
var dummyObj;
var DummyHandler = function()
{
this.onmessage = null;
var selfRef = this;
this.callHandler = function(arg)
{
if (typeof selfRef.onmessage === "function")
{
selfRef.onmessage(arg);
}
else
{
console.error("Message Handler not set");
}
};
};
function msgFromW(evt)
{
console.log(evt.data);
dummyObj.callHandler(evt);
}
self.addEventListener("message", function(evt) {
var data = evt.data;
if(data.msg === "connect")
{
otherPort = evt.ports[1];
otherPort.onmessage = msgFromW;
parentPort = evt.ports[0];
parentPort.postMessage({"msg": "connect"});
}
});
self.addEventListener("fetch", function(event)
{
var url = event.request.url;
var urlObj = new URL(url);
if(!isToBeIntercepted(url))
{
return fetch(event.request);
}
url = decodeURI(url);
var key = processURL(url).toLowerCase();
console.log("Fetch For: " + key);
event.respondWith(new Promise(function(resolve, reject){
dummyObj = new DummyHandler();
dummyObj.onmessage = function(e)
{
if(e.data.error)
{
reject(e.data.error);
}
else
{
var content = e.data.data;
var blob = new Blob([content]);
resolve(new Response(blob));
}
};
otherPort.postMessage({"msg": "content", param: key});
}));
});
Roles of the ports:
otherPort: Communication with worker
parentPort: Communication with parent page
In the worker, I have a database say this:
var dataBase = {
"file1.txt": "This is File1",
"file2.txt": "This is File2"
};
The worker just serves the correct data according to the key sent by the service worker. In reality these will be very large files.
The problem I am facing with this is the following:
Since I am using a global dummyObj, the older dummyObj and hence the older onmessage is lost and only the latest resource is responded with the received data.
In fact, file2 gets This is File1, because the latest dummyObj is for file2.txt but the worker first sends data for file1.txt.
I tried by creating an iframe directly and all the requests inside it are intercepted:
<html>
<head></head>
<body><iframe src="tointercept/file1.txt" ></iframe><iframe src="tointercept/file2.txt"></iframe>
</body>
</html>
Here is what I get as output:
One approach could be to write all the files that could be fetched into IndexedDB in the worker before creating the iframe. Then in the Service Worker fetch those from indexed DB. But I don't want to save all the resources in IDB. So this approach is not what I want.
Does anybody know a way to accomplish what I am trying to do in some other way? Or is there a fix to what I am doing.
Please Help!
UPDATE
I have got this to work by queuing the dummyObjs in a global queue instead of having a global object. And on receiving the response from the worker in msgFromW I pop an element from the queue and call its callHandler function.
But I am not sure if this is a reliable solution. As it assumes that everything will occur in order. Is this assumption correct?
I'd recommend wrapping your message passing between the service worker and the web worker in promises, and then pass a promise that resolves with the data from the web worker to fetchEvent.respondWith().
The promise-worker library can automate this promise-wrapping for you, or you could do it by hand, using this example as a guide.
If you were using promise-worker, your code would look something like:
var promiseWorker = new PromiseWorker(/* your web worker */);
self.addEventListener('fetch', function(fetchEvent) {
if (/* some optional check to see if you want to handle this event */) {
fetchEvent.respondWith(promiseWorker.postMessage(/* file name */));
}
});
I am using IBM Bluemix to make a web service for a school project.
My project needs to request a JSON from an API, so I can use the data it provides. I use the http get method for a data set, and I am not sure if it is working properly.
When I run my code, I get the message:
Error: Protocol "https:" not supported. Expected "http:"
What is causing it and how can I solve it?
Here is my .js file:
// Hello.
//
// This is JSHint, a tool that helps to detect errors and potential
// problems in your JavaScript code.
//
// To start, simply enter some JavaScript anywhere on this page. Your
// report will appear on the right side.
//
// Additionally, you can toggle specific options in the Configure
// menu.
function main() {
return 'Hello, World!';
}
main();/*eslint-env node*/
//------------------------------------------------------------------------------
// node.js starter application for Bluemix
//------------------------------------------------------------------------------
// HTTP request - duas alternativas
var http = require('http');
var request = require('request');
// cfenv provides access to your Cloud Foundry environment
// for more info, see: https://www.npmjs.com/package/cfenv
var cfenv = require('cfenv');
//chama o express, que abre o servidor
var express = require('express');
// create a new express server
var app = express();
// serve the files out of ./public as our main files
app.use(express.static(__dirname + '/public'));
// get the app environment from Cloud Foundry
var appEnv = cfenv.getAppEnv();
// start server on the specified port and binding host
app.listen(appEnv.port, '0.0.0.0', function() {
// print a message when the server starts listening
console.log("server starting on " + appEnv.url);
});
app.get('/home1', function (req,res) {
http.get('http://developers.agenciaideias.com.br/cotacoes/json', function (res2) {
var body = '';
res2.on('data', function (chunk) {
body += chunk;
});
res2.on('end', function () {
var json = JSON.parse(body);
var CotacaoDolar = json["dolar"]["cotacao"];
var VariacaoDolar = json["dolar"]["variacao"];
var CotacaoEuro = json["euro"]["cotacao"];
var VariacaoEuro = json["euro"]["variacao"];
var Atualizacao = json["atualizacao"];
obj=req.query;
DolarUsuario=obj['dolar'];
RealUsuario=Number(obj['dolar'])*CotacaoDolar;
EuroUsuario=obj['euro'];
RealUsuario2=Number(obj['euro'])*CotacaoEuro;
Oi=1*VariacaoDolar;
Oi2=1*VariacaoEuro;
if (VariacaoDolar<0) {
recomend= "Recomenda-se, portanto, comprar dólares.";
}
else if (VariacaoDolar=0){
recomend="";
}
else {
recomend="Recomenda-se, portanto, vender dólares.";
}
if (VariacaoEuro<0) {
recomend2= "Recomenda-se, portanto, comprar euros.";
}
else if (VariacaoEuro=0){
recomend2="";
}
else {
recomend2="Recomenda-se,portanto, vender euros.";
}
res.render('cotacao_response.jade', {
'CotacaoDolar':CotacaoDolar,
'VariacaoDolar':VariacaoDolar,
'Atualizacao':Atualizacao,
'RealUsuario':RealUsuario,
'DolarUsuario':DolarUsuario,
'CotacaoEuro':CotacaoEuro,
'VariacaoEuro':VariacaoEuro,
'RealUsuario2':RealUsuario2,
'recomend':recomend,
'recomend2':recomend2,
'Oi':Oi,
'Oi2':Oi2
});
app.get('/home2', function (req,res) {
http.get('https://www.quandl.com/api/v3/datasets/BCB/432.json?api_key=d1HxqKq2esLRKDmZSHR2', function (res3) {
var body = '';
res3.on('data', function (chunk) {
body += chunk;
});
res3.on('end', function () {
var x=json.dataset.data[0][1];
console.log("My JSON is "+x); });
});
});
});
});
});
Here is a print of the error screen I get:
When you want to request an https resource, you need to use https.get, not http.get.
https://nodejs.org/api/https.html
As a side note to anyone looking for a solution from Google... make sure you are not using an http.Agent with an https request or you will get this error.
The reason for this error is that you are trying to call a HTTPS URI from a HTTP client. The ideal solution would have been for a generic module to figure out the URI protocol and take the decision to use HTTPS or HTTP internally.
The way I overcame this problem is by using the switching logic on my own.
Below is some code which did the switching for me.
var http = require('http');
var https = require('https');
// Setting http to be the default client to retrieve the URI.
var url = new URL("https://www.google.com")
var client = http; /* default client */
// You can use url.protocol as well
/*if (url.toString().indexOf("https") === 0){
client = https;
}*/
/* Enhancement : using the URL.protocol parameter
* the URL object , provides a parameter url.protocol that gives you
* the protocol value ( determined by the protocol ID before
* the ":" in the url.
* This makes it easier to determine the protocol, and to support other
* protocols like ftp , file etc)
*/
client = (url.protocol == "https:") ? https : client;
// Now the client is loaded with the correct Client to retrieve the URI.
var req = client.get(url, function(res){
// Do what you wanted to do with the response 'res'.
console.log(res);
});
Not sure why, but the issue for me happened after updating node to version 17, i was previously using version 12.
In my setup, i have node-fetch using HttpsProxyAgent as an agent in the options object.
options['agent'] = new HttpsProxyAgent(`http://${process.env.AWS_HTTP_PROXY}`)
response = await fetch(url, options)
Switching back to node 12 fixed the problem:
nvm use 12.18.3
I got this error while deploying the code.
INFO error=> TypeError [ERR_INVALID_PROTOCOL]: Protocol "https:" not supported. Expected "http:"
at new NodeError (node:internal/errors:372:5)
To fix this issue, I have updated the "https-proxy-agent" package version to "^5.0.0"
Now the error was gone and it's working for me.
While trying to build a web application using the Sinch Instant Messaging SDK I ran into an issue of not being able receive instant messages using the latest Javascript Instant Messaging SDK found here. I have also been following along this tutorial to help build my app that I think uses a different version of the SDK where instant messages can be received. However, the SDK version in the tutorial does not let me use generated userTickets for authentication for my application, while the latest SDK version does.
So, I was wondering if there was a way to either use generated userTickets for the SDK found in the tutorial or receive instant messages using the latest SDK.
On the latest SDK I have tried setting supportActiveConnection to true during configuration in order to receive messages using the code in the tutorial with no success. Here are some of the relevant code snippets from the tutorial to receive messages:
sinchClient = new SinchClient({
applicationKey: 'APP_KEY',
capabilities: {
messaging: true
},
supportActiveConnection: true,
});
var loginObject = {
username: username,
password: password
};
sinchClient.start(loginObject, function() {
global_username = username;
showPickRecipient();
}).fail(handleError);
var messageClient = sinchClient.getMessageClient();
var eventListener = {
onIncomingMessage: function(message) {
if (message.senderId == global_username) {
$('div#chatArea').append('<div>' + message.textBody + '</div>');
} else {
$('div#chatArea').append('<div style="color:red;">' + message.textBody + '</div>');
}
}
}
messageClient.addEventListener(eventListener);
The authentication ticket is generated by a python back-end through the following function and handler:
def getAuthTicket(username):
userTicket = {
'identity': {'type': 'username', 'endpoint': username},
'expiresIn': 3600,
'applicationKey': APPLICATION_KEY,
'created': datetime.utcnow().isoformat()
}
userTicketJson = json.dumps(userTicket).replace(" ", "")
userTicketBase64 = base64.b64encode(userTicketJson)
# TicketSignature = Base64 ( HMAC-SHA256 ( ApplicationSecret, UTF8 ( UserTicketJson ) ) )
digest = hmac.new(base64.b64decode(
APPLICATION_SECRET), msg=userTicketJson, digestmod=hashlib.sha256).digest()
signature = base64.b64encode(digest)
# UserTicket = TicketData + ":" + TicketSignature
signedUserTicket = userTicketBase64 + ':' + signature
return {"userTicket": signedUserTicket}
class TicketHandler(BaseHandler):
def get(self):
self.response.write(getAuthTicket(self.username))
Then on the client side I call a get request on the ticket handler.
$.get('/ticket', function(authTicket) {
sinchClient.start(eval("(" + authTicket + ")"))
.then(function() {
console.log("success");
})
.fail(function(error) {
console.log("fail");
});
});
The error I get when I try to start the start the Sinch client using the sinch.min.js file found in the tutorial is a "no valid identity or authentication ticket".
I would like to know how (the right way) to work with Google Cloud Endpoint in an Alloy Titanium application. And I would like to use the library that Google has for the API endpoints.
I am new to Alloy and CommonJS, therefore trying to figure out the right way to do this.
From my understanding Alloy prefers (or only allows) including javascript via modules (CommonJS - exports...).
var module = require('google.js');
google.api.endpoint.execute();
This would be the way CommonJS would expect things to work. Although in the google javascript library it just creates a global variable called "gapi".
Is there a way, I can include this file ?
Is there a way, I can create global variables ?
Should I stay away from creating them in first place ?
Thanks !
The client.js library that Google has for the API endpoints can be run only from browsers (Titanium.UI.WebView in this case), it can't be run directly from Titanium code since it contains objects not available in Titanium Appcelerator.
Also, using a Google Cloud Endpoint into an Alloy Titanium application requires having the js code available into the project at compile time, as it is used by Titanium to generate the native code for the desired platforms.
To anwser your questions:
Is there a way, I can include this file ?
No, if you plan to run the code as Titanium code, for the reasons mentioned above. Instead you could use the following code snippet to connect to a Google Cloud Endpoint:
var url = "https://1-dot-projectid.appspot.com/_ah/api/rpc";
var methodName = "testendpoint.listGreetings";
var apiVersion = "v1";
callMethod(url, methodName, apiVersion, {
success : function(responseText)
{
//work with the response
},
error : function(e) { //onerror do something
}
});
function callMethod(url, methodName, apiVersion, callbacks) {
var xhr = Titanium.Network.createHTTPClient();
xhr.onload = function(e) {
Ti.API.info("received text: " + this.responseText);
if (typeof callbacks.success === 'function') {
callbacks.success(this.responseText);
}
};
xhr.onerror = function(e) {
Ti.API.info(JSON.stringify(e));
//Ti.API.info(e.responseText);
if (typeof callbacks.error === 'function') {
callbacks.error(e);
}
};
xhr.timeout = 5000; /* in milliseconds */
xhr.open("POST", url, true);
xhr.setRequestHeader('Content-Type', 'application/json-rpc');
//xhr.setRequestHeader('Authorization', 'Bearer ' + token);
var d = [{
jsonrpc: '2.0',
method: methodName,
id: 1,
apiVersion: apiVersion,
}];
Ti.API.info(JSON.stringify(d));
// Send the request.
xhr.send(JSON.stringify(d));
}
Yes, if you use the embeded device's browser like this (as can be found in web client GAE samples)
webview = Titanium.UI.createWebView({
width : '100%',
height : '100%',
url : url // put your link to the HTML page
});
, to call your server HTML page which should contain:
script src="https://apis.google.com/js/client.js?onload=init">
Is there a way, I can create global variables ?
Yes, insert into the app/alloy.js file the global variables, see the default comments in the file:
// This is a great place to do any initialization for your app
// or create any global variables/functions that you'd like to
// make available throughout your app. You can easily make things
// accessible globally by attaching them to the Alloy.Globals
// object. For example:
//
Alloy.Globals.someGlobalFunction = function(){};
Alloy.Globals.someGlobalVariable = "80dp";
Should I stay away from creating them in first place ?
I suppose you are reffering to global variables containing the module code for connecting to GAE enpoind methods. It's your call, here is how you can use them.
a) Create a file named jsonrpc.js in the app/lib folder of your Titanium project, put the following code into it, and move the function code from above as the function body:
JSONRPCClient = function () {
};
JSONRPCClient.prototype = {
callMethod : function (url, methodName, apiVersion, callbacks) {
// insert the function body here
}
};
exports.JSONRPCClient = JSONRPCClient;
b) Into app/alloy.js file define your global variable:
Alloy.Globals.JSONRPCClient = require('jsonrpc').JSONRPCClient;
c) Use it (eg. from your controller js files):
var client = new Alloy.Globals.JSONRPCClient();
var url = "https://1-dot-projectid.appspot.com/_ah/api/rpc";
var methodName = "testendpoint.listGreetings";
var apiVersion = "v1";
client.callMethod(url, methodName, apiVersion,
{success: function(result) {
//result handling
Ti.API.info('response result=', JSON.stringify(result));
//alert(JSON.stringify(result));
},
error: function(err) {
Ti.API.info('response out err=', JSON.stringify(err));
//error handling
}
});
How would you go to create a streaming API with Node? just like the Twitter streaming API.
What I want to do ultimately is get the first update from the FriendFeed api, and stream when a new one becomes available (if the id is different), and later on expose it as a web service so I can use it with WebSockets on my website :).
So far I have this:
var sys = require('sys'),
http = require('http');
var ff = http.createClient(80, 'friendfeed-api.com');
var request = ff.request('GET', '/v2/feed/igorgue?num=1',
{'host': 'friendfeed-api.com'});
request.addListener('response', function (response) {
response.setEncoding('utf8'); // this is *very* important!
response.addListener('data', function (chunk) {
var data = JSON.parse(chunk);
sys.puts(data.entries[0].body);
});
});
request.end();
Which only gets the data from FriendFeed, creating the Http server with node is easy but it can't return a stream (or I haven't yet found out how).
You would want to set up a system that keeps track of incoming requests and stores their response objects. Then when it's time to stream a new event from FriendFeed, iterate through their response objects and responses[i].write('something') out to them.
Check out LearnBoost's Socket.IO-Node, you may even just be able to use that project as your framework and not have to code it yourself.
From the Socket.IO-Node example app (for chat):
io.listen(server, {
onClientConnect: function(client){
client.send(json({ buffer: buffer }));
client.broadcast(json({ announcement: client.sessionId + ' connected' }));
},
onClientDisconnect: function(client){
client.broadcast(json({ announcement: client.sessionId + ' disconnected' }));
},
onClientMessage: function(message, client){
var msg = { message: [client.sessionId, message] };
buffer.push(msg);
if (buffer.length > 15) buffer.shift();
client.broadcast(json(msg));
}
});