Can swagger generate a javascript client with common JS instead of ESM? - javascript

When I use Swagger-codegen 3.x to send the post request and generate a javascript client, I noticed that the javascript client it generated works fine with ESM instead of common JS (require('') syntax). That is, if I want to test the endpoint I have to create a test.mjs file instead of test.js file.
The docker image I used is "swaggerapi/swagger-generator-v3:latest". The endpoint I'm using is "http://localhost:8080/api/generate". I have passed the option useES6 in the body and it does not work.
Here is the body I send the post request. Are there any ways I can create with a common JS?
json: {
"options": {},
"spec": parsedRawdata,
"codegenVersion": "V3",
"lang": "javascript",
"type": "CLIENT"
},
headers: {
"Accept": "application/octet-stream",
"Content-Type": "application/json"
}

Related

Nextjs app with proxy - headers not included in response

I'm proxying requests to an external backend from a Nextjs app and trying to extract an
Authorization header from the response using the native fetch library. However the headers in the fetch response are empty even though they are present in the actual response from the server.
Here's the setup in the next.config.js file:
module.exports = {
async headers() {
return [
{
source: '/api/:path*',
headers: [
{
key: 'Access-Control-Expose-Headers',
value: '*, authorization',
},
],
},
]
},
async rewrites() {
return [
{
source: '/api/:path*',
destination: `${process.env.NEXT_PUBLIC_BASE_URL}/api/:path*`,
},
]
},
}
With my limited knowledge of proxies I'm assuming that it's a reverse proxy and the headers are lost in the response back somewhere inside it.
I've also tried adding a middleware to try to localize where the headers are being lost without any luck.
Does anyone know where I can configure the headers to be included back from the proxy?
Thanks in advance! :)
I am reusing a fetcher from react-native and it seems like the method for extracting headers differs. 😅
This works using fetch in the browser:
response.headers.get('authorization')
In react-native: response.headers.map['authorization']

Token visibility in Next.js

Is a fetch request token in a next.js app visible to a client side user?
I must to prepare an application using GitHub GraphQL API. I though about using fetch request with a bearer token for it, but I have a doubts about security and good practices in my app.
fetch((`https://api.github.com/graphql`, {
method: 'POST',
headers: {
"Content-Type": 'application/json',
'Authorization': 'Bearer ' + TOKEN,
},
body: JSON.stringify({
query: `
{
search(query: "is:public", type: REPOSITORY, first: 10) {
repositoryCount
pageInfo {
endCursor
startCursor
}
edges {
node {
... on Repository {
name
}
}
}
}
}
`
})
}))
According to the Next.js docs, you need to create a .env.local file at the root of the project directory and add the secrets.
When adding secrets, you need to prefix the secret with NEXT_PUBLIC_ to expose it to the browser. For example:
NEXT_PUBLIC_GITHUB_TOKEN=amogussus
According to the docs:
The value will be inlined into JavaScript sent to the browser because of the NEXT_PUBLIC_ prefix.
Then you might add it into your code with process.env.NEXT_PUBLIC_GITHUB_TOKEN

How to cache all urls like /page/id where id is a number using workbox

Given the following code snippet from my nodejs server:
router.get('/page/:id', async function (req, res, next) {
var id = req.params.id;
if ( typeof req.params.id === "number"){id = parseInt(id);}
res.render('page.ejs' , { vara:a , varb:b });
});
I want to do exactly what I'm doing in the nodejs server but on from the service worker.
I've generated & built it using workbox but I don't know how to cache all the urls like /page/1 or /page/2 or .... /page/4353 and so on without overcharging the service worker source code.
The nodejs code from above it's working 100%.
I tried to so something like:
.....{
"url": "/page/\*",
"revision": "q8j4t1d072f2g6l5unc0q6c0r7vgs5w0"
},....
It doesn't work in the service worker pre-cache when I reloaded the website with this code added into the service worker it was installing and it took pretty much. Is that normal? Can't I do that without overcharging the entire installing process and browser cache memory?
Thank you for help!
EDIT:
My service worker looks like :
importScripts("https://storage.googleapis.com/workbox-cdn/releases/3.6.2/workbox-sw.js");
if (workbox) {
console.log('Workbox status : ONLINE');
} else {
console.log('Workbox status : OFFLINE');
}
workbox.skipWaiting();
workbox.clientsClaim();
self.__precacheManifest = [
{
"url": "/",
"revision": "7e50eec344ce4d01730894ef1d637d4d"
},
'head.ejs',
'navbar.ejs',
'map_script.ejs',
'scripts.ejs',
'details.json',
'page.ejs',
'home.ejs',
'map.ejs',
'about.ejs',
{
"url": "/page",
"revision": "881d8ca1f2aacfc1617c09e3cf7364f0",
"cleanUrls": "true"
},
{
"url": "/about",
"revision": "11d729194a0669e3bbc938351eba5d00"
},
{
"url": "/map",
"revision": "c3942a2a8ac5b713d63c53616676974a"
},
{
"url": "/getJson",
"revision": "15c88be34ben24a683f7be97fd8abc4e"
},
{
"url": "/getJson1",
"revision": "15c88be34bek24a6l3f7be97fd3aoc4e"
},
{
"url": "/getJson2",
"revision": "15c82be34ben24a683f7be17fd3amc4e"
},
{
"url": "/getJson3",
"revision": "15c62be94ben24a683f7be17gd3amc4r"
},
{
"url": "/getJson4",
"revision": "15c62beh4ben24a6g3f7be97gd3amc4p"
},
{
"url": "/public/_processed_/2/7/csm.jpg",
"revision": "15c62beh4bek44a6g3f7ben7gd3amc4p"
},
{
"url": "/public/_processed_/2/7/csm.jpg",
"revision": "15c62beh4ben24a6g3f7be9ngd3a2c4p"
}
].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
Workbox service worker has its own process of caching.
Based from the documentation of Workbox PreCaching:
One feature of service workers is the ability to save a set of files
to the cache when the service worker is installing. This is often
referred to as "precaching", since you are caching content ahead of
the service worker being used.
The precacheAndRoute method sets up an implicit cache-first handler. This is why the home page loaded while we were offline even though we had not written a fetch handler for those files.
If a request fails to match the precache, we'll add .html to end to
support "clean" URLs (a.k.a "pretty" URLs). This means a request like
/about will match /about.html.
workbox.precaching.precacheAndRoute(
[
'/styles/index.0c9a31.css',
'/scripts/main.0d5770.js',
{ url: '/index.html', revision: '383676' },
],
{
cleanUrls: true,
}
);
You can try to implement this method of caching files with service worker:
cache.addAll(requests) - This method is the same as add except it
takes an array of URLs and adds them to the cache. If any of the files
fail to be added to the cache, the whole operation will fail and none
of the files will be added.
For more infos:
Workbox-SW
Workbox Codelab
The idea behind the precache is to cache the minimal set of web assets -that you need to get your page up and running - during service worker installation. It is generally not used to cache a whole bunch of data.
What you probably need is a dynamic cache:
workbox.routing.registerRoute(
\/page/g,
workbox.strategies.CacheFirst({
cacheName: 'my-cache',
})
);
In this case, the first hit will get the page from the network. Thereafter it will be served from the cache.
I think your issue may be that you are expecting clean urls according to the manifest, but you are actually expecting dynamic urls with different params. For that, I suspect you might want to use regex.
An example for caching google fonts:
workbox.routing.registerRoute(
/^https:\/\/fonts\.googleapis\.com/,
workbox.strategies.staleWhileRevalidate({
cacheName: 'google-fonts-stylesheets'
})
);
Maybe try something closer to (regex might need some love, I did not verify):
workbox.routing.registerRoute(
/\/page\/*/,
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages'
})
);
Edit: I realize my answer is the same as #voiceuideveloper but I believe this is the right answer. I do this in my current app and it works well.

Node.js server responding with javascript file, not main .html?

I have a simple node.js server which is meant to respond to GET requests at the address http://localhost:3000/hi/ with my index.html document, and I cannot figure out why it is reading/responding with an (index) and index.js.
My function which works with router objects is:
const http = require('http');
const path = require('path');
const fs = require('fs');
let contacts = {
"1234": {
id: "1234",
phone: "77012345678",
name: "Sauron",
address: "1234 Abc"
},
"4567": {
id: "4567",
phone: "77012345678",
name: "Saruman",
address: "Orthanc, Isengard"
},
};
let loadStatic = (req, res) => {
res.writeHead(200, {'Content-Type': 'text/html'});
fs.readFile('./index.html', null, (err,data) => {
if (err) {
return console.log('error');
} else {
res.write(data);
}
res.end();
});
}
let routes = [
{
method: 'GET',
url: /^\/contacts\/[0-9]+$/,
run: getContact
},
{
method: 'DELETE',
url: /^\/contacts\/[0-9]+$/,
run: deleteContact
},
{
method: 'GET',
url: /^\/contacts\/?$/,
run: getContacts
},
{
method: 'POST',
url: /^\/contacts\/?$/,
run: createContact
},
{
method: 'GET',
url: /\/hi\//,
run: loadStatic
},
{
method: 'GET',
url: /^.*$/,
run: notFound
}
];
let server = http.createServer((req, res) => {
let route = routes.find(route =>
route.url.test(req.url) &&
req.method === route.method
);
route.run(req, res);
});
server.listen(3000);
Request URL: http://localhost:3000/hi/index.js
Request Method: GET
Status Code: 200 OK
Remote Address: [::1]:3000
Referrer Policy: no-referrer-when-downgrade
Connection: keep-alive
Content-Type: text/html
Date: Mon, 20 Aug 2018 23:41:45 GMT
Transfer-Encoding: chunked
Above is the response as processed by google chrome browser. I have no idea first of all, what (index) is - it contains all my html script, along with index.js which is also sent along (presumably with the html doc), which is the name of the javascript file that's supposed to be controlling the DOM. Both contain the html script, which is not right, the .js should be different, and it attempts to read the .js first. Also, the error in the console says "unexpected '<'" which should be obvious since it's trying to read a .js file which contains nothing but .html script. I am still very new to this, and can't find an answer. Thanks for reading, and hopefully revising!
**EDIT - I added more pertinent code to the server script. Lots of redacted functions, but those all work fine. It's this one request that isn't. index.html, index.js (this file controls the DOM and gets script tags in index.html), and newserver.js (name of this file) are in the same folder on my desktop. Hope that helps.
index.js is not "sent along" with index.html as you seem to think. The browser first requests /hi. When you send the content for that HTML file, the browser then parses that HTML file and for any <script src="xxx"> tags it finds in the HTML, the browser, then makes a separate request to your server for whatever the src file is in the script tag such as index.js.
Because you don't have a route for /index.js, it will match your notfound route (which you don't show the code for). FYI, your notfound routes should probably return a 404 status as this makes things a bit easier to debug (there will be 404 errors in the browser console which are easy to see).
You will need a route for all resources that your HTML files refer to (scripts, css files, images, etc...). node.js does not serve any files by default (unlike some other web servers). It only serves the files you specifically write routes for and write code to send.
It is possible to write a generic route that will serve any file in a specific directory that exactly matches a request. The Express framework has such as feature and is called express.static(). It is very useful for serving static resources that only need a static file sent. If you're going to continue to write your own http framework and not use one that's already been built, you will probably want to write such a static matcher. It is important, however, that you point a static file matcher at a directory that only contains public files so people can't inadvertently get access to private server files. And, one has to prevent things like .. or root paths being in the URL too.

swagger POST node.js express

I make my GET services work in Swagger (with Node.js), but I am not able to make the POST work. Can someone point out where I am making mistake in below testPost service?
my service definition
"/people/testPost": {
"post": {
"summary": "write the test POST.",
"description": "Test",
"operationId": "myService",
"consumes": [
"application/json"
],
"tags": [
"test"
],
"parameters": [
{
"name":"body",
"paramType":"body",
"description": "body for the POST request",
"required":false
}
]
In service (filename = myService.js), I am trying this
app.use(bodyParser.json())
app.use(bodyParser.urlencoded({extended:true}));
app.post("/people/testPost", function (req, res) {
console.log("I am in the testPost service); //this gets printed
console.log("The req body", req.body) //says 'undefined'
});
Here is my CURL command I use to POST
curl -H "Content-Type: application/json" -X POST -d '{"arg1":"open", "arg2":"IVR","arg3":"", "arg4" :"1234567"}' http://localhost:8016/people/testPost
Any simple POST request that uses Swagger with Node (all I need is how to configure my "body" in the json file) would, hopefully, gets me going
Some one, please, look into my "parameters" and the curl POST and see where I need to make changes/corrections.
because you're receiving JSON request you need an appropriate handler for this. For JSON you should use app.use(bodyParser.json());, see example on module page.
app.use(bodyParser) don't needed btw.

Categories