Patch custom object with kubernetes javascrtip SDK - javascript

I'm using External Secrets to sync my secrets from azure. And now I need a programmatic way to trigger the sync. With kubectl the command is
kubectl annotate es my-es force-sync=$(date +%s) --overwrite
So, I try to use k8s js sdk to do this. I can success fully get the External Secret
await crdApi.getNamespacedCustomObject("external-secrets.io", "v1beta1", "default", "externalsecrets", "my-es")
However, when I try to update it with patchNamespacedCustomObject, it always tells me "the body of the request was in an unknown format - accepted media types include: application/json-patch+json, application/merge-patch+json, application/apply-patch+yaml"
Here's my code
const kc = new k8s.KubeConfig();
kc.loadFromString(kubeConfig);
const crdApi = kc.makeApiClient(k8s.CustomObjectsApi);
let patch = [{
"op": "replace",
"path": "/metadata/annotations",
"value": {
"force-sync": "1663315075"
}
}];
await crdApi.patchNamespacedCustomObject("external-secrets.io", "v1beta1", "default", "externalsecrets", "my-es", patch);
I am referring their patch example here

const options = {
"headers": {
"Content-type": k8s.PatchUtils.PATCH_FORMAT_JSON_PATCH
}
};
is still required.

Related

Why does Nuxt's proxy wont work in production mode with my setup

To avoid users know what endpoint we are requesting data, we are using #nuxtjs/proxy
This config in nuxt.config.js
const deployTarget = process.env.NUXTJS_DEPLOY_TARGET || 'server'
const deploySSR = (process.env.NUXTJS_SSR === 'true') || (process.env.NUXTJS_SSR === true)
And the proxy settings
proxy: {
'/api/**/**': {
changeOrigin: true,
target: process.env.VUE_APP_API_URL,
secure: true,
ws: false,
pathRewrite: { '^/api/': '' }
}
},
Also we deploy like so
NUXTJS_DEPLOY_TARGET=server NUXTJS_SSR=false nuxt build && NUXTJS_DEPLOY_TARGET=server NUXTJS_SSR=false nuxt start
Also in the httpClient that normally is
constructor (basePath, defaultTimeout, fetch, AbortController) {
this.basePath = basePath
this.defaultTimeout = parseInt(defaultTimeout, 10) || 1000
this.isLocalhost = !this.basePath || this.basePath.includes('localhost')
this.fetch = fetch
this.AbortController = AbortController
}
Has been modified like so:
constructor (basePath, defaultTimeout, fetch, AbortController) {
this.basePath = '/api'
this.defaultTimeout = parseInt(defaultTimeout, 10) || 1000
this.isLocalhost = !this.basePath || this.basePath.includes('localhost')
this.fetch = fetch
this.AbortController = AbortController
}
The fetch options are
_getOpts (method, options) {
const opts = Object.assign({}, options)
opts.method = opts.method || method
opts.cache = opts.cache || 'no-cache'
opts.redirect = opts.redirect || 'follow'
opts.referrerPolicy = opts.referrerPolicy || 'no-referrer'
opts.credentials = opts.credentials || 'same-origin'
opts.headers = opts.headers || {}
opts.headers['Content-Type'] = opts.headers['Content-Type'] || 'application/json'
if (typeof (opts.timeout) === 'undefined') {
opts.timeout = this.defaultTimeout
}
return opts
}
So that's making a request to https://api.anothersite.com/api/?request..
And in localhost using npm run dev its working just fine, it requests and fetchs the desired data.
But some how, when we deploy it to the staging environment all those request return
{ "code": 401, "data": "{'statusCode':401,'error':'Unauthorized','message':'Invalid token.'}", "json": { "statusCode": 401, "error": "Unauthorized", "message": "Invalid token." }, "_isJSON": true }
Note that
the front is being deployed to example.com that requires basic http authentication and we are properly authenticated
The requests in both in local and staging are done to api.example.com that doesn't require http auth where the data is served from a Strapi that doesn't need any token at all
is it posible that this response that we are getting is because as requests are from the proxy they are not http authenticated?
You should find somebody who knows some details because you will need those for that project.
Especially because here, you're hosting your app somewhere and that platform is probably missing an environment variable, hence the quite self-explanatory error
401,'error':'Unauthorized','message':'Invalid token
That also explains why that one works locally (you probably have an .env file) and not once pushed.
You could try to create a repro on an SSR-ready VPS but I'm pretty sure that #nuxtjs/proxy is working fine.
Otherwise, double checking the network requests in your browser devtools is still the way to go regarding the correct configuration of the module.
Anyway, further details are needed from your side here.
As a good practice, you should also have the following in your nuxt.config.js file
ssr: true,
target: 'server'
rather than using inline variables for those, safer and self-explanatory for everybody that way (on top of being less error-prone IMO). Or, you can use an env variable for the key itself.

How do I create a Kubernetes Custom Resource using javascript client

My custom definition
apiVersion: something.com/v1alpha1
kind: MyKind
metadata:
name: test
spec:
size: 1
image: myimage
Here is an answer that shows how to create a deployment using a javascript client. However, I need to create a custom resource using a javascript client
All the client libraries are auto-generated from the same underlying IDL so it works like in Go, createNamespacedCustomObject. You can also use the raw API directly too.
const k8s = require('#kubernetes/client-node')
const kc = new k8s.KubeConfig();
kc.loadFromDefault();
const k8sClient = kc.makeApiClient(k8s.CustomObjectsApi);
var body = {
"apiVersion": "something.com/v1alpha1",
"kind": "MyKind",
"metadata": {
"name": "mycustomobject",
},
"spec": {
"size": "1",
"image": "myimage"
}
}
k8sClient.createNamespacedCustomObject('something.com','v1alpha1','default','mykinds', body)
.then((res)=>{
console.log(res)
})
.catch((err)=>{
console.log(err)
})

Delete By Query API working as curl but not in Node-Red

Background: What I am trying to achieve is to delete multiple values from elastic using a single API call. Our app uses Node-Red to create the backend API's.
I am using below curl command to delete multiple doc id's and it is working like a charm. It deletes the docs found with id's xxxxx and yyyyy.
POST /tom-access/doc/_delete_by_query
{
"query": {
"terms": {
"_id": [
"xxxxx",
"yyyyy"
]
}
}
}
However, when I try to do the same via Node-Red (using a JavaScript function), I am getting below error.
{"error":{"root_cause":[{"type":"action_request_validation_exception","reason":"Validation
Failed: 1: query is
missing;"}],"type":"action_request_validation_exception","reason":"Validation
Failed: 1: query is missing;"},"status":400}
Here is what I have inside the Node-Red JavaScript function:
if (!msg.headers) msg.headers = {};
msg.req = {
"query": {
"terms": {
"id": [
"xxxxx",
"yyyyy"
]
}
}
};
msg.headers = {
"Content-Type": "application/json",
"Authorization" : "Basic xxxxxxxxxxxxxxxxxxxxx"
};
msg.method = "POST"
// New elastic
msg.url = "http://elastic.test.com/tom-access/doc/_delete_by_query";
return msg;
The next node makes an HTTP CALL using above msg object but results in the error mentioned above. I am new to Node-Red, JavaScript and Elastic as well. HEEELP!!!
The endpoint is probably expecting the query to be in the body of the requests.
You should be setting it under msg.payload not msg.req.

Error serving HTML files from an Azure function

I am trying to open, read and return an HTML files using Azure functions. I am developing locally and the logs says that the function executed successfully however on the browser I am getting 500 internal server error. Am I doing something wrong in here?
const fs = require('fs');
const path = require('path');
const mime = require('../node_modules/mime-types');
module.exports = function (context, req) {
const staticFilesFolder = 'www/build/';
const defaultPage = 'index.html';
getFile(context, req.query.file);
function getFile(context, file) {
const homeLocation = process.env["HOME"];
if(!file || file == null || file === undefined){
context.done(null,{status:200,body:"<h1>Define a file</h1>",headers:{
"Content-Type":" text/html; charset=utf-8"
}});
}
fs.readFile(path.resolve(path.join(homeLocation, staticFilesFolder, file)),
(err, htmlContent) => {
if (err) {
getFile(context, "404.html");
}
else {
const res = {
status: 200,
body: htmlContent,
headers:{
"Content-Type": mime.lookup(path.join(homeLocation, staticFilesFolder, file))
}
}
context.done(null,res);
}
})
}
};
Note
I am sure that 404.html exists and index.html exists. When I log the contents of htmlContent it is giving the correct output.
functions.json
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods":["get"],
"route":"home",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
Response on Chrome
If I removed "Content-Length" header the status code changes to 406.
Update 1 The code seems to be running normally on Azure Portal but it is not working when running it locally.
It looks like you are combining two methods of returning data from an http triggered function(context.res and context.done()): https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-node#accessing-the-request-and-response
Since you are using context.res, try removing context.done();
You are making an incorrect use of context.res, you shouldn't be overwriting it but instead leveraging the methods provided by the Response class provided in the Azure NodeJS worker. If you are using using VSCode you'll get intellisense for these methods. Otherwise see: https://github.com/Azure/azure-functions-nodejs-worker/blob/dev/src/http/Response.ts
Your code should look something like this instead.
context.res.setHeader('content-type', 'text/html; charset=utf-8')
context.res.raw(htmlContent)
Using context.res.raw or context.res.send will already perform the context.done call for you.
Make sure you use content-type=text/html; charset-utf8 instead of content-type=text/html or you'll trigger an issue with the returned content-type. Instead of returning content-type=text/html you end up getting content-type=text/plain which will fail to render your html.
Addressed on: https://github.com/Azure/azure-webjobs-sdk-script/issues/2053

How do i call GAS Execution API's run method from container bound script?

I have made an target project script as instructed by google in the documentation and followed all the steps for deploying as a API Executable.Enabled the Google apps script execution API also.
In documentation, they have mentioned a Execution API's run method. By using this we can call the projects method. i don't know how to use this in GAS.
This is the target project's script,
/**
* The function in this script will be called by the Apps Script Execution API.
*/
/**
* Return the set of folder names contained in the user's root folder as an
* object (with folder IDs as keys).
* #return {Object} A set of folder names keyed by folder ID.
*/
function getFoldersUnderRoot() {
var root = DriveApp.getRootFolder();
var folders = root.getFolders();
var folderSet = {};
while (folders.hasNext()) {
var folder = folders.next();
folderSet[folder.getId()] = folder.getName();
}
return folderSet;
}
I'd tried following method to call execution API's run method, but it required access token, how can i get the access token in this code.
var access_token=ScriptApp.getOAuthToken();//Returning a string value
var url = "https://script.googleapis.com/v1/scripts/MvwvW29XZxP77hsnIkgD0H88m6KuyrhZ5:run";
var headers = {
"Authorization": "Bearer "+access_token,
"Content-Type": "application/json"
};
var payload = {
'function': 'getFoldersUnderRoot',
devMode: true
};
var res = UrlFetchApp.fetch(url, {
method: "post",
headers: headers,
payload: JSON.stringify(payload),
muteHttpExceptions: true
});
But in response i'm getting this,
{
"error": {
"code": 401,
"message": "ScriptError",
"status": "UNAUTHENTICATED",
"details": [
{
"#type": "type.googleapis.com/google.apps.script.v1.ExecutionError",
"errorMessage": "Authorization is required to perform that action.",
"errorType": "ScriptError"
}
]
}
}
How can i solve this? Code explanation would be appreciable. Thank you.
How about this sample script? From your question, I could know that you have already done deploying API Executable and enabling Execution API. In order to use Execution API, also an access token with scopes you use is required as you can see at Requirements. Have you already retrieved it? If you don't have it yet, you can retrieve it by checking the Auth Guide. By preparing these, you can use Execution API.
Following sample is for run getFoldersUnderRoot() in the project deployed API Executable. Please copy and paste this sample script to your spreadsheet script editor. Before you use this, please import your access token and script ID of the project deployed API Executable to the script.
Sample script :
var url = "https://script.googleapis.com/v1/scripts/### Script ID ###:run";
var headers = {
"Authorization": "Bearer ### acces token ###",
"Content-Type": "application/json"
};
var payload = {
'function': 'getFoldersUnderRoot',
devMode: true
};
var res = UrlFetchApp.fetch(url, {
method: "post",
headers: headers,
payload: JSON.stringify(payload),
muteHttpExceptions: true
});
Result :
Here, the result of getFoldersUnderRoot() is returned as follows.
{
"done": true,
"response": {
"#type": "type.googleapis.com/google.apps.script.v1.ExecutionResponse",
"result": {
"### folder id1 ### ": "### folder name1 ###",
"### folder id2 ### ": "### folder name2 ###",
}
}
}

Categories