I am using jira-client npm to deal with jira rest API , and I am trying to add an attachment to a ticket using addAttachmentOnIssue method, this method requires the key of the issue as a string parameter and also requires ReadStream as the second parameter, I am able to attach a file to a ticket if I followed these steps:
var jiraAPI = require('jira-client');
var jira = new JiraApi({
protocol: "https",
host:"myJiraInstance",
username:"myUserName",
password: "MyToken",
apiVersion: "2",
strictSSL: true,})
then
const fileStream = fs.createReadStream(filePath);
jira.addAttachmentOnIssue(tickeID,fileStream);
As you can see I have the filepath and I attached it but I that is not what I want, I want to create a file from JSON object, without writing this file on the system and then send it
is that possible ?
by using:
var stream = require('stream');
var readable = new stream.Readable(); // new empty stream.Readable
readable.push('some data');
readable.push(null);
jira.addAttachmentOnIssue(tciketID,readable)
I am getting:
Processing of multipart/form-data request failed. Stream ended unexpectedly
the problem is that the required file related information such as filename, knownLength etc. is missing, which is why it fails parsing the stream.
you need to provide file related information manually
as jira-client is using postman-request, you can do that by providing a custom file object like described here:
multipart/form-data (Multipart Form Uploads)
Try this:
var JiraApi = require('jira-client');
var jira = new JiraApi({
protocol: "https",
host: "myJiraInstance",
username: "myUserName",
password: "MyToken",
apiVersion: "2",
strictSSL: true
})
const inputData = JSON.stringify({
someProp: 'some data'
});
var stream = require('stream');
var readable = new stream.Readable();
readable.push(inputData);
readable._read = () => {};
readable.push(null);
// https://www.npmjs.com/package/postman-request#forms
// Pass optional meta-data with an 'options' object with style: {value: DATA, options: OPTIONS}
// Use case: for some types of streams, you'll need to provide "file"-related information manually.
// See the `form-data` README for more information about options: https://github.com/form-data/form-data
const myStreamFile = {
value: readable,
options: {
filename: 'json.json',
contentType: 'application/json',
knownLength: inputData.length
}
}
jira.addAttachmentOnIssue(tciketID, myStreamFile)
Related
When trying to make a request to API endpoint using Cypress, different results are returned when url is directly passed or from a method/enum. Not sure how to go about it as I don't want to hardcode the urls in request.
const SPACEID = some_space_id
const ENVIRONMENTID = development
const ENTRIES_EP = entries
const CONTENT_TYPE_EP = contentType
enum endPoints {
entries = `https://app.contentful.com/spaces/${SPACEID}/environments/${ENVIRONMENTID}/${ENTRIES_EP}`,
content_type = `https://app.contentful.com/spaces/${SPACEID}/environments/${ENVIRONMENTID}/${CONTENT_TYPE_EP}`,
}
cy.api({
method: 'GET',
url: String(endPoints.content_type),
auth: {
bearer: `SOME_TOKEN`,
},
}).then((response) => {
expect(response.status).to.eq(200)
cy.wrap(response).as('resp')
})
The code above returns some HTML content stating only Chrome, Firefox is supported.
However if I add the absolute URL in cy.api call it works fine. I am not sure what I am doing wrong as I have compared both URLs and they are exactly same.
https://app.contentful.com/spaces/some_space_id/environments/development/content_types
https://api.contentful.com/spaces/some_space_id/environments/development/content_types
DISCLAIMER: Before creating this question, I've checked here, here and here, and also checked Laravel docs.
Context
Laravel 9 full-stack
No JS framework on front-end, which means I'm using vanilla JS
The folders on Storage are setted like this:
storage
app
public
folder1
folder1A
folder1B
folder1C
etc
The files stored in each folder1X are .pdf format and I don't know its names.
No folders are empty, nor with invalid/corrupted files.
The problem
I have a FileController.php to download files that are inside a folder1X/ directory. The method to download it is as follows:
public function downloadFileFromStorage(Request $request): mixed
{
$dirpath = $request->dirpath; // dirpath = public/folder1/folder1X.
$files = Storage::allFiles($dirpath);
return response()->download(storage_path('app\\' . $files[0]));
}
(Note: dirpath is sent in a axios request by client and is also fetched from database on a previous request)
My Javascript CLI needs to enable the download of this file. The download is enabled by clicking on a button. The button calls downloadPDF(dirpath) which works as follows:
function downloadPDF(dirpath) {
axios.post('/download-pdf-file', { dirpath })
.then(
success => {
const url = success.data
const a = document.createElement('a')
a.download = 'file.pdf'
a.href = url
a.click()
},
error => {
console.log(error.response)
}
)
}
But, when I run this function, I get a about:blank#blocked error.
Attempts
Changed the a HTML DOM approach to a window.open(url) on client;
Changed response() to Storage::download($files[0], 'file-name.pdf'), and with this I also tried using Blob on client as follows:
success => {
const blob = new Blob([success.data], { type: 'application/pdf' })
const fileURL = URL.createObjectURL(blob)
window.openURL(fileURL)
},
Also mixed Blob with the a HTML DOM approach;
Changed storage_path argument to /app/public/ before concatenating to $files[0].
UPDATE
Following tips from #BenGooding and #cengsemihsahin, I changed files to the following:
JS
// FileDownload is imported on a require() at the code beginning
function downloadPDF(dirpath) {
axios({
url: '/download-pdf-file',
method: 'GET',
responseType: 'blob',
options: {
body: { dirpath }
}
}).then(
success => {
FileDownload(success.data, 'nota-fiscal.pdf')
}
)
}
PHP:
public function downloadFileFromStorage(Request $request): mixed
{
$dirpath = $request->dirpath; // dirpath = public/folder1/folder1X.
$files = Storage::allFiles($dirpath);
return Storage::download($files[0], 'filename.pdf');
}
and now it downloads a corrupted PDF that can't be opened.
Finally found the issue, and it was here:
axios({
url: '/download-pdf-file',
method: 'GET',
responseType: 'blob',
options: { // here
body: { dirpath } // here
}
})
Laravel's Request arrow operator -> can't fetch a GET body sent through options (At least, not on $request->key fashion; see more about it here) thus making me download a corrupted file - it wasn't fetching any file on Laravel as it didn't get any path at all.
Here is the solution I came with:
As I want to get a file in a route that doesn't change except for the 1X at folder1X, I'm processing the path obtained and sending the 1X as a GET query param:
let folderNumber = dirpath.split('/')
folderNumber = folderNumber[folderNumber.length].replaceAll('/', '')
axios({
url: '/download-pdf-file?folder=',
method: 'GET',
responseType: 'blob'
})
This way I don't pass the whole path to back-end and it's possible to get folderNumber by using $request->query():
public function downloadFileFromStorage(Request $request): mixed
{
$folderNumber = $request->query('folderNumber');
$folderPath = '/public/folder1/folder' . $folderNumber . '/';
$files = Storage::allFiles($folderPath);
return Storage::download($files[0], 'file-name.pdf');
}
In a nutshell:
To download files, use GET requests;
To send arguments within GET requests, use query parameters and fetch them with $request->query('keyname') (or find out another way. Good luck!);
So, first time posting, I usually just find the answer I need by looking through similar questions but this time I'm stumped.
First off, I'm self-taught and about on par with an entry-level developer at absolute best (for reference my highest score on CodeSignal for javascript is 725).
Here is my problem:
I'm working on an SSG eCommerce website using the Nuxt.js framework. The products are digital and so they need to be fulfilled by providing a time-limited download link when a customer makes a purchase. I have the product files stored in a private amazon s3 bucket. I also have a Netlify Serverless Function that when called with a GET request, generates and returns a pre-signed URL for the file (at the moment there is only one product but ideally it should generate pre-signed URLs based on a filename sent as a JSON event body key string since more products are planned in the near future, But I can figure that out once the whole thing is working).
The website is set up to generate dynamic routes based on the user's order number so they can view their previous orders(/pages/account/orders/_id.vue). I have placed a download button, nested in an element on this page so that each order has a button to download the files. The idea is that button press calls a function I defined in the methods object. The function makes an XMLHttpRequest to the endpoint URL of the netlify function. Netlify function returns pre-signed URL to function which returns pre-signed URL to the href property so that file can be downloaded by the user.
But no matter what I try, it fails to download the file. When the page loads it successfully calls the Netlify function and I get a response code 200 but the href property remains blank. Am I going about this the wrong way? There is clearly something I'm not understanding correctly, Any input is greatly appreciated.
Here is my code....
The download button:
<a
:download=<<MY_PRODUCT_NAME>>
:href="getmyurl()"
>
<BaseButton
v-if="order.status === 'complete'"
fit="auto"
appearance="light"
label="Download"
/>
</a>
function that button calls:
methods: {
getmyurl() {
let myurl = "";
const funcurl = <<MY_NETLIFY_FUNCTION_URL>>;
let xhr = new XMLHttpRequest();
xhr.open('GET', funcurl);
xhr.send();
xhr.onload = function() {
if (xhr.status != 200) {
alert(`Error ${xhr.status}: ${xhr.statusText}`);
} else {
myurl = xhr.response.Geturl
};
};
return myurl
},
Netlify function:
require( "dotenv" ).config();
const AWS = require('aws-sdk');
let s3 = new AWS.S3({
accessKeyId: process.env.MY_AWS_ACCESS_KEY,
secretAccessKey: process.env.MY_AWS_SECRET_KEY,
region: process.env.MY_AWS_REGION,
signatureVersion: 'v4',
});
exports.handler = function( event, context, callback ) {
var headers = {
"Access-Control-Allow-Origin" : "*",
"Access-Control-Allow-Headers": "Content-Type"
};
if ( event.httpMethod === "OPTIONS" ) {
callback(
null,
{
statusCode: 200,
headers: headers,
body: JSON.stringify( "OK" )
}
);
return;
}
try {
var resourceKey = process.env.MY_FILE_NAME
var getParams = {
Bucket: process.env.MY_S3_BUCKET,
Key: resourceKey,
Expires: ( 60 * 60 ),
ResponseCacheControl: "max-age=604800"
};
var getUrl = s3.getSignedUrl( "getObject", getParams );
var response = {
statusCode: 200,
headers: headers,
body: JSON.stringify({
getUrl: getUrl
})
};
} catch ( error ) {
console.error( error );
var response = {
statusCode: 400,
headers: headers,
body: JSON.stringify({
message: "Request could not be processed."
})
};
}
callback( null, response );
}
My NodeJS application writes logs with Winston. These logs then will be picked up by Promtail, to be saved to S3 by Loki and then processed in a dashboard in Grafana.
I want to create logs in Winston with dailyrotation of 30m. I want the logs to first be stored in my folder "/home/gad-web/gad-logs" when they are still being appended. And when they are rotated I want to move them to "/home/gad-web/gad-logs-rotated". Promtail will be looking at this specific folder.
I want to use dynamic filenames for different logs being written out, so that I can easily assign static labels to each file separetly using Promtail, rather than having to process each log line and assign a dynamic label to each line of log in one large file.
my file logger.mjs looks like this (formats, levels and other irrelevant data is left out):
const logDir = '/home/gad-web/gad-logs'
const logDirRotated = '/home/gad-web/gad-logs-rotated'
let winstonGdprProofFormat = winston.format.combine(...)
let winstonDailyRotateFileTransport = new winston.transports.DailyRotateFile({
frequency: '30m',
format: winstonGdprProofFormat,
filename: `${logDir}/all-gdpr-proof-%DATE%.log`,
datePattern: 'YYYY-MM-DD HH-mm',
})
// Move the file to another location after it is rotated, so it can be picked up by Promtail
winstonDailyRotateFileTransport.on('rotate', function (oldFilenamePath, newFilenamePath) {
let pathToMoveTo = `${logDirRotated}/${path.basename(oldFilenamePath)}`
fs.rename(oldFilenamePath, pathToMoveTo, function (err) {
if (err) throw err
})
})
let winstonTransports = []
if (process.env.environment !== 'local') {
winstonTransports.push(winstonConsoleTransport)
winstonTransports.push(winstonDailyRotateFileTransport)
} else {
winstonTransports.push(winstonConsoleWithColorsTransport)
}
const logger = winston.createLogger({
level: process.env.environment !== 'local' ? 'info' : 'debug',
levels: winstonLevels,
transports: winstonTransports,
})
export function log (obj) {
let { level, requestId, method, uri, msg, time, data } = obj
if (!level) {
level = 'info'
}
logger.log({
level: level,
requestId: requestId,
method: method,
uri: uri,
msg: msg,
time: time,
data: data,
})
}
It is being called in files that write logs like this:
import { log } from '../config/logger.mjs'
...
function writeRequestLog (start, request, requestId) {
let end = new Date().getTime()
let diff = end - start
log({ level: 'info', requestId: requestId, method: request.method, uri: request.path, msg: null, time: `${diff}ms`, data: JSON.stringify(request.query) })
}
Since the file is imported directly, it is immediately executed, and the winstonDailyRotateFileTransport is created using ${logDir}/all-gdpr-proof-%DATE%.log as the filename. How do I go around this instantiating this with a filename, so that I get daily rotated log files of 30minutes for a bunch of dynamically created different files?
I tried creating a Class in JS, but I quickly got into trouble because of the .on('rotate', ...) defined for the winstonDailyRotateFileTransport, and i'm also not sure of other implications creating a class for this might have (since this logger will be used a lot of times in my code)
Is it possible to alter the headers of the Request object that is received by the fetch event?
Two attempts:
Modify existing headers:
self.addEventListener('fetch', function (event) {
event.request.headers.set("foo", "bar");
event.respondWith(fetch(event.request));
});
Fails with Failed to execute 'set' on 'Headers': Headers are immutable.
Create new Request object:
self.addEventListener('fetch', function (event) {
var req = new Request(event.request, {
headers: { "foo": "bar" }
});
event.respondWith(fetch(req));
});
Fails with Failed to construct 'Request': Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit.
(See also How to alter the headers of a Response?)
Creating a new request object works as long as you set all the options:
// request is event.request sent by browser here
var req = new Request(request.url, {
method: request.method,
headers: request.headers,
mode: 'same-origin', // need to set this properly
credentials: request.credentials,
redirect: 'manual' // let browser handle redirects
});
You cannot use the original mode if it is navigate (that's why you were getting an exception) and you probably want to pass redirection back to browser to let it change its URL instead of letting fetch handle it.
Make sure you don't set body on GET requests - fetch does not like it, but browsers sometimes generate GET requests with the body when responding to redirects from POST requests. fetch does not like it.
You can create a new request based on the original one and override the headers:
new Request(originalRequest, {
headers: {
...originalRequest.headers,
foo: 'bar'
}
})
See also: https://developer.mozilla.org/en-US/docs/Web/API/Request/Request
Have you tried with a solution similar to the one in the question you mention (How to alter the headers of a Response?)?
In the Service Worker Cookbook, we're manually copying Request objects to store them in IndexedDB (https://serviceworke.rs/request-deferrer_service-worker_doc.html). It's for a different reason (we wanted to store them in a Cache, but we can't store POST requests because of https://github.com/slightlyoff/ServiceWorker/issues/693), but it should be applicable for what you want to do as well.
// Serialize is a little bit convolved due to headers is not a simple object.
function serialize(request) {
var headers = {};
// `for(... of ...)` is ES6 notation but current browsers supporting SW, support this
// notation as well and this is the only way of retrieving all the headers.
for (var entry of request.headers.entries()) {
headers[entry[0]] = entry[1];
}
var serialized = {
url: request.url,
headers: headers,
method: request.method,
mode: request.mode,
credentials: request.credentials,
cache: request.cache,
redirect: request.redirect,
referrer: request.referrer
};
// Only if method is not `GET` or `HEAD` is the request allowed to have body.
if (request.method !== 'GET' && request.method !== 'HEAD') {
return request.clone().text().then(function(body) {
serialized.body = body;
return Promise.resolve(serialized);
});
}
return Promise.resolve(serialized);
}
// Compared, deserialize is pretty simple.
function deserialize(data) {
return Promise.resolve(new Request(data.url, data));
}
If future readers have a need to also delete keys in the immutable Request/Response headers and also want high fidelity to the immutable headers, you can effectively clone the Header object:
const mutableHeaders = new Headers();
immutableheaders.forEach((value, key, parent) => mutableHeaders.set(key, value));
mutableHeaders.delete('content-encoding');
mutableHeaders.delete('vary');
mutableHeaders['host'] = 'example.com';
// etc.
You can then create a new Request and pass in your mutableHeaders.
This is preferred to the accepted answer because if you have the need to proxy a Request, you don't want to manually specify every possible header while including the Cloudflare, AWS, Azure, Google, etc. custom CDN headers.
Background Info
The reason why the headers are immutable or read-only in a Request is because:
interface Request extends Body {
readonly cache: RequestCache;
readonly credentials: RequestCredentials;
readonly destination: RequestDestination;
readonly headers: Headers;
readonly integrity: string;
...
The interface for Headers is:
interface Headers {
append(name: string, value: string): void;
delete(name: string): void;
get(name: string): string | null;
has(name: string): boolean;
set(name: string, value: string): void;
forEach(callbackfn: (value: string, key: string, parent: Headers) => void, thisArg?: any): void;
}