I'm actually using node-bunyan to manage log information through elasticsearch and logstash and I m facing a problem.
In fact, my log file has some informations, and fills great when I need it.
The problem is that elastic search doesn't find anything on
http://localhost:9200/logstash-*/
I have an empty object and so, I cant deliver my log to kibana.
Here's my logstash conf file :
input {
file {
type => "nextgen-app"
path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
codec => "json"
}
}
output {
elasticsearch {
host => "localhost"
protocol => "http"
}
}
And my js code :
log = bunyan.createLogger({
name: 'myapp',
streams: [
{
level: 'info',
path: './app/logs/nextgen-info-log.log'
},
{
level: 'error',
path: './app/logs/nextgen-error-log.log'
}
]
})
router.all('*', (req, res, next)=>
log.info(req.url)
log.info(req.method)
next()
)
NB : the logs are well written in the log files. The problem is between logstash and elasticsearch :-/
EDIT : querying http://localhost:9200/logstash-*/ gives me "{}" an empty JSON object
Thanks for advance
Here is how we managed to fix this and other problems with Logstash not processing files correctly on Windows:
Install the ruby-filewatch patch as explained here:
logstash + elasticsearch : reloads the same data
Properly configure the Logstash input plugin:
input {
file {
path => ["C:/Path/To/Logs/Directory/*.log"]
codec => json { }
sincedb_path => ["C:/Path/To/Config/Dir/sincedb"]
start_position => "beginning"
}
}
...
"sincedb" keeps track of your log files length, so it should have one line per log file; if not, then there's something else wrong.
Hope this helps.
Your output scope looks not complete. Here's the list of the output parameters http://logstash.net/docs/1.4.2/outputs/elasticsearch
Please, try:
input {
file {
type => "nextgen-app"
path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
codec => "json"
}
}
output {
elasticsearch {
host => "localhost"
port => 9200
protocol => "http"
index => "logstash-%{+YYYY.MM.dd}"
}
}
Alternatively, you can try the transport protocol:
output {
elasticsearch {
host => "localhost"
port => 9300
protocol => "transport"
index => "logstash-%{+YYYY.MM.dd}"
}
}
I also recommend using Kibana as a data viewer. You can download it at https://www.elastic.co/downloads/kibana
Related
I would like to obtain all (or a subset) of my records from an Algolia index and access them via GraphQL.
I know there is a Gatsby plugin that allows you to do the opposite i.e., add data from a GraphQL query to Algolia, but not the other way around.
I have been able to get the tutorial for adding GraphQL data to work, but I have not had any success when trying to go beyond hardcoded arrays (this is in the gatsby-node.js file):
const algoliasearch = require("algoliasearch/lite");
const searchClient = algoliasearch(
process.env.GATSBY_ALGOLIA_APP_ID,
process.env.GATSBY_ALGOLIA_SEARCH_KEY
)
const searchIndex = searchClient.initIndex(process.env.GATSBY_ALGOLIA_INDEX_NAME)
exports.sourceNodes = ({ actions, createNodeId, createContentDigest }) => {
searchIndex.search("", {
attributesToRetrieve: ["name", "url"]
}).then(({ hits }) => {
hits.forEach(hit => {
const node = {
name: hit.name,
url: hit.url,
id: createNodeId(`hit-${hit.name}`),
internal: {
type: "hit",
contentDigest: createContentDigest(hit),
},
}
actions.createNode(hit)
})
});
}
While the console successfully logs the array of nodes, and the verbose Gatsby deploy output includes the "hit" node as a node type, they do not appear in the GraphQL explorer.
Any help is greatly appreciated, thank you!
I'm new to both Strapi and Mongoose, so I apologise if this is a stupid question.
Following the docs (https://strapi.io/documentation/developer-docs/latest/development/backend-customization.html) I'm trying to create a custom query in Strapi in which I want to return the whole collection called people sorted by name desc. But when I hit the endpoint I get a 500 error and checking the terminal the error message is CastError: Cast to ObjectId failed for value "alldesc" at path "_id" for model "people".
Here's my code:
services/people.js
module.exports = {
findByNameDesc() {
const result = strapi
.query("people")
.model.find()
.sort({ name: "descending" });
return result.map((entry) => entry.toObject());
},
};
controllers/people.js
module.exports = {
async alldesc(ctx) {
const entities = await strapi.services.people.findByNameDesc(ctx);
return entities.map((entity) =>
sanitizeEntity(entity, { model: strapi.models.people })
);
},
};
config/routes.json
{
"routes": [
...
{
"method": "GET",
"path": "/people/alldesc",
"handler": "people.alldesc",
"config": {
"policies": []
}
}
]
}
What am I doing wrong?
UPDATE: even when removing .sort({ name: "descending" }); from the query, the error is still there, so I'm thinking that maybe there's something wrong in the way I use the service in the controller?
The problem was in routes.json. Basically seems like Strapi doesn't like the slash / so instead of /people/alldesc I tried /people-alldesc and it worked.
Also in the service there's no need for return result.map((entry) => entry.toObject());, that causes anther error, simply doing return result works.
I'm trying to deploy my first next.js project on vercel. Locally everything is working.
My problem: When deploying (command: next build) the web app I get the message "Generating static sites (0/2000)" and then nothing happens. The deployment is cancelled after 1h (time expiration).
The problem is somewhere in the (simplified) code below. Here's why: when I deploy the project without the part that follows after const content the deployment is successful - so basically instead of having singleProductResponse AND contentResponse as props, I only have singleProductResponse..
I'm a little stuck and don't know how to solve this. Can someone tell me what I'm doing wrong? Thanks a lot!!
const Item = ({ singleProductResponse, contentResponse }) => {
const router = useRouter();
if (router.isFallback) {
return <div>Loading...</div>;
}
return (
<div className={styles.section}>
<Overview
singleProductResponse={singleProductResponse}
contentResponse={contentResponse}
/>
</div>
);
};
export async function getStaticPaths() {
const itemResponse = await knex("Items");
const paths = itemResponse.map((product) => ({
params: {
brand: product.brand,
item: product.item,
},
}));
return {
paths,
fallback: true,
};
}
export async function getStaticProps({ params }) {
try {
const itemData = await knex("Items").where(
"item",
"like",
`${params.item}`
);
const singleProductResponse = itemData[0];
//!!!!!!!!!when leaving the following part out: deployment is successful!!!!!!!!!!!!
const content = itemData[0].contentList;
const splitContent = content.split(", ");
const contentArray =
typeof splitContent === "string"
? [splitContent]
: splitContent;
const result = await knex("Cos")
.leftJoin("Actives", "Cos.content", "ilike", "Actives.content")
.leftJoin("Alcohol", "Cos.content", "ilike", "Alcohol.content")
.column([
"Cos.content",
"function",
"translationPlant",
"categoryAlcohol",
])
.where((qb) => {
contentArray.forEach((word) => {
qb.orWhere("Cos.content", word);
});
return qb;
});
return {
props: { singleProductResponse, contentResponse },
revalidate: 1,
};
} catch (err) {
console.log(err);
return {
redirect: {
destination: "/",
permanent: false,
},
};
}
}
UPDATE
After digging deeper: I think the problem is that the part after const content is too slow in building process.
When running next build, as well as when deploying on vercel the building process stops after approx. 1h. I suppose this is because of the limit of 45min for building process.
The last message I get is "Generating static pages (1000/2000)" (in vercel and locally) and "Build failed" (vercel). I don't get any other error-messages (also not in the catch block).
I've already tried to optimize the part after const content (each table has an index (clustered indexes -> primary keys), I've redesigned the tables (only 4 tables to join, instead of 6), eliminated everything unnecessary in the query and checked that the database (postgres hosted on heroku - hobby-basic) is also in the US). The performance is better, but still not enough. Does anyone have some suggestions for improvement? TTFB might be a somehow slow.
I just want my app download a file from the server by using react-native-fetch-blob. The problem is, where do the file stored? I just console.log the callback from react-native-fetch-blob and got this object
React-native-fetch-blob object callback
this is my code
alert("downloading");
RNFetchBlob
.config({
useDownloadManager : true,
fileCache : true
})
.fetch('GET', 'http://fontawesome.io/assets/font-awesome-4.7.0.zip', {})
.then((res) => {
console.log(res);
alert("Download");
alert('The file saved to ', res.path());
})
Any solution?
To download a file directly with rn-fetch-blob, you need to set fileCache as true.
btw, react-native-fetch-blob is not maintained anymore, use rn-fetch-blob instead
document of download file directly
RNFetchBlob
.config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
})
.fetch('GET', 'http://www.example.com/file/example.zip', {
//some headers ..
})
.then((res) => {
// the temp file path
console.log('The file saved to ', res.path())
})
function downloadFile(url,fileName) {
const { config, fs } = RNFetchBlob;
const downloads = fs.dirs.DownloadDir;
return config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
addAndroidDownloads : {
useDownloadManager : true,
notification : false,
path: downloads + '/' + fileName + '.pdf',
}
})
.fetch('GET', url);
}
use this answer it will work for sure
I am using it and it works perfectly.
You just need to get the path like this.
var filePath = res.path();
this is where your file is stored.
I am building an electron app which handles file uploads, I am using dialog to get the files from user, I need to send the files to server but I am getting the files path but I get errors when sending them . I am using Vue resource for requests. Below is my code:
<template>
<div>
<button #click="uploadAct()" class="primary">New Upload </button>
</div>
</template>
<script>
const {dialog} = require('electron').remote
const fs = require('fs')
import reqApi from '../../api/something'
export default {
methods: {
uploadAct () {
dialog.showOpenDialog({
title: 'Upload Attachments',
buttonLabel: 'Upload',
filters: [
{name: 'Images', extensions: ['jpg', 'png', 'gif']},
{name: 'All Files', extensions: ['*']}
],
properties: ['openFile', 'multiSelections']
}, function (filenames) {
if (filenames) {
let d = ''
filenames.forEach(function (element) {
d = element
})
// here i get a path of file correctly something like /path/to/file.jpg
reqApi.uploadattachmnets({photo: fs.createReadStream(d)}).then(
(response) => {
console.log(response)
},
(error) => {
console.log(error)
})
// })
}
})
}
}
}
</script>
I however end up with error on the request , any help will be appreciated .
Probably a typo but you have a call to an API:
carApi.uploadattachmnets({photo: fs.createReadStream(d)})
which is different to the one you are importing:
import reqApi from '../../api/something'
If not the above I'd assume this is going to be a CORS issue if Postman is already able to send files and receive the correct response from the endpoint. Without more info I'd recommend looking at: https://www.html5rocks.com/en/tutorials/cors/#toc-making-a-cors-request
For a more specific response you'd need to post the API code so we can review how you are sending the file.