My NodeJS application writes logs with Winston. These logs then will be picked up by Promtail, to be saved to S3 by Loki and then processed in a dashboard in Grafana.
I want to create logs in Winston with dailyrotation of 30m. I want the logs to first be stored in my folder "/home/gad-web/gad-logs" when they are still being appended. And when they are rotated I want to move them to "/home/gad-web/gad-logs-rotated". Promtail will be looking at this specific folder.
I want to use dynamic filenames for different logs being written out, so that I can easily assign static labels to each file separetly using Promtail, rather than having to process each log line and assign a dynamic label to each line of log in one large file.
my file logger.mjs looks like this (formats, levels and other irrelevant data is left out):
const logDir = '/home/gad-web/gad-logs'
const logDirRotated = '/home/gad-web/gad-logs-rotated'
let winstonGdprProofFormat = winston.format.combine(...)
let winstonDailyRotateFileTransport = new winston.transports.DailyRotateFile({
frequency: '30m',
format: winstonGdprProofFormat,
filename: `${logDir}/all-gdpr-proof-%DATE%.log`,
datePattern: 'YYYY-MM-DD HH-mm',
})
// Move the file to another location after it is rotated, so it can be picked up by Promtail
winstonDailyRotateFileTransport.on('rotate', function (oldFilenamePath, newFilenamePath) {
let pathToMoveTo = `${logDirRotated}/${path.basename(oldFilenamePath)}`
fs.rename(oldFilenamePath, pathToMoveTo, function (err) {
if (err) throw err
})
})
let winstonTransports = []
if (process.env.environment !== 'local') {
winstonTransports.push(winstonConsoleTransport)
winstonTransports.push(winstonDailyRotateFileTransport)
} else {
winstonTransports.push(winstonConsoleWithColorsTransport)
}
const logger = winston.createLogger({
level: process.env.environment !== 'local' ? 'info' : 'debug',
levels: winstonLevels,
transports: winstonTransports,
})
export function log (obj) {
let { level, requestId, method, uri, msg, time, data } = obj
if (!level) {
level = 'info'
}
logger.log({
level: level,
requestId: requestId,
method: method,
uri: uri,
msg: msg,
time: time,
data: data,
})
}
It is being called in files that write logs like this:
import { log } from '../config/logger.mjs'
...
function writeRequestLog (start, request, requestId) {
let end = new Date().getTime()
let diff = end - start
log({ level: 'info', requestId: requestId, method: request.method, uri: request.path, msg: null, time: `${diff}ms`, data: JSON.stringify(request.query) })
}
Since the file is imported directly, it is immediately executed, and the winstonDailyRotateFileTransport is created using ${logDir}/all-gdpr-proof-%DATE%.log as the filename. How do I go around this instantiating this with a filename, so that I get daily rotated log files of 30minutes for a bunch of dynamically created different files?
I tried creating a Class in JS, but I quickly got into trouble because of the .on('rotate', ...) defined for the winstonDailyRotateFileTransport, and i'm also not sure of other implications creating a class for this might have (since this logger will be used a lot of times in my code)
Related
I need to access the fileHandler object of my logger so I can flush the buffer to the file.
This is my program:
import * as log from "https://deno.land/std#0.75.0/log/mod.ts"
import { Application } from "https://deno.land/x/oak#v6.3.1/mod.ts";
const app = new Application()
const port = 7001
await log.setup({
handlers:{
file: new log.handlers.FileHandler("DEBUG",{
filename: "logger.log",
formatter: lr => {
return `${lr.datetime.toISOString()} [${lr.levelName}] ${lr.msg}`
}
})
},
loggers: {
default: {
level: "DEBUG",
handlers: ["file"]
}
}
})
const logger = log.getLogger()
logger.debug("hi there")
app.use((ctx) => {
ctx.response.body = 'Hi there'
})
console.log(`listening on port ${port}`)
app.listen({ port })
My problem is that the log message is never being written to file.
If I remove the last line ( app.listen() ) it Does write to the file because the process ends.
But if I leave it listening process never ends so the log buffer is never flushed.
If I interrupt the process with Ctrl-C it doesn't write it either
Documentation (https://deno.land/std#0.75.0/log/README.md) says I can force log flush using the flush method from FileHandler. But I don't know how to access the fileHandler object.
So I've tried this:
const logger = log.getLogger()
logger.debug("hi there")
logger.handlers[0].flush()
And it works! but only as javascript, NOT as typescript
As typescript I get this error:
error: TS2339 [ERROR]: Property 'flush' does not exist on type 'BaseHandler'.
logger.handlers[0].flush()
Well, I found a solution.
I just have to import the FileHandler class and cast my handler down from BaseHandler to FileHandler.
So I added this line among the imports:
import { FileHandler } from "https://deno.land/std#0.75.0/log/handlers.ts"
And then after creating the logger:
logger.debug("hi there")
const fileHandler = <FileHandler> logger.handlers[0]
fileHandler.flush()
Looks a little weird, I still guess there must be less quirky / more semantic solution for this. But it works ok.
Let us just recap with the help of Santi's answer.
In my experience logs in file work fine in an ending program. I mean a program which dies by itself or with Deno.exit(0). Problem occurs in a never ending loop. In this case logs don't append in their files. Below is how to overcome this situation :
// dev.js : "I want my logs" example
import {serve} from "https://deno.land/std#0.113.0/http/server_legacy.ts";
import * as log from "https://deno.land/std#0.113.0/log/mod.ts";
// very simple setup, adapted from the official standard lib https://deno.land/std#0.113.0/log
await log.setup({
handlers: {
file: new log.handlers.FileHandler("WARNING", {
filename: "./log.txt",
formatter: "{levelName} {msg}",
}),
},
loggers: {
default: {
level: "DEBUG",
handlers: ["file"],
},
},
});
// here we go
let logger;
logger = log.getLogger();
logger.warning('started');
const fileHandler = logger.handlers[0];
await fileHandler.flush(); // <---- the trick, need to flush ! Thanks Santi
// loop on requests
const srv = serve(`:4321`);
for await (const request of srv) {
request.respond({body: 'bonjour', status: 200});
logger.warning('hit !');
fileHandler.flush(); // <---- flush again
}
Run with
$ deno run -A dev.js
And check the file log.txt with the following trigger
$ curl localhost:4321
This is a very low tech, problably adding important delay to the process. The next level will be to fire a time event to flush every minute or so.
Sentry by defaults has integration for console.log to make it part of breadcrumbs:
Link: Import name: Sentry.Integrations.Console
How can we make it to work for bunyan logger as well, like:
const koa = require('koa');
const app = new koa();
const bunyan = require('bunyan');
const log = bunyan.createLogger({
name: 'app',
..... other settings go here ....
});
const Sentry = require('#sentry/node');
Sentry.init({
dsn: MY_DSN_HERE,
integrations: integrations => {
// should anything be handled here & how?
return [...integrations];
},
release: 'xxxx-xx-xx'
});
app.on('error', (err) => {
Sentry.captureException(err);
});
// I am trying all to be part of sentry breadcrumbs
// but only console.log('foo'); is working
console.log('foo');
log.info('bar');
log.warn('baz');
log.debug('any');
log.error('many');
throw new Error('help!');
P.S. I have already tried bunyan-sentry-stream but no success with #sentry/node, it just pushes entries instead of treating them as breadcrumbs.
Bunyan supports custom streams, and those streams are just function calls. See https://github.com/trentm/node-bunyan#streams
Below is an example custom stream that simply writes to the console. It would be straight forward to use this example to instead write to the Sentry module, likely calling Sentry.addBreadcrumb({}) or similar function.
Please note though that the variable record in my example below is a JSON string, so you would likely want to parse it to get the log level, message, and other data out of it for submission to Sentry.
{
level: 'debug',
stream:
(function () {
return {
write: function(record) {
console.log('Hello: ' + record);
}
}
})()
}
filters modify the message, and rewrites modify the meta.
Using winston v2^, What are my options if I want to filter out (not to print) certain messages?
I know the question asks for winston 2, but maybe winston 3 is more relevant since this thread is ~8 months old
for winston 2, read https://github.com/winstonjs/winston/blob/2.4.0/docs/transports.md#console-transport and figure it out, probably need to define a formatter as that's a prop in the options for that versions Console transport
figured it out by reading the source code for winston. It seems to depend on this module, logform.
const logform = require('logform');
// const { MESSAGE } = require('triple-beam'); // prop for info in winston formater to expose the shown message
function filterMessagesFormat(filterFunc) {
const formatFunc = (info) => {
if (filterFunc(info.message)) return info;
return null;
};
const format = logform.format(formatFunc);
format.transform = formatFunc;
return format;
}
usage is the same as their formats they define like json, colorize, simple, and others
in the options for winston.createLogger(options), you define a prop called transports and you want one of the values of that array to be the output from this function like:
transports: [
new winston.transports.Console({
format: winston.format.combine(
filterMessagesFormat((msg) => msg !== 'useless message'),
),
handleExceptions: false,
}),
my personal logger creator func https://gist.github.com/jtara1/3128cc6ed3dbea6d507b30967ab0e197 which shows a includes the change to allow a filter func to be used
I am attempting to update an entity in my datastore kind using sample code from here https://cloud.google.com/datastore/docs/reference/libraries. The actual code is something like this:
/ Imports the Google Cloud client library
const Datastore = require('#google-cloud/datastore');
// Your Google Cloud Platform project ID
const projectId = 'YOUR_PROJECT_ID';
// Creates a client
const datastore = new Datastore({
projectId: projectId,
});
// The kind for the new entity
const kind = 'Task';
// The name/ID for the new entity
const name = 'sampletask1';
// The Cloud Datastore key for the new entity
const taskKey = datastore.key([kind, name]);
// Prepares the new entity
const task = {
key: taskKey,
data: {
description: 'Buy milk',
},
};
// Saves the entity
datastore
.save(task)
.then(() => {
console.log(`Saved ${task.key.name}: ${task.data.description}`);
})
.catch(err => {
console.error('ERROR:', err);
});
I tried to create a new entity using this code. But when I ran this code and checked the datastore console, there were no entitites created.Also, I am unable to update an existing entity. What could be the reason for this?
I am writing the code in Google Cloud Functions.This is the log when I run this function:
{
insertId: "-ft02akcfpq"
logName: "projects/test-66600/logs/cloudaudit.googleapis.com%2Factivity"
operation: {…}
protoPayload: {…}
receiveTimestamp: "2018-06-15T09:36:13.760751077Z"
resource: {…}
severity: "NOTICE"
timestamp: "2018-06-15T09:36:13.436Z"
}
{
insertId: "000000-ab6c5ad2-3371-429a-bea2-87f8f7e36bcf"
labels: {…}
logName: "projects/test-66600/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2018-06-15T09:36:17.865654673Z"
resource: {…}
severity: "ERROR"
textPayload: "Warning, estimating Firebase Config based on GCLOUD_PROJECT. Intializing firebase-admin may fail"
timestamp: "2018-06-15T09:36:09.434Z"
}
I have tried the same code and it works for me. However, I have noticed that there was a delay before the entities appeared in Datastore. In order to update and overwrite existing entities, use .upsert(task) instead of .save(task) (link to GCP documentation). You can also use .insert(task) instead of .save(task) to store new entities.
Also check that the project id is correct and that you are inspecting the entities for the right kind.
Basically, I am trying to get a wss feed going from Poloniex, and update a collection with it so that I can have 'latest' prices in a collection (I will update and overwrite existing entries) and show it on a web page. For now, I got the wss working and am just trying to insert some of the data in the collection to see if it works, but it doesn't and I can't figure out why!
Note: The collection works, I've manually inserted a record with the shell.
Here is the code I have now:
import { Meteor } from 'meteor/meteor';
import * as autobahn from "autobahn";
import { Mongo } from 'meteor/mongo'
import { SimpleSchema } from 'meteor/aldeed:simple-schema'
//quick DB
Maindb = new Mongo.Collection('maindb');
Maindb.schema = new SimpleSchema({
place: {type: String},
pair: {type: String},
last: {type: Number, defaultValue: 0}
});
Meteor.startup(() => {
var wsuri = "wss://api.poloniex.com";
var Connection = new autobahn.Connection({
url: wsuri,
realm: "realm1"
});
Connection.onopen = function(session)
{
function tickerEvent (args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}
session.subscribe('ticker', tickerEvent);
Connection.onclose = function () {
console.log("Websocket connection closed");
}
}
Connection.open();
});
The console logs the feed but then the insert does not work.
I looked online and it said that to get an insert to work when in a 'non Meteor' function, you need to use Meteor.bindEnvironment which I did:
I changed
function tickerEvent (args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}
which became
var tickerEvent = Meteor.bindEnvironment(function(args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}); tickerEvent();
Which doesn't do anything - not even print the feed on my console. Using this same structure but simply removing Meteor.bindEnvironmentprints again to the console but doesn't update.
Am I doing something wrong?