i18next not loading files in backend - javascript

Hi I am using i18next in my Node JS Application.
The following is the config code for i18next:
const i18nextBackend = require('i18next-node-fs-backend');
i18n
.use(i18nextBackend)
.init({
fallbackLng: 'en',
debug: true,
backend: {
loadPath: 'locales/{{lng}}.json',
addPath: 'locales/{{lng}}.json',
jsonIndent: 2,
},
}, (err, t) => {
// init set content
console.log(t);
// console.log('INIT DONE');
});
console.log(i18n.t('hello'));
I have en.js in my locales folder in which JSON format data is there. But the file is unable to load. Can anybody tell how to give the path name in loadPath properly?

looks like you did not add a backend plugin at all. https://github.com/i18next/i18next-node-fs-backend filesystem backend might be what you search

Related

Write to disk option for Vite

recently I have started working with vite on a couple of small projects and found it very interesting, however got a blocker once tried to work on ExpressJS + Svelte coupled project.
I usually use Express as BFF (Backend For Frontend) when it comes to working on rather more serious projects since it allows me to go for HTTPOnly cookies as well as proxy gateway for the frontend. However for development (specially when it comes to oauth2) it is hard to develop the spa separated form the server so what I usually do with webpack is activating the WriteToDisk option for devserver which then allows me to have my development build in the dist folder.
Example with webpack will be something like the webpack config below for the frontend:
module.exports = {
devServer: {
devMiddleware: {
writeToDisk: true,
},
},
//...
}
and then on the server basically rendering the dist as static folder:
app.get(
"*",
(req, res, next) => {
if (req.session.isAuth) return next();
else return res.redirect(staticURL);
},
(req, res) => {
return res.sendFile(staticProxyPage());
}
);
My problem
I can not find in vite's documentation any APIs to do something like this, does anyone have any experience with such cases?
if it is possible with the help of plugins, can you please provide references to the plugin or dev logs of it?
Many Thanks :)
Here is an example plugin I was able to hack together. You might need to modify it to suit your needs:
// https://vitejs.dev/guide/api-plugin.html#universal-hooks=
import {type Plugin} from 'vite';
import fs from 'fs/promises';
import path from 'path';
const writeToDisk: () => Plugin = () => ({
name: 'write-to-disk',
apply: 'serve',
configResolved: async config => {
config.logger.info('Writing contents of public folder to disk', {timestamp: true});
await fs.cp(config.publicDir, config.build.outDir, {recursive: true});
},
handleHotUpdate: async ({file, server: {config, ws}, read}) => {
if (path.dirname(file).startsWith(config.publicDir)) {
const destPath = path.join(config.build.outDir, path.relative(config.publicDir, file));
config.logger.info(`Writing contents of ${file} to disk`, {timestamp: true});
await fs.access(path.dirname(destPath)).catch(() => fs.mkdir(path.dirname(destPath), {recursive: true}));
await fs.writeFile(destPath, await read());
}
},
});
In short, it copies the content of the publicDir (public) to the outDir (dist).
Only thing missing is the ability to watch the public folder and copy whatever changed to the dist folder again. It will also re-write the file if any of the files within the public folder changes.
Note: The use of fs.cp relies on an experimental API from node 16.
Feel free to modify as you please. Would be nice to be able to specify file globs to include/exclude, etc.

il8n not working in react app after converting to single spa

After converting a react app to single spa which had il8n implemented I am facing a problem where translation.json cannot be accessed hence not fetching the labels.
Should I modify something in the webpack.config.js to get it right
import i18n from "i18next";
import { initReactI18next } from "react-i18next";
import i18nextHttpBackend from "i18next-http-backend";
import Cookies from "js-cookie";
import LanguageDetector from "i18next-browser-languagedetector";
i18n
.use(i18nextHttpBackend)
.use(initReactI18next)
.use(LanguageDetector)
.init({
lng: Cookies.get("locale") || "es",
fallbackLng: "en",
debug: false,
supportedLngs: ["en", "es"],
interpolation: {
escapeValue: false,
},
});
export default i18n;
il8n is imported in App.js
import "./i18n";
Initially before converting to single spa the app was working fine and making a call to
http://localhost:3000/locales/en/translation.json
but after converting the app to single spa the get request would fail.
http://single-spa-playground.org/locales/en/translation.json
I did follow this tutorial https://www.youtube.com/watch?v=W8oaySHuj3Y&list=PLLUD8RtHvsAOhtHnyGx57EYXoaNsxGrTU&index=13 to convert the react app to single spa.
WebPack Config
const { merge } = require("webpack-merge");
const singleSpaDefaults = require("webpack-config-single-spa-react");
const Dotenv = require("dotenv-webpack");
module.exports = (webpackConfigEnv, argv) => {
console.log(webpackConfigEnv);
const defaultConfig = singleSpaDefaults({
orgName: "WHATEVR",
projectName: "WHATEVER",
webpackConfigEnv,
argv,
});
return merge(defaultConfig, {
// modify the webpack config however you'd like to by adding to this object
plugins: [new Dotenv()],
devServer: {
headers: {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods":
"GET, POST, PUT, DELETE, PATCH, OPTIONS",
"Access-Control-Allow-Headers":
"X-Requested-With, content-type, Authorization",
},
},
});
};
Tried Solution but still not solved
Reactjs - error loading translation files
The issue is that previously, the React app also served as the server that provided the index.html file along with other static assets (eg. your localized translation json files). In single-spa, that is no longer the case; that is instead now the root-config. You'll need to update your i18next-http-backend loadPath configuration so that the library tries to retrieve them from the right path which is no longer the root url. Without being familiar with what you want to achieve, you have two options:
use __webpack_public_path__ to dynamically create the correct URL to point to the assets served by this microfrontend, eg. loadPath: `${__webpack_public_path__} /locales/{{lng}}/{{ns}}.json`,
if you have a separate i18n service, point the URL to that. This may also require crossDomain and withCredentials depending on how that is also configured.
The answer of #filoxo was very helpful. What didn't help me though was the fact that we needed a public folder outside of src. When deploying my app to s3, it would never add the translation files.
So I moved the locales inside src and used webpack copy plugin to move the file into the dist folder. So setup-wise this looks like this:
Webpack config:
const CopyPlugin = require('copy-webpack-plugin');
plugins: [
new CopyPlugin({
patterns: [
{ from: 'src/assets/locales', to: 'locales' }
]
})
]
i18n Settings:
i18n
.use(Backend)
.use(LanguageDetector)
.use(initReactI18next)
.init({
fallbackLng: 'en',
supportedLngs: ['en', 'de'],
backend: {
loadPath: `${__webpack_public_path__}locales/{{lng}}-translation.json`
}
});
Hope this helps someone who was like me trying to figure out why it wouldn't load the translations.

Sapper/Svelte.js - How to specify client-side assets location?

I have a Sapper.js application that I have successfully running on AWS Lambda. Lambda is able to deliver the server-side generated HTML created by Sapper to AWS API Gateway which then serves the app to the user. I am using S3 to host the client side assets (scripts, webpack chunks, etc). The S3 bucket is on a different domain than API Gateway.
The issue I'm having is that I need to set an asset prefix for these scripts so that Sapper can find them. Currently all of my client side scripts include relative links and look like this: <script src="/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script> I need to have them look like this: <script src="https://AWS_S3_BUCKET_ENDPOINT.com/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script>
Looking in the Sapper docs, I see that I can specify a base url for the client and server. However, changing this base url breaks my app and causes the Lambda rendering the pages to return 404 errors.
I know that when using, say, Next.js, I can accomplish this by modifying the next.config.js file to include the following:
module.exports = {
assetPrefix: "https://AWS_S3_BUCKET_ENDPOINT.com/client",
}
But I don't know how to do this in Sapper. Do I need to modify the bundler (using webpack) config? Or is there some other way?
Thank you.
I think I've figured it out.
I had to change two sapper files. First I went into sapper/dist/webpack.js and modified it like so:
'use strict';
var __chunk_3 = require('./chunk3.js');
var webpack = {
dev: __chunk_3.dev,
client: {
entry: () => {
return {
main: `${__chunk_3.src}/client`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/client`,
filename: '[hash]/[name].js',
chunkFilename: '[hash]/[name].[id].js',
// change this line to point to the s3 bucket client key
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com/client"
};
}
},
server: {
entry: () => {
return {
server: `${__chunk_3.src}/server`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/server`,
filename: '[name].js',
chunkFilename: '[hash]/[name].[id].js',
libraryTarget: 'commonjs2'
};
}
},
serviceworker: {
entry: () => {
return {
'service-worker': `${__chunk_3.src}/service-worker`
};
},
output: () => {
return {
path: __chunk_3.dest,
filename: '[name].js',
chunkFilename: '[name].[id].[hash].js',
// change this line to point to the s3 bucket root
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com"
}
}
}
};
module.exports = webpack;
//# sourceMappingURL=webpack.js.map
Then I had to modify sapper/runtime/server.mjs so that the main variable points to the bucket like so:
...
const main = `https://AWS_S3_BUCKET_ENDPOINT.com/client/${file}`;
...
Testing with the basic sapper webpack template, I can confirm that the scripts are loading from the s3 bucket successully. So far this all looks good. I will mess around with the sapper build command next to make it so I can pass these hacks in as command line arguments so I don't have to hardcode them every time.
Now, I'm not sure if this will hold up as the app becomes more complicated. Looking into the sapper/runtime/server.mjs file, I see that the req.baseUrl property is referenced in several different locations and I don't know if my hacks will cause any issues with this. Or anywhere else in sapper for that matter.
If anyone with more experience with the Sapper internals is reading, let me know in the comments if I screwed something up 👍

Unable to send file from local to server in node js project using FTP

I want to upload file from my local machine to server using FTP in node js project.
my project structure-
-media
-one.jpg
-two.jpg
-node_modules
-views
-index.js
my code -
client = new ftpClient(config, options);
client.connect(function () {
client.upload(['./media/five.png'], 'product', {
baseDir: 'test',
overwrite: 'older'
}, function (result) {
console.log(result);
});
});
I am getting this error-
Error: The system cannot find the path specified.
if I am passing full url inspite of ./media/five.png then I am getting this error-
Error: The parameter is incorrect.
how can I send my file to server?
please help
Thanks in advance
as mentioned here https://www.npmjs.com/package/ftp-client
baseDir - local base path relative to the remote directory, e.g. if you want to upload file uploads/sample.js to public_html/uploads, baseDir has to be set to uploads
Moreover, your second parameter 'product' should be the path in the destination server
If you want to upload file from your local 'media' directory to remote directory /product/media (assuming the 'product' directory is located at the root of the server) the parameters should look as follow:
client = new ftpClient(config, options);
client.connect(function () {
client.upload(['./media/five.png'], '/product/media', {
baseDir: 'media',
overwrite: 'older'
}, function (result) {
console.log(result);
});
});
Note: You should use the 'path' node module to join urls and path strings - https://nodejs.org/api/path.html

Loading JSON config for paths using express and node

So I have what is probably a noob question about routing in express and node.js
I have a middleware that loads a JSON file that contains all of the paths and configuration for these paths. The configuration file looks something like this
module.exports = {
"/account" :{
template: "dashboard/account",
page_title: "My Account",
// more path configs....
},
"/account/:id/users" : {
template: "dashboard/account/account-users",
page_title: "Account Users",
// more path configs....
}
}
And the middleware method that gets called in my app server when the app is created
app.use(middleware.getRouteConfig);
looks like this
var routeConfig = require('../config/route-config'); // path to the JSON above
getRouteConfig: function (req, res, next) {
req.config = routeConfig[req.path];
next();
},
And finally my express route looks like this
app.get('/account/:id/users', function(req, res){
// Get the users
//
});
Now lets say I go to /account/12345/users
Express noticed that the route has the :id in the middle and goes to that route. However my middleware then will try to load the config for that route and it can't find /account/12345/users in my JSON config file.
So my question is how do I load that route configuration using my dynamic :id path variable and get the configuration from the JSON file. Thanks in advance
Implement json parser using GSON SDK of Google.
You will found sample code from : http://www.javacreed.com/gson-deserialiser-example/
For json string validator : http://jsonlint.com/

Categories