React Native read yaml file - javascript

I'd like to read and parse a static yaml resource file (let's starts from there) within my React Native source code, I tried require('path/to/file.yaml') to no avail.
I also tried with npm's libraries such as read-yaml or yaml-loader all of which cannot get the yaml content properly.
It is a breeze with json file, I can just `require('path/to/file.json') and immediately get everything in nice object format.
Is there a way to read yaml in react native? It is just another text file right with a different format right, so I don't think it will be that hard to read and parse the yaml file., but I am new to JavaScript and React-Native here coming from C/C++ background

Since there appears to be no built-in support for yaml, I believe you would need to use something like react-native-fs, or in Expo FileSystem.readAsStringAsync (treating the file as an asset). You should then be able to use a library such as yaml-js to parse the string as read.

You can't use require directly to load yaml.
If you're using Expo:
use a hook:
const [assets, assetsLoadError] = useAssets([
require(uri)
]);
load asset (won't work without the hook above):
const loadedAsset = await Asset.loadAsync(
require(uri)
);
Read the file into a string var:
let content;
try {
content = await FileSystem.readAsStringAsync(
loadedAsset.localUri
);
} catch (e) {
console.log({ e });
}
Parse YAML from string, e.g. with YAML npm package:
const parsedYAML = YAML.parse(content);

First install
npm install --save babel-plugin-content-transformer
Then add to your babel config :
plugins: [
[
'content-transformer',
{
transformers: [
{
file: /\.ya?ml$/,
format: 'yaml',
},
],
},
],
],

Related

Update (write to) an object in a separate JS file using Node

I'm fairly new to Node, and am wracking my brains on how to achieve the following:
I have a config file that looks something like this:
// various es imports
export default {
input: {
index: 'src/index.ts',
Button: 'src/Button/index.ts',
Spinner: 'src/Spinner/index.ts',
'icons/Notification': 'src/_shared/components/icons/Notification.tsx',
'icons/Heart': 'src/_shared/components/icons/Heart.tsx',
},
//.. other properties
}
From my node script, i need to somehow read this file and do the following:
Delete any entries in the input object that have a key starting
with icons/
Append new entries to the input object.
Write these changes back to the original config file.
Is there a recommended way to do this in Node, i've been looking at a couple of libs, like replace-in-file but none seem to be suited to this particular case.
Just faced the same concern, here is how I solved it :
1. Gets your file content
If it is not a .js file, then use fs.readFileSync (or fs.readFile) like so :
const fs = require('fs');
const path = require('path');
const myObjectAsString = fs.readFileSync(
path.join( process.cwd(), 'my-file.txt' ), // use path.join for cross-platform
'utf-8' // Otherwise you'll get buffer instead of a plain string
);
// process myObjectAsString to get it as something you can manipulate
Here I am using process.cwd(), in case of a CLI app, it will gives you the current working directory path.
If it is a .js file (eg. a JavaScript config file like webpack.config.js for instance), then simply use require native function, as you would do with regular internal file or NPM module, and you will get all module.export content :
const path = require('path');
const myObject = require( path.join( process.cwd(), 'myFile.js') );
2. Modify your object as you want
// ...
myObject.options.foo = 'An awesome option value';
3. Then rewrite it back to the file
You can simply use fs.writeFileSync to achieve that :
// ...
fs.writeFileSync( path.join(process.cwd(), 'my-file.txt', myObject );
If you want to write a plain JavaScript Object, then you can use util.inspect() native method and you may also use fs.appendFileSync :
// ...
// If we wants to adds the 'module.exports = ' part before
fs.writeFileSync( process.cwd() + '/file.js', 'module.exports = ');
// Writes the plain object to the file
fs.appendFileSync( process.cwd() + '/file.js', util.inspect(options));

How to dynamically require JSON files in Webpack?

I have a bunch of json files that I need to dynamically require:
export const allLanguages = R.fromPairs(availableLanguages.map(language => {
return [
language,
{
translation: require('./' + language + '.json'),
formats
}
] as [Language, any]
}))
But I get Error: Cannot find module './ar.json' when running webpack --watch, and I can see that the json files have not been copied to the build directory.
So I tried adding {from: 'common/i18n/*.json'} to CopyWebpackPlugin arguments, and now the json files are copied to the correct place, but I still get Error: Cannot find module './ar.json' when running webpack --watch. It seems they are copied after the build, not before it, and hence the error.
There seems to be feature request for allowing copying files before the build for CopyWebpackPlugin: https://github.com/webpack-contrib/copy-webpack-plugin/issues/195
What is the correct way to handle this in Webpack?

How to 'require' an xml file (loaded by webpack) in order to read the xml contents in javascript

I'm using webpack as the bundler for a app and using an XML file for it configuration. Currently I'm writing tests to retrieve the config from an XML file packaged by webpack.
Referencing this webpack article: Loading data the piece of code I'm having trouble is this:
import Data from './data.xml';
I'm my own test, I'm using the require form which is as follows:
const select = require('xpath.js');
const DOMParser = require('xmldom').DOMParser
const parser = new DOMParser();
const data = require('./app.config.xml'); // <- THIS IS MY PROBLEM
const xdocument = parser.parseFromString(data.toString(), 'text/xml');
How does 'require' work for an XML file?
This code:
console.log("DATA: " + data.toString());
produces this output:
DATA: [object Object]
So I don't know what object is being created by the require.
I have seen a lot of other code samples about reading xml into JSON, but I don't what to have to handle a JSON object. I want to interrogate the XML as an XMLObject so can run xpath queries against it.
I can easily, get to the JSON, by running
JSON.stringify(data)
which displays something like:
{"Application":{"$":{"name":"config-data"},"Imports":[{"Import":[{"$":{"from":"./another.config.xml"}}]}],"Con
which is not what I want. So how can I get the XML content from *data* as a string, which is what **parser.parseFromString** needs?
My test code is as follows:
describe.only('xpath examples (zenobia)', async assert => {
const xdoc = require('./app.config.xml');
let applicationNodes = select(xdoc, "/Application"); // THIS RETURNS a DOMObject
console.log(`Found ${applicationNodes.length} Application elements`);
assert({
given: 'an inline xml document with an Application',
should: 'Select an element from the root',
condition: applicationNodes !== null
});
});
That log message displays:
Found 0 Application elements
for an XML document that is:
<?xml version="1.0"?>
<Application name="app">
<Imports/>
</Application>
If you want to read XML files as strings you can simply use Webpack raw-loader with the following configuration
// webpack.config.js
module.exports = {
module: {
rules: [
{
test: /\.xml$/i,
use: 'raw-loader',
},
],
},
};
Now data in
const data = require('./app.config.xml');
Should be the raw data from the XML file (i.e. a string) - the rest of your code should now work as expected

Using RequireJS with node to optimize creating single output file does not include all the required files

I use the FayeJS and the latest version has been modified to use RequireJS, so there is no longer a single file to link into the browser. Instead the structure is as follows:
/adapters
/engines
/mixins
/protocol
/transport
/util
faye_browser.js
I am using the following nodejs build script to try and end up with all the above minified into a single file:
var fs = require('fs-extra'),
requirejs = require('requirejs');
var config = {
baseUrl: 'htdocs/js/dev/faye/'
,name: 'faye_browser'
, out: 'htdocs/js/dev/faye/dist/faye.min.js'
, paths: {
dist: "empty:"
}
,findNestedDependencies: true
};
requirejs.optimize(config, function (buildResponse) {
//buildResponse is just a text output of the modules
//included. Load the built file for the contents.
//Use config.out to get the optimized file contents.
var contents = fs.readFileSync(config.out, 'utf8');
}, function (err) {
//optimization err callback
console.log(err);
});
The content of faye_browser.js is:
'use strict';
var constants = require('./util/constants'),
Logging = require('./mixins/logging');
var Faye = {
VERSION: constants.VERSION,
Client: require('./protocol/client'),
Scheduler: require('./protocol/scheduler')
};
Logging.wrapper = Faye;
module.exports = Faye;
As I under stand it the optimizer should pull in the required files, and then if those files have required files, it should pull in those etc..., and and output a single minified faye.min.js that contains the whole lot, refactored so no additional serverside calls are necessary.
What happens is faye.min.js gets created, but it only contains the content of faye_browser.js, none of the other required files are included.
I have searched all over the web, and looked at a heap of different examples and none of them work for me.
What am I doing wrong here?
For anyone else trying to do this, I mist that on the download page it says:
The Node.js version is available through npm. This package contains a
copy of the browser client, which is served up by the Faye server when
running.
So to get it you have to pull down the code via NPM and then go into the NPM install dir and it is in the "client" dir...

What is the Node.js equivalent of a PHP include?

I'm looking for a simple solution for server side includes of regular HTML and CSS (for repeated components like headers, footers, navigation, etc) that doesn't require extensive frameworks.
You're looking for the
require()
function. Check out the documentation for this and other Node.js things here
If you want to use newer import statement, you may do so; it's not yet fully implemented in Node, but you can use it by using the .mjs extension on the file you need to import and then using the following command:
node --experimental-modules someFile.mjs
In node js to include repetitive files like headers, footers etc. you will need a templating language such as ejs.
Using ejs here is a sample code snippet with the include tag
<%- include('./path/to/your/html/file') %>
You can use require to require multiple files. However, since node caches the files, you will need to delete them from the cache if you want to used an un-cached version of the file.
index.js
app.get('/path', (req, res) => {
clear()
let headers1 = require('/headers/a.js')
let headers2 = require('/headers/b.js')
res.set(headers1)
res.set(headers2)
})
// Remove from require cache
function clear() {
delete require.cache[require.resolve('/headers/a.js')]
delete require.cache[require.resolve('/headers/b.js')]
}
headers/a.js
module.exports = {
'Content-Type': 'application/json'
}
headers/b.js
module.exports = {
'Custom-Header': 'Brass Monkey'
}
I think you like require "view components". Exists multiple view engines for nodejs, for example pug , or ejs. In this case, you use include

Categories