How to template dynamically generated react code? - javascript

I'm currently working on a framework for a web app that a number of developers will be working on. The web app will be in a dashboard style, with individual modules or "views" written in React.
One of the requirements of the framework is that it must ship with a tool that allows developers to automatically generate a new module or "view", so that it creates the files and folders needed and they can get straight to work on the code logic.
An extremely simple flow would be as follows:
Developer enters the name of their new module as an argument to a npm script
A script runs which creates [moduleName.js] and [moduleName.less], links them together, places them in a directory, and writes the generic react code.
I'm now up to the point where I am generating the common react code. Here is what I wrote:
function writeBoilerplate(moduleName) {
var jsFileStream = fs.createWriteStream("./src/m-" + moduleName + "/" + moduleName + ".js");
jsFileStream.once('open', (fd) => {
jsFileStream.write("import React from \"react\"\;\n");
jsFileStream.write("import Style from \"\.\/" + moduleName + "\.less\"\;");
jsFileStream.write("\n");
jsFileStream.write("\n");
jsFileStream.write("export default class " + moduleName + " extends React\.Component \{\n");
jsFileStream.write("\n");
jsFileStream.write(" constructor\(\) \{\n");
jsFileStream.write(" super\(\)\;\n");
jsFileStream.write(" \}\n");
jsFileStream.write("\n");
jsFileStream.write(" componentDidMount\(\) \{");
jsFileStream.write(" \}");
jsFileStream.write("\}");
jsFileStream.end();
});
You can immediately see the problem here, and I stopped before going too far. If I continue on this path, the code will become unreadable and unmanageable.
I want to refactor this to use javascript templates. However, I have never used templating before and I am unsure of how to create a template and use it, or if there are any tools to help.
How can I refactor this code to use a template?

You need to use a template library for that. You can try lodash template one, for example: https://lodash.com/docs#template.
Put your boilerplate in a template file, read it and use something like:
var compiled = _.template(templateFileContent);
compiled({ 'moduleName': 'mymodule' });

Related

Rails 6 Webpacker calling javascript function from Rails view

I have following structure for Javascript in my Rails 6 app using Webpacker.
app/javascript
+ packs
- application.js
+ custom
- hello.js
Below shown is the content in the above mentioned JS files
app/javascript/custom/hello.js
export function greet(name) {
console.log("Hello, " + name);
}
app/javascript/packs/application.js
require("#rails/ujs").start()
require("jquery")
require("bootstrap")
import greet from '../custom/hello'
config/webpack/environment.js
const { environment } = require('#rails/webpacker')
const webpack = require('webpack')
environment.plugins.prepend('Provide',
new webpack.ProvidePlugin({
$: 'jquery',
jQuery: 'jquery',
Popper: ['popper.js', 'default']
})
)
module.exports = environment
Now in my Rails view I am trying to use the imported function greet like shown below
app/views/welcome/index.html.haml
- name = 'Jignesh'
:javascript
var name = "#{name}"
greet(name)
When I load the view I am seeing ReferenceError: greet is not defined error in browser's console.
I tried to search for a solution to this problem and found many resources on web but none turned out to help me. At last when I was drafting this question in the suggestions I found How to execute custom javascript functions in Rails 6 which indeed is close to my need however the solution shows a workaround but I am looking for a proper solution for the need because I have many views which needs to pass data from Rails view to JS functions to be moved custom files under app/javascript/custom folder.
Also I would highly appreciate if anybody can help me understand the cause behind the ReferenceError I am encountering.
Note:
I am not well-versed in Javascript development in Node realm and also new to Webpacker, Webpack, Javascript's modules, import, export, require syntax etc so please bear with me if you find anything silly in what I am asking. I have landed up in above situation while trying to upgrade an existing Rails app to use version 6.
Webpack does not make modules available to the global scope by default. That said, there are a few ways for you to pass information from Ruby to JavaScript outside of an AJAX request:
window.greet = function() { ... } and calling the function from the view as you have suggested is an option. I don't like have to code side effects in a lot of places so it's my least favorite.
You could look at using expose-loader. This would mean customizing your webpack config to "expose" selected functions from selected modules to the global scope. It could work well for a handful of cases but would get tedious for many use cases.
Export selected functions from your entrypoint(s) and configure webpack to package your bundle as a library. This is my favorite approach if you prefer to call global functions from the view. I've written about this approach specifically for Webpacker on my blog.
// app/javascript/packs/application.js
export * from '../myGlobalFunctions'
// config/webpack/environment.js
environment.config.merge({
output: {
// Makes exports from entry packs available to global scope, e.g.
// Packs.application.myFunction
library: ['Packs', '[name]'],
libraryTarget: 'var'
},
})
// app/views/welcome/index.html.haml
:javascript
Packs.application.greet("#{name}")
Take a different approach altogether and attach Ruby variables to a global object in your controller, such as with the gon gem. Assuming you setup the gem per the instructions, the gon object would be available both as Ruby object which you can mutate server-side and in your JavaScript code as a global variable to read from. You might need to come up with some other way to selectively call the greet function, such as with a DOM query for a particular selector that's only rendered on the given page or for a given url.
# welcome_controller.rb
def index
gon.name = 'My name'
end
// app/javascript/someInitializer.js
window.addEventListener('DOMContentLoaded', function() {
if (window.location.match(/posts/)) {
greet(window.gon.name)
}
})
#rossta Thanks a lot for your elaborate answer. It definitely should be hihghly helpful to the viewers of this post.
Your 1st suggestion I found while searching for solution to my problem and I did referenced it in my question. Like you I also don't like it because it is sort of a workaround.
Your 2nd and 3rd suggestions, honestly speaking went top of my head perhaps because I am novice to the concepts of Webpack.
Your 4th approach sounds more practical to me and as a matter of fact, after posting my question yesterday, along similar lines I tried out something and which did worked. I am sharing the solution below for reference
app/javascript/custom/hello.js
function greet(name) {
console.log("Hello, " + name)
}
export { greet }
app/javascript/packs/application.js
require("#rails/ujs").start()
require("bootstrap")
Note that in above file I removed require("jquery"). That's because it has already been made globally available in /config/webpack/environment.js through ProvidePlugin (please refer the code in my question). Thus requiring them in this file is not needed. I found this out while going through
"Option 4: Adding Javascript to environment.js" in http://blog.blackninjadojo.com/ruby/rails/2019/03/01/webpack-webpacker-and-modules-oh-my-how-to-add-javascript-to-ruby-on-rails.html
app/views/welcome/index.html.haml
- first_name = 'Jignesh'
- last_name = 'Gohel'
= hidden_field_tag('name', nil, "data": { firstName: first_name, lastName: last_name }.to_json)
Note: The idea for "data" attribute got from https://github.com/rails/webpacker/blob/master/docs/props.md
app/javascript/custom/welcome_page.js
import { greet } from './hello'
function nameField() {
return $('#name')
}
function greetUser() {
var nameData = nameField().attr('data')
//console.log(nameData)
//console.log(typeof(nameData))
var nameJson = $.parseJSON(nameData)
var name = nameJson.firstName + nameJson.lastName
greet(name)
}
export { greetUser }
app/javascript/packs/welcome.js
import { greetUser } from '../custom/welcome_page'
greetUser()
Note: The idea for a separate pack I found while going through https://blog.capsens.eu/how-to-write-javascript-in-rails-6-webpacker-yarn-and-sprockets-cdf990387463
under section "Do not try to use Webpack as you would use Sprockets!" (quoting the paragraph for quick view)
So how would you make a button trigger a JS action? From a pack, you add a behavior to an HTML element. You can do that using vanilla JS, JQuery, StimulusJS, you name it.
Also the information in https://prathamesh.tech/2019/09/24/mastering-packs-in-webpacker/ helped in guiding me to solve my problem.
Then updated app/views/welcome/index.html.haml by adding following at the bottom
= javascript_pack_tag("welcome")
Finally reloaded the page and the webpacker compiled all the packs and I could see the greeting in console with the name in the view.
I hope this helps someone having a similar need like mine.

How can I reload an ES6 module at runtime?

Prior to ES6 modules, it was (I'm told by other Stack answers) easy to force a JS script to be reloaded, by deleting its require cache:
delete require.cache[require.resolve('./mymodule.js')]
However, I can't find an equivalent for ES6 modules loaded via import.
That might be enough to make this question clear, but just in case, here's a simplified version of the code. What I have is a node server running something like:
-- look.mjs --
var look = function(user) { console.log(user + " looks arond.") }
export { look };
-- parser.mjs --
import { look } from './look.mjs';
function parse(user, str) {
if (str == "look") return look(user);
}
What I want is to be able to manually change the look.mjs file (e.g. to fix a misspelled word), trigger a function that causes look.mjs to be reimported during runtime, such that parse() returns the new value without having to restart the node server.
I tried changing to dynamic import, like this:
-- parser.mjs --
function parse(user, str) {
if (str == "look") {
import('./look.mjs').then(m => m.look(user))
}
}
This doesn't work either. (I mean, it does, but it doesn't reload look.mjs each time it's called, just on the first time) And I'd prefer to keep using static imports if possible.
Also, in case this is not clear, this is all server side. I'm not trying to pass a new module to the client, just get one node module to reload another node module.
I don't know what the reason behind doing this,
I think this is not safe to change the context of modules at runtime and cause unexpected behaviors and this is one of the reasons that Deno came to.
If you want to run some code evaluation at runtime you can use something like this using vm:
https://nodejs.org/dist/latest-v16.x/docs/api/vm.html
You could try using nodemon to dynamically refresh when you make code changes
https://www.npmjs.com/package/nodemon
I agree with #tarek-salem that it's better to use vm library. But there is another way to solve your problem.
There is no way to clear the dynamic import cache which you use in question (btw there is a way to clear the common import cache because require and common import has the same cache and the dynamic import has its own cache). But you can use require instead of dynamic import. To do it first create require in parser.mjs
import Module from "module";
const require = Module.createRequire(import.meta.url);
Then you have 2 options:
Easier: convert look.mjs into commonjs format (rename it look.cjs and use module.exports).
If want to make it possible to either import AND require look.mjs you should create the npm package with package.json
{
"main": "./look.cjs",
"type": "commonjs"
}
In this case in parser.mjs you will be able to use require('look') and in other files import('look') or import * as look from 'look'.

Workaround to dynamic includes in Pug/Jade

I understand that Pug does not support dynamic includes or extends in templates. Ie
extend path/to/template
works but not
extend #{dynamic_path_to_template}
Is there a workaround (however convoluted) that will allow the same goal of modifying the template used by a view at runtime
Context: My use case is that I am developing an npm module and the template being used to extend other views is located inside the module. After the module is published and installed, the path will be defined (ie. node_modules/my_module/path/to/template) but during the development phase, I need to just be able to "npm link" to the module and have the templates work. I also would prefer not to hard code the links so I can publish the same code as tested.
I had this issue aswell and found this question while searching for a solution. My solution is similar to Nikolay Schambergs Answer, but i thought i should share it.
I've created a function that renders templates by giving it a path and passed it to the options object. Maybe it helps in your case aswell
const includeFunc = (pathToPug, options = {}) => {
return pug.renderFile(pathToPug, options); //render the pug file
}
const html = pug.renderFile('template.pug', {include: includeFunc});
and then use it as followed in your template:
body
h1 Hello World
|!{include(dynamicPugFilePathFromVariable)}
There is no way to do this for now, but you can work out your application architecture without dynamic extends.
Possible solution #1
Make a layout.jade that conditionally include multiple layouts:
layout.jade:
if conditionalVariable
include firstLayout.jade
else
include otherLayout
In your view, extend layout.jade, and define conditionalVariable in the controller (true/false):
view.jade:
extends layout
block content
p here goes my content!
Possible solution #2
Pass configurations to the layout
- var lang = req.getLocale();
doctype html
block modifyLayout
split the project into multiple entrances, each entrance extends the layout and passes its different configs, and includes different things in different blocks
extends ../layout
block modifyLayout
- var lang = "en" //force language to be en in this page.
block body
include my-page-body
Possible solution #3
use something like terraform which uses pug as its rendering engine, but it enables you to use dynamic partials like this
!= partial(dynamicFileFromVariable)
It works!
First, set res.locals middleware.
middlewares/setResLocals.js
const pug = require('pug')
const path = require('path')
module.exports = function(req, res, next) {
res.locals.include = (pathToPug, options = {}) => { // used for imitate includ pug function
return pug.renderFile(pathToPug, options); //render the pug file
}
res.locals.__PATH__ = path.join(__dirname, '../')
next()
}
server/index.js
app.use(require('../middlewares/setResLocals'))
file.pug
|!{include(`${__PATH__}/${something}`)}
In order to do dynamic include, you will have to use Unescaped String Interpolation, inserting pug contents that are pre-compiled before your main .pug file inside your route. In other words it works as follows:
1) Some .pug files are pre-compiled into HTML
2) The HTML gets fed into another .pug file compilation process
Here's an example how to do it
Inside your router file (routes.js or whatever)
var pug = require('pug')
var html = []
var files = ['file1','file2'] // file names in your views folders
let dir = path.resolve(path.dirname(require.main.filename) + `/app/server/views/`)
//dir is the folder with your templates
app.get('/some-route', (req,res) => {
for (let n = 0; n < files.length; n++) {
let file = path.resolve(dir + '/' + files[n] + `.pug`)
fs.access(file, fs.constants.F_OK, (err) => {
if (!err) {
html.push(pug.renderFile(file, data))
if (n === files.length - 1) {
res.render('dashboard', {html})
}
}
else {
res.status(500).json({code:500,status:"error", error:"system-error"})
}
})
}
})
Inside your desired .pug file:
for item in html
.
!{item}
The example above is specific to my own use case, but it should be easy enough to adapt it.
I know, this is a bit late for answering. But I found a possibility suitable for my purpose by this bit of information from the pug docs:
If the path is absolute (e.g., include /root.pug), it is resolved by
prepending options.basedir. Otherwise, paths are resolved relative to
the current file being compiled.
So, I provide most of my pug modules by relative paths and the stuff I want to exchange dynamically is organised in pug files of the same name but in different folders (think theme) and include them by absolute paths . Then I change the basedir option to dynamically choose a set of pug files (like choosing the theme).
May this help others, too.

Using both coffeescript and typescript on the same project

I m actually trying to work with both coffeescript and typescript in the same project.
In fact, I want to be able to chose which one I prefer when coding.
The fact is that the javascript generated by typescript doesn't seem to work as expected with the javascript generated with coffeescript
Explanation :
I wrote a Controller class with coffeescript which works perfectly when I extend it in a coffeescript file like below :
Controller = require('../node_modules/Controller/Controller')
class HelloController extends Controller
indexAction: (name) =>
console.log 'hey '+ name
module.exports = HelloController
But when I try to use it with typescript like below :
import Controller = require('../node_modules/Controller/Controller');
export class HelloController extends Controller {
constructor() {
super()
}
indexAction(name:String) {
console.log('hey '+name);
}
}
I got an error telling me that the controller cant be find at the expected place (the .js file is well generated)
Can you help me ?
If you want to do this, you'll need to supply type information about the Coffeescript-generated JavaScript file.
If you add a Controller.d.ts you can describe the types in your controller file so TypeScript can apply that type information during compilation.
For example:
declare class Controller {
protected name: string;
//... more type information
}
export = Controller;
Of course, you are essentially undertaking far more work writing JavaScript or Coffeescript and then also writing type information in another file - so you may want to make a decision on a per-unit basis about what you are going to write the program in. For example, if you write a tool-kit in Coffeescript, it is easy to write a single .d.ts file for it - whereas if you write a file here and there in Coffeescript you're going to have a bit of a maintenance nightmare (either creating lots of .d.ts files or managing a single merged one each time you change one of the parts).
Definition files work best against a stable API.

How do I share common typescript code from one VS2013 project into a separate VS2013 project

I have Visual Studio 2013, Typescript 1.3, and Module System set to AMD (requirejs 2.1.15 pulled down via Nuget, Type definitions for RequireJS 2.1.8)
I want to have a Web Application, let's call it MyWebApplication1, that has a simple html page which includes a javascript file that is generated from my typescript, eg. MyTypescript.ts
MyHtml.html:
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<script type="text/javascript" src="Scripts/require.js" data-main="Scripts/requireConfig.js"> </script>
</head>
<body>
<div id="result">DEFAULT</div>
</body>
</html>
requireConfig.ts:
require.config({
baseUrl: 'Scripts',
});
require([ '../Scripts/MyTypescript'],
function (MyTypescript) {
MyTypescript.MyFunction("Mr Anderson");
}
);
MyTypescript.ts:
import MyCommonResource = require("MyCommonResource");
export = MyApp;
module MyApp {
export function MyFunction(myParameter: string) {
var aString: string = (new Date()).toTimeString() ;
var myCommonThing: MyCommonResource.CommonClass = new MyCommonResource.CommonClass();
aString = myCommonThing.CommonMethod();
alert("HELLO " + myParameter + " # " + aString);
}
}
MyCommonResource.ts:
export = MyCommon;
module MyCommon {
export class CommonClass {
constructor() { }
public CommonMethod(): string {
return "COMMON";
}
}
}
This all works fine with the following folder structure in one project:
MyWebApplication1
-Scripts
-MyTypescript.ts
-MyCommonResource.ts
-requireConfig.ts
-MyHtml.html
However, I want something like this (a separate project that may be needed by various web projects)
MyWebApplication1
-Scripts
-MyTypescript.ts
-requireConfig.ts
-MyHtml.html
MyCommonWebCode1
-Scripts
-MyCommonResource.ts
I don't see a good way to accomplish this, without having something like this in MyTypescript.ts:
import MyCommonResource = require("..\..\AnotherProject\Scripts\MyCommonResource");
Which won't work for deployment under IIS, because the generated js define would look like this:
define(["require", "exports", "..\..\AnotherProject\Scripts\MyCommonResource"], function (require, exports, MyCommonResource) {
Is there a good solution for this situation?
I have considered using post build events, using a local nuget server,...there just doesn't seem to be a nice way of handling common bits of code in different web apps when using typescript and visual studio though.
Thanks in advance.
You have already mentioned what I think is the right option.
If you have code you want to share across multiple projects, a package manager is the perfect way to share it. Not only does it aid the distribution of the shared code, it also helps with versioning.
There are some neat extensions that make NuGet package creation really easy - I am using NuGet Packager.
You can use the free edition of Pro Get to get started - it is a really neat NuGet server.
You can also use your local NuGet server as a proxy to keep versions of other libraries consistent, keep a list of "team-approved" packages and to ensure you have checked licenses and kept to license conditions (rather than having to check if anyone has added a new package).
If you prefer a different package manager (i.e. npm) you can pretty much follow the same pattern as the features are very similar.

Categories