I'm building a large React application that involves processing lots of data, formatting it and outputting to tables. Occasionally these functions are variables (e.g const x = () => etc.)
I'm storing the functions that do this formatting in Typescript files which I import into my React components.
To give an example, I might write a table formatting function like this:
export const buildMainData = (data: any) => {
do stuff
}
I'm placing it inside a file, called functions.ts (for example).
I then have a React component which makes use of the function.
My question is - is this a bad idea? Am I creating loads of functions that are polluting the memory heap? I'm using create-react-app so I'm not sure if Webpack is doing some magic behind the scenes to prevent global variables, or whether everything I write should be placed inside of React components.
It would be great if anyone with more experience / knowledge in this area could help out. If I'm also completely getting the wrong end of the stick that would also be helpful to know. Thanks.
The variables and functions you're exporting aren't globals, they're exports from the module in which you define them. They're used via import. If you have some that aren't used, modern bundlers like Webpack and Rollup can tree-shake the bundle they create, leaving unused functions out (if there are any). More about tree-shaking in Webpack and in Rollup.js.
It's true that top-level declarations in classic scripts are globals, but top-level declarations in modules are scoped to the module (kind of like the module were a function and you were declaring things inside it), and then possibly exported from it.
Related
I am new to React and I've come across two bits of code that I think do the same thing, but I am not entirely sure.
import React from 'react'
export default function TestComponent() {
return (
<div>
</div>
)
}
as well as
import React from 'react'
function TestComponent() {
return (
<div>
</div>
)
}
export default TestComponent
The only difference that I can see between the two is that the export is just put at the end in one of them but they should both behave the same if my understanding is correct. Does this have any real effect on the behaviour of the code?
As several in the comments have already said, those two code blocks are practically equivalent (ie. they are the same). In theory you could add code to make them different (eg. you could re-assign the function before exporting it in the second example) ... but it would be pretty contrived.
Does this have any real effect on the behaviour of the code?
No. What matters is just that you name your exported functions at all (and in fact the Create React App rules will fail if you don't, because it could make your debugging harder).
It's the same thing in your case because you are creating your React component by using a 'classic' Javascript function.
I personnally use the first code to save a line but the second code is usefull when you have a React.js component written in arrow function like as is :
const yourComponent = () => {};
export default yourComponent;
In this case, you must use this line at the end of your file :)
Both code blocks are functionally the same but I personally favor the second.
Both use a function declaration but the second one permits you to swap the declaration out for a function expression or class declaration without modifying the export.
Also, in cases where you can have multiple exports from a single file, it's a useful convention to declare the default export at the bottom of the file.
All in all, that's not a matter for React or components, but for coding in general and JavaScript modules especially.
It might be matter of scalability on the dev side, i.e. how you yourself will be able to manage your code if it grows. If you have to add and maintain more functions / classes / variables, it's a benefit to separate the place in code where you define the functions / classes / variables from that where you define wich to export and from that where you define your default export because you can only have one default export, but many non-default exports - imagine you decided to re-declare which export is the default one, labeling the new one as "default" and forget to "de-label" the old one, then with defining "default" somewhere in your code the old default might outrule the new one because being later in the code. Declaring exports at the end of file will give you helpful overview.
As question of personal style might be if you want to use "export" directly where a function / class / variable is defined to see immediately what functions / classes / variables are "public" and which are "private" (i.e. not exported).
If your code grows into something requiring some kind of an API, you might use the option to export as, e.g. maintaining complicated "speaking" functions names inside of your code, but exposing the functionality by "simple" names to the component's consumers. This would be obviously easier if being separated from the definitions of the functions itself.
In general, for your own sake, be as explicit as possible, here: separate "export" instructions. Trying to have short and clever code leads to more complexity than myriads of "stupid simple" code. React and other soft- and hardware is not impressed how cleverly you may have code golfed, very rarely something would be faster or slower, since optimization should not be part of developing, and trying to generalize should be well dosed.
For JavaScript "ES6" modules used by React for components, the 2015 introduction https://hacks.mozilla.org/2015/08/es6-in-depth-modules/ is still the best reference and surely a must-read.
Both are technically same, the only difference is that you add an extra line to export your functional component in second code. I generally prefer second code because it keeps my code clean when compared to first code.
Today I realized that I'm missing some important theory in JS.
In our app written in TypeScript we have tons of constants (objects) declared within their own modules that are imported and used elsewhere in the app. Those constants are generally quite big settings objects that have tons of properties (as well as some functions as properties).
While our app is growing I started to worry about performance and optimization. The idea that came into my head was to wrap each of those constants into a function, returning the constant so that it's only loaded into memory once that function is called.
I'm not 100% sure that this is how it will work. Specifically, if I declare a function that defines a constant and returns it, what exactly will be in memory when I import that function? How will it be different from when I define a constant and import it straight away? Does it make sense to wrap such constants into functions?
At the moment I assume that once I define a constant within a module scope it's loaded into memory once it's imported. Is it correct?
Please let me know if there's some better approach to handle this.
In implementing the Python module mechanism on top of ES6 modules for the Transcrypt Python to JavaScript compiler, I am faced with the following problem:
There are a large number of standard functions imported from the Python runtime module, like e.g. the Python input function (implemented in JS), which can be made available using named imports (since they shouldn't have to be prefixed with anything in the user code, so input rather than __runtime__.input, to be consistent with Python).
In Python it's allowed to rebind named imports. So I define another function input, which will override the one from the runtime. But if I do so in JS, I get an error:
Identifier 'input' has already been declared
It seems that all imported names are regarded as JS consts, so non-rebindable according to this article. I can think of several clever workarounds, like importing under an alias and then assigning to a module global var rather than const, but like to keep things simple, so my question is:
Am I right that JS named imports are consts, so non-rebindable (and if so, just curious, anyone knows WHY)? Where can I find details on this?
Is there a simple way to circumvent that and still put them in the global namespace of the importing module, but override them at will?
As per the language specification, imported bindings are immutable bindings, so they cannot be changed. The identifiers are reserved as the module gets parsed because of how ES6 modules work: Unlike in Python, imports are not statements that are included as they are executed; instead, all a module’s imports are basically collected during the early compilation and then resolved before the module starts executing.
This makes ES6 modules kind of unsuitable as an implementation for Python’s import system.
As a general way, to avoid losing those names, you can simply give the imported bindings different names. For example an from foo import bar, baz may be compiled to the following:
import { bar as _foo__bar, baz as _foo__baz } from 'foo';
let bar = _foo__bar;
let baz = _foo__baz;
That will only reserve some special names while keeping the bar and baz identifiers mutable.
Another way, which would probably also help you to solve possible import semantics differences would be to simply create a closure:
import { bar, baz } from 'foo';
(function (bar, baz) {
// …
})(bar, baz);
Or even add some other lookup mechanism in between.
Btw. Python’s import is very similar to Node’s require, so it might be worth looking into all those solutions that made Node’s module system work in the browser.
Disclaimer: I’m a Node.js newbie.
There’s a number of class-based languages in which you can/must use namespaces to organize your code, for example: Java, PHP, ActionScript 3… For a number of those languages, if you choose/have to use namespaces, there’s generally a set of common practices and conventions that govern project organization then:
Classes form the basic code units, and responsibilities are spread across multiple classes.
The class file hierarchy reside in a single top-level directory (most of the time: src/ or lib/).
Each source file contains a single class definition and nothing else.
Each class resides at a specific level of a namespace (or package) hierarchy, which mirrors the filesystem; for example:
in Java: class com.badlogic.gdx.Application would be found in the src/com/badlogic/gdx/Application.java file
in PHP (with PSR-0): class Symfony\Component\HttpKernel\Kernel would be found in the src/Symfony/Component/HttpKernel/Kernel.php file
Foreign class symbols can be imported into the current scope via a specific statement:
in Java: import com.badlogic.gdx.Application;
in PHP: use Symfony\Component\HttpKernel\Kernel;
I’m used to this type of project organization, but I do realize that it’s specific to class/namespace-based languages and that it might not match JavaScript/Node.js’ usual idioms. If I understand the concept of Node.js modules correctly, it’s 1 source file = 1 module, but from what I’ve seen in a lot of NPM packages, a module usually export more than one symbol, and more often than not those exports are functions and not classes/constructors, so it’s pretty different from the conventions described above.
So, I have the following questions:
In JavaScript/Node.js, is it relevant at all to think about distribution of responsibilities in terms of «classes only» (using either the traditional constructor + prototype composition method or the new class shorthand)?
Is the type of project organization described above possible at all in the context of a Node.js project?
In JavaScript/Node.js, is it relevant at all to think about distribution of responsibilities in terms of «classes only» (or «prototypes only» for that matter)?
In Javascript it's a choice rather than a mandate. You can go full OOP even file structure wise. Or just write modules as pure functions. I'd advise you to stick to the structure that's easier for others, who may want to understand your code, to follow. For example, the OOP style:
Let namespace be the path under src
/src/org/xml/XMLDocument.js
and have a class very similar to the popular OOP languages:
// imports
const fs = require('fs');
const XMLNode = require('./XMLNode');
// class def
class XMLDocument extends XMLNode {
// constructor
constructor(filePath){
...
}
// property getter
get filePath(){
...
}
// method
function getElementsByName(name){
...
}
}
// export class to outer world
module.exports = XMLDocument;
Use the class
// import
const XMLDocument = require('./org/xml/XMLDocument');
// create an instance
const doc = new XMLDocument('./mydoc.xml');
So yes, following an OOP structure is relevant when you tackle the problem the OOP way. And there are alternate ways as well.
Another "creator" oriented custom style:
function createXMLDocument(filePath){
const doc = {};
doc._type = "XMLDocument";
... // make the object have XMLDocument features
return doc;
}
function createDXMLDocument(filePath){
const doc = cerateXMLDocument(filePath);
doc._type = "DXMLDocument";
... // modify parent object with DXML features
return doc;
}
You see, there are some patterns the developer adheres to and write all project code in that style.
Is the type of project organization described above possible at all in the context of a Node.js project?
A Node.js project can have any kind of code organisation because of certain features:
Javascript module system is nothing but referencing a js file present somewhere in file system. So there are no special restrictions on file placement. There are modules that are built in, or can be installed via npm.
Module exports can export one or multiple "things" to external world. So a lot of flexibility here as well.
Javascript itself can be easily written in many styles, functional, OOP, procedural etc. It allows developer to modify a lot of Javascript's own nature. Hence possibly "mimic" many programming styles.
In JavaScript/Node.js, is it relevant at all to think about distribution of responsibilities in terms of «classes only» (or «prototypes only» for that matter)?
To be honest I don't really understand this question. You should follow OOP principles if you use classes, but if you do not, you still need to find cohesion between your functions and organize them in modules and folders based on that.
Is the type of code organization described above usual or relevant at all in the context of a Node.js project, and is it technically implementable without too much trouble?
Javascript modules don't have namespaces, which make things a bit easier (Remember that C# and c++ projects usually have a folder structure totally different than the namespaces). Use folders as namespaces and you'll be fine. There is no such rule that you can only have one class per source file. I usually start writing classes and functions in a single file, and reorganize into multiple files when the file grows big. JavaScript's module system is very flexible, you can organize the code literally any way you want.
If not, what are the traditional ways of handling repartition of responsibilities and code reuse in a Node.js project?
The same as anywhere else.
This may be just me lacking a 'bigger picture' so to speak, but I'm having trouble understanding why exporting modules is needed to just split up files.
I tried doing something like this:
//server.js
var app = require('koa')();
var othermodule1 = require('othermodule1')();
var othermodule2 = require('othermodule2')();
var router = require('./config/routes')();
app.use(router.routes());
//routes.js
module.exports = require('koa-router')()
.get('*', function*(next){
othermodule1.something;
})
realizing that routes.js does not have access to 'othermodule1' after calling it from serverjs. I know that there's a way to pass needed variables during the require call, but I have a lot more than just 2 modules that I would need to pass. So from my probably naive perspective, this seems somewhat unnecessarily cumbersome. Someone care to enlighten me or is there actually a way to do this that I missed?
Each node.js module is meant to be a stand-alone sharable unit. It includes everything that it needs to do its job. That's the principle behind modules.
This principle makes for a little more overhead at the start of each module to require() in everything you need in that module, but it's only done once at the server startup and all modules are cached anyway so it isn't generally a meaningful performance issue.
You can make global things by assigning to the global object, but they that often breaks modularity and definitely goes against the design spirit of independently shareable modules.
In your specific code, if routes needs access to othermodule1, then it should just require() it in as needed. That's how modules work. routes should just include the things it needs. Modules are cached so requiring it many times just gives every require() the same module handle from a cache.
This is an adjustment in thinking from other systems, but once you get use to it, you just do it and it's no big deal. Either require() in what you need (the plain shareable module method) or pass something into a module on its constructor (the push method) or create init() methods so someone can initialize you properly or call some other module to get the things you need (the pull method).