I have a react app that loads data from a local json file using
import data from './data.json'
it is all working fine without any issues and I'm able to open my index.html in 3000 and also open it as simple HTML page in browser.
Now I run npm run build which creates the build directory. However, my json values become kind of stagnant as it is hardcoded in the javascript in the build. So the question is how can my code in build directory reads json files from a specific location dynamically.
My question: Why not use fetch and serve the JSON from a server side API?
To partially answer your question:
Without changing any webpack configuration, you can use the import() function, instead of import, and a chunk will be built with the json content within a js file.
async function fn() {
const json = await import("./foo.json")
document.title = json.bar
}
On the other hand, probably, webpack has a way to configure this output to be json, but for that you'll need to npm run eject or use a tool to override the webpack production config.
Apart from other alternatives, what you're looking for vanilla Javascript is called fetch API. It's possible to read from either local or remote URLs via fetch method.
As per the example you provided above, instead of doing below;
import data from './data.json'
You can make use of it like;
fetch('./data.json')
Also it works pretty same way as per any URL;
// Sample working URL example to mock some real data
fetch('https://jsonmock.hackerrank.com/api/football_competitions?year=2015')
And best part of it, the parameter fetch method accept can be modified easily since it both accepts local file path and a URL as a variable very same way;
let baseURL = 'https://jsonmock.hackerrank.com/api',
endpointToCall = 'football_competitions',
year = '2015',
URL;
URL = `baseURL/${endpointToCall}?year=${year}`;
fetch(URL);
Note: With the last example above, my point is to destructure the same API endpoint used with previous example before, via dynamic variables in order to being able to more clearer. Please let me know if it's not and you need more clarification.
What you can do it before you run the npm run build you make a request to your server to get the data.json file and then just run the npm run build when it loads. You can write a simple script for it.
For example:
#!/bin/bash
# Get the file from the server
curl https://yourServer/data.zip -o data.zip
# Unzip the file, you can also use unzip
zip -d data.json
# Move the file to the desired directory
mv data.json /yourApp/data/data.json is
# Navigate to the directory where the npm package is
cd /yourApp/
# This one is optional but you should run a test to see if the app won't crash with the new json data that you fetched
# Run tests
npm run tests
# Run the build command for React
npm run build
You can modify this script with your paths and it should work.
Summary
Get the json data with curl
Unzip it
Move it to your react app where data.json is and replace it
Run the tests (optional)
Run the build
You're done.
Hope this helps.
Related
I need to compute and cache new data at build time that is required by both the frontend and backend API routes. Is there any way to read the static properties generated at build time by API routes? The API routes are publicly accessible, so I can't just pass the data from the frontend.
All build artifacts are saved in .next folder, that's outside application source, therefore your code won't be able to access it.
If data you're getting in getStaticProps is time consuming to compute, I'd save it in some cache and then read from this cache in your API routes.
UPDATE:
I lied, I played around and it's actually possible to access cached page data, but there are some caveats.
Build artifacts for given page are saved in .next/server/pages/. Exported static props are stored in a JSON file, that matches page path. E.g. static props for /contact live in .next/server/pages/contact.json.
Those files are deleted when you do yarn build, therefore you can't simply import them with
import data from '../../.next/server/pages/contact.json'
in your code, as that will break the build because of your attempt to import nonexistent file.
What you could do, is loading this file this way:
const cacheDirectory = path.join(process.cwd(), '.next/server/pages/');
const fileContents = await fs.readFile(cacheDirectory + '/contact.json', 'utf8');
This will build just fine and work when you do yarn start. But... it won't work when you do yarn dev, because this command clears the build folder. To work around that, you could check for NODE_ENV value and run this logic in production mode only and use some mock data in this scenario.
I am currently writing some E2E tests with Cypress for a Gatsby based project.
For one test in particular, i'd like to loop through all pages of my Gatsby site, and in order to achieve this, I need a test fixture (e.g. endpoints.json) which includes an array of all urls.
I've tried the following methods (but all have limitations):
1. Running a node script to check the folder structure in the src/pages folder
Limitation - This doesn't account for dynamically generated pages in gatsby-node.js with graphql
2. Running a node script to scrape URLs in the sitemap.xml file generated with gatsby-plugin-sitemap
Limitation - The plugin only generated a sitemap.xml file in prod builds and not in dev (cypress runs a dev server)
Would be more than grateful if anyone has a suggestion for how we would get a full list of Gatsby endpoints in this environment.
You might just want to generate a file in the desired format on-build using the data in GraphQL:
// gatsby-node.js
const path = require("path")
const fs = require("fs").promises
exports.onPostBuild = async ({ graphql }) => {
const { data } = await graphql(`
{
pages: allSitePage {
nodes {
path
}
}
}
`)
return fs.writeFile(
path.resolve(__dirname, "all-pages.txt"),
data.pages.nodes.map(node => node.path).join("\n")
)
}
This creates a .txt file with each page’s path on a line. You could also just write out the data as JSON by passing in JSON.stringify(data.pages) as the second argument to writeFile, though.
There is a pretty cool new plugin for Cypress that will allow you to get the endpoints in your fixtures and env variable, called gatsby-cypress-endpoints. coreyward's answer is perfectly valid, but this should allow you to get the endpoints only when you run Cypress.
Please be aware that with more recent versions of Gatsby (particularly v3) you may have to set the env variable CI=true for the suggested solutions to work.
I believe this is because of changes to the development experience where pages & queries are processed on-demand instead of upfront.
The following flags FAST_DEV / QUERY_ON_DEMAND may be responsible for this behaviour.
I know that this question could be a duplicate of many similar existing questions however, still want to ask more precisely for help in the scenario I want to understand:
I am using this repo in my example and I have following script block in my package.json
I need to pass one parameter from the command line to identify the environment in which I wish to test, for example something like:
-- testEnv=staging
How can I update the following script blcok to accomodate this change.
I have already tried to set different configurations for world parameter like this:
--world-parameters \"{\\\"environment\\\": \\\"Dev\\\"}\"
however it is now confusing to maintain various version of world parameter configs hence looking to use command line to send variable values through.
"scripts": {
"test-chrome": "./node_modules/.bin/cucumber-js.cmd --tags #Run --format json:./testlab/support/reports/report.json",
}
TestCafe allows you to use environment variables, and have a config.json file to store the baseUrl:
So, you could do
export testEnv=staging
npm run test-chrome
Then enter that value as part of your config file.
{
baseUrl: process.env.testEnv
}
Or, if you want a default baseUrl, you could have a helper class that just returns const targetUrl = process.env.testEnv || config.baseUrl.
Starting with version 1.20.0, TestCafe offers a way to specify the base url for test pages in all interfaces:
CLI
Program API runner.run({baseUrl})
Config file
react-native: 0.59.9,
react-native-document-picker: 3.2.4,
rn-fetch-blob: 0.10.15,
In the App, I use 'react-native-document-picker' to select files on the phone, and get the uris of the files, then call 'RNFetchBlob.fs.cp(uri, destPath)' to copy the file to the specific folder.
However, the copy file method may fail depends on the uri returned from 'react-native-document-picker'.
For example, while selecting the file in different directory,
if the uri returned as "content://com.android.externalstorage.documents/document/primary%3ADownload%2FCopyFile.pdf",
it works,
but if the uri returned as "content://com.android.providers.downloads.documents/document/17",
then RNFetchBlob.fs.cp will cause a error of 'Attempt to invoke virtual method 'boolean java.lang.String.startsWith(java.lang.String)' on a null object reference'
I guess it fails because of the uri format with 'com.android.providers.downloads', is there any react-native library deal with the uri so that RNFetchBlob.fs can work on it?
This is a great library for file/document picker
Run npm install react-native-file-picker#latest --save and some changes to the following files
android/settings.gradle
android/app/build.gradle
android/src/main/AndroidManifest.xml
MainApplication.java
and your are good to go!
In your React Native javascript code, bring in the native module:
import FilePickerManager from 'react-native-file-picker';
and go on!
Hope this helps.
I want to have a strongloop example only using javascript without angular.
There's no complete working example without angular for now.
I want to simply include the browser.bundle.js in my index.html, then sync data from/to server side. In fact, I'm trying to replace pouchdb in my program since the couchdb seems not success in open source community.
I can't follow up this document correctly:
Running Loopback in the browser
create browser-app.js with the content from Running Loopback in the browser
copy past the content to browser-app.js
npm install loopback loopback-boot
browserify browser-app.js -o app.bundle.js Then I got error: Error: Cannot find module 'loopback-boot#instructions' from '/Users/simba/Projects/traveller-app/client/node_modules/loopback-boot'
There are few steps for this but its pretty simple.
Bootstrap your application via slc loopback.
Delete server/boot/root.js.
Uncomment two lines in server/server.js, it should look like:
...
// -- Mount static files here--
// All static middleware should be registered at the end, as all requests
// passing the static middleware are hitting the file system
// Example:
var path = require('path'); //this line is now uncommented
app.use(loopback.static(path.resolve(__dirname, '../client'))); //this line is now uncommented
...
Create index.html in the client dir (ie. client/index.html) with your contents.
That should get you a basic set up with just a basic front-end working. Let me know if you have any more issues.