So I'm trying to build a weather app by using data from a weather API.
import fetch from 'node-fetch'
//fetch weather API
let weather
let getWeather = async() => {
let url = \https://api.openweathermap.org/data/2.5/weather?q=auckland&appid=c156947e2c7f0ccb0e2a20fde1d2c577`try {let res = await fetch(url)weather = await res.json() } catch (error) {console.log("error") } let weatherMain = weather.weather.map( el => el.description)if(weatherMain ="Rain"){console.log(weatherMain)// weatherImg = "[https://icon-library.com/images/raining-icon/raining-icon-1.jpg](https://icon-library.com/images/raining-icon/raining-icon-1.jpg)" } }console.log(getWeather())`
My problem is that I'm getting this error when running in vscode:
SyntaxError: Cannot use import statement outside a module
and this error when running in browser:
Uncaught TypeError: Failed to resolve module specifier "node-fetch". Relative references must start with either "/", "./", or "../".`
Not sure what exactly is going on, Can someone please explain what's happening?
I've tried fetch API once before and that time I didn't need to import fetch, so I'm pretty confused.
SS
Edit - Understood now, running in browser and in vscode is 2 different things. What works in the browser won't necessarily work in Node.js
When running in browser, there's no need to import fetch.
Thanks everyone.
let weather;
let getWeather = async () => {
let url = `https://api.openweathermap.org/data/2.5/weather?q=auckland&appid=c156947e2c7f0ccb0e2a20fde1d2c577`;
try {
let res = await fetch(url);
weather = await res.json();
console.log('weather', weather);
} catch (error) {
console.log(error);
}
let weatherMain = weather.weather.map((el) => el.description);
if ((weatherMain = 'Rain')) {
console.log('weatherMain', weatherMain);
let weatherImg =
'[https://icon-library.com/images/raining-icon/raining-icon-1.jpg](https://icon-library.com/images/raining-icon/raining-icon-1.jpg)';
return weatherImg;
}
};
const main = async () => {
const data = await getWeather();
console.log('data', data);
};
main();
Yes, you are right about no need to import fetch if you are running the js in the browser. But I see that you are importing node-fetch, this package is used to bring the fetch (window.fetch) for the node system.
But If you want to run it in the node, then you should know that the node doesn't support ES6 module. But you can user the experimental flag to run the code. e.g.
node --experimental-modules app.mjs
Related
I have managed to use fleek to update IPFS via straight javascript. I am now trying to add this functionality to a clean install of a svelteKit app. I think I am having trouble with the syntax around imports, but am not sure what I am doing wrong. When I click the button on the index.svelte I get the following error
Uncaught ReferenceError: require is not defined
uploadIPFS upload.js:3
listen index.mjs:412..........(I truncated the error here)
A few thoughts
I am wondering if it could be working in javascript because it is being called in node (running on the server) but running on the client in svelte?
More Details
The index.svelte file looks like this
<script>
import {uploadIPFS} from '../IPFS/upload'
</script>
<button on:click={uploadIPFS}>
upload to ipfs
</button>
the upload.js file looks like this
export const uploadIPFS = () => {
const fleek = require('#fleekhq/fleek-storage-js');
const apiKey = 'cZsQh9XV5+6Nd1+Bou4OuA==';
const apiSecret = '';
const data = 'pauls test load';
const testFunctionUpload = async (data) => {
const date = new Date();
const timestamp = date.getTime();
const input = {
apiKey,
apiSecret,
key: `file-${timestamp}`,
data
};
try {
const result = await fleek.upload(input);
console.log(result);
} catch (e) {
console.log('error', e);
}
};
testFunctionUpload(data);
};
I have also tried using the other import syntax and when I do I get the following error
500
global is not defined....
import with the other syntax is
import fleekStorage from '#fleekhq/fleek-storage-js';
function uploadIPFS() {
console.log('fleekStorage',fleekStorage)
};
export default uploadIPFS;
*I erased the api secret in the code above. In future I will store these in a .env file.
Even more details (if you need them)
The file below will update IPFS and runs via the command
npm run upload
That file is below. For my version that I used in svelte I simplified the file by removing all the file management and just loading a variable instead of a file (as in the example below)
const fs = require('fs');
const path = require('path');
const fleek = require('#fleekhq/fleek-storage-js');
require('dotenv').config()
const apiKey = process.env.FLEEK_API_KEY;
const apiSecret = process.env.FLEEK_API_SECRET;
const testFunctionUpload = async (data) => {
const date = new Date();
const timestamp = date.getTime();
const input = {
apiKey,
apiSecret,
key: `file-${timestamp}`,
data,
};
try {
const result = await fleek.upload(input);
console.log(result);
} catch(e) {
console.log('error', e);
}
}
// File management not used a my svelte version to keep it simple
const filePath = path.join(__dirname, 'README.md');
fs.readFile(filePath, (err, data) => {
if(!err) {
testFunctionUpload(data);
}
})
I have a nextjs app with a strapi api deployed with heroku and the frontend with firebase. I use an environnement variable to fetch my api. It worked well in localhost but not with Heroku. I get the error: Only absolute URLs are supported**
I tried to hardcode my api URL, but I guess this is not the very best solution
Does someone ever meet that kind of problem ?
> Build error occurred
TypeError: Only absolute URLs are supported
at getNodeRequestOptions (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/next/dist/compiled/node-fetch/index.js:1:64341)
at /Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/next/dist/compiled/node-fetch/index.js:1:65715
at new Promise (<anonymous>)
at fetch (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/next/dist/compiled/node-fetch/index.js:1:65650)
at fetchAPI (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/.next/server/pages/books/[name].js:7040:26)
at getProducts (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/.next/server/pages/books/[name].js:7053:26)
at getStaticPaths (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/.next/server/pages/books/[name].js:6997:60)
at buildStaticPaths (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/next/dist/build/utils.js:17:86)
at Object.isPageStatic (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/next/dist/build/utils.js:24:555)
at execFunction (/Users/mac/Documents/Website/strapi/strapi-starter/strapi-starter-next-ecommerce/frontend/node_modules/jest-worker/build/workers/processChild.js:155:17) {
type: 'TypeError'
}
api.js
export function getStrapiURL(path) {
return `https://intense-beyond-59367.herokuapp.com${path}`;
}
// Helper to make GET requests to Strapi
export async function fetchAPI(path) {
const requestUrl = getStrapiURL(path);
const response = await fetch(requestUrl);
const data = await response.json();
return data;
}
export async function getCategories() {
const categories = await fetchAPI("/categories");
return categories;
}
export async function getCategory(slug) {
const categories = await fetchAPI(`/categories?slug=${slug}`);
return categories?.[0];
}
export async function getProducts() {
const products = await fetchAPI("/books");
return products;
}
export async function getProduct(name) {
const products = await fetchAPI(`/books?name=${name}`);
return products?.[0];
}
I am trying to switch my environment variables to Google Secrets Manager, but I am encountering a problem. I am running an expressjs api trying to establish a database connection. But whatever I try, it only returns Promise { <pending> } instead of waiting for the async function to finish. Any help is highly appreciated!
gcloud.js
const { SecretManagerServiceClient } = require('#google-cloud/secret-manager');
const client = new SecretManagerServiceClient();
async function getSecret(name) {
const name = `projects/PROJECT-ID/secrets/${name}/versions/latest`;
const [version] = await client.accessSecretVersion({
name: name,
});
const payload = version.payload.data.toString();
return payload;
}
module.exports.getSecret = getSecret;
config.js
const gcloud = require('./config/gcloud.js')
const config = {
authSecret: gcloud.getSecret(SECRET_NAME)
}
module.exports = config
You need to await the result of gcloud.getSecret:
// Not actually valid
const config = {
authSecret: await gcloud.getSecret(SECRET_NAME)
}
However, top-level await isn't widely supported yet, so you'll need to build the config inside a function: NodeJS Async / Await - Build configuration file with API call
I am relatively new to selenium and javascript. I am trying to run multiple seleniumjs test files sequentially. To do this I have created another js file (testAll) in which I call all the exported test functions I have created in separate files. I am running into an issue with where I am defining the driver and feel like I'm in a bit of a catch 22. When I define the driver within the test async function itself it works fine but when I then transfer the driver definition to my testAll file to avoid multiple browser windows opening up then I receive a 'cannot read property 'get' of undefined' message. This must be referring to my driver as an undefined variable but I can't see a way to get around this. I have included an example test.js file and my testAll.js file code below:
testAll.js:
const servicesPage = require('./servicesPage');
const organisations = require('./organisations');
const {By , Builder, until} = require('selenium-webdriver');
const allServices = async () => {
const driver = await new Builder().forBrowser('chrome').build();
try {
await servicesPage(driver);
await organisations(driver);
}
finally{
await driver.quit();
}
}
allServices();
test.js:
//Setup
const {By , Builder, until} = require('selenium-webdriver');
const properties = require('../test_Properties')
const authentication = require('../mainAuth');
const assert = require('assert');
const organisations = async (driver) => {
try {
//Execution
await driver.get(properties.servicesUrls.orgsPage);
await authentication(driver);
await driver.wait(until.elementLocated(By.linkText('Request an organisation')), 7000);
await driver.findElement(By.linkText('Request an organisation')).click();
//Assert organisations request page and click back
let rqstOrg = await driver.findElement(By.tagName('h1')).getText();
assert.equal(rqstOrg , 'Request an organisation' , 'Request an organisation heading does not
match');
await driver.findElement(By.className('link-back')).click();
//Assert organisations page
let orgTitle = await driver.getTitle();
assert.equal(orgTitle , 'Organisations' , 'Organisations title does not match');
await driver.findElement(By.linkText('Sign out')).click();
} catch (e) {
throw e;
}
};
module.exports = organisations;
Employee moved on and left me with this code that was once working to generate PDFs. I haven't had any luck trying to debug - with breakpoints or even console.logs - the script listed here at the bottom; is there a way to search the huge list of loaded scripts in Visual Studio?
C# error:
{System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> System.IO.IOException: The server returned an invalid or unrecognized response.
at System.Net.Http.HttpConnection.FillAsync()
Client Side error: (is this because the server never returns anything?)
ERROR Error: Uncaught (in promise): DataCloneError: Failed to execute 'postMessage' on 'Worker': TypeError: Failed to fetch could not be cloned.
Error: Failed to execute 'postMessage' on 'Worker': TypeError: Failed to fetch could not be cloned.
at MessageHandler.postMessage (pdf.js:12334)
at sendStreamRequest (pdf.js:12151)
at Object.error (pdf.js:12194)
at eval (pdf.js:8419)
at ZoneDelegate.invoke (zone.js:392)
Controller method
public async Task<IActionResult> Generate(string id)
{
try
{
var stream = await _reportService.GenerateReportAsync(id);
return new FileStreamResult(stream, "application/pdf");
}
catch(Exception ex)
{
throw;
}
}
Service method:
public async Task<Stream> GenerateReportAsync(string id)
{
return await Services.InvokeAsync<Stream>("./Node/generate-pdf.js", Configuration.Url, id, new { format = "A4" });
}
generate-pdf.js:
const pdf = require('html-pdf');
const puppeteer = require('puppeteer');
module.exports = async function (result, url, id, options) {
const browser = await createBrowser();
const page = await browser.newPage();
const css = await browser.newPage();
await page.goto(`${url}/reports/${id}`, {
waitUntil: 'networkidle2'
});
await css.goto(`${url}/styles.bundle.css`, {
waitUntil: 'networkidle2'
});
await page.waitForSelector('.report-loaded');
let cssBody = await css.evaluate(() => `<style>${document.documentElement.innerHTML}</style>`);
let bodyHtml = await page.evaluate(() => document.documentElement.innerHTML);
bodyHtml = bodyHtml.replace('<link href="styles.bundle.css" rel="stylesheet">', cssBody);
browser.close();
pdf.create(cssBody + bodyHtml, options).toStream((error, stream) => stream.pipe(result.stream));
}
async function createBrowser() {
return await puppeteer.launch({
args: ['--no-sandbox', '--disable-setuid-sandbox']
});
}
Looks like the generate-pdf.js script is using "html-pdf". This can be found on npm:
https://www.npmjs.com/package/html-pdf
And it has a github page:
https://github.com/marcbachmann/node-html-pdf
So the problem is going to be with the usage of that package, or some kind of bug in it. (well, that's an assumption on my part, I don't know this package at all and have no experience working with it)
At this point I'd try to figure out which version of that package is being used, check out the source code and try to find a hint in there.
This structure seems rather convoluted though. Why not just generate the PDF in the client in the first place, or generate it in the C# code? That it was working at some point shouldn't be an argument as you are now noticing this is proving difficult to maintain.