I made a Quotes generator website, Which uses an API. When I click Generate Quote it is just stuck on loading
How can I fix it?
function randomQuote(){
quoteBtn.classList.add("loading");
quoteBtn.innerText = "Loading Quote...";
fetch("http://api.quotable.io/random").then(response => response.json()).then(result => {
quoteText.innerText = result.content;
authorName.innerText = result.author;
quoteBtn.classList.remove("loading");
quoteBtn.innerText = "New Quote";
quoteBtn.addEventListener("click", randomQuote);
});
}
Also I get this error which I do not know how to fix
index.js:12 Mixed Content: The page at 'https://mrquoter.netlify.app/' was loaded over HTTPS, but requested an insecure resource 'http://api.quotable.io/random'. This request has been blocked; the content must be served over HTTPS.
Also when I run it on my local server it runs fine, But I hosted it on netlify.app and it gives out an error
Use a secure endpoint: https://api.quotable.io/random
Perhaps use async/await to make your code easier to parse.
Keep the button outside of the element you're updating.
const button = document.querySelector('button');
const quote = document.querySelector('.quote');
const endpoint = 'https://api.quotable.io/random';
button.addEventListener('click', handleClick, false);
async function handleClick() {
const response = await fetch(endpoint);
const data = await response.json();
quote.textContent = data.content;
}
.quote { margin-top: 1em; }
<button>New quote</button>
<div class="quote"></div>
You can not mix loading resources from HTTP sources when you have a HTTPS website because it alters the behavior of your HTTPS website (i.e. a secure website is not secure anymore) which opens up for new attack vectors on your HTTPS website as described here.
You simply need to change the query URL to
'https://api.quotable.io/random'
As your website uses HTTPs but you are calling the API with HTTP
HTTPs is a secure version of HTTP
You can find more about HTTP & HTTPs Here
Related
How can I get content from google docs for email body in Html format in app script as earlier I was using classic google sites for getting body content of the email and now the classic sites are shutting down. Or do you know any alternative for this . Earlier I was using code for getting content.
SitesApp.getPageByUrl(spSignURL).getHtmlContent()
You may export the document as HTML. To do so you may use the following function:
function exportAsHtml(documentId) {
DriveApp.getRootFolder() // Makes Apps Script get the right permissions
const result = UrlFetchApp.fetch(`https://www.googleapis.com/drive/v3/files/${documentId}/export?mimeType=text%2Fhtml`, {
headers: {
"Authorization": `Bearer ${ScriptApp.getOAuthToken()}`
},
muteHttpExceptions:true,
})
const content = result.getContentText()
if (result.getResponseCode() >= 400) {
console.error(JSON.parse(content))
throw new Error("Exception when exporting as HTML")
}
return content
}
The first line of the function makes it so Apps Script grants you the necessary permissions. Alternatively you can manually set all the permissions you are using at the manifest.
It's worth noting that email HTML is not 100% the same as web HTML, so you may need to clean the result a bit depending on your use case.
References
UrlFetchApp.fetch(url, params) (Apps Script reference)
Files: export (Google Drive API reference)
There's a very common problem I have seen from many people who use different versions of their site for mobile and desktop, many themes have this feature. The issue is Cloudflare caches the same page regardless of the user device causing mixes and inconsistencies between desktop and mobile versions.
The most common solution is to separate the mobile version into another URL, but in my case, I want to use the same URL and make Cloudflare cache work for both desktop and mobile properly.
I found this very nice guide showing how to fix this issue, however, the worker code seems to be outdated, I had to modify some parts to make it work.
I created a new subdomain for my workers and then assigned the route to my site so it starts running.
The worker is caching everything, however, it does not have the desired feature of having different cached versions according to the device.
async function run(event) {
const { request } = event;
const cache = caches.default;
// Read the user agent of the request
const ua = request.headers.get('user-agent');
let uaValue;
if (ua.match(/mobile/i)) {
uaValue = 'mobile';
} else {
uaValue = 'desktop';
}
console.log(uaValue);
// Construct a new response object which distinguishes the cache key by device
// type.
const url = new URL(request.url);
url.searchParams.set('ua', uaValue);
const newRequest = new Request(url, request);
let response = await cache.match(newRequest);
if (!response) {
// Use the original request object when fetching the response from the
// server to avoid passing on the query parameters to our backend.
response = await fetch(request, { cf: { cacheTtl: 14400 } });
// Store the cached response with our extended query parameters.
event.waitUntil(cache.put(newRequest, response.clone()));
}
return response;
}
addEventListener('fetch', (event) => {
event.respondWith(run(event));
});
it is indeed detecting the right user agent, but it should be having two separate cache versions according to the assigned query string...
I think maybe I'm missing some configuration, I don't know why it's not working as expected. As it is right now I still get mixed my mobile and desktop cache versions.
The problem here is that fetch() itself already does normal caching, independent of your use of the Cache API around it. So fetch() might still return a cached response that is for the wrong UA.
If you could make your back-end ignore the query parameter, then you could include the query in the request passed to fetch(), so that it correctly caches the two results differently. (Enterprise customers can use custom cache keys as a way to accomplish this without changing the URL.)
If you do that, then you can also remove the cache.match() and cache.put() calls since fetch() itself will handle caching.
I'm new to web development and am trying to build a small webpage that'll attempt to detect if anyone's currently logged into Instagram on that browser. If I run my code command-by-command in Chrome's console and manually copy-paste the HTML, it works perfectly. But when I try to run my code separately, it seems to run as if it was in Incognito mode.
How would one get around this? If the user needs to grant my page permission to access Instagram, how would I do that?
Here's the JS part of my code:
const USERNAME_REGEX = /"viewer"\:{.*"full_name"\:"([^"]+)"/gm;
const FULLNAME = 'full_name\":\"';
document.addEventListener('DOMContentLoaded', () => {
const request = new XMLHttpRequest();
request.open('GET', 'https://instagram.com');
request.send();
request.onload = () => {
const data = request.responseText;
var u = data.match(USERNAME_REGEX);
if (u != null) {
u = u[0];
u = u.substring(u.indexOf(FULLNAME) + FULLNAME.length).slice(0, -1);
document.querySelector('#info').innerHTML = u;
}
else {
document.querySelector('#info').innerHTML = 'no one is logged in!';
}
}
});
Thank you :) have a wonderful day!
I'm new to web development and am trying to build a small webpage that'll attempt to detect if anyone's currently logged into Instagram on that browser.
That sounds like a security and privacy nightmare, which fortunately for everyone (you included) isn't possible. (Or at least, it shouldn't be...)
If I run my code command-by-command in Chrome's console and manually copy-paste the HTML, it works perfectly.
Totally different context. You'll find, in fact, that if you run this code in a console that isn't on Instagram.com, it won't work.
Same-origin policy will prevent you from accessing another domain. What you're attempting is a classic cross-origin attack that isn't going to work. CORS would get around this, but Instagram surely isn't going to enable you access.
How would one get around this?
You don't.
This issue is related only to websites hosted on HTTPS, which need to make requests to an HTTP-endpoint (seems to be working in Chrome, however).
The following code was used:
HTML:
<button id="fetchButton">click me</button>
<script>
var worker = new Worker("/worker.js");
var button = document.getElementById("fetchButton");
button.addEventListener("click", function() {
myWorker.postMessage("");
});
worker.onmessage = response => {
console.log("message from worker");
console.log(response.data);
}
</script>
Worker:
onmessage = async function (event) {
const result = await (await fetch("http://localhost:3000/something")).json();
this.postMessage(result);
}
When hosting this website on an HTTPS-enabled website, I get back a response in Chrome, but not in Firefox, but I couldn't find any more specific docs about this one. Is there a way of getting this to work in Firefox, too? Or is this behavior (Firefox) the correct one, on how to deal with mixed-content? The above code works on both Chrome and Firefox, when hosted locally without HTTPS, so it's probably related to serving the website over HTTPS or not.
everyone!
I'm new to TestCafe and I need some help on something I want to achieve.
I have a React website where I put a Facebook Login. Normally, when you enter the page and click on Login with facebook a popup window opens and enter your credentials normally. After that, you are redirected to the page and the token is saved in a localStorage variable for the page to consult later on.
However, when I run test for login process, Testcafe instead of opening a popup window, opens the facebook form on the same page and never redirects to the page.
Also, I tried to set some dummy token on the localstorage using the ClientFunction (and also Roles) and my website can never reach that token because testcafe seems to put this variable on a key called hammerhead
So, my question here is, how could I enter this token on the test or manually so my website can read it and make some functions with it?
This is what I have so far.
/* global test, fixture */
import { WelcomePage } from './pages/welcome-page'
import {ClientFunction, Role} from 'testcafe';
const welcomePage = new WelcomePage()
const setLocalStorageItem = ClientFunction((prop, value) => {
localStorage.setItem(prop, value);
});
const facebookAccUser = Role(`https//mypage.net/`, async t => {
await setLocalStorageItem('token', 'my-token');
}, { preserveUrl: true });
fixture`Check certain elements`.page(`https//mypage.net/`)
test('Check element is there', async (t) => {
await t
.navigateTo(`https//mypage.net/`)
.wait(4000)
.useRole(facebookAccUser)
.expect(cetainElementIfLoggedIn)
.eql(certainValue)
.wait(10000)
})
Any help would be highly appreciated
Thanks for your time.
UPDATE FROM FEB 2021
TestCafe now supports multiple browser windows and you can log-in via the Facebook popup form without any issues. Refer to the Multiple Browser Windows topic for more information.
Currently, TestCafe does not support multiple browser windows. So it's impossible to log in via the Facebook popup form.
However, there is a workaround. Please refer to the following thread https://github.com/DevExpress/testcafe-hammerhead/issues/1428.
My working test look like this:
import { Selector, ClientFunction } from 'testcafe';
const patchAuth = ClientFunction(() => {
window['op' + 'en'] = function (url) {
var iframe = document.createElement('iframe');
iframe.style.position = 'fixed';
iframe.style.left = '200px';
iframe.style.top = '150px';
iframe.style.width = '400px';
iframe.style.height = '300px';
iframe.style['z-index'] = '99999999';
iframe.src = url;
iframe.id = 'auth-iframe';
document.body.appendChild(iframe);
};
});
fixture `fixture`
.page `https://www.soudfa.com/signup`;
test('test', async t => {
await patchAuth();
await t
.click('button.facebook')
.switchToIframe('#auth-iframe')
.typeText('#email', '****')
.typeText('#pass', '****')
.click('#u_0_0')
.wait(30e3);
});
Please keep in mind that manipulations with x-frame-options in the testcafe-hammerhead module are required.
In addition, I would like to mention that Testing in Multiple browser windows is one of our priority tasks, which is a part of TestCafe Roadmap