I'm new to Cypress and want to know where's the place I should put all of my external URLs in?
In my website, we have separated URLs for the web, and another one for APIs.
Let's say they're:
App URL: https://app.com
API URL: https://api.myapi.com
I know I can put my app URL in baseUrl in cypress.json, and I've put all my subdirectory e.g. /products /users in the env variables session. Because cy.visit() automatically utilize this baseUrl, it's good for me now.
What I struck now is where to place my API URLs properly. I don't feel like put the fullpath like:
https://api.myapi.com/products
https://api.myapi.com/users
https://api.myapi.com/users/1/edit
is not a good idea, since I repeated the API baseUrl everywhere. And if I split it into apiBaseUrl and the sub part /products, I'm now have to build up the URL myself every time I want to watch it like
cy.route(`${Cypress.env('apiBaseUrl')}${Cypress.env('apiUrl').getProducts}`);
because my apiBaseUrl is not baseUrl. This make it even harder than the above method.
Any suggestion?
I found a quite simple solution. Just simply put your API backend URL right next to the (web frontend) cypress 'baseUrl'. My cypress.json looks like this:
{
"baseUrl": "http://localhost:3001/",
"backendBaseURL": "http://localhost:8080/api/v2",
"env": {
"someDummyEnvironemntVar": "fooBar"
},
"testFiles": "**/*.*",
"watchForFileChanges": true
}
Both can be accessed anywhere in Cypress tests:
Cypress.env('someDummyEnvironemntVar') => "fooBar"
Cypress.config('backendBaseURL') => "http://localhost:8080/api/v2"
We have a separate file called settings.js (cypress/settings.js). Which holds the configuration for the API:
export const apiUri = 'APIURL'
export const apiOauthSecret = 'TOKEN'
To actually use the variable apiUri we import it in the file where we want to use it:
import {apiUri, apiOauthSecret} from '../settings'
Cypress.Commands.add('apiRequest', (endPoint, options) => {
const token = options.token || JSON.parse(window.sessionStorage.getItem('oauth-token') || '{}').accessToken
if (token) {
return cy.request({
failOnStatusCode: false,
...requestOptions({
...options,
token
}),
url: `${apiUri}/${endPoint}`
})
}
return cy
})
This results in being able to use apiUri and baseUrl next to eachother
Related
I'm trying to load Vue 3 components in a asynchronous way. I've found that there is a function called
defineAsyncComponent
which is supposed to be used as follows:
const GameUI = defineAsyncComponent(()=>import(filePath));
app.component("GameUI", GameUI);
filePath in this context is exactly: './components/GameUI/GameUI.element.vue'
Running the app like this leads to the following error:
Uncaught (in promise) Error: Cannot find module './components/GameUI/GameUI.element.vue'
at eval (eval at ./src lazy recursive)
But... if I change the filePath code to import the path as a string:
const GameUI = defineAsyncComponent(()=>import('./components/GameUI/GameUI.element.vue'));
The app works, it does find the component.
I don't want to use constant strings, because I have a lot of components and I want to load them asynchronously.
One of my main goals to achieve this, is to load the webapp by parts, component by component whenever they are needed, instead of loading them all on start.
I've also find that if I append a comment as follows:
const GameUI = defineAsyncComponent(()=>import(/* webpackChunkName: "GameUI" */ './components/GameUI/GameUI.element.vue'));
The JavaScript part for the GameUI component, should have it's own chunk.js file, but I always keep getting everything in a couple .js chunk files, which contradicts the async loading I want to achieve.
I'm using vue-cli-service, and my vue.config.js looks like this:
module.exports = {
productionSourceMap: false,
css: {
loaderOptions: {
sass: {
additionalData: process.env.NODE_ENV === 'production'
? '$baseURL:"/dist/";'
: '$baseURL:"/";'
}
}
},
publicPath: process.env.NODE_ENV === 'production'
? '/dist/'
: '/',
devServer: {
https: true,
port:"",
host:'website.com',
disableHostCheck: true,
cert: (fs.readFileSync('cert.pem')+""),
key: (fs.readFileSync('privkey.pem')+""),
ca: (fs.readFileSync('ca.pem')+""),
}
};
I've already tried multiple stuff I've found online, but they are not that much explanatory. I'm literally doing the same as some online articles I've found and cannot find the problem on my side.
The two main problems are:
Cannot load .vue files from variables, only full strings.
Cannot split the code into different .js files for every async loaded component.
The answer to the first question:
Because there have some limitations of async import
.
What you are doing cannot work because you are using a variable as value to defineAsyncComponent.
According to the limitations of async import, you cannot import your component usinig a variable. Instead what you can do is:
// If the component name to call is GameUI.element
const component = 'GameUI.element' // can be comed from anyting
const GameUI = defineAsyncComponent(()=>import(`./components/GameUI/${component}.vue`));
app.component("GameUI", GameUI);
In my React app, hosted on Azure as a Static Web App, I use the MSAL library for authentication. In order to create a new Msal.UserAgentApplication, I need to pass a configuration object - which in particular contains the redirectUri. Something like:
const msalConfig = {
auth: {
clientId: "75d84e7a-40bx-f0a2-91b9-0c82d4c556aa",
authority: "https://login.microsoftonline.com/common",
redirectUri: "www.example.org/start",
},
...
I'm wondering: how can I use a relative redirect url, not an absolute one?
The concrete problem I'm trying to solve is the following one: when I launch and test the app locally, then I want to redirect to localhost:3000/start instead of www.example.org/start. So far I haven't found any better method than switching the config file before each run/deploy, but it's annoying and of course can be easily forgotten...
You can make it dynamic like this:
const msalConfig = {
auth: {
redirectUri: window.location.origin + '/start'
}
}
This takes the current origin and adds /start to the end.
This way it works locally and in the deployed environment(s).
Im building the back-end for an app using node and express.
I separated different parts of code in diferent files: for example everything that concerns accessing the database is in the file DBService.js and if I want to perform any action related to my users I have a UserService.js file that does all the app needs with the users, and uses DBService.js to save the users in the DB.
I'm aware I do have some circular dependencies in my code but all worked fine until now. I'm using GraphQL for pretty much everything but I'm adding a normal endpoint to grab a file given it's ID.
I do require the FileService.js in the index.js (entry point to the node app) to serve the file, and this part works good. The problem is that in another file (ZoneService.js) where I also require the FileService.js, it returns an empty object.
I know for a fact that this is the problem because if I remove the require in the index.js file, the problem disappears.
These are the paths that lead to the circular dependencies. The '->' means that the previous Service requires the next.
FileService -> ZoneService -> FileService
FileService -> ZoneService -> FileUploadService -> FileService
It may look silly but I need this because I thought it was a good move to keep the graphQL type definitions and resolvers of each entity in it's own file.
I will try to explain my reasoning for the first path:
I want to grab files that are from a certain zone so this function goes into FileService. I then use ZoneService to get the file ID's given the zone ID, then I get the paths from the DB
ZoneService needs the FileService to resolve the 'files' field in the zone entity
I could just move this function to ZoneService and get the files from there, but would kinda of break all of my logic of separating concerns.
What I would like to know is the best way of fixing this so that it does not happen again, and how can it be avoided.
I would post some code but I'm not sure what so if you think it's necessary let me know.
Thanks in advance!
Edit - Here is some code:
FileService.js
//Import services to use in resolvers
const EditService = require("./EditService.js")
const ZoneService = require("./ZoneService.js")
//Resolvers
const resolvers = {
Query: {
getFileById: (parent, {_id}) => {
return getFileById(_id)
},
getFilesById: (parent, {ids}) => {
return getFilesById(ids)
},
getFilesByZoneId: (parent, {_id}) => {
return getFilesByZoneId(_id)
},
},
File: {
editHistory: file => {
return EditService.getEditsById(file.editHistory)
},
fileName: file => {
return file.path.split('\\').pop().split('/').pop();
},
zone: file => {
return ZoneService.getZoneById(file.zone)
}
}
}
ZoneService.js
//Import services to use in resolvers
const UserService = require("./UserService.js")
const FileService = require("./FileService.js")
const EditService = require("./EditService.js")
const ErrorService = require("./ErrorService.js")
const FileUploadService = require("./FileUploadService.js")
//Resolvers
const resolvers = {
Query: {
getZone: (parent, {_id, label}) => {
return _id ? getZoneById(_id) : getZoneByLabel(label)
},
getZones: () => {
return getZones()
},
},
Zone: {
author: zone => {
return UserService.getUserById(zone.author)
},
files: zone => {
if(zone.files && zone.files.length > 0) return FileService.getFilesById(zone.files)
else return []
},
editHistory: zone => {
return EditService.getEditsById(zone.editHistory)
}
},
Mutation: {
createZone: async (parent, input, { loggedUser }) => {
return insertZone(input, loggedUser)
},
editZone: async (parent, input, { loggedUser }) => {
return editZone(input, loggedUser)
},
removeZone: async (parent, input, { loggedUser }) => {
return removeZone(input, loggedUser)
}
},
}
A couple of dos and don'ts:
Do split your schema into smaller modules. For most schemas, it makes sense to split the type definitions and resolvers across multiple files, grouping together related types and Query/Mutation fields. The resolvers and type definitions might be exported from a single file, or the type definitions might reside in a file by themselves (maybe a plain text file with a .gql or .graphql extension). (Note: Borrowing Apollo's terminology, I'm going to refer to the related type definitions and resolvers as a module).
Don't introduce dependencies between these modules. Resolvers should operate independently of one another. There's no need to call one resolver inside another -- and certainly no need to call one module's resolver from inside another module. If there's some shared logic between modules, extract it into a separate function and then import it into both modules.
Do keep your API layer separate from your business logic layer. Keep business logic contained to your data model classes, and keep your resolvers out of these classes. For example, your app should have a Zone model, or ZoneService or ZoneRepository that contains methods like getZoneById. This file should not contain any resolvers and should instead be imported by your schema modules.
Do use context for dependency injection. Any data models, services, etc. that your resolvers need access to should be injected using context. This means instead of importing these files directly, you'll utilize the context parameter to access the needed resource instead. This makes testing easier, and enforces a unidirectional flow of dependencies.
So, to sum up the above, your project structure might look something like this:
services/
zone-service.js
file-service.js
schema/
files/
typeDefs.gql
resolvers.js
zones/
typeDefs.gql
resolvers.js
And you might initialize your server this way:
const FileService = require(...)
const ZoneService = require(...)
const server = new ApolloServer({
typeDefs,
resolvers,
context: () => ({
services: {
FileService,
ZoneService,
}
})
})
Which means your resolver file would not need to import anything, and your resolvers would simply look something like:
module.exports = {
Query: {
getFileById: (parent, {_id}, {services: {FileService}}) => {
return FileService.getFileById(_id)
},
getFilesById: (parent, {ids}, {services: {FileService}}) => {
return FileService.getFilesById(ids)
},
getFilesByZoneId: (parent, {_id}, {services: {FileService}}) => {
return FileService.getFilesByZoneId(_id)
},
},
}
For better, you should avoid circular dependencies.
A simple way to do this is separated your module to smaller modules.
As
FileService -> CommonService
ZoneService -> CommonService
or
FileServicePartDependsOnZoneService -> ZoneService
ZoneService -> FileServicePartNotDependsOnZoneService
or
FileService -> ZoneServicePartNotDependsOnFileService
ZoneServicePartDependsOnFileService -> FileService
Note that is example. You should naming your module to meaningful and shorter than my example.
Another way is to merge them together. (But it may be a bad idea)
If you can not avoid circular dependencies. You also can require module when you need it instead of import.
For example:
//FileService.js
let ZoneService
function doSomeThing() {
if(!ZoneService) {
ZoneService = require("./ZoneService.js").default
//or ZoneService = require("./ZoneService.js")
}
//using ZoneService
}
For reusable, define a function getZoneService or something alternative
How do I set the baseUrl so if I switch from server to server on the frontend (Vue.js), it changes dynamically to baseUrl?
I show my code axios-auth.js code:
import axios from 'axios'
const instance = axios.create({
baseURL: 'http://mvp.test/api/public/api/'
// baseURL: 'http://127.0.0.1:8000/api/' for testing localhost
});
and my .env file which have standard code for laravel.
Taken from the official mix documentation, you can use an environment variable by creating a key in your .env prefixed with MIX_:
MIX_BASE_URL=http://mvp.test/api/public/api/
And run php artisan config:clear to be sure the new config is set.
Then, in javascript you can access the variable inside the process.env object:
process.env.MIX_BASE_URL
So you can simply use it like this:
const instance = axios.create({
baseURL: process.env.MIX_BASE_URL
});
I have 4 servers with different URLs, i want to change one server URL to another server URL if server is down in angular 6 application, if anyone knows please help me.
Consider i have 4 servers with different URLs(1st,2nd,3rd and 4th), here 1st server is having more priority and i want to make it has default. and my question is how to change 1st server URL to 2nd server URL as same as for 3rd and 4th also, if servers are down. Any help would be appreciated and thanks in advance.
service.ts file
firstserver ="http://111.121.1.342:8001/rest/formalities";
secondserver="http://111.121.1.342:8002/rest/formalities";
thirdserver="http://111.121.1.342:8003/rest/formalities";
fourthserver="http://111.121.1.342:8004/rest/formalities";
validateUserDetails(employeeDetails): Observable<any> {
console.log(serversurl);
return this._httpClint.post(serversurl(would be first/second/third/fourth server), employeeDetails);
}
here i have to check first server URL is up or down then apply to server URL which one is up.
Expected result:
I should be able to change one server URL to another server URL based on up and down condition.
for my above problem i got solution like in below, for that we have to take help from JAVASCRIPT.
step 1:->
create env.js file in the directory of index.html, like in below.
(function (window) {
window.__env = window.__env || {};
// API url
window.__env.apiUrl = "http://RestWebService";
// Whether or not to enable debug mode
// Setting this to false will disable console output
window.__env.enableDebug = true;
}(this));
the meaning of above env.js file is just we are creating one global variable called apiUrl, to store our server URL. So that we can access that variable globally. Next, we add a element to the section in our index.html to load env.js before Angular is loaded:
<html>
<head>
<!-- Load environment variables -->
<script src="env.js"></script>
</head>
<body>
...
<!-- Angular code is loaded here -->
</body>
</html>
By default, JavaScript files such as env.js are not copied to the output directory when we build our application.
To make sure that the file is copied to the output directory when we run ng build or ng serve, we must add it to the assets section of our application's build configuration in angular.json:
angular.json file
{
"projects": {
"app-name": {
"architect": {
"build": {
"options": {
"assets": [
"src/favicon.ico",
"src/assets",
"src/env.js"
]
}
"configurations": {
"production": {
"fileReplacements": [
{
"replace": "src/environments/environment.ts",
"with": "src/environments/environment.prod.ts"
}
],
// ...
}
}
}
}
}
}
}
step 2:->
Create one service in the name of any, in my case i created it as env.service.ts in the directory of app.module.ts.
this is a service file will be used to take our server URL value, which is stored in apiUrl(env.js file).
env.service.ts
import { Injectable } from '#angular/core';
#Injectable({
providedIn: 'root'
})
export class EnvService {
// The values that are defined here are the default values that can
// be overridden by env.js
// API url
public apiUrl = '';
// Whether or not to enable debug mode
public enableDebug = true;
constructor() {
}
}
step 3:->
Create on service provider file in the same directory of env.service.ts.
env.service.provider.ts
import { EnvService } from './env.service';
export const EnvServiceFactory = () => {
// Create env
const env = new EnvService();
// Read environment variables from browser window
const browserWindow = window || {};
const browserWindowEnv = browserWindow['__env'] || {};
// Assign environment variables from browser window to env
// In the current implementation, properties from env.js overwrite defaults from the EnvService.
// If needed, a deep merge can be performed here to merge properties instead of overwriting them.
for (const key in browserWindowEnv) {
if (browserWindowEnv.hasOwnProperty(key)) {
env[key] = window['__env'][key];
}
}
return env;
};
export const EnvServiceProvider = {
provide: EnvService,
useFactory: EnvServiceFactory,
deps: [],
};
EnvServiceProvider:->It is an angular provider recipe to register EnvServiceFactory with Angular dependency injection as the factory for instantiating EnvService.
EnvServiceFactory:->It's a factory that reads environment values from window.__env and instantiates an instance of the EnvService class.
So over all summary of this env.service.provider.ts file is, we export an EnvServiceFactory function that creates an instance of the EnvService class and copies all values from the window.__env object into the EnvService instance.
Finally to register EnvServiceProvider as a recipe with Angular's dependency injection system, we must list it as a provider in our application's providers array:
app.module.ts file
import { NgModule } from '#angular/core';
import { EnvServiceProvider } from './env.service.provider';
#NgModule({
imports: [ // ... ],
providers: [EnvServiceProvider],
})
export class AppModule {}
We can now access our environment variables from anywhere in our application by injecting the EnvService, I am using it like in below.
service.ts file
import { EnvService } from '../env.service';
constructor(private _httpClinet: HttpClient, private env: EnvService) {
}
emplLoginCheckUrl = this.env.apiUrl+"/checkValidUser";
validateUserDetails(employeeDetails): Observable<any> {
console.log(this.emplLoginCheckUrl);
return this._httpClinet.post(this.emplLoginCheckUrl, employeeDetails);
}
tha's it for step 3.
step 4:->
finally what we have to do means before serving app, we have to build app with ng build --prod to get dist folder, which contains env.js file. from there we can change our URL's, if you change URL there it will automatically apply new changed URL in our application.
FOR MORE INFORMATION visit below site, thx Jurgen Van de Moere.
https://www.jvandemo.com/how-to-use-environment-variables-to-configure-your-angular-application-without-a-rebuild/
You can use ajax and if request fails, retry with other server url:
$.ajax({
type: "POST",
url: "**serverIP** ",
data: ,
success: function(){
alert("done");
},
error: function(XMLHttpRequest, errorThrown) {
alert("Here retry post with different server id ");
}
});
You could check if the request is getting failed and try making a request to another server but what if only the fourth server is working and you have to make three failed attempts all the time to reach the working one? Does not seem to be the most efficient way.
Instead, I would add an NGinx load balancer with health check:
http://nginx.org/en/docs/http/load_balancing.html