Building docker image task results in VSTS error - javascript

I'm following
this Microsoft tutorial on how to containerize and deploy an Angular application on Azure using Docker.
I'm running into a unique issue that is not addressed in the tutorial. Every time the build reaches the build an image task it fails with the following error.
2018-01-29T12:37:28.3169577Z [command]/usr/local/bin/docker build -f
/opt/vsts/work/1/s/Dockerfile -t iponam.azurecr.io/my-angular-app:197
/opt/vsts/work/1/s
2018-01-29T12:37:28.7630517Z Sending build context to Docker daemon
593.4 kB
2018-01-29T12:37:28.7639048Z
2018-01-29T12:37:28.7859042Z Step 1/3 : FROM nginx
2018-01-29T12:37:30.3172506Z latest: Pulling from library/nginx
2018-01-29T12:37:30.3188560Z e7bb522d92ff: Pulling fs layer
2018-01-29T12:37:30.3204163Z 6edc05228666: Pulling fs layer
2018-01-29T12:37:30.3219390Z cd866a17e81f: Pulling fs layer
2018-01-29T12:37:30.3279280Z cd866a17e81f: Download complete
2018-01-29T12:37:30.3294450Z e7bb522d92ff: Verifying Checksum
2018-01-29T12:37:30.3310189Z e7bb522d92ff: Download complete
2018-01-29T12:37:30.3511653Z 6edc05228666: Verifying Checksum
2018-01-29T12:37:30.3529571Z 6edc05228666: Download complete
2018-01-29T12:37:32.1518557Z e7bb522d92ff: Pull complete
2018-01-29T12:37:33.4982718Z 6edc05228666: Pull complete
2018-01-29T12:37:33.6163338Z cd866a17e81f: Pull complete
2018-01-29T12:37:33.6365600Z Digest:
sha256:285b49d42c703fdf257d1e2422765c4ba9d3e37768d6ea83d7fe2043dad6e63d
2018-01-29T12:37:33.6540818Z Status: Downloaded newer image for
nginx:latest
2018-01-29T12:37:33.6558500Z ---> 3f8a4339aadd
2018-01-29T12:37:33.6577887Z Step 2/3 : COPY dist /usr/share/nginx/html
2018-01-29T12:37:33.6595536Z COPY failed: stat
/var/lib/docker/tmp/docker-builder545265281/dist: no such file or
directory
2018-01-29T12:37:33.6780204Z ##[error]COPY failed: stat
/var/lib/docker/tmp/docker-builder545265281/dist: no such file or
directory
2018-01-29T12:37:33.6855716Z ##[error]/usr/local/bin/docker failed with
return code: 1
Per the tutorial, this is running on a hosted linux agent that I cannot access to examine folder structure. Essentially what's happening is my Dockerfile (below) is pulling an nginx image, building the angular application with ng build prod and then I try to COPY the dist folder into the /usr/share/nginx/html. Here's the Dockerfile, it's identical to the one in the tutorial as well:
FROM nginx
COPY dist /usr/share/nginx/html
EXPOSE 80
I have spent countless hours trying to debug and understand the issue. Any help will be tremendously appreciated. Thank you!
Update
After run build is executed successfully I don't see where the dist directory is outputted to. I checked every single place on the machine with no luck. On my local machine the dist will appear in the root of the project folder.

Using this command instead in NPM build task (Command and arguments):
run build -- -op dist

okay, first off all, you can examine the folder structure, you can add a Shell script step before docker build and do something like ls -Rla, which is a hack, undoubtebly, but a working one.
As for you error, it indicates that there is no such file (dist) under the working directory. Again, what you could do is verify this file is in the root of the repo or use the ls hack to locate the file. You should also check that you didnt change the default working directory of the docker build step. If you did you need to specify path to dist file relative to the working directory.

Related

How to run gunicorn from 1 folder above manage.py

My folder structure is completely messed up but I am trying to load my django project on to a docker image for the first time and its failing on my CMD step
My directories are as follows:
WORKDIR/appname/appname/wsgi.py
In my docker image I need to call the gunicorn command that runs the server. Typically when I am in the terminal I can cd into the right directory and run
gunicorn appname.wsgi:application --bind 0.0.0.0:8000
How can run this same command from 1 folder structure outside of where I normally am?
I have tried gunicorn ./appname/appname.wsgi:application ...
Thanks in advance
Do i change the WORKDIR right before the command?

AWS Lambda read-only file system error failed to create directory with Docker image

Problem
Docker image compiles successfully, however fails when ran from Lambda because of its read only file system.
Summary
Luminati-proxy has a docker integration for their proxy manager. I copied over their docker file and appended it to my own docker file for pushing out a script to AWS Lambda. The building of the docker image was successful, but when pushed off to Lambda, it failed because of a read only file system error:
Failed to create directory /home/sbx_user1051/proxy_manager: [code=EROFS] Error: EROFS: read-only file system, mkdir '/home/sbx_user1051'
2022-02-28 19:37:22.049 FILE (8): Failed to create directory /home/sbx_user1051/proxy_manager: [code=EROFS] Error: EROFS: read-only file system, mkdir '/home/sbx_user1051'
Analysis
Upon examining the trackback, the error is focused on the proxy_manager installation and fails with directory changes (mkdir, mk_work_dir ...). These changes are made within the .js files of the GitHub which is pulled from the docker file as the proxy_manager installation. Obviously the only mutable directory on Lambda is the /tmp directory, but is there a workaround for getting this set up without resorting to putting everything under the /tmp directory as it wipes itself upon runtime? Reinstalling a proxy_manager each run is not at all ideal...
Answer?
Could this be as simple as setting environment stipulations such as:
ENV PATH=...
ENV LD_LIBRARY_PATH=...
If so, I how should they be configured? I am adding the docker file below for quick reference:
FROM node:14.18.1
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add - \
&& sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list' \
&& apt-get update \
&& apt-get install -y google-chrome-stable fonts-ipafont-gothic fonts-wqy-zenhei fonts-thai-tlwg fonts-kacst fonts-freefont-ttf \
--no-install-recommends \
&& rm -rf /var/lib/apt/lists/*
USER root
RUN npm config set user root
RUN npm install -g npm#8.1.3
RUN npm install -g #luminati-io/luminati-proxy
ENV DOCKER 1
CMD ["luminati", "--help"]
I appreciate the insight!
TL;DR:
You should instead leverage an S3 bucket to store, read and modify any file. All lambdas and microservices. In general, should always be treated as stateless
All Luminati-proxy functionality comes prebuilt within amazon lambdas and API Gateway
Lambda functions are not meant to run long-running processes as they are limited to 15 minutes maximum so the design of the container that you are trying to run in lambdas has to have AWS serverless architecture considerations in its design
Explanation:
According to the documentation of AWS
Lambda functions:
The container image must be able to run on a read-only file system. Your function code can access a writable /tmp directory with 512 MB of storage.
Since containers based on Linux based images are already supposed to have a folder called /tmp you should pretty much be able to access that folder any time from your code to read( remember, read-only FS)
If you are looking to store content amazon's solution for that is for you to have any content created and manage over an S3 bucket, buckets are as easy to use as if you read a file locally but will remain accessible after the lambda instance finishes the workload
Please refer to Read file from aws s3 bucket using node fs and Upload a file to Amazon S3 with NodeJS for more details on how to use an S3 bucket. There are plenty of ways to achieve it regardless of the language been used.
This is all based on a best practice promoted by AWS over their platform. Where lambdas remain stateless
AWS Lambda provides /tmp folder for users to write files on lambda, as I don't about your question context but hope this help.
You can write files to AWS Lambda at /tmp folder
eg. I want to create a file demo.txt at runtime/programmatically using AWS lambda, then i can write the file to /txt/demo.txt

Generated next js chunks does not load on local environment

I recently joined a project written with Next Js and Squidex CMS.
The problem is I am not able to run it (properly) on my local environment.
The chunks are not loading and the images are not visible
I tried 2 ways
1 - Running the project with the same scripts on production pipeline
- clean `node_modules` and `.next` directories
- run `yarn build`
- run `yarn start`
The build is always successful but when I try to run the build by yarn start it does not load some js chunks and assets (images in static folder)
They are throwing 404 error on the console (please see the image)
I compared the hashes and they are fine.
My next config file is pretty empty.
const optimizedImages = require("next-optimized-images")
const withPlugins = require("next-compose-plugins")
module.exports = withPlugins([
[optimizedImages, {
responsive: {
adapter: require('responsive-loader/sharp')
}
}]
],
)
I also tried with multiple next versions (9.5.0 and 10.1.3)
The issue was the directory name of the project.
Repository named like 'Some Project' and when you try to clone it via SSH provided from repository website, the directory on your computer will be like this : 'Some%20Project'
Example ssh command :
git#ssh.dev.provider.com:Some%20Project
In that case, node.js server able to start and there is no error on the console unless you launch your browser and try to react the localhost. Node.js is not able to serve files from the directory because of not supported string chars like %
This cost me a lot of hours to find out, hope it will help

How to deploy node.js application in cyberpanel?

I have my application developed in node.js, and I have cyberpanel installed on my server. I have seen many examples of how to deploy a node application in cyberpanel, but I have doubts about how to view it from the browser.
So far I have the following configuration in vHost:
context / {
type appserver
location /FOLDER/FOLDER/PROJECT_FOLDER/dist
binPath /usr/bin/node
startupFile index.js
appType node
maxConns 100
}
My application runs perfectly on port 3000 when I run it by console, but I need to list it on port 80 with cyberpanel.
Does anyone have an idea how to do it?
try the following steps. Essentially, the error lies in selecting the root document folder and allowing access to the application.
Create a Website using the normal CyperPanel menu. [https://cyberpanel.net/docs/2-creating-website/]
Upload your Node.Js files into the public_html folder of the website.
Enter the Open Lite Speed panel via port :7080 (you would need to enable the port on the firewall)
Navigate to VH Hosts > Your Domain > Context
Select App Server, for location using $VH ROOT instead of the hardcoded path worked.
Additionally, don't forget to enable the site on access control via allowing all IPs (*).
context / {
type appserver
location $VH_ROOT/public_html/
binPath /usr/bin/node
appType node
startupFile server.js //this is the name of your
appserverEnv 1
maxConns 100
accessControl {
allow *
}
rewrite {
}
ad
See I am going to answer point to point to the question
First of all cyberpanel by default only takes app.js file as its core file to run the application.
Second, How to change that default file pointing ?
context / {
type appserver
startupFile index.js // **NAME OF YOUR STARTUP FILE**
location /home/PROJECT_FOLDER/public_html/dist
binPath /usr/bin/node
appType node
appserverEnv 1
maxConns 100
accessControl {
allow *
}
rewrite {
}
ad
location /FOLDER/FOLDER/PROJECT_FOLDER/dist
Note :- Things, I want to mention about this location parameter is this is the location to the your startup file, you will get it via file manager, as you cannot run typescript code directly here, you have to convert it into javascript using tsc command and further target dist folder using location parameter in vconfig file
Now next question is how to run application outside console ?
Create a website to deploy the project, use below link for reference click here
Issuing SSL for website - link for reference
This is my folder structure for deployment, simply zip all files and upload it on file manager of cyber panel, and extract out your files. You can see, I have dist folder which contains all javascript files and also have index.js, the main startup file.
Click on fix permissions on file manager.
Go to Web terminal and install node modules. how ?
on web terminal :- type cd .. and press enter.
There you have to find out your project from directory, You can use ls command to get list of files and folder structure.
mine directory was (after using cd ..) :- cd home/FOLDERNAME/public_html
At last run your project through terminal, to check its working.
Config your vhost config file, below is reference image
File you have to add in vhost config, I also had provided you above.
If you domain is setup correctly, you can view on api on your domain else you can click on preview button on cyber panel
Note :- Always Run code in terminal first to check its working.

CORS policy error on file in same folder as HTML [duplicate]

I am getting the following error:
XMLHttpRequest cannot load file:///C:/Users/richa.agiwal/Desktop/get/rm_Library/templates/template_viewSettings.html. Cross origin requests are only supported for HTTP.
I realize that this question has been answered before, but I still have not found a solution to my problem. I tried running chrome.exe --allow-file-access-from-files from the command prompt, and moved the file to the local file system, but I still get the same error.
I appreciate any suggestions!
If you are doing something like writing HTML and Javascript in a code editor on your personal computer, and testing the output in your browser, you will probably get error messages about Cross Origin Requests. Your browser will render HTML and run Javascript, jQuery, angularJs in your browser without needing a server set up. But many web browsers are programed to watch for cross site attacks, and will block requests. You don't want just anyone being able to read your hard drive from your web browser. You can create a fully functioning web page using Notepad++ that will run Javascript, and frameworks like jQuery and angularJs; and test everything just by using the Notepad++ menu item, RUN, LAUNCH IN FIREFOX. That's a nice, easy way to start creating a web page, but when you start creating anything more than layout, css and simple page navigation, you need a local server set up on your machine.
Here are some options that I use.
Test your web page locally on Firefox, then deploy to your host.
or: Run a local server
Test on Firefox, Deploy to Host
Firefox currently allows Cross Origin Requests from files served from your hard drive
Your web hosting site will allow requests to files in folders as configured by the manifest file
Run a Local Server
Run a server on your computer, like Apache or Python
Python isn't a server, but it will run a simple server
Run a Local Server with Python
Get your IP address:
On Windows: Open up the 'Command Prompt'. All Programs, Accessories, Command Prompt
I always run the Command Prompt as Administrator. Right click the Command Prompt menu item and look for Run As Administrator
Type the command: ipconfig and hit Enter.
Look for: IPv4 Address . . . . . . . . 12.123.123.00
There are websites that will also display your IP address
If you don't have Python, download and install it.
Using the 'Command Prompt' you must go to the folder where the files are that you want to serve as a webpage.
If you need to get back to the C:\ Root directory - type cd/
type cd Drive:\Folder\Folder\etc to get to the folder where your .Html file is (or php, etc)
Check the path. type: path at the command prompt. You must see the path to the folder where python is located. For example, if python is in C:\Python27, then you must see that address in the paths that are listed.
If the path to the Python directory is not in the path, you must set the path. type: help path and hit Enter. You will see help for path.
Type something like: path c:\python27 %path%
%path% keeps all your current paths. You don't want to wipe out all your current paths, just add a new path.
Create the new path FROM the folder where you want to serve the files.
Start the Python Server: Type: python -m SimpleHTTPServer port Where 'port' is the number of the port you want, for example python -m SimpleHTTPServer 1337
If you leave the port empty, it defaults to port 8000
If the Python server starts successfully, you will see a msg.
Run You Web Application Locally
Open a browser
In the address line type: http://your IP address:port
http://xxx.xxx.x.x:1337 or http://xx.xxx.xxx.xx:8000 for the default
If the server is working, you will see a list of your files in the browser
Click the file you want to serve, and it should display.
More advanced solutions
Install a code editor, web server, and other services that are integrated.
You can install Apache, PHP, Python, SQL, Debuggers etc. all separately on your machine, and then spend lots of time trying to figure out how to make them all work together, or look for a solution that combines all those things.
I like using XAMPP with NetBeans IDE. You can also install WAMP which provides a User Interface for managing and integrating Apache and other services.
Simple Solution
If you are working with pure html/js/css files.
Install this small server(link) app in chrome. Open the app and point the file location to your project directory.
Goto the url shown in the app.
Edit: Smarter solution using Gulp
Step 1: To install Gulp. Run following command in your terminal.
npm install gulp-cli -g
npm install gulp -D
Step 2: Inside your project directory create a file named gulpfile.js. Copy the following content inside it.
var gulp = require('gulp');
var bs = require('browser-sync').create();
gulp.task('serve', [], () => {
bs.init({
server: {
baseDir: "./",
},
port: 5000,
reloadOnRestart: true,
browser: "google chrome"
});
gulp.watch('./**/*', ['', bs.reload]);
});
Step 3: Install browser sync gulp plugin. Inside the same directory where gulpfile.js is present, run the following command
npm install browser-sync gulp --save-dev
Step 4: Start the server. Inside the same directory where gulpfile.js is present, run the following command
gulp serve
To add to Alan Wells's elaborate answer here is a quick fix
Run a Local Server
you can serve any folder in your computer with Serve
First, navigate using the command line into the folder you'd like to serve.
Then
npx i -g serve
serve
or if you'd like to test Serve without downloading it
npx serve
and that's it! You can view your files at http://localhost:5000
If you are using vscode, you can easily start a liver server. Click liver server at the bottom of the page, once the server is started, vscode will tell the port the project is running. Do ensure your project folder is the workspace
This error is happening because you are just opening html documents directly from the browser. To fix this you will need to serve your code from a webserver and access it on localhost. If you have Apache setup, use it to serve your files. Some IDE's have built in web servers, like JetBrains IDE's, Eclipse...
If you have Node.Js setup then you can use http-server. Just run npm install http-server -g and you will be able to use it in terminal like http-server C:\location\to\app.
Kirill Fuchs
If you use the WebStorm Javascript IDE, you can just open your project from WebStorm in your browser. WebStorm will automatically start a server and you won't get any of these errors anymore, because you are now accessing the files with the allowed/supported protocols (HTTP).
I was facing this error while I deployed my Web API project locally and I was calling API project only with this URL given below:
localhost//myAPIProject
Since the error message says it is not http:// then I changed the URL and put a prefix http as given below and the error was gone.
http://localhost//myAPIProject
Depends on your needs, but there is also a quick way to temporarily check your (dummy) JSON by saving your JSON on http://myjson.com. Copy the api link and paste that into your javascript code. Viola! When you want to deploy the codes, you must not forget to change that url in your codes!

Categories