This is what I have done in the first place
docker run --name=nodesetup -it --mount type=bind,source="$(pwd)",target=/usr/src/app -w /usr/src/app
node bash
Mysql2 works fine
npm i mysql2
when I go for
npx create-react-app test-client
sh: 1: create-react-app: Permission denied
I am bind mounted root#83d263a01dc7:/usr/src/app/test-app#.
How to solve this?
It's an issue with permissions on the host directory that you map. What exactly the issue is, I can't figure out. The container runs as root, so it should have access.
But if you run your container with the UID and GID of your user on the host by adding -u $(id -u):$(id -g) as an option, it works. Like this
docker run --name=nodesetup -it --mount type=bind,source="$(pwd)",target=/usr/src/app -w /usr/src/app -u $(id -u):$(id -g) node bash
This also has the advantage that the files created on the host are owned by you, making it easier to edit them, etc.
Related
I have a Nestjs Application which I try to run in a docker container. It all worked fine just until recently where I get the error Parse Error: Missing expected CR after header value. I'm trying to make a http GET Request to a webserver of an IoT device. Furthermore this error occurs only when I run the server-app inside a docker container. When I run it locally on a windows or macos machine everything works fine.
I tried using different versions of nodejs in docker. 14, 16 and 18. This error always comes up, independent of which version I use. I have no idea how I could debug this error, since it only occurs when I serve the app inside a docker container.
This is my dockerfile:
FROM node:slim
RUN mkdir -p /app
WORKDIR /app
COPY src .
COPY package.json .
RUN apt update && apt install python3 make g++ -y
RUN npm install --force
EXPOSE 3000
CMD ["npm", "run", "start:dev"]
According to this issue, this was recently implemented in Node v14.20.0, v16.16.0 and v18.5.0. Apparently it fixes a vulnerability in the HTTP parser of earlier versions.
Comments below the issue suggest various workarounds if it's not possible to fix the client, the easiest of which seems to be to set the insecureHTTPParser flag to true when creating the HTTP server.
Alternatively, revert to an older minor version of each of the Node.js versions mentioned above.
I have this Dockerfile
FROM node:14
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "node", "app.mjs" ]
And I can successfully run
docker run hello-world
on my ubuntu 20.10 OS.
So I am assuming that docker is installed successfully.
But when I tried to run
docker build .
It gives me this error
This is not something with the npm. I can locally install the dependencies without any issue. I assume Dokcer can't access npm registry to pull the npm packages. Something to do with networks I guess.
How do I fix this issue?
This is my code
https://github.com/Enuri-Information-Systems/docker-test
Depending on the network configuration it could happen that you run into problems that the container is not able to connect to a server to download data. This can have various reasons and also depends on how you run docker.
One common thing is that your network requires the usage of a proxy. But only having the error message it is not possible to tell what the exact reason in your case is.
One solution to that problem is to specify the network mode. Adding --network host as an option should most of the time work.
So running your build command that way docker build --network=host . should work.
I'm want to scrape dynamic content with splash and scrapy .When i follow the documentation for installation in Linux 'https://splash.readthedocs.io/en/stable/'
I am not able to run the image with command:
docker run -p 8050:8050 scrapinghub/splash
I get the followig error :
python3: can't open file '/app/bin/splash': [Errno 13] Permission denied
I don't know where to find that file to change the permissions for it .
Thank you
When you installed docker, did you add your user to docker group?
If not, add it using
usermod -aG docker <your-user-name>
and then run
docker run -it -p 8050:8050 scrapinghub/splash
I'm trying to understand why env variables inside my Docker container keep appearing when I've clearly removed or commented them out from my .env file. I'm fairly new to Docker and don't know if this is expected behavior or an anomaly.
The way my system is setup, I spin up an instance of Azure's IoT Edge server locally (via deployment.template.json) which builds the Docker container and populates the environmental variables using the associated .env file.
Now what's perplexing me is that if I were to completely stop the server (not pause), comment out/remove the variable from the .env file, restart the server, and inspect the container (docker container inspect), I still see the variable name and value. I've also used docker system prune -a --volumes after stopping the server to prune my system and volumes, then restarted the server only to see the variable still listed.
Just in case it helps, inside my deployment.template.json I'm passing my variables as MY_VAR=${MY_VAR}. Then in my .env file I have the variable as MY_VAR=abc123.
From my Dockerfile:
# -------------
# Build Sources
# -------------
FROM node:10-alpine as builder
# Install additional git and openssh dependencies and make sure GitLab domain is accepted by SSH
RUN apk add --no-cache openssh git curl \
&& mkdir /root/.ssh/ \
&& touch /root/.ssh/known_hosts \
&& ssh-keyscan gitlab.com github.com >> /root/.ssh/known_hosts
WORKDIR /app
# Install app dependencies
RUN npm i -g typescript --no-cache --silent
COPY package*.json ./
RUN npm ci --only=production --silent
# Copy sources and build
COPY . .
RUN npm run build
# ----------------
# Production Image
# ----------------
FROM node:10-alpine
RUN apk add --no-cache curl
WORKDIR /app
COPY --from=builder /app/node_modules /app/node_modules
COPY --from=builder /app/dist /app/dist
COPY . .
USER node
CMD ["node", "dist/index.js"]
you can run "docker inspect" to your container and see what environment variables are defined in docker create options.
you can also check docker create options in Azure Portal.
I am trying to setup a node.js app inside docker, using as host the google compute engine VM gci-stable-55-8872-71-0 (debian), from image project google-containers:
$ gcloud compute instances create myvm --image-project google-containers --image gci-stable-55-8872-71-0 --zone europe-west1-b --machine-type f1-micro --scopes compute-rw
then I try to get a docker container running:
$ sudo docker build -t forperfuse/test .
but I keep getting errors when installing node:
The command '/bin/sh -c npm install' returned a non-zero code: 1
all other dependencies install well but node and npm are not installing- I have tried several options but still cannot get it to work, can you please help? many thanks in advance...
I'm not sure about what is going on, looks like the run command in the dockerfile is aiming to a bash that has a weird header. If you can publish them we can try or...
You can use the bitnami docker image available in launcher for free and works like a charm.
https://console.cloud.google.com/launcher
And there search for the node.js image.