Difficulty with Python, Flask, Javascript web app for microphone audio frequency - javascript

I'm trying to create a web page that shows the audio frequency from a microphone input using a Python/Flask backend and an HTML/JavaScript frontend on Centos 7 with an Apache server. The backend is setup from the command line and I'm using a reverse proxy
Website URL
The URL resolves to an index.html file in the tuner/template folder. When navif=gated to on the web the index file simply shows the text, loading...
Some errors i get are,
In the browser inspector is:
(https://i.stack.imgur.com/j4DDF.jpg)
The error keeps repeating due because i coded it to update repeatedly i imagine.
I don't have a var/log/httpd folder. In the log folder, the file, messages, shows,
Feb 7 19:06:10 1 kernel: Firewall: TCP_IN Blocked IN=eth0 OUT= MAC=c4:37:72:d5:fa:9b:d0:07:ca:5a:5d:21:08:00 SRC=209.159.151.30 DST=104.237.8.87 LEN=40 TOS=0x00 PREC=0x00 TTL=248 ID=11415 PROTO=TCP SPT=47477 DPT=3389 WINDOW=1024 RES=0x00 SYN URGP=0
Feb 7 19:06:11 1 kernel: Firewall: UDP_IN Blocked IN=eth0 OUT= MAC=ff:ff:ff:ff:ff:ff:c4:37:72:0f:3a:c6:08:00 SRC=104.237.8.160 DST=104.237.8.255 LEN=49 TOS=0x00 PREC=0x00 TTL=64 ID=40819 DF PROTO=UDP SPT=47903 DPT=32412 LEN=29
My app.py file in the folder, tuner, is as follows:
from flask import Flask, render_template, request, jsonify
import pyaudio
import numpy as np
app = Flask(__name__, template_folder='/home/spheres/public_html/tuner/template')
#app.route('/')
def index():
return render_template('index.html')
#app.route('/frequency', methods=['GET'])
def get_frequency():
# Audio Configuration
CHUNK = 1024
FORMAT = pyaudio.paInt16
CHANNELS = 1
RATE = 44100
# PyAudio Object
p = pyaudio.PyAudio()
# Stream Object
stream = p.open(format=FORMAT,
channels=CHANNELS,
rate=RATE,
input=True,
frames_per_buffer=CHUNK)
# Audio Data
data = stream.read(CHUNK)
data = np.frombuffer(data, dtype=np.int16)
# Calculate audio frequency
frequency = np.fft.fft(data)
frequency = np.abs(frequency[:int(len(frequency) / 2)])
frequency = np.argmax(frequency) * RATE / CHUNK
# Close Stream
stream.stop_stream()
stream.close()
# Terminate PyAudio
p.terminate()
return jsonify({'frequency': frequency})
if __name__ == '__main__':
app.run(debug=True)
My reverse proxy code i put in /usr/local/apache/conf/httpd.conf
<VirtualHost 104.237.8.87:443>
ServerName soundsphere.icu
ServerAdmin soundsphere.icu#gmail.com
DocumentRoot /home/spheres/public_html
ProxyPass /tuner/ http://localhost:5000/
ProxyPassReverse /tuner/ http://localhost:5000/
ProxyPreserveHost On
SSLEngine on
SSLCertificateFile /etc/pki/tls/certs/soundsphere.icu.cert
SSLCertificateKeyFile /etc/pki/tls/private/soundsphere.icu.key
</VirtualHost>
The index.html file is as follows:
<!DOCTYPE html>
<html>
<head>
<script>
function updateFrequency() {
// Send a GET request to the '/frequency' endpoint
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
// Update the frequency readout with the response from the endpoint
document.getElementById("frequencyReadout").innerHTML = this.responseText;
}
};
xhttp.open("GET", "https://soundsphere.icu/tuner/frequency", true);
xhttp.send();
console.log("This is a debugging statement");
}
</script>
<script>
navigator.getUserMedia = (navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
navigator.getUserMedia({audio: true}, function(stream) {
// Success!
}, function(error) {
console.error(error);
});
</script>
</head>
<body onload="setInterval(updateFrequency, 100)">
<div id="frequencyReadout">Loading...</div>
</body>
</html>
I'm seeing this in my terminal
ALSA lib confmisc.c:767:(parse_card) cannot find card '0'
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_card_driver returned error: No such file or directory
ALSA lib confmisc.c:392:(snd_func_concat) error evaluating strings
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_concat returned error: No such file or directory
ALSA lib confmisc.c:1246:(snd_func_refer) error evaluating name
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_refer returned error: No such file or directory
ALSA lib conf.c:5047:(snd_config_expand) Evaluate error: No such file or directory
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM sysdefault
ALSA lib confmisc.c:767:(parse_card) cannot find card '0'
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_card_driver returned error: No such file or directory
ALSA lib confmisc.c:392:(snd_func_concat) error evaluating strings
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_concat returned error: No such file or directory
ALSA lib confmisc.c:1246:(snd_func_refer) error evaluating name
ALSA lib conf.c:4568:(_snd_config_evaluate) function snd_func_refer returned error: No such file or directory
ALSA lib conf.c:5047:(snd_config_expand) Evaluate error: No such file or directory
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM sysdefault
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.front
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.rear
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.center_lfe
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.side
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround21
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround21
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround40
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround41
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround50
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround51
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.surround71
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.iec958
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.iec958
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.iec958
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.hdmi
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.hdmi
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.modem
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.modem
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.phoneline
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.phoneline
ALSA lib pulse.c:243:(pulse_connect) PulseAudio: Unable to connect: Connection refused
Thank you for any assistance you can lend.
I had hoped to get a frequency readout.

Related

Why does ffmpeg fail to process streams?

I am struggling on a ffmpeg media conversion script.
I am using fluent-ffmpeg library with node.js.
My app is supposed to receive a stream as input, resize it using ffmpeg, and then outputing a stream.
However, I am absolutely unable to process an input stream with ffmpeg, even when specifying the input format (-f ffmpeg's option).
However, when execuyting the exact same ffmpeg command on an mp4 file (without extension), it works and converts properly the media !
Working code (no stream)
import * as ffmpeg from 'fluent-ffmpeg';
ffmpeg('myMp4File')
.inputFormat('mp4')
.audioCodec('aac')
.videoCodec('libx264')
.format('avi')
.size('960x540')
.save('mySmallAviFile');
Failing code (using stream)
import * as ffmpeg from 'fluent-ffmpeg';
import { createReadStream } from 'fs';
ffmpeg(createReadStream('myMp4File'))
.inputFormat('mp4')
.audioCodec('aac')
.videoCodec('libx264')
.format('avi')
.size('960x540')
.save('mySmallAviFile');
It generates the following ffmpeg's error:
Error: ffmpeg exited with code 1: pipe:0: Invalid data found when processing input
Cannot determine format of input stream 0:0 after EOF
Error marking filters as finished
Conversion failed!
This error explicitely says that ffmpeg could not identify the input format, in despite of argument -f mp4.
I read pages and pages of ffmpeg's man but I could find any relevant information concerining my issue.
Complementary information
Here is the output of command._getArguments(), showing the full ffmpeg command baked by the library:
[
'-f', 'mp4',
'-i', 'pipe:0',
'-y', '-acodec',
'aac', '-vcodec',
'libx264', '-filter:v',
'scale=w=960:h=540', '-f',
'avi', 'mySmallAviFile'
]
So the full ffmpeg command is the following:
ffmpeg -f mp4 -i pipe:0 -y -acodec 'aac' -vcodec 'libx264 -filter:v 'scale=w=960:h=540' -f 'avi' mySmallAviFile
I was getting the same error but only for files which had moov atom metadata at the end of a file.
After moving mov atom to the beginning of the file with:
ffmpeg -i input.mp4 -movflags faststart out.mp4
the error dissapeared.

reading in a file from ubuntu (AWS EC2) on local machine?

I have a python script which I'm running on AWS (EC2 instance with Ubuntu). This python script outputs a JSON file daily, to a directory in /home/ubuntu:
with open("/home/ubuntu/bandsintown/sf_events.json", "w") as writeJSON:
file_str = json.dumps(allEvents, sort_keys=True)
file_str = "var sf_events = " + file_str
All works as expected here. My issue is that I'm unsure of how to read this JSON (existing on ubuntu) into a javascript file that I'm running on my local machine.
Javascript can't find the file if I call the file from ubuntu:
<script src="/home/ubuntu/bandsintown/sf_events.json"></script>
In other words, I'd like to read in the JSON that I've created in the cloud, to a file that exists on my local machine. Should I output the JSON somewhere other than home/ubuntu? Or, can my local file somehow recognize /home/ubuntu as a file location?
Thanks in advance.
The problem occurs because the file does not exist on your local machine, only on the running EC2 instance.
A possible solution is to upload the JSON file from EC2 instance to S3 and afterward download the JSON file to your local machine /home/ubuntu/bandsintown/sf_events.json.
First, install the AWS CLI toolkit on running EC2 instance AWS CLI and run the following commands in the terminal
aws configure
aws s3 cp /home/ubuntu/bandsintown/sf_events.json s3://mybucket/sf_events.json
Or install Python AWS SDK boto3 and upload it via python
s3 = boto3.resource('s3')
def upload_file_to_s3(s3_path, local_path):
bucket = s3_path.split('/')[2] #bucket is always second as paths are S3://bucket/.././
file_path = '/'.join(s3_path.split('/')[3:])
response = s3.Object(bucket, file_path).upload_file(local_path)
return response
s3_path = "s3://mybucket/sf_events.json"
local_path = "/home/ubuntu/bandsintown/sf_events.json"
upload_file_to_s3(s3_path, local_path)
Then on your local machine download file from s3 via AWS CLI
aws configure
aws s3 cp s3://mybucket/sf_events.json /home/ubuntu/bandsintown/sf_events.json
Or if you prefer python SDK:
s3 = boto3.resource('s3')
def download_file_from_s3(s3_path, local_path):
bucket = s3_path.split('/')[2] #bucket is always second as paths are S3://bucket/.././
file_path = '/'.join(s3_path.split('/')[3:])
filename = os.path.basename(s3_path)
s3.Object(bucket, file_path).download_file(local_file_path)
s3_path = "s3://mybucket/sf_events.json"
local_path = "/home/ubuntu/bandsintown/sf_events.json"
download_file_from_s3(s3_path, local_path)
Or using Javascript SDK running inside of browser, but I would not recommend this because you must make your bucket public and also take care of browser compatibility issue
You can use aws S3
You can run one python script on your instance which uploads the json file to s3 whenever the json gets generated and another python script on local machine where you can use (script for sqs queue and s3 download configuration) or (script which downloads the latest file uploaded to s3 bucket).
Case1:
Whenever the json file gets uploaded to S3 you will get message in the sqs queue that the file has been uploaded to s3 and then the file gets downloaded to your local machine.
Case2:
Whenever the json file gets uploaded to s3, you can run the download script which downloads the latest json file.
upload.py:
import boto3
import os
import socket
def upload_files(path):
session = boto3.Session(
aws_access_key_id='your access key id',
aws_secret_access_key='your secret key id',
region_name='region'
)
s3 = session.resource('s3')
bucket = s3.Bucket('bucket name')
for subdir, dirs, files in os.walk(path):
for file in files:
full_path = os.path.join(subdir, file)
print(full_path[len(path)+0:])
with open(full_path, 'rb') as data:
bucket.put_object(Key=full_path[len(path)+0:], Body=data)
if __name__ == "__main__":
upload_files('your pathwhich in your case is (/home/ubuntu/)')
your other script on local machine:
download1.py with sqs queue
import boto3
import logzero
from logzero import logger
s3_resource = boto3.resource('s3')
sqs_client=boto3.client('sqs')
### Queue URL
queue_url = 'queue url'
### aws s3 bucket
bucketName = "your bucket-name"
### Receive the message from SQS queue
response_message = sqs_client.receive_message(
QueueUrl=queue_url,
MaxNumberOfMessages=1,
MessageAttributeNames=[
'All'
],
)
message=response_message['Messages'][0]
receipt_handle = message['ReceiptHandle']
messageid=message['MessageId']
filename=message['Body']
try:
s3_resource.Bucket(bucketName).download_file(filename,filename)
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code']=='404':
logger.info("The object does not exist.")
else:
raise
logger.info("File Downloaded")
download2.py with latest file downloading from s3:
import boto3
### S3 connection
s3_resource = boto3.resource('s3')
s3_client = boto3.client('s3')
bucketName = 'your bucket-name'
response = s3_client.list_objects_v2(Bucket=bucketName)
all = response['Contents']
latest = max(all, key=lambda x: x['LastModified'])
s3 = boto3.resource('s3')
key=latest['Key']
print("downloading file")
s3_resource.Bucket(bucketName).download_file(key,key)
print("file download")
You basically need to copy a file from remote machine to your local one. The most simple way is to use scp. In the following example it just copies to your current directory. If you are on Windows, open PowerShell, if you are on Linux , scp should be installed already.
scp <username>#<your ec2 instance host or IP>:/home/ubuntu/bandsintown/sf_events.json ./
Run the command, enter your password, done. The same way you are using ssh to connect to your remote machine. (I believe your username would be ubuntu)
More advanced method would be mounting your remote directory via SSHFS. It is a little cumbersome to set up, but then you will have instant access to the remote files as if they were local.
And if you want to do it pragramatically from Python, see this question.
Copying files from local to EC2
Your private key must not be publicly visible. Run the following command so that only the root user can read the file.
chmod 400 yourPublicKeyFile.pem
To copy files between your computer and your instance you can use an FTP service like FileZilla or the command scp. “scp” means “secure copy”, which can copy files between computers on a network. You can use this tool in a Terminal on a Unix/Linux/Mac system.
To use scp with a key pair use the following command:
scp -i /directory/to/abc.pem /your/local/file/to/copy user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:path/to/file
You need to specify the correct Linux user. From Amazon:
For Amazon Linux, the user name is ec2-user.
For RHEL, the user name is ec2-user or root.
For Ubuntu, the user name is ubuntu or root.
For Centos, the user name is centos.
For Fedora, the user name is ec2-user.
For SUSE, the user name is ec2-user or root.
Otherwise, if ec2-user and root don’t work, check with your AMI provider.
To use it without a key pair, just omit the flag -i and type in the password of the user when prompted.
Note: You need to make sure that the user “user” has the permission to write in the target directory. In this example, if ~/path/to/file was created by user “user”, it should be fine.
Copying files from EC2 to local
To use scp with a key pair use the following command:
scp -i /directory/to/abc.pem user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:path/to/file /your/local/directory/files/to/download
Reference: Screenshot from terminal
Hack 1: While downloading file from EC2, download folder by archiving it.
zip -r squash.zip /your/ec2/directory/
Hack 2 : You can download all archived files from ec2 to just by below command.
scp -i /directory/to/abc.pem user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:~/* /your/local/directory/files/to/download
Have you thought about using EFS for this? You can mount EFS on ec2 as well as on your local machine over a VPN or a direct connect? Can you not save the file on EFS so both sources can access it?
Hope this helps.

Error: EROFS: read-only file system while streaming the xlsx content in Lambda

I am Using xlsx library to parse an excel document to get the data as
sheet per file, row per file , column per file etc...
While processing inside AWS lambdaI am getting the below error stack
{"errorType":"Runtime.UnhandledPromiseRejection","errorMessage":"Error: EROFS: read-only file system, open 'Dist Share Summary.xlsx'","reason":{"errorType":"Error","errorMessage":"EROFS: read-only file system, open 'Dist Share Summary.xlsx'","code":"EROFS","errno":-30,"syscall":"open","path":"Dist Share Summary.xlsx","stack":["Error: EROFS: read-only file system, open 'Dist Share Summary.xlsx'"," at Object.openSync (fs.js:443:3)"," at Object.writeFileSync (fs.js:1194:35)"," at write_dl (/var/task/node_modules/xlsx/xlsx.js:2593:112)"," at write_zip_type (/var/task/node_modules/xlsx/xlsx.js:20730:31)"," at writeSync (/var/task/node_modules/xlsx/xlsx.js:20818:22)"," at Object.writeFileSync (/var/task/node_modules/xlsx/xlsx.js:20841:9)"," at workbook.SheetNames.forEach.element (/var/task/index.js:31:26)"," at Array.forEach ()"," at getParsedData (/var/task/index.js:27:32)"," at Parsing (/var/task/index.js:20:32)"]},"promise":{},"stack":["Runtime.UnhandledPromiseRejection: Error: EROFS: read-only file system, open 'Dist Share Summary.xlsx'"," at process.on (/var/runtime/index.js:37:15)"," at process.emit (events.js:198:13)"," at process.EventEmitter.emit (domain.js:448:20)"," at emitPromiseRejectionWarnings (internal/process/promises.js:140:18)"," at process._tickCallback (internal/process/next_tick.js:69:34)"]}
The Lambda file system is read only aside from /tmp - you have up to 500mb to use, don't forget to remove the file when you're don, if the container is reused the file will still be there and you'll run out of space over time.

Calling flask via javascript run on an https external site

There is an external site that runs https. I have a UI based "server" that I can run javascript from.
I want to have that site send me a XMLHttpRequest() via my javascript running on their site to my Flask_Restful running elsewhere.
This code works, but only on http sites. Their site is https.
I am currently running this code on the chrome console when on their site.
var xhttp = new XMLHttpRequest();
xhttp.open("PUT", "http://[my_server]:5000/todos/todo3", true);`
xhttp.setRequestHeader("Content-Type", "application/json;charset=UTF-8");
xhttp.send(JSON.stringify({ "task": "new task2"}));
So I get the error:
This request has been blocked; the content must be served over HTTPS.
If I run it with https I get the error:
OPTIONS https://9[my_flask_server]:5000/todos/todo3 net::ERR_SSL_PROTOCOL_ERROR
My Flask server also prints out:
code 400, message Bad request syntax ('\x16...
How do I enable my flask site to allow this SSL call?
I have enabled CORS like this:
from flask import Flask
from flask_restful import reqparse, abort, Api, Resource
from flask_cors import CORS, cross_origin
app = Flask(__name__)
CORS(app)
api = Api(app)
EDIT 1
I am trying to add ssl capabilities to my server.
I have made a host.cert file and a host.key file by using this:
openssl genrsa 1024 > host.key
chmod 400 host.key
openssl req -new -x509 -nodes -sha1 -days 365 -key host.key -out host.cert
source: https://serverfault.com/questions/224122/what-is-crt-and-key-and-how-can-i-generate-them
And I updated my Flask app.py such that the top looks like this:
from flask import Flask
from flask_restful import reqparse, abort, Api, Resource
from flask_cors import CORS, cross_origin
from OpenSSL import SSL
context = SSL.Context(SSL.SSLv23_METHOD)
context.use_privatekey_file('host.key')
context.use_certificate_file('host.cert')
app = Flask(__name__)
CORS(app)
api = Api(app)
source: http://flask.pocoo.org/snippets/111/
When I run python app.py it errors out now.
Current error:
Traceback (most recent call last):
File "app.py", line 72, in <module>
app.run(host="[my_server]", port="5000", ssl_context=context)
File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 841, in run
run_simple(host, port, self, **options)
File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 742, in run_simple
inner()
File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 702, in inner
fd=fd)
File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 596, in make_server
passthrough_errors, ssl_context, fd=fd)
File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 528, in __init__
self.socket = ssl_context.wrap_socket(sock, server_side=True)
AttributeError: 'OpenSSL.SSL.Context' object has no attribute 'wrap_socket'
EDIT 2: (didn't work)
sudo pip install 'Werkzeug==0.9.6'
Changes the issue to:
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/__init__.py", line 851, in emit
msg = self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 724, in format
return fmt.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 464, in format
record.message = record.getMessage()
File "/usr/lib/python2.7/logging/__init__.py", line 328, in getMessage
msg = msg % self.args
TypeError: %d format: a number is required, not str
Logged from file _internal.py, line 87
EDIT 3: PROGRESS
Using just
from OpenSSL import SSL
...
and
...
if __name__ == '__main__':
context = ('host.crt', 'host.key')
app.run(host...
Has changed the console side error from:
OPTIONS https://[my_flask_server]:5000/todos/todo3 net::ERR_SSL_PROTOCOL_ERROR
to
OPTIONS https://[my_flask_server]:5000/todos/todo3 net::ERR_INSECURE_RESPONSE
EDIT 3
At this point its just browser related security checks since my ssl certificate is not recognized / weak. I think I've got it working as well as I can, Im gonna try and get it more authorized i guess?

Wscript.shell.run and Wscript.shell.exec do not display the same outputs

I use an ActiveXObject in Javascript.
var shell = new ActiveXObject("WScript.Shell");
exec = shell.exec('cmd /c ftp -i -A -s:file.ftp host);
var output = exec.StdOut.ReadAll();
I'm getting the expected error "Could not create file" because the file already exists on the server. Everything's ok here. But the output doesn't display the error codes of ftp, while the Run method does (553 Could not create file).
I do not use Run method because the only output possible consists in redirecting the output into a file, on the client machine.
Trust me, I read a lot of websites (including Windows official descriptions of Run, Exec)
How can I get the error codes of ftp command using WScript.Shell.Exec command?
More info:
exec.StdOut.ReadAll() output ->
"bin
cd my_dir/
mput file_path file_path
Could not create file.
Could not create file.
quit"
output file from WScript.shell.run ->
ftp> bin
200 Switching to Binary mode.
ftp> cd my_dir/
250 Directory successfully changed.
ftp> mput file_path file_path
200 PORT command successful. Consider using PASV.
553 Could not create file.
200 PORT command successful. Consider using PASV.
553 Could not create file.
ftp> quit
221 Goodbye.
Ok, I found the solution on a 8 years old post. (http://computer-programming-forum.com/61-wsh/813f07658378176c.htm)
In order to get the complete output from ftp command using Exec method, the -v option has to be added to ftp command.
Works like a charm.
var shell = new ActiveXObject("WScript.Shell");
exec = shell.exec(cmd /c ftp -i -v -A -s:file.ftp host);
var output = exec.StdOut.ReadAll();

Categories