I'm trying to formulate a post request to the Pastebin API in Code by Zapier using fetch. After some trial and error, I managed to do it using URLencoded data, like this:
const { URLSearchParams } = require('url');
const encodedParams = new URLSearchParams();
encodedParams.set('api_paste_code', 'test');
encodedParams.set('api_option', 'paste');
encodedParams.set('api_dev_key', '1NLa6pHlhuBYTMAb1GDiAYHa1xAy5Ihd');
let url = 'https://pastebin.com/api/api_post.php';
let options = {
method: 'POST',
headers: {'Content-Type': 'application/x-www-form-urlencoded'},
body: encodedParams.toString()
};
const fetchResult = await fetch(url, options);
output = {output: fetchResult}
I get the response 200 but the data I receive is something I've never seen in my life. It's something like this:
output
url
https://pastebin.com/api/api_post.php
status
200
statusText
OK
headers
_headers
date
1
Fri, 08 Apr 2022 13:45:23 GMT
content-type
1
text/html; charset=UTF-8
transfer-encoding
1
chunked
connection
1
close
x-custom-api-dev-id
1
8604170
set-cookie
1
pastebin_posted=49abfc325ca96e5daf3e6b81de576fbdae7e584ab1dc300294f770a2200d6771a%3A2%3A%7Bi%3A0%3Bs%3A15%3A%22pastebin_posted%22%3Bi%3A1%3Bs%3A8%3A%22X6WmZQ3u%22%3B%7D; expires=Fri, 08-Apr-2022 14:45:23 GMT; Max-Age=3600; path=/; HttpOnly
content-encoding
1
gzip
cf-cache-status
1
DYNAMIC
expect-ct
1
max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
server
1
cloudflare
cf-ray
1
6f8b716dfe295790-IAD
ok
true
body
_writeState
0
0
1
0
_readableState
objectMode
false
highWaterMark
16384
buffer
head
null
tail
null
length
0
length
0
pipes
null
pipesCount
0
flowing
null
ended
false
endEmitted
false
reading
false
sync
false
needReadable
false
emittedReadable
false
readableListening
false
resumeScheduled
false
emitClose
true
autoDestroy
false
destroyed
false
defaultEncoding
utf8
awaitDrainWriters
null
multiAwaitDrain
false
readingMore
false
decoder
null
encoding
null
readable
true
_events
_eventsCount
6
_writableState
objectMode
false
highWaterMark
16384
finalCalled
false
needDrain
false
ending
false
ended
false
finished
false
destroyed
false
decodeStrings
true
defaultEncoding
utf8
length
49
writing
true
corked
0
sync
false
bufferProcessing
false
writelen
49
afterWriteTickInfo
null
bufferedRequest
null
lastBufferedRequest
null
pendingcb
1
prefinished
false
errorEmitted
false
emitClose
true
autoDestroy
false
bufferedRequestCount
0
corkedRequestsFree
next
null
entry
null
writable
true
allowHalfOpen
true
_transformState
needTransform
false
transforming
true
writechunk
type
Buffer
data
1
31
2
139
3
8
4
0
5
0
6
0
7
0
8
0
9
0
10
3
11
203
12
40
13
41
14
41
15
40
16
182
17
210
18
215
19
47
20
72
21
44
22
46
23
73
24
77
25
202
26
204
27
211
28
75
29
206
30
207
31
213
32
143
33
48
34
11
And then it continues with these rows of numbers for thousands of lines. I tried to use decodeURIcomponent but the returned value is [object Object] that I can't parse into a string.
I suspect maybe parsing encodedParams to string before sending can cause the proble as in Node in my IDE and in postman I don't use toString() and I have the correct response with the url to my paste. But in Zappier if I use the same code I get the response 422 "Unprocessable Entity". Only after parsing encodedParams to string did I manage to get a response code 200. I'm stuck.
I hope my question is formatted ok, this is my first time on StackOverflow. Thanks!
This is actually working, you can see in your response you have a status of 200, the problem is, you didn't extract the data out of the HTTP response.
You need to make this change to see the actual data:
const fetchResult = await fetch(url, options);
const data = await fetchResult.json();
output = {output: data}
You could also replace .json() with .text() if you prefer.
I hope this helps!
How to know the cryptocurrency used in a transaction through the transaction hash?
I'm using the following code:
<script src="https://cdn.jsdelivr.net/npm/web3#latest/dist/web3.min.js"></script>
...
if (typeof web3 !== 'undefined') {
web3 = new Web3(web3.currentProvider);
} else {
web3 = new Web3(new Web3.providers.HttpProvider("http://localhost:8545"));
}
var transaction = web3.eth.getTransaction('0x42ca92e05d21592e92e36c93ae6e31d0583c4c95cadd42901e297b55c6b465af');
console.log(transaction);
And I'm getting the object below:
Promise
[[Prototype]]: Promise
[[PromiseState]]: "fulfilled"
[[PromiseResult]]: Object
blockHash: "0x91a5799cac3de01498a4988d4e55bb29f6b56617490bf854ce5e1f21821db2e2"
blockNumber: 14832362
from: "0xdE4429D4cD2afa9C4314254F1cb074181f0e184e"
gas: 200000
gasPrice: "10000000000"
hash: "0x42ca92e05d21592e92e36c93ae6e31d0583c4c95cadd42901e297b55c6b465af"
input: "0x"
nonce: 12
r: "0xec1cb9f6b134c375542c4aaff07aa6ae8bd831918e52a609e77f227683524f67"
s: "0x798123f2eb75584430c2016cf91eeb384306291dcbc50eae64fc73c140cc972d"
to: "0x69AF47261C4585C365e4E1D93b635974a30fb117"
transactionIndex: 7
type: 0
v: "0x94"
value: "10000000000000000"
[[Prototype]]: Object
I find the value, the recipient, and the origin but I can't figure out where is any information about the uses token.
I would be glad if someone helps me.
I do not think that you can get it from transaction hash: This is the break down of transaction hash:
0x
// https://eth.wiki/fundamentals/rlp for more on f8
// 0xf8-0xf7=1byte=2 chars which tells you next 2 chars is the length of the paylaod
f8
// length of payload. 0x69=105bytes=210 chars.
69
// Nonce. 1byte
12
//0x85-0x80=133-128= 5bytes=10chars= GAS PRICE
85 012a05f200
// 0x83-0x80=131-128=3bytes=6chars= GAS LIMIT
83 07a120
// 0x94 - 0x80=148-128=20bytes=40chars= TO
94 b75fe87f44cb2003f3472424261f021d2100ce5a
// this should be value. empty valu, string is represended as 80
80
// 0x84-0x80=132-128=4bytes=8 chars= DATA
84 3ccfd60b
// "v" value. 1byte
2a
// a0-0x80=160-128=32 bytes=64 chars= "r" VALUE
a0 fce2a70cde55dc86a203d2387469c36da00e9a9f3536b867f6761c314b1f1423
// a0-0x80=160-128=32 bytes=64 chars= "s" VALUE
a0 7b85742fd48a687b8daa097d1fb9d4b5135de6deb5b6fd16dd0f28d6c28727b6
To get an ERC20 transfer's information, you need the transaction
receipt, since transfer information is recorded in a transfer event
log. You should be using eth_getTransactionReceipt.
This post explains how to get it from transaction_receipt:
ERC20 Tokens Transferred Information from Transaction Hash
I am trying to plug individual id's into a function to pull log outputs for each id. These ids are inside objects inside a larger object. I cannot figure out how to call these id's individually in order to print out the log report for each object.
objects
(2) [{…}, {…}]
0: {id: "60844b1f5ce50a40c1adfc32", rdbmsJobId: null, sqoopJobId: "30471e65-c9b9-40e0-b27f-472769480f0c", status: "FINISHED", progress: 100, …}
1: {id: "60844b1f5ce50a40c1adfc33", rdbmsJobId: null, sqoopJobId: "1e74fb62-4a61-420b-9c78-e005007eb17e", status: "FINISHED", progress: 100, …}
length: 2
__proto__: Array(0)
Function
getLogDataById(id: string){
this.logService.getLogById(id).subscribe(logRes => {
this.logReport = logRes
console.log("Log report is: " + this.logReport)
})
}
Expected Output
{
"id": "60844b1f5ce50a40c1adfc34",
"jobId": "60844b1f5ce50a40c1adfc32",
"log": "2021-04-24 16:45:20,180 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7\n2021-04-24 16:45:21,084 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.\nWarning: /opt/sqoop/../hbase does not exist! HBase imports will fail.\nPlease set $HBASE_HOME to the root of your HBase installation.\nWarning: /opt/sqoop/../hcatalog does not exist! HCatalog jobs will fail.\nPlease set $HCAT_HOME to the root of your HCatalog installation.\nWarning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.\nPlease set $ACCUMULO_HOME to the root of your Accumulo installation.\nWarning: /opt/sqoop/../zookeeper does not exist! Accumulo imports will fail.\nPlease set $ZOOKEEPER_HOME to the root of your Zookeeper installation.\nJob Created and Saved to Sqoop.\nSqoop Job Submitted to Yarn\n2021-04-24 16:46:27,894 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n2021-04-24 16:46:28,051 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032\nContainer: container_1618958577187_0219_01_000001 on ip-172-31-50-48.us-west-2.compute.internal_43829\nLogAggregationType: AGGREGATED\n=====================================================================================================\nLogType:stderr\nLogLastModifiedTime:Sat Apr 24 16:46:26 +0000 2021\nLogLength:1722\nLogContents:\nApr 24, 2021 4:46:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register\nINFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class\nApr 24, 2021 4:46:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register\nINFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class\nApr 24, 2021 4:46:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register\nINFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class\nApr 24, 2021 4:46:03 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate\nINFO: Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'\nApr 24, 2021 4:46:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider\nINFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope \"Singleton\"\nApr 24, 2021 4:46:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider\nINFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope \"Singleton\"\nApr 24, 2021 4:46:06 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider\nINFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope \"PerRequest\"\nlog4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).\nlog4j:WARN Please initialize the log4j system properly.\nlog4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.\n\nEnd of LogType:stderr\n***********************************************************************\n\n\nEnd of LogType:stdout\n***********************************************************************\n\nContainer: container_1618958577187_0219_01_000003 on ip-172-31-50-48.us-west-2.compute.internal_43829\nLogAggregationType: AGGREGATED\n=====================================================================================================\nLogType:stderr\nLogLastModifiedTime:Sat Apr 24 16:46:26 +0000 2021\nLogLength:227\nLogContents:\nLoading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.\n\nEnd of LogType:stderr\n***********************************************************************\n\n\nEnd of LogType:stdout\n***********************************************************************\n\nContainer: container_1618958577187_0219_01_000002 on ip-172-31-50-48.us-west-2.compute.internal_43829\nLogAggregationType: AGGREGATED\n=====================================================================================================\nLogType:stderr\nLogLastModifiedTime:Sat Apr 24 16:46:26 +0000 2021\nLogLength:227\nLogContents:\nLoading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.\n\nEnd of LogType:stderr\n***********************************************************************\n\n\nEnd of LogType:stdout\n***********************************************************************\n\nIngestion Finished"
}
Does anyone know how to retrieve data from Google's Cloud Natural Language (NL) API using client-side javascript? What is incorrect with the code block below - likely line 29?
I've been looking at a lot of documentation today to figure out how to perform sentiment analysis using Google's Cloud NL API. Many of them have not been very clear.
This tutorial helped me understand how Google's Cloud NL API works to a large degree.
I still have a few gaps.
This is the block of code I derived after looking at a few docs.
One area I think that may be a potential issue, line 29.
I've never fetched using two arguments.
1 sentiApiKey = ***API KEY***
2 sentiEndPoint = ***Google's Cloud Natural Language endpoint***
3 //dictData is existing JSON data
5 function searchSenti(dictData) {
6 const sentiParams = {
7 key: sentiApiKey,
8 }
10 const sentiApiKeyString = formatQueryParams(sentiParams)
11 const sentiUrl = sentiAnEndPoint + "?" + sentiApiKeyString;
13 const def = dictData[0].shortdef[0]
14 const dict = {
15 language: 'en-us',
16 type: 'PLAIN_TEXT',
17 content: def
18 };
20 const nlApiData = {
21 document: dict,
22 encodingType: 'UTF8'
23 };
25 const nlCallOptions = {
26 method: 'post',
27 contentType: 'application/json',
28 payload: JSON.stringify(nlApiData)
29 }
31 fetch(sentiUrl, nlCallOptions)
32 .then(response => {
33 if (response.ok) {
34 return response.json();
35 }
36 throw new Error(response.statusText);
37 })
38 .then(sentiData => parseSenti(sentiData))
39 .catch(err => {
40 $("#error-message").removeClass("hidden");
41 $("#js-error-message").text(`Something went wrong with the
Sentiment API: ${err.message}`);
43 });
44 }
46 function parseSenti(sentiData) {
47 const data = JSON.parse(sentiData);
48 const sentiment = 0.0;
50 if (data && data.documentSentiment && data.documentSentiment.score){
51 sentiment = data.documentSentiment.score;
52 }
54 console.log(sentiment);
55 }
A few days ago, I learned that there is no way to use Google's Cloud Natural Language API using client-side JS.
Here's some documentation that will help you use client-side JS for sentiment analysis:
http://www.gtlambert.com/blog/sentiment-analysis-sentimoodjs
https://github.com/thinkroth/Sentimental
Cheers.
I have created a script to migrate a file from gridfs to the new local C: drive and then rename the file since I am unable to use variables with writeFileSync to dynamically name the file with its extension included.
The issue I am running into is that when I am using any variables in the location path variable it will return an error no such file or directory. When I explicitly write out the string and use no variables, it will find it with no issue and rename the file correctly. Same path, just two different methods for how the path is created. Might be an issue caused by javascript's pass by reference and pass by value? No idea at this point.
I am debugging by console logging all paths as I go to confirm that the final destination string is correct and have already verified that it is exactly the same when using referenced variables as it is when written specifically as a single string.
fs.writeFileSync(__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/tempfile`, data);
unprocessedPath1 = (__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/tempfile`); //DOES NOT WORK
unprocessedPath2 = (__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/${tempFile.filename}`); //DOES NOT WORK
//RETURNS: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile
//RETURNS: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile1
// unprocessedPath1 = `C:/Sites/CRM/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile`; // WORKS
// unprocessedPath2 = `C:/Sites/CRM/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile1`; // WORKS
//RETURNS: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile
//RETURNS: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile1
var correctPath1 = path.normalize(unprocessedPath1);
var correctPath2 = path.normalize(unprocessedPath2);
fs.renameSync(correctPath1, correctPath2, function(err) {
if ( err ) console.log('RENAME ERROR: ' + err);
});
RENAME ERROR: Error: ENOENT: no such file or directory, rename 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile' -> 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\2019-08-16T10:50.wav'
EDIT: Confirmed that file exists at the destination before rename attempt. Added a console.log(fs.existsSync(correctPath1)) line before rename.
true
RENAME ERROR: Error: ENOENT: no such file or directory, rename 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile' -> 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\2019-08-16T10:50.wav'
EDIT: Here is the entire request.
//DOWNLOAD GRIDFS DATABASE'S OLD FILES AND SAVE TO LOCAL DRIVE
router.get('/export/gridfs', middleware.isLoggedIn, (req, res) => {
gfs = Grid(conn.db, mongoose.mongo);
gfs.collection('uploads').find().toArray((err, files) => {
console.log("OPENING FILES OBJECT: ");
console.log(util.inspect(files, false, null, true /* enable colors */));
files.forEach(file => {
console.log("FOR EACH FILE");
tempFile = file;
console.log("CREATING FILE PRE SAVE");
File.create(tempFile, function(err, file){
console.log("FILE PRE CREATED");
if(err){
console.log("ERROR OCCURED: " + err);
console.log("SKIPPING FILE");
} else {
console.log("NO ERROR");
console.log(util.inspect(file, false, null, true /* enable colors */))
console.log(util.inspect(tempFile, false, null, true /* enable colors */))
console.log("MAKING NEW FOLDER IN LEAD ID USING FILE ID");
console.log('public/uploads/' + tempFile.metadata.parent +"/"+ file._id);
mkdirp('public/uploads/' + tempFile.metadata.parent +"/"+ file._id, function() {});
console.log("FOLDER CREATED");
console.log("File Before");
console.log(file);
console.log("OPENING file OBJECT: ");
console.log(util.inspect(file, {showHidden: false, depth: null}))
console.log("OPENING tempfile OBJECT: ");
console.log(util.inspect(tempFile, false, null, true /* enable colors */))
file.filename = tempFile.filename;
file.contentType = tempFile.mimetype;
file.fileLocation = `/public/uploads/${tempFile.metadata.parent}/${file._id}/${tempFile.filename}`;
file.metadata = { parent: tempFile.metadata.parent };
file.createdAt = tempFile.uploadDate;
console.log("filename: " + file.filename);
console.log("contentType" + file.contentType);
console.log("fileLocation" + file.fileLocation);
console.log("metadata: " + file.metadata);
console.log("File After");
console.log("FILE: " + file);
console.log("tempFile: " + tempFile);
console.log(tempFile.newFileName);
console.log(tempFile.originalname);
console.log(tempFile.mimetype);
console.log(tempFile.contentType);
console.log("File After");
console.log(file);
//save note
file.save();
console.log("File Saved");
const tempId = file._id;
console.log("tempId: " + tempId);
console.log("OPENING tempId OBJECT: ");
console.log(util.inspect(tempId, false, null, true /* enable colors */));
console.log("PROCESSING FILE");
console.log(util.inspect(file, false, null, true /* enable colors */));
console.log("Starting gridfs stream");
gfs.files.find({ _id: new ObjectId(file._id) }, (err, file) => {
// Check if file
console.log("CHECKING FOR FILE ENTRY");
if (!file || file.length === 0) {
return res.status(404).json({
err: 'No file exists'
});
}
console.log("FILE ENTRY FOUND");
let data = [];
let readstream = gfs.createReadStream({
filename: tempFile.filename
});
console.log("Creating read stream");
readstream.on('data', function(chunk) {
console.log("PUSHING CHUNK");
console.log(chunk);
data.push(chunk);
console.log("PUSHED CHUNK");
});
readstream.on('end', function() {
console.log("ENDING STREAM");
data = Buffer.concat(data);
console.log("WRITING TO LOCAL DRIVE");
var fileExt = path.extname(tempFile.filename);
console.log(fileExt)
fs.writeFileSync(__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/tempfile`, data);
console.log("RETURNING FILE TO CLIENT");
console.log("RENAMING FILE AT LOCATION WITH EXTENSION");
var unprocessedPath1 = (__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/tempfile`);
var unprocessedPath1String = `C:/Sites/CRM/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile`; //works
console.log("Path1: " + unprocessedPath1);
console.log("Path1String: " + unprocessedPath1String);
var unprocessedPath2 = (__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/${tempFile.filename}`);
// unprocessedPath2 = `C:/Sites/CRM/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile1`; //works
console.log("Path2: " + unprocessedPath2);
var correctPath1 = path.normalize(unprocessedPath1);
console.log("NORMALIZED Path1: "+ correctPath1);
var correctPath2 = path.normalize(unprocessedPath2);
console.log("NORMALIZED Path2: " + correctPath2);
// correctPath1 = String(correctPath1);
// correctPath2 = String(correctPath2);
console.log(fs.existsSync(correctPath1))
console.log(fs.existsSync(unprocessedPath1String))
fs.rename(correctPath1, correctPath2, function(err) {
if ( err ) console.log('RENAME ERROR: ' + err);
console.log("RENAME COMPLETE");
});
});
readstream.on('error', function(err) {
console.log('An error occured!', err);
throw err;
});
res.send("EXPORTED");
});
}
});
});
});
});
EDIT: And here is the log.
OPENING FILES OBJECT:
[ { _id: 5d56ece48f6b3b09f0068f60,
length: 221228,
chunkSize: 261120,
uploadDate: 2019-08-16T17:50:30.212Z,
filename: '2019-08-16T10:50.wav',
md5: '47fbec41801f73efc53d7e8f73b4e596',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' } } ]
FOR EACH FILE
CREATING FILE PRE SAVE
FILE PRE CREATED
NO ERROR
{ _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-22T02:50:55.594Z,
__v: 0 }
{ _id: 5d56ece48f6b3b09f0068f60,
length: 221228,
chunkSize: 261120,
uploadDate: 2019-08-16T17:50:30.212Z,
filename: '2019-08-16T10:50.wav',
md5: '47fbec41801f73efc53d7e8f73b4e596',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' } }
MAKING NEW FOLDER IN LEAD ID USING FILE ID
public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60
FOLDER CREATED
File Before
{ _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-22T02:50:55.594Z,
__v: 0 }
OPENING file OBJECT:
{ _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-22T02:50:55.594Z,
__v: 0 }
OPENING tempfile OBJECT:
{ _id: 5d56ece48f6b3b09f0068f60,
length: 221228,
chunkSize: 261120,
uploadDate: 2019-08-16T17:50:30.212Z,
filename: '2019-08-16T10:50.wav',
md5: '47fbec41801f73efc53d7e8f73b4e596',
contentType: 'audio/wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' } }
filename: 2019-08-16T10:50.wav
contentTypeundefined
fileLocation/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/2019-08-16T10:50.wav
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' }
File After
FILE: { _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-16T17:50:30.212Z,
__v: 0,
fileLocation: '/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/2019-08-16T10:50.wav' }
tempFile: [object Object]
undefined
undefined
undefined
audio/wav
File After
{ _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-16T17:50:30.212Z,
__v: 0,
fileLocation: '/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/2019-08-16T10:50.wav' }
File Saved
tempId: 5d56ece48f6b3b09f0068f60
OPENING tempId OBJECT:
5d56ece48f6b3b09f0068f60
PROCESSING FILE
{ _id: 5d56ece48f6b3b09f0068f60,
filename: '2019-08-16T10:50.wav',
metadata: { parent: '5d56ebd88f6b3b09f0068f5c' },
createdAt: 2019-08-16T17:50:30.212Z,
__v: 0,
fileLocation: '/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/2019-08-16T10:50.wav' }
Starting gridfs stream
CHECKING FOR FILE ENTRY
FILE ENTRY FOUND
Creating read stream
(node:3008) DeprecationWarning: GridStore is deprecated, and will be removed in a future version. Please use GridFSBucket instead
PUSHING CHUNK
<Buffer 52 49 46 46 24 60 03 00 57 41 56 45 66 6d 74 20 10 00 00 00 01 00 01 00 80 bb 00 00 00 ee 02 00 02 00 10 00 64 61 74 61 00 60 03 00 00 00 00 00 00 00 ... >
PUSHED CHUNK
(node:3008) DeprecationWarning: GridStore is deprecated, and will be removed in a future version. Please use GridFSBucket instead
PUSHING CHUNK
<Buffer 52 49 46 46 24 60 03 00 57 41 56 45 66 6d 74 20 10 00 00 00 01 00 01 00 80 bb 00 00 00 ee 02 00 02 00 10 00 64 61 74 61 00 60 03 00 00 00 00 00 00 00 ... >
PUSHED CHUNK
PUSHING CHUNK
<Buffer 52 49 46 46 24 60 03 00 57 41 56 45 66 6d 74 20 10 00 00 00 01 00 01 00 80 bb 00 00 00 ee 02 00 02 00 10 00 64 61 74 61 00 60 03 00 00 00 00 00 00 00 ... >
PUSHED CHUNK
ENDING STREAM
WRITING TO LOCAL DRIVE
.wav
RETURNING FILE TO CLIENT
RENAMING FILE AT LOCATION WITH EXTENSION
Path1: C:\Sites\CRM\routes/../public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile
Path1String: C:/Sites/CRM/public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/tempfile
Path2: C:\Sites\CRM\routes/../public/uploads/5d56ebd88f6b3b09f0068f5c/5d56ece48f6b3b09f0068f60/2019-08-16T10:50.wav
NORMALIZED Path1: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile
NORMALIZED Path2: C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\2019-08-16T10:50.wav
true
true
RENAME ERROR: Error: ENOENT: no such file or directory, rename 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\tempfile' -> 'C:\Sites\CRM\public\uploads\5d56ebd88f6b3b09f0068f5c\5d56ece48f6b3b09f0068f60\2019-08-16T10:50.wav'
RENAME COMPLETE
Just saw the new code you added.
This looks to me like shared variables inside a loop using async operations.
Your loop is going to conflict with itself. The .forEach() doesn't wait for your async operations to finish so you'll have multiple iterations of the loop running and trying to use the same variables. You can fix by either stopping the parallel running of operations in the loop or by very carefully defining variables with let so they are unqiue to each iteration of the loop and using NO shared variables that are ever modified.
The reason there's a problem when you assign to a variable is that variable is in a shared scope and ALL the iterations of the loop are trying to use the same variable, some of them conflicting with one another.
All variables modified inside the .forEach() need to be declared internally to the .forEach() so they are unique and separate for each iteration of the loop. None of them should be declared at a higher scope.
The variable tempFile is part of the problem. It will be overwritten by subsequent iterations of the .forEach() loop BEFORE some of your asynchronous operations try to use it. You have so many nested async operations, I haven't studied every single variabled used inside an async callback to see which other ones may have this same issue.
So, the point where you do this:
var unprocessedPath1 = (__dirname + `/../public/uploads/${tempFile.metadata.parent}/${tempId}/tempfile`);
tempfile may well have been overwritten by the next iteration of the loop because this happens in deeply nested asynchronous callbacks that will get called after the .forEach() loop has already completed and other values have been potentially written to tempfile.
FYI, you show no declaration at all for the tempFile variable so perhaps its declared at a higher scope or an accidental module level variable. Just declaring it as:
let tempFile = file;
at the top of your .forEach() callback will give each invocation of the loop it's own copy of that variable and will fix at least this first issue.
As a simplified example of the problem, run this snippet:
const data = [1,2,3,4];
const base = "base";
let tempFile;
data.forEach(function(num) {
tempFile = base + num;
setTimeout(function() {
console.log(tempFile);
}, 1);
});
I use setTimeout() here for simplicity's sake as a simple asynchronous callback, but the issue is the same in your code because you're assigning to tempFile at each invocation of the loop and then referencing it in asynchronous callbacks.
console.log(tempFile); does not show the desired value because the .forEach() loop runs to its completion before a single async callback gets called (a product of the node.js event loop and how asynchronous operations use it).
Whereas, if you move the declaration of tempFile into the loop itself, there's a separate copy of that variable unique to each iteration of the loop and you get the desired output:
const data = [1,2,3,4];
const base = "base";
data.forEach(function(num) {
let tempFile = base + num;
setTimeout(function() {
console.log(tempFile);
}, 1);
});