I want to send image Node.js server to Android Clients.
I am using a REST Service between Node.js and Android Devices.
I can send image using node.js module 'fs' and receive Android device.
It's ok but i have over 200 images and each image's size between 1KB and 2KB. It' s very small images. So i dont want to send one by one. Its too slow so i am curious about if i ".rar" all image file (about 2MB), can i send one time and show images in android devices?
Or are there any way to send one time without ".rar" ?
Of course you can compress them in an archive(any kind) and decompress them on the device.
Using nodejs-zip you can generate zip archives. An example of compression (taken from here)
var http = require('http'),
nodejszip = require('../lib/nodejs-zip');
http.createServer(function (req, res) {
var file = 'compress-example.zip',
arguments = ['-j'],
fileList = [
'assets/image_1.jpg',
'assets/image_2.jpg',
'assets/image_3.jpg',
'assets/image_4.jpg',
'assets/image_5.jpg',
'assets/image_6.jpg',
'assets/image_7.jpg',
'assets/image_8.jpg',
'assets/image_9.jpg',
'assets/image_10.jpg',
'assets/image_11.jpg',
'assets/image_12.jpg',
'assets/image_13.jpg',
'assets/image_14.jpg'];
var zip = new nodejszip();
zip.compress(file, fileList, arguments, function(err) {
if (err) {
throw err;
}
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Complete.\n');
});
}).listen(8000);
On the device you can decompress a zip archive like that.(taken from here)
public class Decompress {
private String _zipFile;
private String _location;
public Decompress(String zipFile, String location) {
_zipFile = zipFile;
_location = location;
_dirChecker("");
}
public void unzip() {
try {
FileInputStream fin = new FileInputStream(_zipFile);
ZipInputStream zin = new ZipInputStream(fin);
ZipEntry ze = null;
while ((ze = zin.getNextEntry()) != null) {
Log.v("Decompress", "Unzipping " + ze.getName());
if(ze.isDirectory()) {
_dirChecker(ze.getName());
} else {
FileOutputStream fout = new FileOutputStream(_location + ze.getName());
for (int c = zin.read(); c != -1; c = zin.read()) {
fout.write(c);
}
zin.closeEntry();
fout.close();
}
}
zin.close();
} catch(Exception e) {
Log.e("Decompress", "unzip", e);
}
}
private void _dirChecker(String dir) {
File f = new File(_location + dir);
if(!f.isDirectory()) {
f.mkdirs();
}
}
}
Related
I'm trying to download an xlsx file with reactJS but i'm receiving this message when i try to open my file after download:
"Excel can not open file 'file.xlsx' because the file format or file extension is not valid. Verify that the file has not been corrupted and that the file extension matches the file format"
Here's the frontend code:
const REST_DOWNLOAD_URL = REST_URL + '/token';
Rest.ajaxPromise('GET', REST_DOWNLOAD_URL).then(function (res) {
var FILE_URL = "/supermarket/token/" + res;
Rest.ajaxPromise('GET', FILE_URL).then(function (my_file) {
let blob = new Blob([my_file._body], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;charset=utf-8' });
if (navigator.msSaveOrOpenBlob) {
navigator.msSaveBlob(blob, 'file.xlsx');
} else {
let link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
link.setAttribute('download', 'file.xlsx');
document.body.appendChild(link);
link.download = '';
link.click();
document.body.removeChild(link);
}
});
});
Why am i getting this error? Please somebody help me, i'm stuck on this for 3 weeks
[EDIT 1]
The file that i'm trying to download is build on backend, basically i get the values on database and use the Apache poi workbook to create the excel sheet. I will show you the mainly parts of the code:
1) This method is called by frontend on the first GET requisition of frontend and aims to prepare the file before the download. Is very simple, just create a token (buildToken()) and associate a temp file with this token (createTempFile(randomBackendToken)). The temp file is filled with what i get on my database (createFile(os))
#RequestMapping(value = "/token", method = RequestMethod.GET)
public String returnToken() throws IOException {
String randomBackendToken = tokenGenerator.buildToken();
OutputStream os = tokenGenerator.createTempFile(randomBackendToken);
tokenGenerator.createFile(os);
return randomBackendToken;
}
2) The method where i create the temp file:
public OutputStream createTempFile(String randomBackendToken) throws IOException {
OutputStream os = null;
File file = File.createTempFile(randomBackendToken, ".xlsx");
os = new FileOutputStream(file);
return os;
}
3) The method where i receive the empty temp file and fills with my data on database:
public void createFile(OutputStream os) throws IOException {
List<Supermakets> supermarkets = service.findAllSupermarkets();
Workbook workbook = writeExcel.createSheet(supermarkets);
workbook.write(os);
IOUtils.closeQuietly(os);
}
4) My WriteExcel Class that build the xlsx file:
private static String[] columns = {"Code", "Name", "Type"};
public Workbook createSheet(List<Supermarket> supermarkets) throws IOException {
Workbook workbook = new XSSFWorkbook();
Sheet sheet = workbook.createSheet("file");
[....]
// Row for Header
Row headerRow = sheet.createRow(0);
// Header
for (int col = 0; col < columns.length; col++) {
Cell cell = headerRow.createCell(col);
cell.setCellValue(columns[col]);
cell.setCellStyle(headerCellStyle);
}
//Content
int rowIdx = 1;
for (Supermarket supermarket : supermarkets) {
Row row = sheet.createRow(rowIdx++);
row.createCell(0).setCellValue(supermarket.getCode());
row.createCell(1).setCellValue(supermarket.getName());
row.createCell(2).setCellValue(supermarket.getType());
}
return workbook;
}
So, this all above is just for the first GET requisition. I make another one and the method below holds the second requisition. I just verify the token that the frontend returns for me and them, based on the validation, i allow the download of the file that i created on the previous step:
public void export(#PathVariable(value = "frontendToken") String frontendToken, HttpServletResponse response) throws IOException {
if (StringUtils.isNotBlank(frontendToken)) {
String tmpdir = System.getProperty("java.io.tmpdir");
File folder = new File(tmpdir);
File[] listOfFiles = folder.listFiles();
for (int i = 0; i < listOfFiles.length; i++) {
if (listOfFiles[i].isFile()) {
boolean fileIsValid = tokenGenerator.validateToken(frontendToken, listOfFiles[i]);
if (fileIsValid) {
InputStream input = new FileInputStream(listOfFiles[i]);
OutputStream output = response.getOutputStream();
int data = input.read();
while (data != -1) {
output.write(data);
data = input.read();
}
input.close();
output.flush();
output.close();
String mimeType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
response.setContentType(mimeType);
listOfFiles[i].delete();
}
}
}
}
}
And that's all that i'm doing. Can't find what's wrong or what i'm missing. When i press F12 on my navigator to see the response of the request, shows for me something encoded, like:
PK#SM_rels/.rels’ÁjÃ0†_ÅèÞ8í`ŒQ·—2èmŒî4[ILbËØÚ–½ýÌ.[Kì($}ÿÒv?‡I½Q.ž£uÓ‚¢hÙùØx>=¬î#ÁèpâH"Ã~·}¢
Any suspicions of what can be?
guys!
The problem was: my binary data was being converted for string by javascript and this was breaking my excel file. i solved my problem converting my binary data on backend to text and then on frontend i make the inverse. The following links helped me:
java convert inputStream to base64 string
Creating a Blob from a base64 string in JavaScript
Thank you for everyone that tried to help. I hope my question can help others
I have a client with written c# and a server with written java. I capture audio and send with socket to the server and server send with web socket to the browser and want to play with browser. But when i try browser says Uncaught (in promise) DOMException: Failed to load because no supported source was found.
Could you help me?
private static void Recordwav()
{
waveInEvent = new WaveInEvent();
int devicenum = 0;
for (int i = 0; i < WaveIn.DeviceCount; i++)
{
if (WaveIn.GetCapabilities(i).ProductName.Contains("icrophone"))
devicenum = i;
}
waveInEvent.DeviceNumber = devicenum;
waveInEvent.WaveFormat = new WaveFormat(44100, WaveIn.GetCapabilities(devicenum).Channels);
waveInEvent.DataAvailable += new EventHandler<WaveInEventArgs>(VoiceDataAvailable);
waveInEvent.StartRecording();
}
private static void VoiceDataAvailable(object sender, WaveInEventArgs e)
{
JObject jObject = new JObject();
jObject["voice"] = Convert.ToBase64String(e.Buffer);
byte[] messageByte = Encoding.ASCII.GetBytes(jObject.ToString().Replace("\r\n", "") + "\n");
socket.Send(messageByte);
}
$scope.socket.onmessage = function (response)
{
var data = JSON.parse(response.data);
if(data.id == $scope.id) {
if(data.voice) {
var voice = data.voice;
var sound = new Audio("data:audio/wav;base64," + voice);
sound.play();
}
}
};
you're just sending raw samples, not a properly formatted WAV file. You'd need to use WaveFileWriter to write to a MemoryStream (wrapped in an IgnoreDisposeStream) dispose the WaveFileWriter and then access the MemoryStream underlying byte array. Also you're not taking into account BytesRecorded.
Even if you get this working, I suspect you'll get very choppy audio, as each WAV file will be a few hundred ms, and they won't necessarily play perfectly one after the other.
I have written a file to a specified folder. After writing it to the folder I attach that file to mail. After attaching that file to mail, I want to delete that folder.But folder is not geting deleted and it throws the exception as "The process cannot access the file because it is being used by another process"
Here is my code.
public HttpResponseMessage SendChannelPartenersMessage(string Name,string FirmName,string Address, string Email,string Mobile)
{
var httpRequest = HttpContext.Current.Request;
ContactUs contactUs = new ContactUs();
contactUs.Address = Address;
contactUs.Name = Name;
contactUs.FirmName = FirmName;
contactUs.Email = Email;
contactUs.Mobile = Mobile;
try
{
if (httpRequest.Files.Count > 0)
{
contactUs.AttachFileName = WriteAttachedFile(httpRequest, contactUs.Email);
if (ContactUsService.SendChannelPartenersMessage(contactUs))
{
var fileToBeDeleted = contactUs.AttachFileName;
var deleteFile = DeleteAttachedFile(contactUs.AttachFileName);
}
return Request.CreateResponse(HttpStatusCode.OK, contactUs);
}
else
{
return Request.CreateResponse(HttpStatusCode.BadRequest);
}
}
catch (Exception e)
{
throw new HttpResponseException(new HttpResponseMessage(HttpStatusCode.InternalServerError)
{
Content = new StringContent("An error occurred, please try again or contact the administrator."),
ReasonPhrase = "Critical Exception"
});
}
}
private string WriteAttachedFile(HttpRequest httpRequest, string FileName)
{
var postedFile = httpRequest.Files[0];
var directoryPath = System.Configuration.ConfigurationManager.AppSettings["FolderPath"].ToString() + FileName + "\\\\";
var filePath = directoryPath + postedFile.FileName;
Directory.CreateDirectory(directoryPath);
postedFile.SaveAs(filePath);
var Path = filePath.Replace("\\", "/");
return (Path);
}
private bool DeleteAttachedFile(string FileName)
{
if (System.IO.File.Exists(FileName))
{
System.IO.File.Delete(FileName);
}
string[] words = FileName.Split('/');
string directoryPath = words[words.Length - 2];
if (Directory.Exists(directoryPath))
{
Directory.Delete(directoryPath);
}
return (true);
}
That's because the file you sent through the mail is still not being downloaded at the receiver's end. This happens even in sending files through Skype or even copying to USB stick. Make sure the file is downloaded on the receiver's end
I'm trying to open a binary file './test' and send it's contents, one byte at a time, to an external device through the UART. The external device echos back each byte.
The regular file './test' and the buffer 'dta' in this case are both 19860 bytes in length however the code will continue to send bytes from beyond the end of buffer 'dta' well after 'a' becomes greater than 'dta.length' and I can't figure out why. Any ideas?
var fs = require('fs');
var SerialPort = require("serialport").SerialPort
var serialPort = new SerialPort("/dev/ttyAMA0", {baudrate: 115200}, false);
stats = fs.statSync(__dirname+"/test");
dta = new Buffer (stats.size);
dta = fs.readFileSync(__dirname+"/test");
a=0;
serialPort.open(function (error) {
if ( error ) {
console.log('failed to open: '+error);
} else {
serialPort.write(dta[a]);
}
});
serialPort.on('data', function(data) {
a++;
if (a < dta.length) serialPort.write(dta[a]);
});
I am currently struggling to run my Node.js server.
What I want to do:
Upload a CSV-File from mobile device to my local server and save it on the file system
Read each line of the .csv-File and save each row to my MongoDB database
Uploading and saving the file works flawlessly. Reading the .csv-File and saving each row to the database only works for files with small line numbers.
I don't know the exact number of lines when it stops working. It seems to differ every time I read a file.
Sometimes (if the line numbers are bigger than 1000) the CSV-Reader I use doesn't even start processing the file. Other times he reads only 100-200 lines and then stops.
Here is my code how I upload the file:
var fs = require('fs');
var sys = require("sys");
var url = require('url');
var http = require('http');
http.createServer(function(request, response) {
sys.puts("Got new file to upload!");
var urlString = url.parse(request.url).pathname;
var pathParts = urlString.split("/");
var deviceID = pathParts[1];
var fileName = pathParts[2];
sys.puts("DeviceID: " + deviceID);
sys.puts("Filename: " + fileName);
sys.puts("Start saving file");
var tempFile = fs.createWriteStream(fileName);
request.pipe(tempFile);
sys.puts("File saved");
// Starting a new child process which reads the file
// and inserts each row to the database
var task = require('child_process').fork('databaseInsert.js');
task.on('message', function(childResponse) {
sys.puts('Finished child process!');
});
task.send({
start : true,
deviceID : deviceID,
fileName : fileName
});
sys.puts("After task");
response.writeHead(200, {
"Content-Type" : "text/plain"
});
response.end('MESSAGE');
}).listen(8080);
This works all fine.
Now the code of the child process (databaseInsert.js):
var sys = require("sys");
var yaCSV = require('ya-csv');
var Db = require('mongodb').Db;
var dbServer = require('mongodb').Server;
process.on('message', function(info) {
sys.puts("Doing work in child process");
var fileName = info.fileName;
var deviceID = info.deviceID;
sys.puts("Starting db insert!");
var dbClient = new Db('test', new dbServer("127.0.0.1", 27017, {}), {
w : 1
});
dbClient.open(function(err, client) {
if (err) {
sys.puts(err);
}
dbClient.createCollection(deviceID, function(err, collection) {
if (err) {
sys.puts("Error creating collection: " + err);
} else {
sys.puts("Created collection: " + deviceID);
var csvReader = yaCSV.createCsvFileReader(fileName, {
columnsFromHeader : true,
'separator' : ';'
});
csvReader.setColumnNames([ 'LineCounter', 'Time', 'Activity',
'Latitude', 'Longitude' ]);
var lines = 0;
csvReader.addListener('data', function(data) {
lines++;
sys.puts("Line: " + data.LineCounter);
var docRecord = {
fileName : fileName,
lineCounter : data.LineCounter,
time : data.Time,
activity : data.Activity,
latitude : data.Latitude,
longitude : data.Longitude
};
collection.insert(docRecord, {
safe : true
}, function(err, res) {
if (err) {
sys.puts(err);
}
});
});
}
});
});
process.send('finished');
});
At first I didn't use a child process but I had the same behaviour as I have now. So I tested this.
Hopefully someone who has some experience with Node.js can help me.
I think your issue is that you are trying to read the tempFile while it is still being written to. Right now you are piping the request to the file stream (which proceeds in parallel and asynchronously) and start the reader process. The reader process will then start reading the file in parallel with the write operations. If the reader is faster (it usually will be), it will read the first couple of records but then encounter an end of file and stop reading.
To remedy this, you could only start the reader process after writing has completely finished, i.e., put the part from sys.puts("File.send"); onward into a callback of tempFile.end(...) (see http://nodejs.org/api/stream.html#stream_writable_end_chunk_encoding_callback).
Reading the file while it is still being written to, akin to the tail command in Unix, is fairly hard in my understanding (google for details on how difficult it is to implement a proper tail).
Are you familiar with mongoimport/export?
I used this in the past to export from my db to a csv file...so you can do the opposite after you upload it from the mobile-client to the server.
Its from the shell, but you can write it in code using nodeJS_ChildSpawn