I have spent several days researching and working on a solution for uploading/downloading byte[]’s. I am close, but have one remaining issue that appears to be in my AngularJS code block.
There is a similar question on SO, but it has no responses. See https://stackoverflow.com/questions/23849665/web-api-accept-and-post-byte-array
Here is some background information to set the context before I state my problem.
I am attempting to create a general purpose client/server interface to upload and download byte[]’s, which are used as part of a proprietary server database.
I am using TypeScript, AngularJS, JavaScript, and Bootstrap CSS on the client to create a single page app (SPA).
I am using ASP.NET Web API/C# on the server.
The SPA is being developed to replace an existing product that was developed in Silverlight so it is constrained to existing system requirements. The SPA also needs to target a broad range of devices (mobile to desktop) and major OSs.
With the help of several online resources (listed below), I have gotten most of my code working. I am using an asynchronous multimedia formatter for byte[]’s from the Byte Rot link below.
http://byterot.blogspot.com/2012/04/aspnet-web-api-series-part-5.html
Returning binary file from controller in ASP.NET Web API
I am using a jpeg converted to a Uint8Array as my test case on the client.
The actual system byte arrays will contain mixed content compacted into predefined data packets. However, I need to be able to handle any valid byte array so an image is a valid test case.
The data is transmitted to the server correctly using the client and server code shown below AND the Byte Rot Formatter (NOT shown but available on their website).
I have verified that the jpeg is received properly on the server as a byte[] along with the string parameter metadata.
I have used Fiddler to verify that the correct response is sent back to the client.
The size is correct
The image is viewable in Fiddler.
My problem is that the server response in the Angular client code shown below is not correct.
By incorrect, I mean the wrong size (~10K versus ~27.5K) and it is not recognized as a valid value for the UintArray constructor. Visual Studio shows JFIF when I place the cursor over the returned “response” shown in the client code below, but there is no other visible indicator of the content.
/********************** Server Code ************************/
Added missing item to code after [FromBody]byte[]
public class ItemUploadController : ApiController{
[AcceptVerbs("Post")]
public HttpResponseMessage Upload(string var1, string var2, [FromBody]byte[] item){
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
var stream = new MemoryStream(item);
result.Content = new StreamContent(stream);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
return result;
}
}
/***************** Example Client Code ********************/
The only thing that I have omitted from the code are the actual variable parameters.
$http({
url: 'api/ItemUpload/Upload',
method: 'POST',
headers: { 'Content-Type': 'application/octet-stream' },// Added per Byte Rot blog...
params: {
// Other params here, including string metadata about uploads
var1: var1,
var2: var2
},
data: new Uint8Array(item),
// arrybuffer must be lowecase. Once changed, it fixed my problem.
responseType: 'arraybuffer',// Added per http://www.html5rocks.com/en/tutorials/file/xhr2/
transformRequest: [],
})
.success((response, status) => {
if (status === 200) {
// The response variable length is about 10K, whereas the correct Fiddler size is ~27.5K.
// The error that I receive here is that the constructor argument in invalid.
// My guess is that I am doing something incorrectly with the AngularJS code, but I
// have implemented everything that I have read about. Any thoughts???
var unsigned8Int = new Uint8Array(response);
// For the test case, I want to convert the byte array back to a base64 encoded string
// before verifying with the original source that was used to generate the byte[] upload.
var b64Encoded = btoa(String.fromCharCode.apply(null, unsigned8Int));
callback(b64Encoded);
}
})
.error((data, status) => {
console.log('[ERROR] Status Code:' + status);
});
/****************************************************************/
Any help or suggestions would be greatly appreciated.
Thanks...
Edited to include more diagnostic data
First, I used the angular.isArray function to determine that the response value is NOT an array, which I think it should be.
Second, I used the following code to interrogate the response, which appears to be an invisible string. The leading characters do not seem to correspond to any valid sequence in the image byte array code.
var buffer = new ArrayBuffer(response.length);
var data = new Uint8Array(buffer);
var len = data.length, i;
for (i = 0; i < len; i++) {
data[i] = response[i].charCodeAt(0);
}
Experiment Results
I ran an experiment by creating byte array values from 0 - 255 on the server, which I downloaded. The AngularJS client received the first 128 bytes correctly (i.e., 0,1,...,126,127), but the remaining values were 65535 in Internet Explorer 11, and 65533 in Chrome and Firefox. Fiddler shows that 256 values were sent over the network, but there are only 217 characters received in the AngularJS client code. If I only use 0-127 as the server values, everything seems to work. I have no idea what can cause this, but the client response seems more in line with signed bytes, which I do not think is possible.
Fiddler Hex data from the server shows 256 bytes with the values ranging from 00,01,...,EF,FF, which is correct. As I mentioned earlier, I can return an image and view it properly in Fiddler, so the Web API server interface works for both POST and GET.
I am trying vanilla XMLHttpRequest to see I can get that working outside of the AngularJS environment.
XMLHttpRequest Testing Update
I have been able to confirm that vanilla XMLHttpRequest works with the server for the GET and is able to return the correct byte codes and the test image.
The good news is that I can hack around AngularJS to get my system working, but the bad news is that I do not like doing this. I would prefer to stay with Angular for all my client-side server communication.
I am going to open up a separate issue on Stack Overflow that only deals with the GET byte[] issues that I am have with AngularJS. If I can get a resolution, I will update this issue with the solution for historical purposes to help others.
Update
Eric Eslinger on Google Groups sent me a small code segment highlighting that responseType should be "arraybuffer", all lower case. I updated the code block above to show the lowercase value and added a note.
Thanks...
I finally received a response from Eric Eslinger on Google Group. He pointed out that he uses
$http.get('http://example.com/bindata.jpg', {responseType: 'arraybuffer'}).
He mentioned that the camelcase was probably significant, which it is. Changed one character and the entire flow is working now.
All credit goes to Eric Eslinger.
Related
I use an html table where it's content can be changed with mouse drag and drop implemented. Technically, you can move the data from any table cell to another. The table size 50 row * 10 column with each cell given a unique identifier. I want to export it to .xlsx format with C# EPPlus library, and give back the exported file to client.
So I need the pass the whole table data upon a button press and post it to either a web api or an mvc controller, create an excel file (like the original html table data) and send it back to download with browser.
So the idea is to create an array which contains each of table cell's value ( of course there should be empty cells in that array), and post that array to controller.
The problem with that approach lies in the download, if I call the api or mvc controller with regular jquery's ajax.post it did not recognize the response as a file.
C# code after ajax post:
[HttpPost]
public IHttpActionResult PostSavedReportExcel([FromBody]List<SavedReports> savedReports, [FromUri] string dateid)
{
//some excel creation code
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(new MemoryStream(package.GetAsByteArray()))
};
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
response.Content.Headers.ContentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("attachment")
{
FileName = dateid + "_report.xlsx"
};
ResponseMessageResult responseMessageResult = ResponseMessage(response);
return responseMessageResult;
}
Usually, for this kind of result I could use window.location = myurltocontroller to download properly , but that is only for GET requests, POST anything is not possible.
I found some answers which could help me in this topic:
JavaScript post request like a form submit
This points out I should go with creating a form, which passes the values, but I do not know how to do so in case of arrays (the table consists 50*10 = 500 values which I have to pass in the form)
I tried some only frontend solutions to the html-excel export problem, which of course does not require to build files on api side, but free jquery add-ins are deprecated, not customizeable, handle only .xls formats, etc.
I found EPPlus nuget package a highly customizeable tool, that is why I want to try this is at first place.
So the question is: how can I post an array of 500 elements, that the controller will recognize, generate the file, and make it automatically download from browser?
If you can provide some code that would be fantastic, but giving me the right direction is also helpful.
Thank you.
You can use fetch() (docs) to send the request from the JS frontend. When the browser (JS) has received the response, it can then offer its binary content as a download. Something like this:
fetch("http://your-api/convert-to-excel", // Send the POST request to the Backend
{
method:"POST",
body: JSON.stringify(
[[1,2],[3,4]] // Here you can put your matrix
)
})
.then(response => response.blob())
.then(blob => {
// Put the response BLOB into a virtual download from JS
if (navigator.appVersion.toString().indexOf('.NET') > 0) {
window.navigator.msSaveBlob(blob, "my-excel-export.xlsx");
} else {
var a = window.document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = "my-excel-export.xlsx";
a.click();
}});
So the JS part of the browser actually first downloads the file behind the scenes, and only when it's done, it's triggering the "download" from the browsers memory into a file on the HD.
This is a quite common scenario with REST APIs that require bearer token authentication.
I have a (GET) endpoint that sends data in chunks (Transfer-Encoding: chunked). The data is JSON encoded and sent line by line.
Is there a way to consume the data sent by this endpoint in an asynchronous manner in JavaScript (or using some JavaScript library)?
To be clear, I know how to perform an asynchronous GET, but I would like to have the GET request not waiting for the whole data to be transfered, but instead read the data line by line as it arrives. For instance, when doing:
curl http://localhost:8081/numbers
The lines below are shown one by one as they become available (the example server I made is waiting a second between sending a line and the second).
{"age":1,"name":"John"}
{"age":2,"name":"John"}
{"age":3,"name":"John"}
{"age":4,"name":"John"}
I would like to reproduce the same behavior curl exhibits, but in the browser. I don't want is leave the user wait till all the data becomes available in order to show anything.
Thanks to Dan and Redu I was able to put together an example that consumes data incrementally, using the Fetch API . The caveat is that this will not work on Internet Explorer, and it has to be enabled by the user in Firefox:
/** This works on Edge, Chrome, and Firefox (from version 57). To use this example
navigate to about:config and change
- dom.streams.enabled preference to true
- javascript.options.streams to true
See https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream
*/
fetch('http://localhost:8081/numbers').then(function(response) {
console.log(response);
const reader = response.body.getReader();
function go() {
reader.read().then(function(result) {
if (!result.done) {
var num = JSON.parse(
new TextDecoder("utf-8").decode(result.value)
);
console.log(
"Got number " + num.intVal
);
go ();
}
})
}
go ();
})
The full example (with the server) is available at my sandbox. I find it illustrative of the limitations of XMLHttpRequest to compare this version with the this one, which does not use the fetch API.
Background:
I am making a simple game in PHP, JavaScript and HTML for the web. A player control movements of a box on the screen, and see others fly around with their boxes.
I have the following files, that I upload to my domain via a hosting company:
index.html: a file with some buttons (eg. to start the game) and frames (for putting boxes in).
server.php: PHP script that receives messages from client, performs reads/writes to a database, echoes (using echo) boxes from database to the client. Does not echo the box of the player the message came from.
database.txt: a JSON text file containing data of players and the next free ID number. When empty it looks like this: {"players":[], "id": 1}. players contain objects with values such as ID, position and rotation.
script.js: JavaScript file with script to send/receive messages, display data from messages etc. Linked to index.html. Moves your box.
A screenshot, two players in movement:
Problem: The game crashes, always. Sooner or later. This is what happens:
Client recevies player data from server.php, everything is fine. This could be for 10 seconds or up to some minutes.
The data starts to falter, the message sometimes is null instead of actual data.
The data recevied is always null. The database file is now {"players":null,"id":5}. (The "id" could be any number, does not have to be 5).
Picture of data flow, printing of players from database. Two players. Before this screenshot lots of rows with valid data. Then as seen two null messages. Then after a while null forever.
I am not completely sure where the problem is, but I am guessing it has to do with my read/write in server.php. I feels like a lot of player movement makes the program more likely to crash. Also how often the program sends data affetcs.
Code Piece 1: This is code from server.php, that writes to the database. I have some sort of semaphore (the flock( ... ) ) to prevent clients from reading/writing at the same time (causing errors). I have an other function, read, which is very similar to this. Possible problems here:
The semaphore is incorrect.
The mode for fopen() is incorrect. See PHP docs. The mode w is for write. The tag b is for "If you do not specify the 'b' flag when working with binary files, you may experience strange problems with your data ...".
Something weird happening because I use read() in my writing function?
Code:
// Write $val to $obj in database JSON
function write($obj,$val){
$content = read();
$json = json_decode($content);
$json->{$obj} = $val; // eg. $json->{'id'} = 5;
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
if(flock($myfile, LOCK_EX|LOCK_NB)) {
fwrite($myfile,json_encode($json));
flock($myfile, LOCK_UN);
}
fclose($myfile);
}
Code Piece 2: This is my code to send data. It is called via a setInterval(). In script.js:
// Send message to server.php, call callback with answer
function communicate(messageFunc,callback){
var message = messageFunc();
if (window.XMLHttpRequest) {
var xmlhttp=new XMLHttpRequest();
}
xmlhttp.onreadystatechange= function() {
if (this.readyState==4 && this.status==200) {
callback(this.responseText);
}
}
xmlhttp.open("GET","server.php?msg="+message,true);
xmlhttp.send();
}
This is my code to receive data, in server.php: $receive = $_GET["msg"].
My current work of solving
This is what I have done so far, but nothing has changed:
Added mode b to fopen().
Added flock() to read/write functions in server.php.
Much reworking on script.js, I would say it looks/works very clean.
Check memory_get_peak_usage(), and check with the hosting company for memory limits. Should be no problem at all.
Looked at PHP garbage collecting and gc_enable() (I don't know why that would change anything).
Lots of testing, looking at the data flow.
Crying.
Conclusion: Is this type of application what PHP is for? What do you think is wrong? If you want more code/info I provide. Thank you very much.
Here is the root of your problem:
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
Note the behavior of the w open mode (emphasis mine):
Open for writing only; place the file pointer at the beginning of the file and truncate the file to zero length. If the file does not exist, attempt to create it.
This happens before you lock the file. What's happening is that between this fopen() call and the following flock() call, the file's content is zero length, and a reader is coming along during that time and reading the empty file.
Why doesn't this cause an error in PHP when you parse the empty string as JSON? Because json_decode() is defective, and returns null when the input is not valid JSON rather than throwing an exception. Nevermind that the string "null" is valid JSON -- json_decode() gives you no way to differentiate between the cases of valid input representing the null value and invalid input. If json_decode() actually threw an exception or triggered a PHP error (don't ask me why two error-signalling mechanisms are necessary in PHP), you would have a fantastic point to start debugging to figure out why the file is empty, and you might have solved this problem by now!
... sigh ...
PHP's "design" gives me headaches. But I digress.
To fix this whole problem, change the open mode to "cb" and ftruncate($myfile, 0) after you successfully acquire the lock.
Note the behavior of the c mode, which actually specifically mentions the approach you are using (emphasis mine):
Open the file for writing only. If the file does not exist, it is created. If it exists, it is neither truncated (as opposed to 'w'), nor the call to this function fails (as is the case with 'x'). The file pointer is positioned on the beginning of the file. This may be useful if it's desired to get an advisory lock (see flock()) before attempting to modify the file, as using 'w' could truncate the file before the lock was obtained (if truncation is desired, ftruncate() can be used after the lock is requested).
I've created a SAPUI5 table widget and made sure that it works. Now, when clicking on a row, the detail view is loaded, but no data is present. The server exposes an entity Site with a primary key which is of type "string".
The client-side code is as follows (assume that oModel is ODataModel, sSiteCode is a string that may contain Cyrillic characters):
// sSiteCode may contain Cyrillic characters
var oKey = {
SiteCode: sSiteCode
};
var sPath = "/" + oModel.createKey("Sites", oKey);
this.getView().bindElement({path: sPath});
It turns out that, if sSiteCode = 'б' (i.e., contains Cyrillic characters), then a GET request will be sent (via batching) to the following URI:
http://<server>:<port>/odata/Sites('б')
However, the server is unable to parse this URI (and subsequently replies with a 404), as it doesn't know what encoding to use. I patched the method ODataModel.prototype._createRequestUrl as follows:
sNormalizedPath = this._normalizePath(sPath, oContext);
sNormalizedPath = encodeURI(sNormalizedPath); // my addition
Then it seems to work, for this particular case. I'm wondering if this is a bug or a feature, and what should I do next?
FYI, I'm using OpenUI5 1.32.11.
Instead of sending
http://<server>:<port>/odata/Sites('б')
The actual string sending to the server should be
http://<server>:<port>/odata/Sites(%27б%27)
Which is the result of the encodeURI() call. Since UI5 allows you to freely define the Models URL and its parameters you have to take care on the correct URI encoding (and all parameters).
So in my opinion this is not a bug but the down part of the possibility to configure the URI without "black-box" behaviour of UI5.
I have an Custom Document Library Action to Alfresco files, and when I press this button opens a new page with an applet (javascript) to make changes to a file, but I'm doing the modifications in base64 and to "appear" on the screen with this :
var stringPDF = "<object data=\"data:application/pdf;base64," +
JSON.parse(pdfbase64).message + "\"
type=\"application/pdf\"width=\"100%\"
height=\"100%\"></object>";$("#pdfTexto").html(stringPDF);
But I really need is to change the file, for when the repository again, there have to change, not just display. How do I change the existing file's contents to the new with the change?
I use this URL to make GET of the file:
http://localhost:8080/share/proxy/alfresco/slingshot/node/content/workspace/SpacesStore/21384098-19dc-4d3f-bcc1-9fdc647c05dc/latexexemplo.pdf
Then I convert to the base64... And I make the changes...
But if I want to make a POST to change the content, how can I make this?
Thanks in advance.
As I mentionned in my response to this question :
The fastest and easiest way to achieve that is to leverage the RESTfull API
This will also ensure compatibility with new versions of alfresco.
Note that you need to provide the noderef for the document to update in the form property updatenoderef and that the property majorversion is a boolean flag to specify if the new version is a minor/major version of the document.
Here is a sample code that might help you with your usecase:
CloseableHttpClient httpClient = HttpClients.createDefault();
HttpPost uploadFile = new HttpPost(<alfresco-service-uri>+"/api/upload?alf_ticket="+<al-ticket>);
MultipartEntityBuilder builder = MultipartEntityBuilder.create();
builder.addTextBody("username", "<username>", ContentType.TEXT_PLAIN);
builder.addTextBody("updatenoderef", <noderef>, ContentType.TEXT_PLAIN);
builder.addTextBody("...", "...", ContentType.TEXT_PLAIN);
builder.addBinaryBody("filedata", <InputStream>, ContentType.DEFAULT_BINARY, <filename>);
HttpEntity multipart = builder.build();
uploadFile.setEntity(multipart);
CloseableHttpResponse response = httpClient.execute(uploadFile);
String responseString = IOUtils.toString(response.getEntity().getContent(), "UTF-8");
JSONObject responseJson = new JSONObject(responseString);
if (response.getStatusLine().getStatusCode()!=200){
throw new Exception("Couldn't upload file to the repository, webscript response :" + responseString );
}
Note 1: You need to replace these tockens <*> with your own values/vars
Note 2: If you have problem retrieving a ticket, check this link, or this one
Note 3: To do this in JavaScript instead of java, visit this link and try to use js to post the parameters I referred as instructed !
Note 4: Since you are working on share, you are most probably authenticated.
If it is the case, you can access your alfresco repo through the proxy endpoint in share and all requests will have authentication ticket attached to them before getting forwarded to your repo !
In other terms, use this endpoint :
/share/proxy/alfresco/api/upload
Instead of :
/alfresco/service/api/upload
and You won't even have to attach a ticket to your requests.
You need to follow these steps to achieve what you are looking for.
1) Reading File:
To display content of PDF file already uploaded you need to read content of file. You are able to do it successfully using following API call.
http://localhost:8080/share/proxy/alfresco/slingshot/node/content/workspace/SpacesStore/21384098-19dc-4d3f-bcc1-9fdc647c05dc/latexexemplo.pdf
2) Capture New Content:
Capture new file content from User from applet. I guess you are storing it in some String variable.
3) Edit Existing File Content:
Issue here is that you cannot simply edit any pdf file using any of out of box Alfresco REST API (as far as I know). So you need to create your own RESTFul API which could edit pdf file's content. You can consider using some third party libraries to do this job. You need to plugin logic of editing pdf in RESTFul API
4) Changes back to Repo:
Call Your API from Step 3:
You could also have look at this plugins which could fulfill your requirements.
https://addons.alfresco.com/addons/alfresco-pdf-toolkit
Hope this helps.