Error: 413 “Request Entity Too Large” Html + MVC WCF service - javascript

Basically, I created one page into HTML then i want to send two files XML & SVG. Both files are converted to text and then send to the WCF service in the MVC project. but it gives "Error: 413 “Request Entity Too Large”. I tried a small file & its works. but more than 200kb is not working well.
i tried to convert stream, but I had no luck. so I decided to convert it into string & pass it.
My HTML Page -
My WCF code -
also, i checked online solution for that but its already done.
ERROR -
If anyone knows, how to convert string to Stream in javascript, then tell me. I am able to accept stream value.

You can set the Maximum limit for receiving files in WCF like this:
WebServiceHost webServiceHost = new WebServiceHost(typeof(UploadService), uri);
WebHttpBinding binding = new WebHttpBinding();
binding.MaxReceivedMessageSize = Int32.MaxValue;
webServiceHost.AddServiceEndpoint(typeof(IUploadService), binding, "WebServiceHost");
webServiceHost.Open();
And you can try setting the upload size limit in HTML web.config.

I found the solution after spending 1-2 days. look into this url
https://cmatskas.com/upload-files-in-asp-net-mvc-with-javascript-and-c/

Related

How do i get all the data passing in WebSocket

I want to know how to get all the data passed on the WebSocket
I alredy tryied using FireFox to see but all the data are strange unicode text and symbols (game link is https://sploop.io) is there an way to maybe decrypt it?
I also tryied using
var data= new WebSocket("usa1.sploop.io/ws")
data.onmessage = (sa)=>{console.log(sa)}
And after some actions in the game the code logged an object that didnt have any of the data...
You're already getting all the data the WebSocket is receiving. The problem is that the data is "encoded" binary data using the game's protocol. The scripts in Sploop.io know how to decode this data (and encode new data to be sent back), but since you don't "speak" that protocol, it looks like gibberish to you.
Problem aside, you can have fun and all, but trying to cheat or so isn't nice towards other players.

Loading buffer data from database as PDF

I have been developing a web app where the user can upload a PDF file, and then later retrieve it and view it. What I have been doing to achieve this is having the PDF uploaded to a PostgreSQL database as bytea datatype (the column is called "attachment"), and then having nodeJS offer up this data to be fetched back and turned back into a PDF to view.
However I have been struggling to convert the data back into a valid PDF. This is the method I am using so far.
var file = new Blob([res[i].attachment], { type: 'application/pdf' });
var fileURL = URL.createObjectURL(file);
document.getElementById("pdf_box").data = fileURL;
The #pdf_box identifier refers to an object element in a HTML file which is used to display the PDF (this has been shown to work when I provide the file location of a dummy PDF file to the data attribute of this element).
The res[i].attachment is also shown to provide valid buffer data in JSON format, with an example provided below:
"attachment":{"type":"Buffer","data":[91,111,98,106,101,99,116,32,70,105,108,101,93]}
When I load the created fileURL into the data attribute of #pdf_box however, I get an error indicating along the lines of an invalid PDF file. My research so far appears to indicate this may be because the data is coming in as buffer whereas it needs to be in byte form, but I haven't found much that helps show me how this may be achieved or if there is a way to convert between the forms with the data I have access to? I have also seen occasional reference to a method called pdf.create(), but I cannot find documentation on this and I assume it must belong to a third-party library for JS.
If you know of any information that can help me with understanding what my problem is and what to search to figure out a solution, it would all be greatly appreciated, thank you.
To all reading this question, this method does not seem possible and would likely be incredibly inefficient to implement. If you send a string to nodeJS larger than the MTU of your link to this server (anywhere between 68 bytes and >1500 bytes depending on every component of your network) the packet will be silently dropped with no attempts to resolve or throw an error. What you must do is take the approach of using "multipart/form-data" encoding to send any files to the server (more on this found here).
It should also be mentioned that uploading a file to the database is not recommended in any capacity (due to databases being quite inefficient storage for large amounts of data) and what should be done is to upload a reference to the file path or URL to find the file at. Then on the client-side this file could be rendered as such when you have retrieved the file location...
<object id="pdf" data="../files/test.pdf"></object>
To change the data attribute, the following can be done...
document.getElementById("pdf").data = "../files/test2.pdf";

How to send large data from native Java to JavaScript?

I am developing an Android app, I have a large JSON string around 5 MB in the Java native code. I need to send this string from Java code to JavaScript (JS files are in assets folder). When I run the application, it hangs.
public String readFileDataofIamge()
{
StringBuilder text = new StringBuilder();
String FILE_NAME = "ImageDataFile.txt";
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(openFileInput(FILE_NAME)));
while ((line=reader.readLine())!=null) {
text.append(line.toString());}
reader.close();
}
catch (Exception e) {
}
return text.toString();
}
#JavascriptInterface
public String readImageData()
{
return readFileDataofIamge();
}
JS Code:
function getDatafromFile() {
var imageData=Android.readImageData();
}
If I return some small string value like hello world, I get the value in JavaScript code.
So is this possible to send this large data from native code to JavaScript code? If not possible is there any other approach to send large data from native code to JS Code?
Condition given: There is no internet connection, so I cannot make any HTTP or network call.
============================================================================
Application overview:
I have an Android app, in which I'm using WebView. all web related files (js and HTML) are inside the assets folder. On first launch I need to get all the data from the server. (I have internet connection at the very first time of app launch) Data is very large, it's around 5 MB, but it may very up to 50 MB. I need to store this data somewhere. so that If I relaunch the app anytime without the internet connection, the app should have this data and it should work in offline mode.
So for this requirement, I have tried to write this data (around 20 MB) into internal storage file and trying to read this file and sending data to JavaScript code. But it's not working.
I have thought to use SQLite DB instead of storing in File, But I think again there will be the same problem while sending data from native code to JS code.
So could you guys please suggest some better approach.
Your problem won't get solved if you store the data in a database unless you won't need the whole file at once.
If the Website uses REST or similar and you have access to the code you could hook the requests and send them over to the Java / Kotlin part of your App. Otherwise you could use WebViews interceptRequest.
Once you are able to get the requests you can return the required data from your database. Imagine it like representing the backend for the website.
Example:
Website requires all users via GET users/all. You intercept this request via a hook which calls a Java method and then returns the result OR via interceptRequest. Either way you will have your local database on which you run a query like (pseudocode): Select * from USER. Then return the result.
Since you are dealing with JSON you could use a NOSQL database like CouchbaseLite which directly stores the JSON. Disadvantage: if you need to modify the data you will have to convert it to a POJO and back to JSON once you return it / store it again. Or use a SQLite database. For this you would have to convert the JSON you download to POJOs and then store them.
It would be great if the backend (from where you get your JSON) allows you to get different model types and not everything at once (otherwise you might run into memory issues). You could use Retrofit for networking and Moshi or Gson for your conversion to POJOs.
If there are no REST requests or similar and you need the whole file at once. Honestly I'd say there is no way of handling this in pleasent way.

knockout.js external model js file

I have an MVC.NET app which using Knockout.js (+ knockout.mapping) to deal with some cascading dropdowns. The data for these comes from a WebAPI call to an external service. As it happens this service requires an authentication token which expires after 2 hours, so I have the MVC app put the data from the service in a System.Web.Caching.Cache and return it from there unless the token has expired where it will grab it again from the service.
This is working fine.
However when I need to get this to the View, I am currently using the following method, which is to have a property of the ViewModel that I assign the service/Cache data to and then do this in the view:
var model = new ViewModel(#Html.Raw(Json.Encode(Model.ReferenceData)))
ko.applyBindings(model);
where Model.ReferenceData is the data from the service.
again this is working fine, but... the thing is with this, that the page then has all that Json data dumped in it on each request.
I would like to use an external JS file for the ReferenceData as then at least it can be cached by the browser and lessen the weight of the page on future requests.
However, I imagine that the overhead of generating a JS file is not that small, along with – what I really need is it to generate a link to that file that changes in much the same way that the built in MVC bundling of js files works – generating a link with a querystring.
My question is: is there an easy way of doing this?
For sure I can, when the cache is filled that first time, generate a js file and reference that from the View, but as I say getting that to change its link each time it is refreshed – or at least working out whether the data in it has changed and updating it only then is where the problem lies.
Any insight to this would be of great help
Thanks
Nat
Version the JS file (you can keep a GUID in the file it-self).
In Application_Start() get this version ID to a static variable.
In your controller pass this static variable data to ViewBag.
Ref your script with this ID
When you regenerate the file, update the version in file as well as your static variable. Next request from the client get the new version with new key.
Now if you want to update clients on the new version you have to use bi-directional protocol like web sockets or long-polling.

Image corrupted or truncated Phonegap

I'm using Phonegap to develop an android application. Users take photo, the photo is stored in a mysql database (medium-blob column). I store them using a simple INSERT INTO query, without changing the data. The data are sent server side using a REST call (PUT)
Here's an example of the content of this column:
thumb = '/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDACgcHiMeG...'
It is written on the phonegap documentation that the image captured through the camera is encoded in base 64.The problem is, when i try to retrieve my images in the database, I cannot display them using this JS code :
$('#myImg').attr("src", "data:image/png;base64," + data
Any ideas of where this "Image corrupted and truncated" come from ? :(
The problem was located in the way I was sending the images.
I was sending the data through a string. I tried to pass them nested in a json object and It worked.

Categories