Server disconnect while returning data from Javascript - javascript

My Blazor Server app uses a canvas element to manipulate an image that is loaded into an img element via a file drop zone, it serves as a reference for the canvas and holds the information that is worked on via JS. I'm using Excubo's Canvas wrapper to work within C# as much as possible but I don't think this error is related to it as it also happens with pure JS. Since this is more of a logic problem I don't think sharing code would be much help but here is a crud visual representation of what I mean, perhaps it will be more useful in trying to convey the problem.
The end goal is to invoke a function in JS to load the image src into a secondary work canvas, and using the data present in the main canvas, manipulate pixel information accordingly for both. After parsing the data, I convert it back toDataURL and set the img.src attribute.
The problem happens when returning the base64 data from that function particularly when using large images. The server disconnects immediately and I'm not sure if there is a way to extend the allowed wait time or this is just an implementation problem on my part.
Also worth mentioning that I extended the maximum file size for the dropzone InputFile component to 30MB but this happens with files of 3MB (and maybe even lower but I couldn't troubleshoot the threshold value yet).
Perhaps there's another way to achieve what I'm aiming for but I'm not sure how I would reference the img src attribute in C# without JS. Client-Server side logic isn't a concern because this app is supposed to be run locally on the server machine only so they are essentially the same target.

Found a solution that works for my case. Here is the reference for the fix.
The default MaximumReceiveMessageSize is too low for my needs, I increased it to 30MB and it works without disconnecting from server. Also tried increasing the ApplicationMaxBufferSize and TransportMaxBufferSize on the MapBlazorHub pipeline configurations but it didn't help. Increasing JS interop timeout on the AddServerSideBlazor service options was also unfruitful.
Here is the relevant Program.cs code I'm using now:
var maxBufferSize = 30 * 1024 * 1024;
builder.Services.AddServerSideBlazor().AddHubOptions(opt => { opt.MaximumReceiveMessageSize = maxBufferSize; });

Related

Async scaling of images in electron

I am populating a list (in my electron app), each item showing a hi-res image from disk.
Now, since the list is getting longer, I am running into performance issues: Electron is occupying a whole lot of RAM, so I guess I will need to render thumbnails somehow.
Electron provides an API for downscaling images.
But this is a synchronous process and so downscaling on render of the UI is not an option, since it would freeze the UI.
So it would need to be done when new images are added by the user.
However, I don't like this idea much, since I would need to take care of managing those additional thumbnail image files.
What is a good solution here? My idea was to run a http server (express) which takes care of delivering downscaled version of the images, similar to this API:
http://localhost:30948/imageshrinker?image=imagedir/myimg.jpg&size=medium
This seems like a whole lot to implement (especially because I think I would need to run this in another worker window right?) and I didn't find any library which I could use.
Any ideas how to best approach this issue?
How about using nativeImage.createThumbnailFromPath? It returns Promise, so you could display some loading/dummy/shimmer image on init and show thumbnail when particular promise resolves. With proper cache management (like render Chunks known for ex. from facebook wall) loading even 100 000 images should be a breeze.
Other than that, when I was faced with similar issue, I made some research and for me the best and (actually) the fastest library was Sharp, I was loading JPG images from Hasselblad that has more than 100MB in size and creating a set of thumbnails for Sharp was like 0.4s. This is a bit of troublesome, because you'll have to implement thumbnail management by your own, but in long run you'll benefit from that. In your case I would follow an algorithm such as:
On open file
Create md5 checksum
Compare md5 with database entry, if entry exist jump to 7
Open image within Sharp constructor, resize to desired value
Save to File within temp/thumbnails folder
sharp(input)
.resize(200, 300)
.toFile(`thumbnails/${md5}.png`)
.then(() => {
// update list item with proper image source
});
Write database entry with thumbnail source path under original md5 checksum
Insert thumbnail as an image source
If Image.onError rises which means thumbnail file does not exists, jump to 4
So you wouldn't have to generate thumbnails everytime someone opens an app, just remember to clean thumbnails directory from times to times, or just dump the idea of keeping thumbnails local, just keep them in-memory, display on request and check if performance is good- might be, like:
sharp(input)
.resize({ width: 100, height: 100 })
.toBuffer()
.then(data => {
tempImages[itemID] = nativeImage.createFromBuffer(data, {width: 100, height: 100})
});

What is the most efficient way to display images from a ZIP archive on an offline desktop app?

I have an offline Electron + React desktop app which uses no server to store data. Here is the situation:
An user can create some text data and add any number of images to them. When they save the data, a ZIP with the following structure is created:
myFile.zip
├ main.json // Text data
└ images
├ 00.jpg
├ 01.jpg
├ 02.jpg
└ ...
If they want to edit the data, they can open the saved ZIP with the app.
I use ADM-ZIP to read the ZIP content. Once open, I send the JSON data to Redux since I need them for the interface display. The images, however, are not displayed on the component that read the archive and can be on multiple "pages" of the app (so different components). I display them with an Image component that takes a name props and return an img tag. My problem here is how to get their src from the archive.
So here is my question: what could be the most efficient way to get the data of the images contained in the ZIP given the above situation?
What I am currently doing:
When the user open their file, I extract the ZIP in a temp folder and get the images paths from there. The temp folder is only deleted when the user close the file and not when they close the app (I don't want to change this behaviour).
// When opening the archive
const archive = new AdmZip('myFiles.zip');
archive.extractAllTo(tempFolderPath, true);
// Image component
const imageSrc = `${tempFolderPath}/images/${this.props.name}`;
Problem: if the ZIP contains a lot of images, the user must have enough space on their disk to get them extracted properly. Since the temp folder is not necessarily deleted when the app is closed, it means that the disk space will be taken until they close the file.
What I tried:
Storing the ZIP data in the memory
I tried to save the result of the opened ZIP in a props and then to get my images data from it when I need them:
// When opening the archive
const archive = new AdmZip('myFile.zip');
this.props.saveArchive(archive); // Save in redux
// Image component
const imageData = this.props.archive.readFile(this.props.name);
const imageSrc = URL.createObjectURL(new Blob([imageData], {type: "image/jpg"}));
With this process, the images loading time is acceptable.
Problem: with a large archive, storing the archive data in the memory might be bad for the performances (I think?) and I guess there is a limit on how much I can store there.
Opening the ZIP again with ADM-ZIP every time I have to display an image
// Image component
const archive = new AdmZip('myFile.zip');
const imageData = archive.readFile(this.props.name);
const imageSrc = URL.createObjectURL(new Blob([imageData], {type: "image/jpg"}));
Problem: bad performances, extremely slow when I have multiple images on the same page.
Storing the images Buffer/Blob in IndexedDB
Problem: it's still stored on the disk and the size is much bigger than the extracted files.
Using a ZIP that doesn't compress the data
Problem: I didn't see any difference compared to a compressed ZIP.
What I considered:
Instead of a ZIP, I tried to find a file type that could act as a non-compressed archive and be read as a directory but I couldn't find anything like that. I also tried to create a custom file with vanilla Node.js but I'm afraid that I don't know the File System API enough to do what I want.
I'm out of ideas so any suggestions are welcome. Maybe I didn't see the most obvious way to do... Or maybe that what I'm currently doing is not that bad and I am overthinking the problem.
What do you mean by "most efficient" is not exactly very clear. I'll make some assumptions and respond ahead according them.
Cache Solution
Basically what you have already done. It's pretty efficient loading it all at once (extracting to a temporary folder) because whenever you need to reuse something, the most spendable task doesn't have to be done again. It's a common practice of some heavy applications to load assets/modules/etc. at their startup.
Obs¹: Since you are considering the lack of disk space as a problem, if you want to stick with this is desirable that you handle this case programmatically and give the user some alert, since having little free space on storage is critical.
TL;DR - Spendable at first, but faster later for asset reutilization.
Lazy Loading
Another very common concept which consists in basically "load as you need, only what you need". This is efficient because this approach assures your application only loads the minimum needed to run and then load things as demanded by the user.
Obs¹: It looks pretty much on what you have done at your try number 2.
TL;DR - Faster at startup, slower during runtime.
"Smart" Loading
This is not a real name, but it describes satisfactorily what i mean. Basicaly, focus on understanding the goal of your project and mix both of previous solutions according to your context, so you can achieve the best overall performance in your application, reducing trade offs of each approach.
Ex:
Lazy loading images per view/page and keep a in-memory cache with
limited size
Loading images in background while the user navigates
Now, regardless of your final decision, the follow considerations should not be ignored:
Memory will always be faster to write/read than disk
Partially unzipping (setting specific files) is possible in many packages (including ADM-ZIP) and is always faster than unzipping everything, specially if the ZIP file is huge.
Using IndexedDB or a custom file-based database like SQLite offers overall good result for a huge number of files and a "smart" approach, since querying and organizing data is easier through those.
Always keep in mind the reason of every application design choice, the best you understand why you are doing something, the better your final result will be.
Topic 4 being said, in my opinion, in this case you really overthought a little, but that's not a bad thing and it's really admirable to have this kind of concern about how can you do things on the best possible way, I do this often even if it's not necessary and it's good for self improvement.
Well, I wrote a lot :P
I would be very pleased if all of this help you somehow.
Good luck!
TL;DR - There is not a closed answer based on your question, it depends a lot on the context of your application; some of your tries are already pretty good, but putting some effort into understanding the usability context to handle image loading "smartly" would surely award you.
The issue of using ZIP
Technically streaming the content would be the most efficient way but streaming zipped content from inside a ZIP is hard and inefficient by design since the data is fragmented and needs to parse the whole file to get the central directory
A much better format for your specific use case is 7zip it allows you to stream out single file and does not require you to read out the whole file to get the information about this single files.
The cost of using 7zip
You will have to add a 7zip binary to your program, luckily it's very small ~2.5MB.
How to use the 7zip
const { spawn } = require('child_process')
// Get all images from the zip or adjust to glob for the needed ones
// x, means extract while keeping the full path
// (optional instead of x) e, means extract flat
// Any additional parameter after the archive is a filter
let images = spawn('path/to/7zipExecutable', ['x', 'path/to/7zArchive', '*.png'])
Additional consideration
Either you use the Buffer data and display them inside the HTML via Base64 or you locate them into a cache folder that gets cleared as part of the extraction process. Depending on granular you want to do this.
Keep in mind that you should use appropriate folder for that under windows this would be %localappdata%/yourappname/cache under Linux and MacOS I would place this in a ~/.yourappname/cache
Conclusion
Streaming data is the most efficient (lowest memory/data footprint)
By using a native executable you can compress/decompress much faster then via pure js
the compression/decompression happens in another process avoiding your application from beeing blocked in the execution and rendering
ASAR as Alternative
Since you mentioned you considered to use another format that is not compressed you should give the asar file format a try it is a build in format in electron that can be accessed by regular filepaths via FS no extraction etc.
For example you have a ~/.myapp/save0001.asar which contains a img folder with multiple images in it, you would access that simply by ~/.myapp/save0001/img/x.png

Strategy for making React image gallery SEO-friendly

I wrote a React image gallery or slideshow. I need to make the alt text indexable by search engines, but because my server is in PHP, React.renderToString is of limited use.
The server is in PHP + MySQL. The PHP uses Smarty, a decent PHP template engine, to render the HTML. The rest of the PHP framework is my own. The Smarty template has this single ungainly line:
<script>
var GalleryData = {$gallery};
</script>
which is rendered by the PHP's controller function as follows:
return array(
'gallery' => json_encode($gallery),
);
($gallery being the result table of a MySQL query).
And my .js:
React.render(<Gallery gallery={GalleryData} />, $('.gallery').get(0));
Not the most elegant setup, but given that my server is in PHP there doesn't seem to be much of a better way to do it (?)
I did a super quick hack to fix this at first shot - I copied the rendered HTML from Firebug, and manually inserted it into a new table in the DB. I then simply render this blob of code from PHP and we're good to go on the browser.
There was one complication which is that because React components are only inserted into the DOM as they're first rendered (as I understand it), and because the gallery only shows one image slide at a time, I had to manually click through all slides once before saving the HTML code out.
Now however the alt text is editable by CMS and so I need to automate this process, or come up with a better solution.
Rewriting the server in Node.js is out of the question.
My first guess is that I need to install Node, and write a script that creates the same React component. Because the input data (including the alt text) has to come from MySQL, I have a few choices:
connect to the MySQL DB from Note, and replicate the query
create a response URL on the PHP side that returns only the JSON (putting the SQL query into a common function)
fetch the entire page in Node but extracting GalleryData will be a mess
I then have to ensure that all components are rendered into the DOM, so I can script that by manually calling the nextSlide() method as many times as there are slides (less one).
Finally I'll save the rendered DOM into the DB again (so the Node script will require a MySQL connection after all - maybe the 1st option is the best).
This whole process seems very complicated for such a basic requirement. Am I missing something?
I'm completely new to Node and the whole idea of building a DOM outside of the browser is basically new to me. I don't mind introducing Node into my architecture but it will only be to support React being used on the front-end.
Note that the website has about 15,000 pageviews a month, so massive scalability isn't a consideration - I don't use any page caching as it simply isn't needed for this volume of traffic.
I'm likely to have a few React components that need to be rendered statically like this, but maintaining a small technical overhead (e.g. maintaing a set of parallel SQL queries in Node) won't be a big problem.
Can anyone guide me here?
I think you should try rendering React components on server-side using PHP. Here is a PHP lib to do that.
But, yes, you'll basically need to use V8js from your PHP code. However, it's kind of experimental and you may need to use other around. (And this "other way around" may be using Node/Express to render your component. Here is some thoughts on how to do it.)

"data:image/png;base64,b64Encoderpngstring" VS url-to-folder when set html img's src

I use pywebkit and html5 to develope desktop map server.
Map tiles store in sqlite database.
So when I set html img's src, I have two options.
One is read image from database and b64encode it, then set img's src with "data:image/png;base64,b64Encoder-string".
Another is read image from database and save it on disk, then set img's src with url to local folder.
My question is which way is better.
I am most concerned about rendering speed. Which one is faster for the browser to render images.
<img src="http://mysite.com/images/myimage.jpg" /> this is actually a http URI scheme,
while <img src="data:image/png;base64,efT....." /> is a data URI, this way image is inlined in the HTML and there is no extra HTTP request however embedded images can’t be cached between different page loads in most cases, so the solution actually goes through your way...what is better for you and convenient to overlook what Ian suggested :)
Now move to the browser compatibility -
Data URI's don't work in IE 5-7, but are supported in IE 8. Source
Page Size:
base64 encoding increases page size as well, another thing to look at.
It largely depends on how the application is going to run, and other details such as running environment.
Saving the image to disk has the advantage that you can avoid re-encoding the image every time you need it (which can be avoided by adding a column in you DB with that computed base-64 string).
Summary:
1-Use the first option but cache it the database or a server variable.
2-Use the second option and cache the file names.
But then there is the question of how fast is your secondary storage. You might want to avoid using the HD, even with caching, because it's slow for example or you have limited binary storage (and have more DB storage).
I'd suggest you add specifics regarding what you are worried about.

Javascript caching question

I'm making an application that draws a lot of widgets and tiles on a canvas. The core UI will be defined by a long string of characters and drawn at page load by javascript. Since that core UI is big, >250K, and never changes, whats a good way to cache that?
I know I COULD just stick it in a variable at the top of the file, but is there a more efficient way? Like if wrote:
var img = new Image();
img.src = 'moose.png'
I assume that the browser will download and cache this image, so that on subsequent requests it won't have to hit my server again. But how do I do that with a chunk of text?
EDIT: basically I'm looking for an alternative to this:
var myUI = "AAAABBBCBVBVNCNDGAGAGABVBVB.... etc. for about 20 pages";
You can create a JavaScript file that contains the string of text.
var text='.....';
Then you can just include the script:
<script src="/ui.initialization.js" type="text/javascript"></script>
Followed by whatever other javascript you use to render the UI. The browser will cache the js file.
EDIT
I'm assuming you're opposed to having the long string of text inline. By moving it to a separate file, you allow the browser to cache it (if it's truly static and your server is configured with proper cache-control for static resources).
Here's some information for tweaking caching on Apache (I'm assuming you're running PHP on Apache).
Most static resources are cache-able by a browser. Just put your data in a .txt, .dat, .xml or whatever (even a .js) and load it with your javascript via AJAX.
time to download 250K over anything above 1Mbps thruput is < 1 second .. this is an issue for you?
And the very file you are downloading that contains that javascript with 250K baggage is going to be cached itself, probably.
You can use google gears or the new HTML 5 data storage features, supported by FF 3.5 and others.
Use Google page speed or YSlow to figure out what other (HTTP) improvments you can do.

Categories