I'm using nativescrip-background-http plugin to upload images to a remote server.
The idea is that I use nativescript-imagepicker plugin to select the image, save it to a folder using saveToFile then upload it.
saveToFile works well with smaller files but there's a lug when the file size is big. Is there a way to reduce file size before saving it?
I was able to circumnavigate the issue by setting the image height, width and quality using getImage() as shown bellow.
getImage({maxWidth: 200, maxHeight: 200, quality: 100})
This makes all files saved using saveToFile method of a uniform quality thus no lugging while dealing with large or smaller files. A better suggestion would be great though.
You can check this Compress images on client side before uploading. But you will lose some data while compressing.
In the case of image, it will be the resolution.
Related
I'm asking user on registration to upload an avatar images but, some images takes much time to be uploaded as it's very big or something, is there any way to reduce images size before send them with post request, i'm using nuxtjs.
Not really.
You could make a validation on the size and do not allow them if it's above a specific threshold.
Don't do image compression on client-side tho.
There are some ugly solutions here tho, like converting your image into a canvas and back, you will definitively get some sub-par image quality but it may be okay in your case?
This answer is still relevant too.
So I'm currently building a website using a php backend and polymer frontend. The client wants to be able to have a news feature for their own events. For this I want to convert all the images to webp and create a few different sizes so I can serve them quickly to different browsers (Mobile, Tablet, Desktop etc). However I haven't been able to find a good way of doing this in PHP or JS. Squoosh is great for static assets but not user generated content. Any help appreciated thanks!
PHP has functions for manipulating webp images. Try this.
<?php
$im = imagecreatefromstring(file_get_contents('path/to/image.jpg')); // Create image identifier
imagewebp($im, 'path/to/image.webp'); // Generate webp image and save to location
imagedestroy($im); // Free up image identifier
?>
The resizing must be necessarily done server side. The thing that you can do is to use the srcset and sizes attributes of the image tag to optimize the version to use:
<img srcset="elva-fairy-320w.jpg 320w,
elva-fairy-480w.jpg 480w,
elva-fairy-800w.jpg 800w"
sizes="(max-width: 320px) 280px,
(max-width: 480px) 440px,
800px"
src="elva-fairy-800w.jpg" alt="Elva dressed as a fairy" />
(directly from Mozilla documentation)
I would highly recommend using Adobe Photoshop. With this you can manually compress/resize images or submit them in batch.
I don't know if you have access to the server, but one way could be to call ImageMagick from PHP.
It would require for PHP to interact with the server, which can be dangerous, so please keep that in mind.
ImageMagick although don't support webm, to my knowledge, but Im sure you get the idea behind the though.
If you don't want PHP to interact with the server itself, you could also scan for non-converted / Resized images, and then convert them.
On linux it could be: find ./ -name "*.jpg" -exec CONVERT_FUNCTION{} \;
For resizing and compressing the image, you would need an image library installed with your PHP, like ImageMagick or GD
You can write your own resizing function as shown in https://stackoverflow.com/a/14649689, but you have to be careful with the image-types, as they may have their own function per type.
A maybe more easy way to resize is using the image intervention package. https://image.intervention.io/v2/api/resize (this also requires either GD or IamgeMagick to be installed):
// resize image to fixed size
$img->resize(300, 200);
// resize only the width of the image
$img->resize(300, null);
// resize only the height of the image
$img->resize(null, 200);
// resize the image to a width of 300 and constrain aspect ratio (auto height)
$img->resize(300, null, function ($constraint) {
$constraint->aspectRatio();
});
Using this library, you may also compress the image using encode or save function:
https://image.intervention.io/v2/api/encode
https://image.intervention.io/v2/api/save
// open and resize an image file
$img = Image::make('public/foo.jpg')->resize(300, 200);
// save file as jpg with medium quality
$img->save('public/bar.jpg', 60);
You may alsouse tinypng API's to compress your images: https://tinypng.com/developers, it compresses jpg, png and WebP and is free if you scale 500 images per month
I have a form input file, and when we upload an image, it will resize the image 2 times. First the original image will resize to square resolution(100x100), and second the original image will resize to landscape resolution(1000x500). After upload the square resolution will go to folder square and landscape will go for folder landscape.
So the original image won't be saved to the database, but the resized images will. Do you think a jQuery plugin for my case exists?
Javascript/Jquery is not the right choice there.
Javascript/Jquery work client side: they operate on users pc and not on your server. Maybe there are some plugins capable of resizing images, but surely you won't be able to store them in different folders using Javascript/Jquery
Such an operation must be done on your server, with a sever side language like PHP, NodeJS (still uses Javascript language), Java or many others.
The answer is based on the language off your choice, so i can't give a general one
I'm working on a site where a user can edit a banner for them selves and I'm planning to use kineticJS for the image editor. The thing is, the banner will go for print and the result image has to be 5014px by 12402px. It is possible to have an image sized like that, but the stage size would be only the 4% (200x496) of the final image? If not, what would you suggest to do to render a huge image like that? (I was thinking to generate a data file at the end of editing, then render the image in PHP with GD).
Thank you for your answers!
Just upload the huge image to the server without displaying it on canvas.
If you need to give the user visual feedback you can resize the image in PHP and send back the 4% image for Kinetic to display.
Here's a link to a php resizer: http://php.net/manual/en/imagick.resizeimage.php
BTW, 60M+ upload exceeds the default limits of PHP server so be sure you reconfigure your server!
Weeell, if you can restrict your users to use shapes only to draw their content you could serialize the stage and convert it to a vector format.
Then you can render this vector format to whatever size of bitmap you want.
But that's quite a project...
I was wondering if there is any other ways to compress my images or any script that would load the page faster / or the the images behind the scenes?
The site is very interactive and using very high quality layers of images for the main layout. I have already saved for web devices in Photoshop and re-compressed using ImageOptim, some are jpeg but the majority are png24 to maintain transparancy, they are all set in CSS.
I have used jpegs and css sprites where i can but there is one in particular of a tree illustration streching the full site length, that is really slowing up the loading time, is there any I could compress these images further or code them differently that I missed?
Any help would be great thanks!
You said you are spriting. That is good.
You can also use tools such as PNGcrush which attempt to make files smaller by dropping things such as meta data.
You should also send far distant expiry headers and use a cache breaker on your images, to ensure the images won't be downloaded again if unnecessary.
In Photoshop, choose file-> save for web, you will be able to find the best compromise between size and quality.
Do you really need the transparency there? PNG transparency is unsupported on some browsers and makes the page processing intensive and slow even on high end computers, depending on image size and quantity of layers. If you can show something of your site maybe someone can give more hints about how to optimize it.
You can compress them on the fly with Apache if that's your web server. One of many available articles on the subject: http://www.samaxes.com/2009/01/more-on-compressing-and-caching-your-site-with-htaccess/