danlev wrote:Does anybody know how to deal with this issue?
All uploaded images get passed through the
ValidatePicture function in PictureService.cs which uses the
ImageResizer library to compress the image and save it to either the file system or database. There are a couple of easily accessible settings under
Media Settings in nop that affect this file size of the resulting image:
1.
Maximum image size (default 1980) constrains the longest image size and resizes the image proportionally
2.
Default image quality (0 - 100) (default 80) determines the amount of compression that is applied to the resulting image (higher number = higher quality = less compression)
Andrei has previously
asked about this on Stack Overflow and received an answer from the author of the ImageResizer library, though that mainly covers png images not jpegs.
I think if your jpeg is heavily compressed before you upload it to nop then it can lead to it being re-encoded with less compression. Obviously it's not possible for this process to improve the image quality but it can lead to an increase in file size.
One possible way to prevent this might be to change the ValidatePicture function to only re-encode images on upload if it results in a smaller file size than the original (alteration in bold):
public virtual byte[] ValidatePicture(byte[] pictureBinary, string mimeType)
{
using (var destStream = new MemoryStream())
{
ImageBuilder.Current.Build(pictureBinary, destStream, new ResizeSettings
{
MaxWidth = _mediaSettings.MaximumImageSize,
MaxHeight = _mediaSettings.MaximumImageSize,
Quality = _mediaSettings.DefaultImageQuality
});
return (destStream.Length < pictureBinary.Length) ? destStream.ToArray() : pictureBinary;
}
}
(though this is then potentially bypassing the MaximumImageSize constraint)