I guess you need to cache the images. so these methods are run only once.
You may also use a queue function to process the images on the background.
php.ini memory limit is set to 1Gig is a bad idea. if your server is getting many connections it will soon run out of memory and crash.
Thanks for your answer.
Caching: Those images are only rendered once. Next time they are rendered, there's a different crop, so caching doesn't seem to make sense, right?
Queue: Haven't been working with queues so far but that seems to be an idea. I Guess callback functions must be possible as well. I will dig into that
Memory Limit: You advise me to keep the limit open? The server is hosting a special multimedia content management system. It's not a frontend server, so you still think there might be the need to keep the limit open?
My progress so far: Using the backup() and reset() method I was able to reduce the rendering time drastically by over 50%. I only create and crop an image once. After that I have a loop where I start with the highest resize-size. Then I set a restore point with backup() and afterwards encode and save it. Next time I reset() the image from the restore point and have a smaller image to resize again which is faster.
Here's the code:
$filename_base = 'test';
// crop the image
$img->crop($pos['width'], $pos['height'], $pos['x'], $pos['y']);
// ----
// optimizing image resize speed:
// 1. resize steps
$resize = array( array( 'size_x' => $width[0],
'compress' => $compress[0],
'filename' => 'desktop' ),
array( 'size_x' => $width[1],
'compress' => $compress[1],
'filename' => 'mobile' ),
array( 'size_x' => $width[2],
'compress' => $compress[2],
'filename' => 'full' )
);
// 2. sort array by size_x
// therefore I have a helpers function called cmp
usort($resize, "Helpers::cmp");
// 3. in order to save time we will start resizing with the biggest image and then continue to get smaller
for ($i = count($resize)-1; $i >= 0; $i = $i-1) {
if($i < count($resize)-1)
$img->reset();
// resize image
$img->resize( $resize[$i]['size_x'], null, function($constraint) {
$constraint->aspectRatio();
$constraint->upsize();
});
// create restore point
$img->backup();
// encode image with individual compression
$img->encode('jpg', $resize[$i]['compress']);
// save file to BLOB
FileStore::add($connect_blob, Config::get('privateconfig.ablob1'), $filename_base.$resize[$i]['filename'].'.jpg', $img);
}
Still any ideas for improvements are more than welcomed.
Those images are only rendered once. Next time they are rendered, there's a different crop, so caching doesn't seem to make sense, right?
i guess so. can you please explain to me the actual use case of this script.
You advise me to keep the limit open? The server is hosting a special multimedia content management system. It's not a frontend server, so you still think there might be the need to keep the limit open?
ah. then it is a different case. cannot tell for sure until you run a load test on the server.
Did you try this without compression ? It will make big files yes but i think will make the whole process end faster.
Thanks :)
Did you try this without compression ? It will make big files yes but i think will make the whole process end faster.
We are rendering the images for a multimedia reportage. These stories are viewed by thousands of users where a lot of them have a slow connection so image compression is crucial.
But hey I came up with a different approach ;) Once an image is resized it should be immediately rendered by the server. A Javascript calls a server function in the background where the image gets rendered. So if you work on a story with 50 images and you publish it in the end, you don't need to process all 50 photos as they are already on the server.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community