There are better ways that place usability first. By usability this means that there is nothing for the content creator to do and nothing for the frontend developer to do.
I use mod_pagespeed - there are versions for nginx and Apache that do all of the heavy lifting.
With mod_pagespeed you can get all of the src_set images at sensible compression levels. All you need is to markup your code with width= and height= values for each img.
With this in place the client can upload multi-megabyte images from their camera without having to fiddle in Photoshop etc. It just works and the hard part is abstracted out to mod_pagespeed.
By taking this approach there is no need to use fancy build tools. However, a background script to 'mogrify' your source images is a nice complement to mod_pagespeed, if you want your images to be in Google Image Search then 1920x1080 is what you need.
The really good thing about taking the mod_pagespeed route is that you do get 'infinite zoom' on mobile, e.g. pinch and zoom and it fills in the next src_set size. Keep going and you eventually get the original, which you have background converted to 1920x1080.
There is also the option to optimise image perceptually, so you are not just mashing everything down to 70% (or 84%).
On your local development box you can run without mod_pagespeed and just have the full resolution images.
Or you can experiment with more advanced features such as lazy_loading - this also comes for free with mod_pagespeed.
If you want your images to line up in nice squares then you might add in whitespace to the images. Maybe taking time in Photoshop to do this. However, it is easier to just 'identify' the image height/widths and to set something sensible for them, keeping the aspect ratio correct. Then you can use modern CSS to align them in figure elements to then let mod_pagespeed fill out the src_sets.
Icons and other images that are needed are best manually tweaked into cut down SVG files and then put into CSS as data URLs, thereby reducing a whole load of extra requests (even if it is just one for a fiddly 'sprite sheet').
Oh, a final tweak, if you are running a script to optimise uploaded images and to restrict max size then you can also use 4:2:0 colour sampling. This is where the image still has the dots but the colours are 'halved in resolution'. This is not noticeable in a lot of use cases and particularly good if you are using PNGs to get that transparency.
As mentioned, mod_pagespeed reduced project complexity by offloading the hard work to the server, keeping cruft out of the project and making the build tools out of the way. It can also be covered to inline some images and plenty else to get really good performance.
Mileage may vary if the decision has been made to use a CDN where such functionality is not possible. However, if serving a local market then a faux CDN is pretty good, i.e. a static domain on HTTP2 where the cache is set properly and no cookies are sent up/down the wire to get every image.
I use mod_pagespeed - there are versions for nginx and Apache that do all of the heavy lifting.
With mod_pagespeed you can get all of the src_set images at sensible compression levels. All you need is to markup your code with width= and height= values for each img.
With this in place the client can upload multi-megabyte images from their camera without having to fiddle in Photoshop etc. It just works and the hard part is abstracted out to mod_pagespeed.
By taking this approach there is no need to use fancy build tools. However, a background script to 'mogrify' your source images is a nice complement to mod_pagespeed, if you want your images to be in Google Image Search then 1920x1080 is what you need.
The really good thing about taking the mod_pagespeed route is that you do get 'infinite zoom' on mobile, e.g. pinch and zoom and it fills in the next src_set size. Keep going and you eventually get the original, which you have background converted to 1920x1080.
There is also the option to optimise image perceptually, so you are not just mashing everything down to 70% (or 84%).
On your local development box you can run without mod_pagespeed and just have the full resolution images.
Or you can experiment with more advanced features such as lazy_loading - this also comes for free with mod_pagespeed.
If you want your images to line up in nice squares then you might add in whitespace to the images. Maybe taking time in Photoshop to do this. However, it is easier to just 'identify' the image height/widths and to set something sensible for them, keeping the aspect ratio correct. Then you can use modern CSS to align them in figure elements to then let mod_pagespeed fill out the src_sets.
Icons and other images that are needed are best manually tweaked into cut down SVG files and then put into CSS as data URLs, thereby reducing a whole load of extra requests (even if it is just one for a fiddly 'sprite sheet').
Oh, a final tweak, if you are running a script to optimise uploaded images and to restrict max size then you can also use 4:2:0 colour sampling. This is where the image still has the dots but the colours are 'halved in resolution'. This is not noticeable in a lot of use cases and particularly good if you are using PNGs to get that transparency.
As mentioned, mod_pagespeed reduced project complexity by offloading the hard work to the server, keeping cruft out of the project and making the build tools out of the way. It can also be covered to inline some images and plenty else to get really good performance.
Mileage may vary if the decision has been made to use a CDN where such functionality is not possible. However, if serving a local market then a faux CDN is pretty good, i.e. a static domain on HTTP2 where the cache is set properly and no cookies are sent up/down the wire to get every image.
https://www.modpagespeed.com/doc/filter-image-optimize https://www.modpagespeed.com/doc/filter-image-responsive