Cause
Some images are just very hard to down-sample and interpolate such as this one with curves when you want to go from a large size to a small.
Browsers appear to typically use bi-linear (2x2 sampling) interpolation with the canvas element rather than bi-cubic (4x4 sampling) for (likely) performance reasons.
If the step is too huge then there are simply not enough pixels to sample from which is reflected in the result.
From a signal/DSP perspective you could see this as a low-pass filter's threshold value set too high, which may result in aliasing if there are many high frequencies (details) in the signal.
Solution
Update 2018:
Here's a neat trick you can use for browsers which supports the filter
property on the 2D context. This pre-blurs the image which is in essence the same as a resampling, then scales down. This allows for large steps but only needs two steps and two draws.
Pre-blur using number of steps (original size / destination size / 2) as radius (you may need to adjust this heuristically based on browser and odd/even steps - here only shown simplified):
const canvas = document.getElementById("canvas");
const ctx = canvas.getContext("2d");
if (typeof ctx.filter === "undefined") {
alert("Sorry, the browser doesn't support Context2D filters.")
}
const img = new Image;
img.onload = function() {
// step 1
const oc = document.createElement('canvas');
const octx = oc.getContext('2d');
oc.width = this.width;
oc.height = this.height;
// steo 2: pre-filter image using steps as radius
const steps = (oc.width / canvas.width)>>1;
octx.filter = `blur(${steps}px)`;
octx.drawImage(this, 0, 0);
// step 3, draw scaled
ctx.drawImage(oc, 0, 0, oc.width, oc.height, 0, 0, canvas.width, canvas.height);
}
img.src = "//i.stack.imgur.com/cYfuM.jpg";
body{ background-color: ivory; }
canvas{border:1px solid red;}
<br/><p>Original was 1600x1200, reduced to 400x300 canvas</p><br/>
<canvas id="canvas" width=400 height=250></canvas>
Support for filter as ogf Oct/2018:
CanvasRenderingContext2D.filter
api.CanvasRenderingContext2D.filter
On Standard Track, Experimental
https://developer.mozilla.org/docs/Web/API/CanvasRenderingContext2D/filter
DESKTOP > |Chrome |Edge |Firefox |IE |Opera |Safari
:----------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------
filter ! | 52 | ? | 49 | - | - | -
MOBILE > |Chrome/A |Edge/mob |Firefox/A |Opera/A |Safari/iOS|Webview/A
:----------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------
filter ! | 52 | ? | 49 | - | - | 52
! = Experimental
Data from MDN - "npm i -g mdncomp" (c) epistemex
Update 2017: There is now a new property defined in the specs for setting resampling quality:
context.imageSmoothingQuality = "low|medium|high"
It's currently only supported in Chrome. The actual methods used per level is left to the vendor to decide, but it's reasonable to assume Lanczos for "high" or something equivalent in quality. This means step-down may be skipped altogether, or larger steps can be used with fewer redraws, depending on the image size and
Support for imageSmoothingQuality
:
CanvasRenderingContext2D.imageSmoothingQuality
api.CanvasRenderingContext2D.imageSmoothingQuality
On Standard Track, Experimental
https://developer.mozilla.org/docs/Web/API/CanvasRenderingContext2D/imageSmoothingQuality
DESKTOP > |Chrome |Edge |Firefox |IE |Opera |Safari
:----------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:
imageSmoothingQuality !| 54 | ? | - | ? | 41 | Y
MOBILE > |Chrome/A |Edge/mob |Firefox/A |Opera/A |Safari/iOS|Webview/A
:----------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:
imageSmoothingQuality !| 54 | ? | - | 41 | Y | 54
! = Experimental
Data from MDN - "npm i -g mdncomp" (c) epistemex
browser. Until then..:
End of transmission
The solution is to use step-down to get a proper result. Step-down means you reduce the size in steps to allow the limited interpolation range to cover enough pixels for sampling.
This will allow good results also with bi-linear interpolation (it actually behaves much like bi-cubic when doing this) and the overhead is minimal as there are less pixels to sample in each step.
The ideal step is to go to half the resolution in each step until you would set the target size (thanks to Joe Mabel for mentioning this!).
Modified fiddle
Using direct scaling as in original question:
Using step-down as shown below:
In this case you will need to step down in 3 steps:
In step 1 we reduce the image to half by using an off-screen canvas:
// step 1 - create off-screen canvas
var oc = document.createElement('canvas'),
octx = oc.getContext('2d');
oc.width = img.width * 0.5;
oc.height = img.height * 0.5;
octx.drawImage(img, 0, 0, oc.width, oc.height);
Step 2 reuses the off-screen canvas and draws the image reduced to half again:
// step 2
octx.drawImage(oc, 0, 0, oc.width * 0.5, oc.height * 0.5);
And we draw once more to main canvas, again reduced to half but to the final size:
// step 3
ctx.drawImage(oc, 0, 0, oc.width * 0.5, oc.height * 0.5,
0, 0, canvas.width, canvas.height);
Tip:
You can calculate total number of steps needed, using this formula (it includes the final step to set target size):
steps = Math.ceil(Math.log(sourceWidth / targetWidth) / Math.log(2))