Isn't current or the ability of the amp to provide decent current more important than watts?
Kind of, it depends really.
The amplifier has voltage rails which limit the maximum voltage it can swing, but also a maximum current it can deliver, which might be limited by the output devices, or the capacitance of the amplifier. The limit of the amplifiers power is reached when either of these limits is exceeded.
If you make an amplifier, there is little point having massive voltage swing potential if you can't deliver current, or massive current delivery potential if the voltage output is limited. So, some sort of balanced design is typical, and that must be made based on an expected impedance load of the speakers, as this changes the current required for a given voltage.
Amplifiers in the 70s were expecting to see 16 or 8 ohm speaker loads, so the voltage to current delivery was designed with this in mind. The 80s trend for lower speaker impedance (6 or even 4 ohm) would mean the current delivery requirement assumptions in those amplifier designs was wrong. However, if you are never near the limit of current or voltage, you won't see any particular difference.
There have always been speaker designs which are grim to drive (e.g. ESL57s) and amplifiers which are massively over engineered in the current delivery department (e.g. Krell) but whether this will help you will rather depend on what you are doing with these things.
But I do agree, running out of current supply is probably the most likely failure mode when looking at an amplifier rather than running out of voltage rails since it's expensive to add current supply, and easy to add voltage support. I don't think i've ever got near an amplifier clipping in my years with domestic hifi, even with 20w amps.