The relation of voltage and current are indirectly proportional. A device with a 120 volt input will require 4 times the current (use any unit of measure you want current/watts that is what you pay for) of a device using an input of 480 volt for the same load.
As voltage increases, current decreases when all else is equal as far as the load is concerned. This is an oversimplification of OHM's law of course but the limit of human understanding of electricity.
I have no reason to doubt the data that
@rookhawk has posted so lets work with that.
The charging rate of the type of station appears to be directly proportional to the input voltage.
If we can't figure out a way to safely install a residential service with a higher input voltage than 240VAC then we have do have serious problems.
Those skinny little wires along the road that provide primary voltage to that transformer to your home are on the low end 13,800VAC before, stepping it down to 240VAC. Much higher voltages are commonly present.
Keep in mind when you are sticking that "adaptor" in the wall in Namibia to charge your phone that it is 220VAC phase to GROUND. The same service suppling that is 380VAC phase to phase in the residence.
One day who knows?