OK, I've done it properly now. Because of the low precision of the display (to 0.01MΩ and 1V), I can only do it sensibly from 22kΩ upwards, and the results are summarised here:I foolishly did not note down the actual resistance readings being displayed when I did my tests (so calculated test current only on the basis of the nominal values of the resistors), but they were all pretty close. I'll do it again (without that oversight!) so I can determine precisely what current it's using.
As well as estimating the test current being used by dividing the displayed voltage by displayed resistance, in view of the precision issue, I've also estimated the upper and lower bounds of these estimates (by considering the worst possible rounding errors due to the limited precision of display).
It is apparent that (whatever the manual may say about 1mA) my Fluke 1652 is using a test current of around 1.5 mA. As previously discussed, one would expect to hit a 'ceiling' at about 350kΩ - beyond which the test voltage could not increase any more and the test current would therefore reduce progressively as resistance increased above this value. I've included separate data for 390kΩ to illustrate that this 'ceiling'point seems to have been more-or-less reached by that resistance.
Kind Regards, John