Most likely they determine impurities and subtract from 100%.
I thought about that, but there is no free lunch there as well, because if you are measuring impurities you have to measure them in some quantity of the total substance, which means you still have to measure out total substance with some accuracy, which in turn will result in overall accuracy.
The best I can think of would be to have a very tall burette that uses ultra sound or laser to measure the distance to the top of the titration column. That could give very high accuracy. Lets say in a lab you have 200cm tall burette and the laser or ultra sound can measure down to 0.1 of a mm, that gives you 1 in 200,000 accuracy.
Now, granted that this is probably just a theoretical exercise... I do not know of any chemical process that would require accuracy better than 1%. If you are doing research then you probably care to find out molar ratios, so that means the measurements need to have accuracy good enough to determine measured ratio to be closest to one of possible rations, say 1:2 or 3:7 what ever. Probably 0.1% is good enough. In production environment you often run things to excess, with excess being a waste. So 1% waste is usually acceptable. Like in my particular case i am titrating to measure total acid, free acid and iron content in my phosphating bath. But those numbers will easily change by 10% or more as i use the bath, hence for me a 1% accuracy is more than enough.
Just wondering if i am missing something and there really is the need and tools available to do chemical analysis beyond the 1/20,000 error offered by $4000 digital burettes or is this basically the limit determined by most stringent needs for this field of science?
When you are sending a rocket to Mars 1/20,000 error in angle of trajectory is tens of thousands of miles, but for chemistry, is this the practical limit of needed accuracy?