The explanation for why division by zero is undefined often goes like this; To say that 6/3=2 is to say that 3*2=6. Now take 6/0=x we would have to find some number that when multiplied by zero gave us 6 (x*0=6). This we can’t do. So, since division and multiplication are inter-defined in this way (generally a/b=c if and only if c*b=a) we can’t divide by 0.
But another way of talking about the interdefinition is to start from multiplication and work back to division; so we can say that a*b=c if and only if c/b=a and c/a=b (e.g. 3*2=6 if and only if 6/3=2 amd 6/2=3). But this will not work for 5*0=0. One way works fine since 0/5=0, but the other fails since it tells us that 0/0=5. But 0/0 s indeterminate and so cannot equal 5. Therefore 0*5 is indeterminate.
Now it is true in a sense that 0/0=5 since according to our interdefinition this just means that 0*5=0 which is of course true. But the problem is that this will be true for any answer. So, suppose that you thought that 0/0=120. This is a perfectly good answer since 0*120=0. But since any number will do as an answer for 0/0 this is definied as indeterminate and the above argument should go through.
What’s the right answer to this problem?