Make the number you multiply by essentially the concatenation of a long series of random digits, and I can just about guarantee most humans will get different things on both sides, because they'll make one or more mistakes doing the math. That is, of course, assuming the humans don't have suitable traditional computer tools capable of handling such a scenario.
You don't see how asking humans to multiply both sides of 2 * 2 = 4 by the same, very large, random-ish number, and expecting that they'll get different things is relevant to this:
> 2 * 2 = 4, but if you multiply both sides of that equation by a big number, that's no longer true.
You know, the very same scenario I pulled from your comment?