Remember that “a=0” is true in all cases for this equation.
Start with the factor: (a-a)(4-3)=0
Expand it into: a(4-3)-a(4-3)=0
Since a=0: a(4-4)-a(3-3)=0
Take out common factors: 4a(1-1)-3a(1-1)=0
This step is a bit tricky; but since a=0, and a(4-3)-a(4-3)=0, then a(3-3)=0. So, by replacing the 0 in the ending of the last equation so far shown with a(3-3), it won’t make a difference; and since 0 is not contributing anything to the equation on the left of each statement, then it will not matter if it’s gone. So it’s safe to say that I may move it over to the right.
So: 4a(1-1)=3a(1-1)=0
Divide each: 4a(1-1)/a(1-1)=3a(1-1)/a(1-1)=0
You end up with 4/1=3/1=0
Since it can also be written as 4=3=0, it is safe to say that: 4=3
step 1 is wrong. You don’t expand like that. You use foil to expand a factor. You also divided by 0. Simplify before moving things. a(1-1)=0. Therefore you can’t divide it out on both sides. You’d get an inditerminate or something like that.
Again- why did you make this thread? There was another thread, and it was already proven. Why did you think you would even come close to solving something that would actually be very revolutionary to mathematicians when you constantly come to us with problems that a middle schooler shouldn’t have problems with? Do you think before you post threads?
There’s no really explaining it, cause it’s defined by people to just <b>be</b> like that. x^4 * x^2 = x^(2+4) = x^6, and x^4 * x^0 = x^(4+0) = x^4, so x^0 is, <b>by definition</b>, equal to 1.
0^0 is undefined because it is in conflict with two definitions: that x^0 = 1, and that 0^x = 0.
Although you could go ahead and say that 0^0 = 1, I don’t think it would do any harm. But it can also be argued that 0^0 = 0. So it’s usually left undefined because there’s no great reason to favor one or the other, and defining it either way doesn’t help solve any problems.
On the other hand, 0/0 is undefined because there is no reasonable value it could be. It can’t be 1 because then all numbers could be proven equal.
I recall some formula I once found that tried to show why 0^0 !=1, but I don’t remember what it was. But I do remember my teacher saying something like 0 is not really 0, it really means something really small. So if you have something really small to something really small you can’t determine it’s answer. So it’s an inditerminate.