**Ohm's law** states that electric current is **directly proportional** to voltage provided that physical conditions like temperature remain constant i.e.
$$V = IR$$
On the other hand ,
$$\text{Power = Voltage} \times \text{Current}$$
So here it seems that *higher* the voltage *lower* is the current provided that the power remains constant (i.e. current is **inversely proportional** to the voltage here which is against Ohm's Law.) .
Now my question is how do physicists explain this apparent *contradiction* ? Or maybe this not a contradiction because I am analysing things incorrectly?
P.S. : I am a tenth grade student so please refrain from the usage of highly complicated terminologies in your answers.