@Kabir5454
You have not defined the joint pdf for the random vector $(x,y),$ if it exists.
Like with the single random variable say $X$ and $Y=g(X)$ for some function $g,$ and we define the interval or the set of values for $x$ as well as the pdf/pmf and
then we write $E[Y]=\Sigma_{x \in S_X}g(x)p_X(x)$ or $E[Y]=\int_{-\infty}^{\infty}g(x)f_X(x) dx$ when the expectation exist i.e. $\Sigma_{x \in S_X}|g(x)|p_X(x) <\infty$ or $\int_{-\infty}^{\infty}|g(x)|f_X(x) \ dx <\infty.$
It is called the absolute convergence of the summation. ($S_X$ is the support of $X$ which represents the points of X which have the positive probability)
So, It is also possible that expectation of a random variable does not exist. For example, Cauchy Distribution.
Same goes with more than one random variable.
Suppose, $(X_1,X_2)$ is a random vector of the continuous type and $Y=g(X_1,X_2)$ is a real-valued function then
$E[Y]=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}g(x_1,x_2)f_{X_1,X_2}(x_1,x_2) \ dx_1 \ dx_2$
If $\int_{-\infty}^{\infty} \int_{-\infty}^{\infty}|g(x_1,x_2)|f_{X_1,X_2}(x_1,x_2) \ dx_1 \ dx_2 < \infty$
Here, $f_{X_1,X_2}(x_1,x_2)$ is called the Joint Probability Density Function (pdf) with the given interval for $x_1$ and $x_2$ and from this we define the Marginal pdf $f_{X_1}(x_1)$ and $f_{X_2}(x_2)$.
So, when we talk about the random vector $(X_1,X_2)$ we talk about all these things.
You have defined $Y=1-X$ which is called a transformed random variable which has same cdf as $X$ and so they are called "equal in distribution" though $X \neq Y.$
----------------------------------------------------------------------------------------------------
You have already proved the given claim very nicely and probably someone could see this also.
As counter-example for C is not given, So I am giving it but probably I should not have to give it because you have already proved the claim.
Counter-example for C:
$f(x,y)=x+y$ for $0 \leq x \leq 1$ and $0 \leq y \leq 1$
and $f(x,y)=0$ otherwise
Here, $E[x]=E[y]=\frac{7}{12}$ and $E[xy]=\frac{1}{3}$
(Sorry for the long comment)