Let $Z$ be a standard normal distribution.
I am trying to find a solution to the following problem:\begin{align}&\max_{ x_1,x_2 \in \mathbb{R}, t\in[0,1]} (1-t) E[|x_1+Z|^m]+t E[|x_2+Z|^m]\\&\text{ s.t. } (1-t) |x_1|^k+t|x_2|^k=c\end{align}where $0\le m\le k$.
This problem can also be cast as the following problem:\begin{align}&\max_{X} E[|X+Z|^m] \quad (*)\\&\text{ s.t. } X \text{ has two mass points}, E[|X|^k]=c, X \text{ is indpendent of } Z\end{align}
My conjecture is that the above problem is maximized by a deterministic random variable $X= c^{\frac{1}{k}}$ and\begin{align}\max_{X} E[|X+Z|^m]= E[|c^{\frac{1}{k}}+Z|^m].\end{align}
While we are restrict $Z$ to be standard normal it would be nice to have a proof that works for all symmetric and absolutely continuous distributions.
I feel like the proof should be using Jensen's inequality but not sure how to use it. The reason is the following. Suppose we remove $Z$ and seek to optimize
\begin{align}&\max_{X} E[|X|^m]\\&\text{ s.t. } X \text{ has two mass points}, E[|X|^k]=c, X \text{ is indpendent of } Z\end{align}Since, $m \le k$ by Jensen's inequality\begin{align}E[|X|^m] \le ( E[|X|^k] )^{\frac{m}{k}}.\end{align}Note, that Jensen's inequality is equality iff $X$ is a constant. So, the optimization problem $\max_{X} E[|X|^m]$ is solve by deterministic random variable.
Edit 1: It appears that my conjecture is only true in some cases. See a very nice approach of kimchilover.
Edit 2: It also appears that in $(*)$ the $\max$ should be replaced with $\sup$. This was also pointed out by kimchilover.